Mar 20 10:36:03 crc systemd[1]: Starting Kubernetes Kubelet... Mar 20 10:36:03 crc restorecon[4682]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 10:36:03 crc restorecon[4682]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 10:36:04 crc restorecon[4682]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 10:36:04 crc restorecon[4682]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Mar 20 10:36:05 crc kubenswrapper[4748]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 20 10:36:05 crc kubenswrapper[4748]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 20 10:36:05 crc kubenswrapper[4748]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 20 10:36:05 crc kubenswrapper[4748]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 20 10:36:05 crc kubenswrapper[4748]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 20 10:36:05 crc kubenswrapper[4748]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.274864 4748 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.281565 4748 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.281600 4748 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.281613 4748 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.281623 4748 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.281633 4748 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.281641 4748 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.281650 4748 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.281658 4748 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.281667 4748 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.281675 4748 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.281682 4748 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.281705 4748 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.281713 4748 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.281722 4748 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.281729 4748 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.281738 4748 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.281746 4748 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.281754 4748 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.281763 4748 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.281773 4748 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.281782 4748 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.281791 4748 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.281799 4748 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.281807 4748 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.281815 4748 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.281823 4748 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.281831 4748 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.281862 4748 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.281870 4748 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.281880 4748 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.281889 4748 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.281897 4748 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.281907 4748 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.281916 4748 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.281924 4748 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.281932 4748 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.281941 4748 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.281949 4748 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.281956 4748 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.281964 4748 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.281972 4748 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.281979 4748 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.281987 4748 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.281994 4748 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.282002 4748 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.282010 4748 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.282018 4748 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.282025 4748 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.282033 4748 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.282041 4748 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.282049 4748 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.282057 4748 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.282064 4748 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.282072 4748 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.282079 4748 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.282087 4748 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.282095 4748 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.282102 4748 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.282110 4748 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.282117 4748 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.282125 4748 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.282133 4748 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.282140 4748 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.282148 4748 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.282155 4748 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.282164 4748 feature_gate.go:330] unrecognized feature gate: Example Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.282172 4748 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.282180 4748 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.282188 4748 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.282201 4748 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.282211 4748 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.283180 4748 flags.go:64] FLAG: --address="0.0.0.0" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.283202 4748 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.283216 4748 flags.go:64] FLAG: --anonymous-auth="true" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.283228 4748 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.283239 4748 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.283249 4748 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.283261 4748 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.283272 4748 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.283281 4748 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.283290 4748 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.283300 4748 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.283310 4748 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.283319 4748 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.283328 4748 flags.go:64] FLAG: --cgroup-root="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.283337 4748 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.283346 4748 flags.go:64] FLAG: --client-ca-file="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.283356 4748 flags.go:64] FLAG: --cloud-config="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.283365 4748 flags.go:64] FLAG: --cloud-provider="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.283375 4748 flags.go:64] FLAG: --cluster-dns="[]" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.283387 4748 flags.go:64] FLAG: --cluster-domain="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.283396 4748 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.283405 4748 flags.go:64] FLAG: --config-dir="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.283414 4748 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.283424 4748 flags.go:64] FLAG: --container-log-max-files="5" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.283435 4748 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.283445 4748 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.283454 4748 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.283465 4748 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.283474 4748 flags.go:64] FLAG: --contention-profiling="false" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.283482 4748 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.283491 4748 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.283500 4748 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.283509 4748 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.283520 4748 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.283530 4748 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.283539 4748 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.283548 4748 flags.go:64] FLAG: --enable-load-reader="false" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.283557 4748 flags.go:64] FLAG: --enable-server="true" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.283566 4748 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.283576 4748 flags.go:64] FLAG: --event-burst="100" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.283585 4748 flags.go:64] FLAG: --event-qps="50" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.283594 4748 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.283603 4748 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.283612 4748 flags.go:64] FLAG: --eviction-hard="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.283622 4748 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.283631 4748 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.283640 4748 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.283650 4748 flags.go:64] FLAG: --eviction-soft="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.283658 4748 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.283667 4748 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.283679 4748 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.283688 4748 flags.go:64] FLAG: --experimental-mounter-path="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.283697 4748 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.283706 4748 flags.go:64] FLAG: --fail-swap-on="true" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.283714 4748 flags.go:64] FLAG: --feature-gates="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.283725 4748 flags.go:64] FLAG: --file-check-frequency="20s" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.283734 4748 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.283743 4748 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.283753 4748 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.283762 4748 flags.go:64] FLAG: --healthz-port="10248" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.283771 4748 flags.go:64] FLAG: --help="false" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.283779 4748 flags.go:64] FLAG: --hostname-override="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.283788 4748 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.283797 4748 flags.go:64] FLAG: --http-check-frequency="20s" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.283806 4748 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.283815 4748 flags.go:64] FLAG: --image-credential-provider-config="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.283823 4748 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.283856 4748 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.283866 4748 flags.go:64] FLAG: --image-service-endpoint="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.283874 4748 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.283884 4748 flags.go:64] FLAG: --kube-api-burst="100" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.283893 4748 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.283902 4748 flags.go:64] FLAG: --kube-api-qps="50" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.283911 4748 flags.go:64] FLAG: --kube-reserved="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.283921 4748 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.283929 4748 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.283939 4748 flags.go:64] FLAG: --kubelet-cgroups="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.283948 4748 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.283958 4748 flags.go:64] FLAG: --lock-file="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.283966 4748 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.283975 4748 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.283985 4748 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.283998 4748 flags.go:64] FLAG: --log-json-split-stream="false" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.284007 4748 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.284016 4748 flags.go:64] FLAG: --log-text-split-stream="false" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.284025 4748 flags.go:64] FLAG: --logging-format="text" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.284033 4748 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.284042 4748 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.284051 4748 flags.go:64] FLAG: --manifest-url="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.284060 4748 flags.go:64] FLAG: --manifest-url-header="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.284071 4748 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.284080 4748 flags.go:64] FLAG: --max-open-files="1000000" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.284091 4748 flags.go:64] FLAG: --max-pods="110" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.284100 4748 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.284109 4748 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.284118 4748 flags.go:64] FLAG: --memory-manager-policy="None" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.284127 4748 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.284136 4748 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.284145 4748 flags.go:64] FLAG: --node-ip="192.168.126.11" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.284154 4748 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.284175 4748 flags.go:64] FLAG: --node-status-max-images="50" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.284184 4748 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.284193 4748 flags.go:64] FLAG: --oom-score-adj="-999" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.284203 4748 flags.go:64] FLAG: --pod-cidr="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.284211 4748 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.284226 4748 flags.go:64] FLAG: --pod-manifest-path="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.284243 4748 flags.go:64] FLAG: --pod-max-pids="-1" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.284252 4748 flags.go:64] FLAG: --pods-per-core="0" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.284261 4748 flags.go:64] FLAG: --port="10250" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.284270 4748 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.284279 4748 flags.go:64] FLAG: --provider-id="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.284287 4748 flags.go:64] FLAG: --qos-reserved="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.284296 4748 flags.go:64] FLAG: --read-only-port="10255" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.284305 4748 flags.go:64] FLAG: --register-node="true" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.284314 4748 flags.go:64] FLAG: --register-schedulable="true" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.284323 4748 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.284346 4748 flags.go:64] FLAG: --registry-burst="10" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.284355 4748 flags.go:64] FLAG: --registry-qps="5" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.284364 4748 flags.go:64] FLAG: --reserved-cpus="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.284373 4748 flags.go:64] FLAG: --reserved-memory="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.284384 4748 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.284393 4748 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.284401 4748 flags.go:64] FLAG: --rotate-certificates="false" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.284411 4748 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.284420 4748 flags.go:64] FLAG: --runonce="false" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.284429 4748 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.284438 4748 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.284447 4748 flags.go:64] FLAG: --seccomp-default="false" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.284456 4748 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.284464 4748 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.284475 4748 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.284484 4748 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.284493 4748 flags.go:64] FLAG: --storage-driver-password="root" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.284502 4748 flags.go:64] FLAG: --storage-driver-secure="false" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.284510 4748 flags.go:64] FLAG: --storage-driver-table="stats" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.284519 4748 flags.go:64] FLAG: --storage-driver-user="root" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.284528 4748 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.284538 4748 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.284547 4748 flags.go:64] FLAG: --system-cgroups="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.284556 4748 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.284571 4748 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.284580 4748 flags.go:64] FLAG: --tls-cert-file="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.284590 4748 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.284601 4748 flags.go:64] FLAG: --tls-min-version="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.284610 4748 flags.go:64] FLAG: --tls-private-key-file="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.284619 4748 flags.go:64] FLAG: --topology-manager-policy="none" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.284628 4748 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.284637 4748 flags.go:64] FLAG: --topology-manager-scope="container" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.284646 4748 flags.go:64] FLAG: --v="2" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.284657 4748 flags.go:64] FLAG: --version="false" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.284668 4748 flags.go:64] FLAG: --vmodule="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.284679 4748 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.284688 4748 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.284907 4748 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.284920 4748 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.284931 4748 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.284940 4748 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.284949 4748 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.284957 4748 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.284966 4748 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.284975 4748 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.284984 4748 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.284992 4748 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.285000 4748 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.285008 4748 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.285015 4748 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.285023 4748 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.285031 4748 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.285039 4748 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.285047 4748 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.285054 4748 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.285062 4748 feature_gate.go:330] unrecognized feature gate: Example Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.285073 4748 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.285083 4748 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.285091 4748 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.285100 4748 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.285109 4748 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.285118 4748 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.285127 4748 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.285137 4748 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.285146 4748 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.285154 4748 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.285163 4748 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.285171 4748 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.285189 4748 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.285197 4748 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.285205 4748 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.285213 4748 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.285221 4748 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.285228 4748 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.285236 4748 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.285244 4748 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.285252 4748 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.285260 4748 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.285268 4748 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.285275 4748 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.285283 4748 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.285291 4748 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.285300 4748 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.285307 4748 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.285315 4748 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.285322 4748 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.285330 4748 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.285339 4748 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.285346 4748 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.285353 4748 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.285361 4748 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.285369 4748 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.285379 4748 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.285389 4748 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.285398 4748 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.285407 4748 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.285419 4748 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.285430 4748 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.285442 4748 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.285453 4748 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.285469 4748 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.285480 4748 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.285490 4748 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.285499 4748 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.285506 4748 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.285514 4748 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.285521 4748 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.285529 4748 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.286502 4748 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.297168 4748 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.297210 4748 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.297275 4748 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.297283 4748 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.297288 4748 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.297293 4748 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.297298 4748 feature_gate.go:330] unrecognized feature gate: Example Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.297301 4748 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.297305 4748 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.297309 4748 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.297313 4748 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.297318 4748 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.297322 4748 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.297327 4748 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.297331 4748 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.297335 4748 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.297339 4748 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.297343 4748 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.297347 4748 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.297351 4748 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.297355 4748 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.297361 4748 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.297367 4748 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.297371 4748 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.297375 4748 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.297379 4748 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.297383 4748 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.297386 4748 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.297390 4748 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.297394 4748 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.297399 4748 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.297402 4748 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.297406 4748 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.297410 4748 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.297414 4748 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.297418 4748 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.297421 4748 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.297426 4748 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.297433 4748 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.297437 4748 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.297441 4748 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.297446 4748 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.297450 4748 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.297455 4748 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.297460 4748 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.297464 4748 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.297468 4748 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.297472 4748 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.297476 4748 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.297480 4748 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.297485 4748 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.297489 4748 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.297493 4748 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.297496 4748 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.297500 4748 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.297505 4748 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.297509 4748 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.297513 4748 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.297516 4748 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.297520 4748 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.297524 4748 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.297527 4748 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.297531 4748 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.297535 4748 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.297538 4748 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.297542 4748 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.297546 4748 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.297550 4748 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.297553 4748 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.297556 4748 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.297560 4748 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.297563 4748 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.297567 4748 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.297574 4748 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.297680 4748 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.297686 4748 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.297690 4748 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.297694 4748 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.297697 4748 feature_gate.go:330] unrecognized feature gate: Example Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.297701 4748 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.297704 4748 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.297709 4748 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.297712 4748 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.297716 4748 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.297721 4748 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.297725 4748 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.297730 4748 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.297734 4748 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.297738 4748 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.297742 4748 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.297745 4748 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.297749 4748 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.297753 4748 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.297757 4748 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.297780 4748 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.297784 4748 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.297788 4748 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.297792 4748 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.297796 4748 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.297800 4748 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.297804 4748 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.297808 4748 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.297811 4748 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.297815 4748 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.297819 4748 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.297822 4748 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.297826 4748 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.297841 4748 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.297845 4748 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.297849 4748 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.297852 4748 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.297856 4748 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.297860 4748 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.297863 4748 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.297867 4748 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.297870 4748 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.297876 4748 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.297881 4748 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.297885 4748 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.297889 4748 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.297894 4748 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.297897 4748 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.297901 4748 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.297905 4748 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.297909 4748 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.297913 4748 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.297917 4748 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.297921 4748 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.297924 4748 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.297928 4748 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.297932 4748 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.297936 4748 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.297939 4748 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.297943 4748 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.297947 4748 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.297951 4748 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.297954 4748 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.297958 4748 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.297962 4748 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.297967 4748 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.297972 4748 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.297976 4748 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.297980 4748 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.297984 4748 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.297989 4748 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.297995 4748 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.298801 4748 server.go:940] "Client rotation is on, will bootstrap in background" Mar 20 10:36:05 crc kubenswrapper[4748]: E0320 10:36:05.303769 4748 bootstrap.go:266] "Unhandled Error" err="part of the existing bootstrap client certificate in /var/lib/kubelet/kubeconfig is expired: 2026-02-24 05:52:08 +0000 UTC" logger="UnhandledError" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.307777 4748 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.307924 4748 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.310165 4748 server.go:997] "Starting client certificate rotation" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.310194 4748 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.310652 4748 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.339465 4748 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 20 10:36:05 crc kubenswrapper[4748]: E0320 10:36:05.340870 4748 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.204:6443: connect: connection refused" logger="UnhandledError" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.341436 4748 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.358588 4748 log.go:25] "Validated CRI v1 runtime API" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.394682 4748 log.go:25] "Validated CRI v1 image API" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.397084 4748 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.402717 4748 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-03-20-10-30-33-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.402821 4748 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.420585 4748 manager.go:217] Machine: {Timestamp:2026-03-20 10:36:05.418007681 +0000 UTC m=+0.559553515 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:1909d2db-5267-4c43-8cb4-dc64b5fa3add BootID:1d9697da-f407-4535-b044-2e042853bd80 Filesystems:[{Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:10:05:3f Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:10:05:3f Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:96:f2:c0 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:10:1b:fa Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:9d:61:3b Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:4c:14:b5 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:4a:72:f4:e3:74:6f Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:9a:5c:28:53:a6:d5 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.421000 4748 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.421360 4748 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.422781 4748 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.423180 4748 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.423241 4748 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.423611 4748 topology_manager.go:138] "Creating topology manager with none policy" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.423636 4748 container_manager_linux.go:303] "Creating device plugin manager" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.424223 4748 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.424290 4748 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.425203 4748 state_mem.go:36] "Initialized new in-memory state store" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.425351 4748 server.go:1245] "Using root directory" path="/var/lib/kubelet" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.428965 4748 kubelet.go:418] "Attempting to sync node with API server" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.429002 4748 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.429046 4748 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.429070 4748 kubelet.go:324] "Adding apiserver pod source" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.429088 4748 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.439226 4748 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.204:6443: connect: connection refused Mar 20 10:36:05 crc kubenswrapper[4748]: E0320 10:36:05.439360 4748 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.204:6443: connect: connection refused" logger="UnhandledError" Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.439230 4748 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.204:6443: connect: connection refused Mar 20 10:36:05 crc kubenswrapper[4748]: E0320 10:36:05.439432 4748 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.204:6443: connect: connection refused" logger="UnhandledError" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.440485 4748 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.441709 4748 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.444264 4748 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.445714 4748 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.445754 4748 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.445768 4748 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.445783 4748 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.445806 4748 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.445821 4748 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.445870 4748 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.445893 4748 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.445910 4748 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.445924 4748 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.445944 4748 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.445958 4748 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.445991 4748 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.446645 4748 server.go:1280] "Started kubelet" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.446941 4748 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.204:6443: connect: connection refused Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.447584 4748 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.447618 4748 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.448546 4748 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 20 10:36:05 crc systemd[1]: Started Kubernetes Kubelet. Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.450012 4748 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.450090 4748 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.450480 4748 volume_manager.go:287] "The desired_state_of_world populator starts" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.450498 4748 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.450713 4748 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 20 10:36:05 crc kubenswrapper[4748]: E0320 10:36:05.451216 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.451383 4748 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.204:6443: connect: connection refused Mar 20 10:36:05 crc kubenswrapper[4748]: E0320 10:36:05.451453 4748 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.204:6443: connect: connection refused" logger="UnhandledError" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.452950 4748 server.go:460] "Adding debug handlers to kubelet server" Mar 20 10:36:05 crc kubenswrapper[4748]: E0320 10:36:05.455315 4748 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.204:6443: connect: connection refused" interval="200ms" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.456086 4748 factory.go:55] Registering systemd factory Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.456119 4748 factory.go:221] Registration of the systemd container factory successfully Mar 20 10:36:05 crc kubenswrapper[4748]: E0320 10:36:05.455517 4748 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.204:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189e864c49ef1a10 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:36:05.446597136 +0000 UTC m=+0.588142980,LastTimestamp:2026-03-20 10:36:05.446597136 +0000 UTC m=+0.588142980,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.457091 4748 factory.go:153] Registering CRI-O factory Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.457142 4748 factory.go:221] Registration of the crio container factory successfully Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.457479 4748 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.457558 4748 factory.go:103] Registering Raw factory Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.457585 4748 manager.go:1196] Started watching for new ooms in manager Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.460276 4748 manager.go:319] Starting recovery of all containers Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.464009 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.464241 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.464366 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.464469 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.464605 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.464709 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.464789 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.464933 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.465025 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.465101 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.465163 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.465233 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.465303 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.465394 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.465480 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.465552 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.465618 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.465693 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.465771 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.466902 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.466972 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.466991 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.467009 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.467029 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.467045 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.467061 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.467717 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.467786 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.467819 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.467879 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.467911 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.467939 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.467965 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.467993 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.468061 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.468088 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.468116 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.468141 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.468167 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.468191 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.468218 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.468245 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.468269 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.468295 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.468325 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.468349 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.468374 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.468400 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.468428 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.468454 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.468482 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.468504 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.468539 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.468571 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.468600 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.468628 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.468656 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.468682 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.468708 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.468738 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.468765 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.468791 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.468817 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.468876 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.468904 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.468930 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.468958 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.468986 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.469014 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.469039 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.469065 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.469094 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.469121 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.469146 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.469171 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.469197 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.469223 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.469247 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.469272 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.469297 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.469322 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.469351 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.469376 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.469403 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.469428 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.469455 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.469480 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.469504 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.469530 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.469557 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.471533 4748 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.471603 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.471635 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.471664 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.471691 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.471731 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.471756 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.471781 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.471804 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.471829 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.471883 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.471908 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.471935 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.471981 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.472009 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.472046 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.472074 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.472101 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.472130 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.472157 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.472186 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.472212 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.472242 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.472271 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.472300 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.472329 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.472350 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.472369 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.472387 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.472406 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.472427 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.472445 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.472463 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.472483 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.472505 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.472524 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.472542 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.472560 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.472580 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.472600 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.472618 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.472639 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.472659 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.472678 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.472698 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.472718 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.472736 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.472756 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.472781 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.472807 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.472869 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.472899 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.472926 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.472951 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.472975 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.473002 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.473027 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.473052 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.473077 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.473101 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.473124 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.473147 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.473170 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.473194 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.473219 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.473246 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.473272 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.473298 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.473322 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.473348 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.473372 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.473395 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.473422 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.473441 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.473460 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.473477 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.473496 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.473524 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.473551 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.473575 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.473601 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.473629 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.473654 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.473680 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.473703 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.473723 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.473742 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.473762 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.473780 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.473798 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.473815 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.473893 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.473910 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.473930 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.473956 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.473973 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.473991 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.474010 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.474027 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.474045 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.474065 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.474082 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.474101 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.474120 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.474137 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.474155 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.474174 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.474194 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.474212 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.474231 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.474250 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.474269 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.474288 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.474318 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.474336 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.474354 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.474372 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.474394 4748 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.474412 4748 reconstruct.go:97] "Volume reconstruction finished" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.474425 4748 reconciler.go:26] "Reconciler: start to sync state" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.491249 4748 manager.go:324] Recovery completed Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.502727 4748 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.505002 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.505046 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.505057 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.506175 4748 cpu_manager.go:225] "Starting CPU manager" policy="none" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.506196 4748 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.506222 4748 state_mem.go:36] "Initialized new in-memory state store" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.512019 4748 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.513862 4748 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.513906 4748 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.513978 4748 kubelet.go:2335] "Starting kubelet main sync loop" Mar 20 10:36:05 crc kubenswrapper[4748]: E0320 10:36:05.514610 4748 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 20 10:36:05 crc kubenswrapper[4748]: W0320 10:36:05.515279 4748 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.204:6443: connect: connection refused Mar 20 10:36:05 crc kubenswrapper[4748]: E0320 10:36:05.515327 4748 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.204:6443: connect: connection refused" logger="UnhandledError" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.529890 4748 policy_none.go:49] "None policy: Start" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.530945 4748 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.530988 4748 state_mem.go:35] "Initializing new in-memory state store" Mar 20 10:36:05 crc kubenswrapper[4748]: E0320 10:36:05.552157 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.576978 4748 manager.go:334] "Starting Device Plugin manager" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.577070 4748 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.577088 4748 server.go:79] "Starting device plugin registration server" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.577532 4748 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.577559 4748 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.578115 4748 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.578269 4748 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.578283 4748 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 20 10:36:05 crc kubenswrapper[4748]: E0320 10:36:05.584907 4748 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.615390 4748 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.615543 4748 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.616861 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.616903 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.616913 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.617071 4748 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.617340 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.617394 4748 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.618188 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.618202 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.618219 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.618227 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.618231 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.618244 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.618380 4748 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.618603 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.618640 4748 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.619043 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.619063 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.619073 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.619142 4748 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.619318 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.619351 4748 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.619634 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.619655 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.619659 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.619665 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.619674 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.619689 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.619764 4748 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.619915 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.620300 4748 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.620463 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.620521 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.620547 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.623089 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.623114 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.623124 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.623199 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.623238 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.623249 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.623503 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.623592 4748 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.625505 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.625543 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.625554 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:36:05 crc kubenswrapper[4748]: E0320 10:36:05.656413 4748 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.204:6443: connect: connection refused" interval="400ms" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.675685 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.675759 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.675806 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.675902 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.675944 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.676005 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.676057 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.676111 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.676158 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.676190 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.676240 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.676275 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.676306 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.676352 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.676388 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.677749 4748 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.679637 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.679691 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.679711 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.679747 4748 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 10:36:05 crc kubenswrapper[4748]: E0320 10:36:05.680255 4748 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.204:6443: connect: connection refused" node="crc" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.777905 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.777952 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.777978 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.777996 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.778014 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.778034 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.778052 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.778074 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.778090 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.778108 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.778126 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.778141 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.778157 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.778175 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.778192 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.778634 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.778682 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.778653 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.778761 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.778779 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.778788 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.778820 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.778859 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.778824 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.778879 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.778903 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.778937 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.778963 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.779010 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.779034 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.880392 4748 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.881663 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.881787 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.881869 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.881949 4748 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 10:36:05 crc kubenswrapper[4748]: E0320 10:36:05.882355 4748 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.204:6443: connect: connection refused" node="crc" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.945945 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.950957 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.977581 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:36:05 crc kubenswrapper[4748]: I0320 10:36:05.994232 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 10:36:06 crc kubenswrapper[4748]: W0320 10:36:06.000980 4748 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-9b1275eee82337f4c9b38259399eb8a8cc3c5ae3723c0d5cbf9ac9ec63e9b9c2 WatchSource:0}: Error finding container 9b1275eee82337f4c9b38259399eb8a8cc3c5ae3723c0d5cbf9ac9ec63e9b9c2: Status 404 returned error can't find the container with id 9b1275eee82337f4c9b38259399eb8a8cc3c5ae3723c0d5cbf9ac9ec63e9b9c2 Mar 20 10:36:06 crc kubenswrapper[4748]: I0320 10:36:06.002118 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 10:36:06 crc kubenswrapper[4748]: W0320 10:36:06.003928 4748 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-dcfc7269468519b5892dce8b9fa96630d9436529b4faaa5e9ecb1a71afcfce8a WatchSource:0}: Error finding container dcfc7269468519b5892dce8b9fa96630d9436529b4faaa5e9ecb1a71afcfce8a: Status 404 returned error can't find the container with id dcfc7269468519b5892dce8b9fa96630d9436529b4faaa5e9ecb1a71afcfce8a Mar 20 10:36:06 crc kubenswrapper[4748]: W0320 10:36:06.018636 4748 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-905652f24b210715306ba666acbcf6a57a6d694f2838deb5006f014f0cca34ba WatchSource:0}: Error finding container 905652f24b210715306ba666acbcf6a57a6d694f2838deb5006f014f0cca34ba: Status 404 returned error can't find the container with id 905652f24b210715306ba666acbcf6a57a6d694f2838deb5006f014f0cca34ba Mar 20 10:36:06 crc kubenswrapper[4748]: W0320 10:36:06.024198 4748 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-98549c768db31e8d4fb8cc2f95c8383cf96fe2cdb937e4b4615441486850d986 WatchSource:0}: Error finding container 98549c768db31e8d4fb8cc2f95c8383cf96fe2cdb937e4b4615441486850d986: Status 404 returned error can't find the container with id 98549c768db31e8d4fb8cc2f95c8383cf96fe2cdb937e4b4615441486850d986 Mar 20 10:36:06 crc kubenswrapper[4748]: W0320 10:36:06.030669 4748 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-72c2bb49306eca273298c162d632038f434c963c72385767419d7c859931f5a1 WatchSource:0}: Error finding container 72c2bb49306eca273298c162d632038f434c963c72385767419d7c859931f5a1: Status 404 returned error can't find the container with id 72c2bb49306eca273298c162d632038f434c963c72385767419d7c859931f5a1 Mar 20 10:36:06 crc kubenswrapper[4748]: E0320 10:36:06.057451 4748 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.204:6443: connect: connection refused" interval="800ms" Mar 20 10:36:06 crc kubenswrapper[4748]: I0320 10:36:06.283321 4748 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:36:06 crc kubenswrapper[4748]: I0320 10:36:06.285642 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:36:06 crc kubenswrapper[4748]: I0320 10:36:06.285687 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:36:06 crc kubenswrapper[4748]: I0320 10:36:06.285696 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:36:06 crc kubenswrapper[4748]: I0320 10:36:06.285726 4748 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 10:36:06 crc kubenswrapper[4748]: E0320 10:36:06.286286 4748 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.204:6443: connect: connection refused" node="crc" Mar 20 10:36:06 crc kubenswrapper[4748]: W0320 10:36:06.437134 4748 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.204:6443: connect: connection refused Mar 20 10:36:06 crc kubenswrapper[4748]: E0320 10:36:06.437253 4748 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.204:6443: connect: connection refused" logger="UnhandledError" Mar 20 10:36:06 crc kubenswrapper[4748]: I0320 10:36:06.448073 4748 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.204:6443: connect: connection refused Mar 20 10:36:06 crc kubenswrapper[4748]: I0320 10:36:06.519100 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"72c2bb49306eca273298c162d632038f434c963c72385767419d7c859931f5a1"} Mar 20 10:36:06 crc kubenswrapper[4748]: I0320 10:36:06.520461 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"98549c768db31e8d4fb8cc2f95c8383cf96fe2cdb937e4b4615441486850d986"} Mar 20 10:36:06 crc kubenswrapper[4748]: I0320 10:36:06.522132 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"905652f24b210715306ba666acbcf6a57a6d694f2838deb5006f014f0cca34ba"} Mar 20 10:36:06 crc kubenswrapper[4748]: I0320 10:36:06.523305 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"dcfc7269468519b5892dce8b9fa96630d9436529b4faaa5e9ecb1a71afcfce8a"} Mar 20 10:36:06 crc kubenswrapper[4748]: I0320 10:36:06.524649 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"9b1275eee82337f4c9b38259399eb8a8cc3c5ae3723c0d5cbf9ac9ec63e9b9c2"} Mar 20 10:36:06 crc kubenswrapper[4748]: W0320 10:36:06.625470 4748 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.204:6443: connect: connection refused Mar 20 10:36:06 crc kubenswrapper[4748]: E0320 10:36:06.625575 4748 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.204:6443: connect: connection refused" logger="UnhandledError" Mar 20 10:36:06 crc kubenswrapper[4748]: W0320 10:36:06.705311 4748 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.204:6443: connect: connection refused Mar 20 10:36:06 crc kubenswrapper[4748]: E0320 10:36:06.705413 4748 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.204:6443: connect: connection refused" logger="UnhandledError" Mar 20 10:36:06 crc kubenswrapper[4748]: E0320 10:36:06.858343 4748 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.204:6443: connect: connection refused" interval="1.6s" Mar 20 10:36:06 crc kubenswrapper[4748]: W0320 10:36:06.928870 4748 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.204:6443: connect: connection refused Mar 20 10:36:06 crc kubenswrapper[4748]: E0320 10:36:06.928999 4748 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.204:6443: connect: connection refused" logger="UnhandledError" Mar 20 10:36:07 crc kubenswrapper[4748]: I0320 10:36:07.086691 4748 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:36:07 crc kubenswrapper[4748]: I0320 10:36:07.088672 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:36:07 crc kubenswrapper[4748]: I0320 10:36:07.088724 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:36:07 crc kubenswrapper[4748]: I0320 10:36:07.088756 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:36:07 crc kubenswrapper[4748]: I0320 10:36:07.088790 4748 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 10:36:07 crc kubenswrapper[4748]: E0320 10:36:07.089372 4748 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.204:6443: connect: connection refused" node="crc" Mar 20 10:36:07 crc kubenswrapper[4748]: I0320 10:36:07.448140 4748 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.204:6443: connect: connection refused Mar 20 10:36:07 crc kubenswrapper[4748]: I0320 10:36:07.476413 4748 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 20 10:36:07 crc kubenswrapper[4748]: E0320 10:36:07.477570 4748 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.204:6443: connect: connection refused" logger="UnhandledError" Mar 20 10:36:07 crc kubenswrapper[4748]: I0320 10:36:07.530966 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"0f3d8fdad8305b6f104a43617eca5ca5af9e35960dcf1d0099d533b140d97ba3"} Mar 20 10:36:07 crc kubenswrapper[4748]: I0320 10:36:07.531036 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"14f9538db2803ef3f6e195b735774be885fa4a2ec0958969e3977939282b3847"} Mar 20 10:36:07 crc kubenswrapper[4748]: I0320 10:36:07.531060 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"f7f38cc9f0d5c3bfd12dbac6def39d6daab1ee85f93b3932259db6d4e31d2625"} Mar 20 10:36:07 crc kubenswrapper[4748]: I0320 10:36:07.531078 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"639e31d3a83e1b703bfdbfc7c3e881ef42ced35bc22d96d8917993131c97f72c"} Mar 20 10:36:07 crc kubenswrapper[4748]: I0320 10:36:07.531151 4748 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:36:07 crc kubenswrapper[4748]: I0320 10:36:07.532477 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:36:07 crc kubenswrapper[4748]: I0320 10:36:07.532526 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:36:07 crc kubenswrapper[4748]: I0320 10:36:07.532545 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:36:07 crc kubenswrapper[4748]: I0320 10:36:07.533212 4748 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="4d11cbf1605caf577fd462823f54f7390c8c8ab1d24036423e46215d1598af27" exitCode=0 Mar 20 10:36:07 crc kubenswrapper[4748]: I0320 10:36:07.533397 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"4d11cbf1605caf577fd462823f54f7390c8c8ab1d24036423e46215d1598af27"} Mar 20 10:36:07 crc kubenswrapper[4748]: I0320 10:36:07.533608 4748 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:36:07 crc kubenswrapper[4748]: I0320 10:36:07.534854 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:36:07 crc kubenswrapper[4748]: I0320 10:36:07.534880 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:36:07 crc kubenswrapper[4748]: I0320 10:36:07.534891 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:36:07 crc kubenswrapper[4748]: I0320 10:36:07.535064 4748 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="67a0412a59eedfa9ebddeebc4a645fb62a39cda02904a24a57b8baf157539ea7" exitCode=0 Mar 20 10:36:07 crc kubenswrapper[4748]: I0320 10:36:07.535105 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"67a0412a59eedfa9ebddeebc4a645fb62a39cda02904a24a57b8baf157539ea7"} Mar 20 10:36:07 crc kubenswrapper[4748]: I0320 10:36:07.535236 4748 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:36:07 crc kubenswrapper[4748]: I0320 10:36:07.536116 4748 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="8613c7e56f98b48d583c07390841b7277e84a41265152cd201d32d1d38e5bedf" exitCode=0 Mar 20 10:36:07 crc kubenswrapper[4748]: I0320 10:36:07.536211 4748 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:36:07 crc kubenswrapper[4748]: I0320 10:36:07.536257 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"8613c7e56f98b48d583c07390841b7277e84a41265152cd201d32d1d38e5bedf"} Mar 20 10:36:07 crc kubenswrapper[4748]: I0320 10:36:07.536329 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:36:07 crc kubenswrapper[4748]: I0320 10:36:07.536381 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:36:07 crc kubenswrapper[4748]: I0320 10:36:07.536402 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:36:07 crc kubenswrapper[4748]: I0320 10:36:07.536962 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:36:07 crc kubenswrapper[4748]: I0320 10:36:07.536986 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:36:07 crc kubenswrapper[4748]: I0320 10:36:07.536996 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:36:07 crc kubenswrapper[4748]: I0320 10:36:07.538356 4748 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="b974db792d69c06471dc5bd8f9e64e727b052347aed06e5a1c80653838984f51" exitCode=0 Mar 20 10:36:07 crc kubenswrapper[4748]: I0320 10:36:07.538396 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"b974db792d69c06471dc5bd8f9e64e727b052347aed06e5a1c80653838984f51"} Mar 20 10:36:07 crc kubenswrapper[4748]: I0320 10:36:07.538413 4748 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:36:07 crc kubenswrapper[4748]: I0320 10:36:07.540116 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:36:07 crc kubenswrapper[4748]: I0320 10:36:07.540173 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:36:07 crc kubenswrapper[4748]: I0320 10:36:07.540197 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:36:07 crc kubenswrapper[4748]: I0320 10:36:07.543333 4748 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:36:07 crc kubenswrapper[4748]: I0320 10:36:07.545024 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:36:07 crc kubenswrapper[4748]: I0320 10:36:07.545087 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:36:07 crc kubenswrapper[4748]: I0320 10:36:07.545107 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:36:08 crc kubenswrapper[4748]: W0320 10:36:08.335127 4748 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.204:6443: connect: connection refused Mar 20 10:36:08 crc kubenswrapper[4748]: E0320 10:36:08.335233 4748 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.204:6443: connect: connection refused" logger="UnhandledError" Mar 20 10:36:08 crc kubenswrapper[4748]: I0320 10:36:08.447829 4748 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.204:6443: connect: connection refused Mar 20 10:36:08 crc kubenswrapper[4748]: E0320 10:36:08.459731 4748 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.204:6443: connect: connection refused" interval="3.2s" Mar 20 10:36:08 crc kubenswrapper[4748]: I0320 10:36:08.546764 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"989be6314e2d0af65e9541b5b2a0be5aed080255658e56c1c5bd989d1a842acf"} Mar 20 10:36:08 crc kubenswrapper[4748]: I0320 10:36:08.546814 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"ed2d8bdb5b326ae91ba0ca3da26d434af7ae6a3c604fd3ff22eb0ae8b00a0aa1"} Mar 20 10:36:08 crc kubenswrapper[4748]: I0320 10:36:08.546824 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"15d3d6d4f8c47119a747f3c76a9bb2b05c6f0030576418e787f3deb6e9f2485b"} Mar 20 10:36:08 crc kubenswrapper[4748]: I0320 10:36:08.546861 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"2c40d6e7336ba79828d98e8a8a2cb2e049ce6ad046d53a6337d9ab0d1ce0ee11"} Mar 20 10:36:08 crc kubenswrapper[4748]: I0320 10:36:08.548164 4748 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="ddabedf53696b05f22ba61f42f37346f819ab9d210e1ffb33bcc4c4bc685daa8" exitCode=0 Mar 20 10:36:08 crc kubenswrapper[4748]: I0320 10:36:08.548241 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"ddabedf53696b05f22ba61f42f37346f819ab9d210e1ffb33bcc4c4bc685daa8"} Mar 20 10:36:08 crc kubenswrapper[4748]: I0320 10:36:08.548258 4748 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:36:08 crc kubenswrapper[4748]: I0320 10:36:08.549398 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:36:08 crc kubenswrapper[4748]: I0320 10:36:08.549438 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:36:08 crc kubenswrapper[4748]: I0320 10:36:08.549453 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:36:08 crc kubenswrapper[4748]: I0320 10:36:08.550067 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"53aa23c4ccfb6ffc32263a3be4e2bf043165341a260bc416de7d29a9d308d956"} Mar 20 10:36:08 crc kubenswrapper[4748]: I0320 10:36:08.550090 4748 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:36:08 crc kubenswrapper[4748]: I0320 10:36:08.550787 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:36:08 crc kubenswrapper[4748]: I0320 10:36:08.550816 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:36:08 crc kubenswrapper[4748]: I0320 10:36:08.550829 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:36:08 crc kubenswrapper[4748]: I0320 10:36:08.553786 4748 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:36:08 crc kubenswrapper[4748]: I0320 10:36:08.554187 4748 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:36:08 crc kubenswrapper[4748]: I0320 10:36:08.554501 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"662e7c57da2d014fb8a864ee093bc58631487e042ea763095483fd098faf44c3"} Mar 20 10:36:08 crc kubenswrapper[4748]: I0320 10:36:08.554531 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"c33501d9e18a89b644641b44f64e40dbf0a80d9e7c7ef063bb49339a4913b046"} Mar 20 10:36:08 crc kubenswrapper[4748]: I0320 10:36:08.554544 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"4bb2c0cf28796b2a689d8015d03b548ba0bbda713d57ccc3abda93452a52e3c3"} Mar 20 10:36:08 crc kubenswrapper[4748]: I0320 10:36:08.554872 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:36:08 crc kubenswrapper[4748]: I0320 10:36:08.554897 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:36:08 crc kubenswrapper[4748]: I0320 10:36:08.554907 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:36:08 crc kubenswrapper[4748]: I0320 10:36:08.555756 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:36:08 crc kubenswrapper[4748]: I0320 10:36:08.555781 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:36:08 crc kubenswrapper[4748]: I0320 10:36:08.555792 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:36:08 crc kubenswrapper[4748]: W0320 10:36:08.556208 4748 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.204:6443: connect: connection refused Mar 20 10:36:08 crc kubenswrapper[4748]: E0320 10:36:08.556284 4748 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.204:6443: connect: connection refused" logger="UnhandledError" Mar 20 10:36:08 crc kubenswrapper[4748]: W0320 10:36:08.597357 4748 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.204:6443: connect: connection refused Mar 20 10:36:08 crc kubenswrapper[4748]: E0320 10:36:08.597442 4748 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.204:6443: connect: connection refused" logger="UnhandledError" Mar 20 10:36:08 crc kubenswrapper[4748]: I0320 10:36:08.689900 4748 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:36:08 crc kubenswrapper[4748]: I0320 10:36:08.691538 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:36:08 crc kubenswrapper[4748]: I0320 10:36:08.691592 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:36:08 crc kubenswrapper[4748]: I0320 10:36:08.691602 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:36:08 crc kubenswrapper[4748]: I0320 10:36:08.691630 4748 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 10:36:08 crc kubenswrapper[4748]: E0320 10:36:08.692155 4748 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.204:6443: connect: connection refused" node="crc" Mar 20 10:36:08 crc kubenswrapper[4748]: I0320 10:36:08.858189 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 10:36:09 crc kubenswrapper[4748]: I0320 10:36:09.559823 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"b79bc18ef02ffbff86c29841cc594606e3eb44cb3d39606fdbcf3c8b557e645b"} Mar 20 10:36:09 crc kubenswrapper[4748]: I0320 10:36:09.560026 4748 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:36:09 crc kubenswrapper[4748]: I0320 10:36:09.561201 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:36:09 crc kubenswrapper[4748]: I0320 10:36:09.561228 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:36:09 crc kubenswrapper[4748]: I0320 10:36:09.561239 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:36:09 crc kubenswrapper[4748]: I0320 10:36:09.562587 4748 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="9147cf13ba247b6bd925ebdd83346a8023d6f9cc0ec94b41bde683ce7a91f737" exitCode=0 Mar 20 10:36:09 crc kubenswrapper[4748]: I0320 10:36:09.562667 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"9147cf13ba247b6bd925ebdd83346a8023d6f9cc0ec94b41bde683ce7a91f737"} Mar 20 10:36:09 crc kubenswrapper[4748]: I0320 10:36:09.562728 4748 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:36:09 crc kubenswrapper[4748]: I0320 10:36:09.562737 4748 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:36:09 crc kubenswrapper[4748]: I0320 10:36:09.562731 4748 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:36:09 crc kubenswrapper[4748]: I0320 10:36:09.563959 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:36:09 crc kubenswrapper[4748]: I0320 10:36:09.564011 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:36:09 crc kubenswrapper[4748]: I0320 10:36:09.564036 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:36:09 crc kubenswrapper[4748]: I0320 10:36:09.564239 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:36:09 crc kubenswrapper[4748]: I0320 10:36:09.564264 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:36:09 crc kubenswrapper[4748]: I0320 10:36:09.564272 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:36:09 crc kubenswrapper[4748]: I0320 10:36:09.564337 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:36:09 crc kubenswrapper[4748]: I0320 10:36:09.564386 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:36:09 crc kubenswrapper[4748]: I0320 10:36:09.564410 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:36:09 crc kubenswrapper[4748]: I0320 10:36:09.776453 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:36:10 crc kubenswrapper[4748]: I0320 10:36:10.257253 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 10:36:10 crc kubenswrapper[4748]: I0320 10:36:10.257481 4748 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:36:10 crc kubenswrapper[4748]: I0320 10:36:10.258997 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:36:10 crc kubenswrapper[4748]: I0320 10:36:10.259058 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:36:10 crc kubenswrapper[4748]: I0320 10:36:10.259071 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:36:10 crc kubenswrapper[4748]: I0320 10:36:10.570884 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"641de573660b1953a4b99303cb87e56a7d9b8fabadf4fbf1800bc853a5b49a86"} Mar 20 10:36:10 crc kubenswrapper[4748]: I0320 10:36:10.570956 4748 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:36:10 crc kubenswrapper[4748]: I0320 10:36:10.570963 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"f5844c4d5c4b02ba26d1e9f39a80e6df89f51440d755e03de56504a9373c0955"} Mar 20 10:36:10 crc kubenswrapper[4748]: I0320 10:36:10.570994 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"6ca15e526ecd376669cb2a7c1548debd0aff2ab5e7ae466bbfca7398391a4eb7"} Mar 20 10:36:10 crc kubenswrapper[4748]: I0320 10:36:10.571018 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"08e5a3265c29eedc75658bbc6228215861d3f584633ef23c20df9e11ed8e0141"} Mar 20 10:36:10 crc kubenswrapper[4748]: I0320 10:36:10.570926 4748 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:36:10 crc kubenswrapper[4748]: I0320 10:36:10.572045 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:36:10 crc kubenswrapper[4748]: I0320 10:36:10.572069 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:36:10 crc kubenswrapper[4748]: I0320 10:36:10.572081 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:36:10 crc kubenswrapper[4748]: I0320 10:36:10.572643 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:36:10 crc kubenswrapper[4748]: I0320 10:36:10.572688 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:36:10 crc kubenswrapper[4748]: I0320 10:36:10.572712 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:36:11 crc kubenswrapper[4748]: I0320 10:36:11.003290 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:36:11 crc kubenswrapper[4748]: I0320 10:36:11.578866 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"78a3b0d0138fd88cff5759472f5be514bb57ee4a566614d8a2a5338e38af7260"} Mar 20 10:36:11 crc kubenswrapper[4748]: I0320 10:36:11.578932 4748 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:36:11 crc kubenswrapper[4748]: I0320 10:36:11.579133 4748 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:36:11 crc kubenswrapper[4748]: I0320 10:36:11.579947 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:36:11 crc kubenswrapper[4748]: I0320 10:36:11.580017 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:36:11 crc kubenswrapper[4748]: I0320 10:36:11.580037 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:36:11 crc kubenswrapper[4748]: I0320 10:36:11.580785 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:36:11 crc kubenswrapper[4748]: I0320 10:36:11.580821 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:36:11 crc kubenswrapper[4748]: I0320 10:36:11.580851 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:36:11 crc kubenswrapper[4748]: I0320 10:36:11.816756 4748 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 20 10:36:11 crc kubenswrapper[4748]: I0320 10:36:11.892542 4748 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:36:11 crc kubenswrapper[4748]: I0320 10:36:11.894042 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:36:11 crc kubenswrapper[4748]: I0320 10:36:11.894085 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:36:11 crc kubenswrapper[4748]: I0320 10:36:11.894098 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:36:11 crc kubenswrapper[4748]: I0320 10:36:11.894129 4748 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 10:36:12 crc kubenswrapper[4748]: I0320 10:36:12.482714 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:36:12 crc kubenswrapper[4748]: I0320 10:36:12.581283 4748 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:36:12 crc kubenswrapper[4748]: I0320 10:36:12.581297 4748 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:36:12 crc kubenswrapper[4748]: I0320 10:36:12.582343 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:36:12 crc kubenswrapper[4748]: I0320 10:36:12.582390 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:36:12 crc kubenswrapper[4748]: I0320 10:36:12.582400 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:36:12 crc kubenswrapper[4748]: I0320 10:36:12.583089 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:36:12 crc kubenswrapper[4748]: I0320 10:36:12.583125 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:36:12 crc kubenswrapper[4748]: I0320 10:36:12.583136 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:36:13 crc kubenswrapper[4748]: I0320 10:36:13.045186 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Mar 20 10:36:13 crc kubenswrapper[4748]: I0320 10:36:13.583762 4748 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:36:13 crc kubenswrapper[4748]: I0320 10:36:13.583935 4748 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:36:13 crc kubenswrapper[4748]: I0320 10:36:13.585043 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:36:13 crc kubenswrapper[4748]: I0320 10:36:13.585102 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:36:13 crc kubenswrapper[4748]: I0320 10:36:13.585122 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:36:13 crc kubenswrapper[4748]: I0320 10:36:13.585715 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:36:13 crc kubenswrapper[4748]: I0320 10:36:13.585754 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:36:13 crc kubenswrapper[4748]: I0320 10:36:13.585772 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:36:13 crc kubenswrapper[4748]: I0320 10:36:13.938141 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 10:36:13 crc kubenswrapper[4748]: I0320 10:36:13.938377 4748 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:36:13 crc kubenswrapper[4748]: I0320 10:36:13.940507 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:36:13 crc kubenswrapper[4748]: I0320 10:36:13.940551 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:36:13 crc kubenswrapper[4748]: I0320 10:36:13.940561 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:36:14 crc kubenswrapper[4748]: I0320 10:36:14.137698 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 10:36:14 crc kubenswrapper[4748]: I0320 10:36:14.586325 4748 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:36:14 crc kubenswrapper[4748]: I0320 10:36:14.587235 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:36:14 crc kubenswrapper[4748]: I0320 10:36:14.587255 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:36:14 crc kubenswrapper[4748]: I0320 10:36:14.587263 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:36:15 crc kubenswrapper[4748]: I0320 10:36:15.453866 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 10:36:15 crc kubenswrapper[4748]: I0320 10:36:15.462077 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 10:36:15 crc kubenswrapper[4748]: E0320 10:36:15.585103 4748 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 10:36:15 crc kubenswrapper[4748]: I0320 10:36:15.588573 4748 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:36:15 crc kubenswrapper[4748]: I0320 10:36:15.589792 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:36:15 crc kubenswrapper[4748]: I0320 10:36:15.589826 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:36:15 crc kubenswrapper[4748]: I0320 10:36:15.589852 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:36:16 crc kubenswrapper[4748]: I0320 10:36:16.591304 4748 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:36:16 crc kubenswrapper[4748]: I0320 10:36:16.592493 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:36:16 crc kubenswrapper[4748]: I0320 10:36:16.592568 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:36:16 crc kubenswrapper[4748]: I0320 10:36:16.592592 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:36:16 crc kubenswrapper[4748]: I0320 10:36:16.601274 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 10:36:17 crc kubenswrapper[4748]: I0320 10:36:17.138063 4748 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 10:36:17 crc kubenswrapper[4748]: I0320 10:36:17.138164 4748 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 10:36:17 crc kubenswrapper[4748]: I0320 10:36:17.594905 4748 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:36:17 crc kubenswrapper[4748]: I0320 10:36:17.596309 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:36:17 crc kubenswrapper[4748]: I0320 10:36:17.596355 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:36:17 crc kubenswrapper[4748]: I0320 10:36:17.596369 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:36:18 crc kubenswrapper[4748]: I0320 10:36:18.582063 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Mar 20 10:36:18 crc kubenswrapper[4748]: I0320 10:36:18.582278 4748 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:36:18 crc kubenswrapper[4748]: I0320 10:36:18.583804 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:36:18 crc kubenswrapper[4748]: I0320 10:36:18.583890 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:36:18 crc kubenswrapper[4748]: I0320 10:36:18.583915 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:36:19 crc kubenswrapper[4748]: I0320 10:36:19.199383 4748 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:32846->192.168.126.11:17697: read: connection reset by peer" start-of-body= Mar 20 10:36:19 crc kubenswrapper[4748]: I0320 10:36:19.199477 4748 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:32846->192.168.126.11:17697: read: connection reset by peer" Mar 20 10:36:19 crc kubenswrapper[4748]: I0320 10:36:19.448599 4748 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Mar 20 10:36:19 crc kubenswrapper[4748]: W0320 10:36:19.592544 4748 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 20 10:36:19 crc kubenswrapper[4748]: I0320 10:36:19.592688 4748 trace.go:236] Trace[1743229308]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (20-Mar-2026 10:36:09.591) (total time: 10001ms): Mar 20 10:36:19 crc kubenswrapper[4748]: Trace[1743229308]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (10:36:19.592) Mar 20 10:36:19 crc kubenswrapper[4748]: Trace[1743229308]: [10.001306734s] [10.001306734s] END Mar 20 10:36:19 crc kubenswrapper[4748]: E0320 10:36:19.592721 4748 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 20 10:36:19 crc kubenswrapper[4748]: I0320 10:36:19.606759 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 20 10:36:19 crc kubenswrapper[4748]: I0320 10:36:19.610154 4748 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="b79bc18ef02ffbff86c29841cc594606e3eb44cb3d39606fdbcf3c8b557e645b" exitCode=255 Mar 20 10:36:19 crc kubenswrapper[4748]: I0320 10:36:19.610216 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"b79bc18ef02ffbff86c29841cc594606e3eb44cb3d39606fdbcf3c8b557e645b"} Mar 20 10:36:19 crc kubenswrapper[4748]: I0320 10:36:19.610453 4748 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:36:19 crc kubenswrapper[4748]: I0320 10:36:19.611543 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:36:19 crc kubenswrapper[4748]: I0320 10:36:19.611590 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:36:19 crc kubenswrapper[4748]: I0320 10:36:19.611611 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:36:19 crc kubenswrapper[4748]: I0320 10:36:19.612257 4748 scope.go:117] "RemoveContainer" containerID="b79bc18ef02ffbff86c29841cc594606e3eb44cb3d39606fdbcf3c8b557e645b" Mar 20 10:36:19 crc kubenswrapper[4748]: E0320 10:36:19.825069 4748 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": net/http: TLS handshake timeout" event="&Event{ObjectMeta:{crc.189e864c49ef1a10 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:36:05.446597136 +0000 UTC m=+0.588142980,LastTimestamp:2026-03-20 10:36:05.446597136 +0000 UTC m=+0.588142980,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:36:19 crc kubenswrapper[4748]: W0320 10:36:19.852012 4748 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:36:19Z is after 2026-02-23T05:33:13Z Mar 20 10:36:19 crc kubenswrapper[4748]: E0320 10:36:19.852091 4748 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:36:19Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 10:36:19 crc kubenswrapper[4748]: I0320 10:36:19.857063 4748 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 20 10:36:19 crc kubenswrapper[4748]: I0320 10:36:19.857159 4748 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 20 10:36:19 crc kubenswrapper[4748]: E0320 10:36:19.857568 4748 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:36:19Z is after 2026-02-23T05:33:13Z" interval="6.4s" Mar 20 10:36:19 crc kubenswrapper[4748]: W0320 10:36:19.858730 4748 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:36:19Z is after 2026-02-23T05:33:13Z Mar 20 10:36:19 crc kubenswrapper[4748]: E0320 10:36:19.858798 4748 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:36:19Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 10:36:19 crc kubenswrapper[4748]: E0320 10:36:19.860042 4748 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:36:19Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 10:36:19 crc kubenswrapper[4748]: W0320 10:36:19.860967 4748 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:36:19Z is after 2026-02-23T05:33:13Z Mar 20 10:36:19 crc kubenswrapper[4748]: E0320 10:36:19.861018 4748 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:36:19Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 10:36:19 crc kubenswrapper[4748]: I0320 10:36:19.862683 4748 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 20 10:36:19 crc kubenswrapper[4748]: I0320 10:36:19.862742 4748 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 20 10:36:19 crc kubenswrapper[4748]: E0320 10:36:19.862871 4748 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:36:19Z is after 2026-02-23T05:33:13Z" node="crc" Mar 20 10:36:20 crc kubenswrapper[4748]: I0320 10:36:20.452533 4748 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:36:20Z is after 2026-02-23T05:33:13Z Mar 20 10:36:20 crc kubenswrapper[4748]: I0320 10:36:20.619626 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 20 10:36:20 crc kubenswrapper[4748]: I0320 10:36:20.621304 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"46dd42b91265367f6f5655aedcd4d4b37c6328104e7ec008b5826bbe64a6e71f"} Mar 20 10:36:20 crc kubenswrapper[4748]: I0320 10:36:20.621473 4748 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:36:20 crc kubenswrapper[4748]: I0320 10:36:20.622444 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:36:20 crc kubenswrapper[4748]: I0320 10:36:20.622493 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:36:20 crc kubenswrapper[4748]: I0320 10:36:20.622526 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:36:21 crc kubenswrapper[4748]: I0320 10:36:21.007240 4748 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Mar 20 10:36:21 crc kubenswrapper[4748]: [+]log ok Mar 20 10:36:21 crc kubenswrapper[4748]: [+]etcd ok Mar 20 10:36:21 crc kubenswrapper[4748]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 20 10:36:21 crc kubenswrapper[4748]: [+]poststarthook/openshift.io-api-request-count-filter ok Mar 20 10:36:21 crc kubenswrapper[4748]: [+]poststarthook/openshift.io-startkubeinformers ok Mar 20 10:36:21 crc kubenswrapper[4748]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Mar 20 10:36:21 crc kubenswrapper[4748]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Mar 20 10:36:21 crc kubenswrapper[4748]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 20 10:36:21 crc kubenswrapper[4748]: [+]poststarthook/generic-apiserver-start-informers ok Mar 20 10:36:21 crc kubenswrapper[4748]: [+]poststarthook/priority-and-fairness-config-consumer ok Mar 20 10:36:21 crc kubenswrapper[4748]: [+]poststarthook/priority-and-fairness-filter ok Mar 20 10:36:21 crc kubenswrapper[4748]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 20 10:36:21 crc kubenswrapper[4748]: [+]poststarthook/start-apiextensions-informers ok Mar 20 10:36:21 crc kubenswrapper[4748]: [+]poststarthook/start-apiextensions-controllers ok Mar 20 10:36:21 crc kubenswrapper[4748]: [+]poststarthook/crd-informer-synced ok Mar 20 10:36:21 crc kubenswrapper[4748]: [+]poststarthook/start-system-namespaces-controller ok Mar 20 10:36:21 crc kubenswrapper[4748]: [+]poststarthook/start-cluster-authentication-info-controller ok Mar 20 10:36:21 crc kubenswrapper[4748]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Mar 20 10:36:21 crc kubenswrapper[4748]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Mar 20 10:36:21 crc kubenswrapper[4748]: [+]poststarthook/start-legacy-token-tracking-controller ok Mar 20 10:36:21 crc kubenswrapper[4748]: [+]poststarthook/start-service-ip-repair-controllers ok Mar 20 10:36:21 crc kubenswrapper[4748]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Mar 20 10:36:21 crc kubenswrapper[4748]: [+]poststarthook/scheduling/bootstrap-system-priority-classes ok Mar 20 10:36:21 crc kubenswrapper[4748]: [+]poststarthook/priority-and-fairness-config-producer ok Mar 20 10:36:21 crc kubenswrapper[4748]: [+]poststarthook/bootstrap-controller ok Mar 20 10:36:21 crc kubenswrapper[4748]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Mar 20 10:36:21 crc kubenswrapper[4748]: [+]poststarthook/start-kube-aggregator-informers ok Mar 20 10:36:21 crc kubenswrapper[4748]: [+]poststarthook/apiservice-status-local-available-controller ok Mar 20 10:36:21 crc kubenswrapper[4748]: [+]poststarthook/apiservice-status-remote-available-controller ok Mar 20 10:36:21 crc kubenswrapper[4748]: [+]poststarthook/apiservice-registration-controller ok Mar 20 10:36:21 crc kubenswrapper[4748]: [+]poststarthook/apiservice-wait-for-first-sync ok Mar 20 10:36:21 crc kubenswrapper[4748]: [+]poststarthook/apiservice-discovery-controller ok Mar 20 10:36:21 crc kubenswrapper[4748]: [+]poststarthook/kube-apiserver-autoregistration ok Mar 20 10:36:21 crc kubenswrapper[4748]: [+]autoregister-completion ok Mar 20 10:36:21 crc kubenswrapper[4748]: [+]poststarthook/apiservice-openapi-controller ok Mar 20 10:36:21 crc kubenswrapper[4748]: [+]poststarthook/apiservice-openapiv3-controller ok Mar 20 10:36:21 crc kubenswrapper[4748]: livez check failed Mar 20 10:36:21 crc kubenswrapper[4748]: I0320 10:36:21.007307 4748 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 10:36:21 crc kubenswrapper[4748]: I0320 10:36:21.451069 4748 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:36:21Z is after 2026-02-23T05:33:13Z Mar 20 10:36:21 crc kubenswrapper[4748]: I0320 10:36:21.626617 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 20 10:36:21 crc kubenswrapper[4748]: I0320 10:36:21.628015 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 20 10:36:21 crc kubenswrapper[4748]: I0320 10:36:21.631109 4748 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="46dd42b91265367f6f5655aedcd4d4b37c6328104e7ec008b5826bbe64a6e71f" exitCode=255 Mar 20 10:36:21 crc kubenswrapper[4748]: I0320 10:36:21.631195 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"46dd42b91265367f6f5655aedcd4d4b37c6328104e7ec008b5826bbe64a6e71f"} Mar 20 10:36:21 crc kubenswrapper[4748]: I0320 10:36:21.631260 4748 scope.go:117] "RemoveContainer" containerID="b79bc18ef02ffbff86c29841cc594606e3eb44cb3d39606fdbcf3c8b557e645b" Mar 20 10:36:21 crc kubenswrapper[4748]: I0320 10:36:21.631396 4748 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:36:21 crc kubenswrapper[4748]: I0320 10:36:21.632408 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:36:21 crc kubenswrapper[4748]: I0320 10:36:21.632453 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:36:21 crc kubenswrapper[4748]: I0320 10:36:21.632465 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:36:21 crc kubenswrapper[4748]: I0320 10:36:21.633140 4748 scope.go:117] "RemoveContainer" containerID="46dd42b91265367f6f5655aedcd4d4b37c6328104e7ec008b5826bbe64a6e71f" Mar 20 10:36:21 crc kubenswrapper[4748]: E0320 10:36:21.633396 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 10:36:22 crc kubenswrapper[4748]: I0320 10:36:22.453193 4748 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:36:22Z is after 2026-02-23T05:33:13Z Mar 20 10:36:22 crc kubenswrapper[4748]: I0320 10:36:22.636072 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 20 10:36:23 crc kubenswrapper[4748]: I0320 10:36:23.451376 4748 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:36:23Z is after 2026-02-23T05:33:13Z Mar 20 10:36:24 crc kubenswrapper[4748]: I0320 10:36:24.452384 4748 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:36:24Z is after 2026-02-23T05:33:13Z Mar 20 10:36:25 crc kubenswrapper[4748]: I0320 10:36:25.450801 4748 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:36:25Z is after 2026-02-23T05:33:13Z Mar 20 10:36:25 crc kubenswrapper[4748]: E0320 10:36:25.585247 4748 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 10:36:25 crc kubenswrapper[4748]: W0320 10:36:25.665914 4748 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:36:25Z is after 2026-02-23T05:33:13Z Mar 20 10:36:25 crc kubenswrapper[4748]: E0320 10:36:25.666029 4748 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:36:25Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 10:36:26 crc kubenswrapper[4748]: I0320 10:36:26.010781 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:36:26 crc kubenswrapper[4748]: I0320 10:36:26.010973 4748 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:36:26 crc kubenswrapper[4748]: I0320 10:36:26.012420 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:36:26 crc kubenswrapper[4748]: I0320 10:36:26.012474 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:36:26 crc kubenswrapper[4748]: I0320 10:36:26.012528 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:36:26 crc kubenswrapper[4748]: I0320 10:36:26.013447 4748 scope.go:117] "RemoveContainer" containerID="46dd42b91265367f6f5655aedcd4d4b37c6328104e7ec008b5826bbe64a6e71f" Mar 20 10:36:26 crc kubenswrapper[4748]: E0320 10:36:26.013718 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 10:36:26 crc kubenswrapper[4748]: I0320 10:36:26.015545 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:36:26 crc kubenswrapper[4748]: E0320 10:36:26.260635 4748 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:36:26Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 20 10:36:26 crc kubenswrapper[4748]: I0320 10:36:26.263854 4748 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:36:26 crc kubenswrapper[4748]: I0320 10:36:26.268690 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:36:26 crc kubenswrapper[4748]: I0320 10:36:26.268744 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:36:26 crc kubenswrapper[4748]: I0320 10:36:26.268758 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:36:26 crc kubenswrapper[4748]: I0320 10:36:26.268788 4748 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 10:36:26 crc kubenswrapper[4748]: E0320 10:36:26.274182 4748 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:36:26Z is after 2026-02-23T05:33:13Z" node="crc" Mar 20 10:36:26 crc kubenswrapper[4748]: W0320 10:36:26.319699 4748 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:36:26Z is after 2026-02-23T05:33:13Z Mar 20 10:36:26 crc kubenswrapper[4748]: E0320 10:36:26.319923 4748 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:36:26Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 10:36:26 crc kubenswrapper[4748]: I0320 10:36:26.451291 4748 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:36:26Z is after 2026-02-23T05:33:13Z Mar 20 10:36:26 crc kubenswrapper[4748]: I0320 10:36:26.649050 4748 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:36:26 crc kubenswrapper[4748]: I0320 10:36:26.650074 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:36:26 crc kubenswrapper[4748]: I0320 10:36:26.650110 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:36:26 crc kubenswrapper[4748]: I0320 10:36:26.650121 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:36:26 crc kubenswrapper[4748]: I0320 10:36:26.650625 4748 scope.go:117] "RemoveContainer" containerID="46dd42b91265367f6f5655aedcd4d4b37c6328104e7ec008b5826bbe64a6e71f" Mar 20 10:36:26 crc kubenswrapper[4748]: E0320 10:36:26.650789 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 10:36:27 crc kubenswrapper[4748]: I0320 10:36:27.138437 4748 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 10:36:27 crc kubenswrapper[4748]: I0320 10:36:27.138534 4748 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 10:36:27 crc kubenswrapper[4748]: I0320 10:36:27.453234 4748 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:36:27Z is after 2026-02-23T05:33:13Z Mar 20 10:36:27 crc kubenswrapper[4748]: I0320 10:36:27.863574 4748 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 20 10:36:27 crc kubenswrapper[4748]: E0320 10:36:27.869864 4748 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:36:27Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 10:36:28 crc kubenswrapper[4748]: I0320 10:36:28.406864 4748 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:36:28 crc kubenswrapper[4748]: I0320 10:36:28.407167 4748 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:36:28 crc kubenswrapper[4748]: I0320 10:36:28.409090 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:36:28 crc kubenswrapper[4748]: I0320 10:36:28.409171 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:36:28 crc kubenswrapper[4748]: I0320 10:36:28.409185 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:36:28 crc kubenswrapper[4748]: I0320 10:36:28.409849 4748 scope.go:117] "RemoveContainer" containerID="46dd42b91265367f6f5655aedcd4d4b37c6328104e7ec008b5826bbe64a6e71f" Mar 20 10:36:28 crc kubenswrapper[4748]: E0320 10:36:28.410041 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 10:36:28 crc kubenswrapper[4748]: I0320 10:36:28.452964 4748 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:36:28Z is after 2026-02-23T05:33:13Z Mar 20 10:36:28 crc kubenswrapper[4748]: I0320 10:36:28.612318 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Mar 20 10:36:28 crc kubenswrapper[4748]: I0320 10:36:28.612622 4748 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:36:28 crc kubenswrapper[4748]: I0320 10:36:28.614239 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:36:28 crc kubenswrapper[4748]: I0320 10:36:28.614285 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:36:28 crc kubenswrapper[4748]: I0320 10:36:28.614327 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:36:28 crc kubenswrapper[4748]: I0320 10:36:28.631363 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Mar 20 10:36:28 crc kubenswrapper[4748]: I0320 10:36:28.654935 4748 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:36:28 crc kubenswrapper[4748]: I0320 10:36:28.656220 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:36:28 crc kubenswrapper[4748]: I0320 10:36:28.656266 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:36:28 crc kubenswrapper[4748]: I0320 10:36:28.656279 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:36:29 crc kubenswrapper[4748]: I0320 10:36:29.451352 4748 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:36:29Z is after 2026-02-23T05:33:13Z Mar 20 10:36:29 crc kubenswrapper[4748]: W0320 10:36:29.479653 4748 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:36:29Z is after 2026-02-23T05:33:13Z Mar 20 10:36:29 crc kubenswrapper[4748]: E0320 10:36:29.479764 4748 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:36:29Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 10:36:29 crc kubenswrapper[4748]: I0320 10:36:29.776888 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:36:29 crc kubenswrapper[4748]: I0320 10:36:29.777180 4748 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:36:29 crc kubenswrapper[4748]: I0320 10:36:29.778880 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:36:29 crc kubenswrapper[4748]: I0320 10:36:29.778934 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:36:29 crc kubenswrapper[4748]: I0320 10:36:29.778946 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:36:29 crc kubenswrapper[4748]: I0320 10:36:29.779625 4748 scope.go:117] "RemoveContainer" containerID="46dd42b91265367f6f5655aedcd4d4b37c6328104e7ec008b5826bbe64a6e71f" Mar 20 10:36:29 crc kubenswrapper[4748]: E0320 10:36:29.779865 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 10:36:29 crc kubenswrapper[4748]: E0320 10:36:29.831059 4748 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:36:29Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189e864c49ef1a10 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:36:05.446597136 +0000 UTC m=+0.588142980,LastTimestamp:2026-03-20 10:36:05.446597136 +0000 UTC m=+0.588142980,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:36:30 crc kubenswrapper[4748]: I0320 10:36:30.450966 4748 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:36:30Z is after 2026-02-23T05:33:13Z Mar 20 10:36:30 crc kubenswrapper[4748]: W0320 10:36:30.786054 4748 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:36:30Z is after 2026-02-23T05:33:13Z Mar 20 10:36:30 crc kubenswrapper[4748]: E0320 10:36:30.786165 4748 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:36:30Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 10:36:31 crc kubenswrapper[4748]: I0320 10:36:31.454944 4748 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:36:31Z is after 2026-02-23T05:33:13Z Mar 20 10:36:32 crc kubenswrapper[4748]: I0320 10:36:32.452241 4748 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:36:32Z is after 2026-02-23T05:33:13Z Mar 20 10:36:33 crc kubenswrapper[4748]: E0320 10:36:33.266189 4748 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:36:33Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 20 10:36:33 crc kubenswrapper[4748]: I0320 10:36:33.274442 4748 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:36:33 crc kubenswrapper[4748]: I0320 10:36:33.276287 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:36:33 crc kubenswrapper[4748]: I0320 10:36:33.276355 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:36:33 crc kubenswrapper[4748]: I0320 10:36:33.276380 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:36:33 crc kubenswrapper[4748]: I0320 10:36:33.276422 4748 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 10:36:33 crc kubenswrapper[4748]: E0320 10:36:33.279860 4748 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:36:33Z is after 2026-02-23T05:33:13Z" node="crc" Mar 20 10:36:33 crc kubenswrapper[4748]: I0320 10:36:33.452867 4748 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:36:33Z is after 2026-02-23T05:33:13Z Mar 20 10:36:34 crc kubenswrapper[4748]: I0320 10:36:34.455925 4748 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:36:34Z is after 2026-02-23T05:33:13Z Mar 20 10:36:35 crc kubenswrapper[4748]: I0320 10:36:35.450633 4748 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:36:35Z is after 2026-02-23T05:33:13Z Mar 20 10:36:35 crc kubenswrapper[4748]: E0320 10:36:35.585475 4748 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 10:36:36 crc kubenswrapper[4748]: I0320 10:36:36.450415 4748 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:36:36Z is after 2026-02-23T05:33:13Z Mar 20 10:36:37 crc kubenswrapper[4748]: I0320 10:36:37.138905 4748 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 10:36:37 crc kubenswrapper[4748]: I0320 10:36:37.139019 4748 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 10:36:37 crc kubenswrapper[4748]: I0320 10:36:37.139102 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 10:36:37 crc kubenswrapper[4748]: I0320 10:36:37.139296 4748 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:36:37 crc kubenswrapper[4748]: I0320 10:36:37.140983 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:36:37 crc kubenswrapper[4748]: I0320 10:36:37.141059 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:36:37 crc kubenswrapper[4748]: I0320 10:36:37.141078 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:36:37 crc kubenswrapper[4748]: I0320 10:36:37.141929 4748 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"f7f38cc9f0d5c3bfd12dbac6def39d6daab1ee85f93b3932259db6d4e31d2625"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 20 10:36:37 crc kubenswrapper[4748]: I0320 10:36:37.142195 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://f7f38cc9f0d5c3bfd12dbac6def39d6daab1ee85f93b3932259db6d4e31d2625" gracePeriod=30 Mar 20 10:36:37 crc kubenswrapper[4748]: W0320 10:36:37.416073 4748 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:36:37Z is after 2026-02-23T05:33:13Z Mar 20 10:36:37 crc kubenswrapper[4748]: E0320 10:36:37.416187 4748 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:36:37Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 10:36:37 crc kubenswrapper[4748]: I0320 10:36:37.451294 4748 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:36:37Z is after 2026-02-23T05:33:13Z Mar 20 10:36:37 crc kubenswrapper[4748]: I0320 10:36:37.683804 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 20 10:36:37 crc kubenswrapper[4748]: I0320 10:36:37.684283 4748 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="f7f38cc9f0d5c3bfd12dbac6def39d6daab1ee85f93b3932259db6d4e31d2625" exitCode=255 Mar 20 10:36:37 crc kubenswrapper[4748]: I0320 10:36:37.684335 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"f7f38cc9f0d5c3bfd12dbac6def39d6daab1ee85f93b3932259db6d4e31d2625"} Mar 20 10:36:37 crc kubenswrapper[4748]: I0320 10:36:37.684370 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"5a7fa8f56b588979491c8a1437d968d2d418ce0e8d18c052571a0793f41e2826"} Mar 20 10:36:37 crc kubenswrapper[4748]: I0320 10:36:37.684475 4748 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:36:37 crc kubenswrapper[4748]: I0320 10:36:37.685404 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:36:37 crc kubenswrapper[4748]: I0320 10:36:37.685441 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:36:37 crc kubenswrapper[4748]: I0320 10:36:37.685453 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:36:38 crc kubenswrapper[4748]: I0320 10:36:38.450417 4748 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:36:38Z is after 2026-02-23T05:33:13Z Mar 20 10:36:39 crc kubenswrapper[4748]: I0320 10:36:39.451744 4748 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:36:39Z is after 2026-02-23T05:33:13Z Mar 20 10:36:39 crc kubenswrapper[4748]: E0320 10:36:39.835253 4748 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:36:39Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189e864c49ef1a10 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:36:05.446597136 +0000 UTC m=+0.588142980,LastTimestamp:2026-03-20 10:36:05.446597136 +0000 UTC m=+0.588142980,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:36:40 crc kubenswrapper[4748]: E0320 10:36:40.272646 4748 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:36:40Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 20 10:36:40 crc kubenswrapper[4748]: I0320 10:36:40.280782 4748 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:36:40 crc kubenswrapper[4748]: I0320 10:36:40.282810 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:36:40 crc kubenswrapper[4748]: I0320 10:36:40.282944 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:36:40 crc kubenswrapper[4748]: I0320 10:36:40.282972 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:36:40 crc kubenswrapper[4748]: I0320 10:36:40.283025 4748 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 10:36:40 crc kubenswrapper[4748]: E0320 10:36:40.289330 4748 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:36:40Z is after 2026-02-23T05:33:13Z" node="crc" Mar 20 10:36:40 crc kubenswrapper[4748]: I0320 10:36:40.453135 4748 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:36:40Z is after 2026-02-23T05:33:13Z Mar 20 10:36:41 crc kubenswrapper[4748]: I0320 10:36:41.450971 4748 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:36:41Z is after 2026-02-23T05:33:13Z Mar 20 10:36:42 crc kubenswrapper[4748]: I0320 10:36:42.453208 4748 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:36:42Z is after 2026-02-23T05:33:13Z Mar 20 10:36:43 crc kubenswrapper[4748]: I0320 10:36:43.452509 4748 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:36:43Z is after 2026-02-23T05:33:13Z Mar 20 10:36:43 crc kubenswrapper[4748]: I0320 10:36:43.905547 4748 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 20 10:36:43 crc kubenswrapper[4748]: E0320 10:36:43.908989 4748 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:36:43Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 10:36:43 crc kubenswrapper[4748]: E0320 10:36:43.910232 4748 certificate_manager.go:440] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Reached backoff limit, still unable to rotate certs: timed out waiting for the condition" logger="UnhandledError" Mar 20 10:36:43 crc kubenswrapper[4748]: I0320 10:36:43.938793 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 10:36:43 crc kubenswrapper[4748]: I0320 10:36:43.939049 4748 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:36:43 crc kubenswrapper[4748]: I0320 10:36:43.940359 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:36:43 crc kubenswrapper[4748]: I0320 10:36:43.940401 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:36:43 crc kubenswrapper[4748]: I0320 10:36:43.940414 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:36:44 crc kubenswrapper[4748]: I0320 10:36:44.138595 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 10:36:44 crc kubenswrapper[4748]: I0320 10:36:44.453315 4748 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:36:44Z is after 2026-02-23T05:33:13Z Mar 20 10:36:44 crc kubenswrapper[4748]: I0320 10:36:44.514486 4748 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:36:44 crc kubenswrapper[4748]: I0320 10:36:44.516083 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:36:44 crc kubenswrapper[4748]: I0320 10:36:44.516146 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:36:44 crc kubenswrapper[4748]: I0320 10:36:44.516172 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:36:44 crc kubenswrapper[4748]: I0320 10:36:44.517017 4748 scope.go:117] "RemoveContainer" containerID="46dd42b91265367f6f5655aedcd4d4b37c6328104e7ec008b5826bbe64a6e71f" Mar 20 10:36:44 crc kubenswrapper[4748]: I0320 10:36:44.703910 4748 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:36:44 crc kubenswrapper[4748]: I0320 10:36:44.704940 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:36:44 crc kubenswrapper[4748]: I0320 10:36:44.705013 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:36:44 crc kubenswrapper[4748]: I0320 10:36:44.705042 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:36:45 crc kubenswrapper[4748]: I0320 10:36:45.451212 4748 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:36:45Z is after 2026-02-23T05:33:13Z Mar 20 10:36:45 crc kubenswrapper[4748]: E0320 10:36:45.585919 4748 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 10:36:45 crc kubenswrapper[4748]: I0320 10:36:45.708083 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 20 10:36:45 crc kubenswrapper[4748]: I0320 10:36:45.708684 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 20 10:36:45 crc kubenswrapper[4748]: I0320 10:36:45.710637 4748 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="c78f06d28a973f07c57c8737ae920f13230f8b7712f86c5d779b4e6abf22c67e" exitCode=255 Mar 20 10:36:45 crc kubenswrapper[4748]: I0320 10:36:45.710679 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"c78f06d28a973f07c57c8737ae920f13230f8b7712f86c5d779b4e6abf22c67e"} Mar 20 10:36:45 crc kubenswrapper[4748]: I0320 10:36:45.710725 4748 scope.go:117] "RemoveContainer" containerID="46dd42b91265367f6f5655aedcd4d4b37c6328104e7ec008b5826bbe64a6e71f" Mar 20 10:36:45 crc kubenswrapper[4748]: I0320 10:36:45.710963 4748 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:36:45 crc kubenswrapper[4748]: I0320 10:36:45.712217 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:36:45 crc kubenswrapper[4748]: I0320 10:36:45.712270 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:36:45 crc kubenswrapper[4748]: I0320 10:36:45.712282 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:36:45 crc kubenswrapper[4748]: I0320 10:36:45.712996 4748 scope.go:117] "RemoveContainer" containerID="c78f06d28a973f07c57c8737ae920f13230f8b7712f86c5d779b4e6abf22c67e" Mar 20 10:36:45 crc kubenswrapper[4748]: E0320 10:36:45.713202 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 10:36:46 crc kubenswrapper[4748]: W0320 10:36:46.129410 4748 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:36:46Z is after 2026-02-23T05:33:13Z Mar 20 10:36:46 crc kubenswrapper[4748]: E0320 10:36:46.129812 4748 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:36:46Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 10:36:46 crc kubenswrapper[4748]: I0320 10:36:46.451619 4748 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:36:46Z is after 2026-02-23T05:33:13Z Mar 20 10:36:46 crc kubenswrapper[4748]: I0320 10:36:46.716042 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 20 10:36:47 crc kubenswrapper[4748]: I0320 10:36:47.138684 4748 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 10:36:47 crc kubenswrapper[4748]: I0320 10:36:47.138780 4748 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 10:36:47 crc kubenswrapper[4748]: E0320 10:36:47.278006 4748 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:36:47Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 20 10:36:47 crc kubenswrapper[4748]: I0320 10:36:47.290300 4748 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:36:47 crc kubenswrapper[4748]: I0320 10:36:47.292080 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:36:47 crc kubenswrapper[4748]: I0320 10:36:47.292120 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:36:47 crc kubenswrapper[4748]: I0320 10:36:47.292132 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:36:47 crc kubenswrapper[4748]: I0320 10:36:47.292161 4748 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 10:36:47 crc kubenswrapper[4748]: E0320 10:36:47.295517 4748 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:36:47Z is after 2026-02-23T05:33:13Z" node="crc" Mar 20 10:36:47 crc kubenswrapper[4748]: I0320 10:36:47.451012 4748 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:36:47Z is after 2026-02-23T05:33:13Z Mar 20 10:36:48 crc kubenswrapper[4748]: I0320 10:36:48.407508 4748 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:36:48 crc kubenswrapper[4748]: I0320 10:36:48.407715 4748 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:36:48 crc kubenswrapper[4748]: I0320 10:36:48.409209 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:36:48 crc kubenswrapper[4748]: I0320 10:36:48.409248 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:36:48 crc kubenswrapper[4748]: I0320 10:36:48.409261 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:36:48 crc kubenswrapper[4748]: I0320 10:36:48.409794 4748 scope.go:117] "RemoveContainer" containerID="c78f06d28a973f07c57c8737ae920f13230f8b7712f86c5d779b4e6abf22c67e" Mar 20 10:36:48 crc kubenswrapper[4748]: E0320 10:36:48.410003 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 10:36:48 crc kubenswrapper[4748]: I0320 10:36:48.452743 4748 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:36:48Z is after 2026-02-23T05:33:13Z Mar 20 10:36:49 crc kubenswrapper[4748]: I0320 10:36:49.453414 4748 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 10:36:49 crc kubenswrapper[4748]: W0320 10:36:49.734705 4748 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Mar 20 10:36:49 crc kubenswrapper[4748]: E0320 10:36:49.734819 4748 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 20 10:36:49 crc kubenswrapper[4748]: I0320 10:36:49.776998 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:36:49 crc kubenswrapper[4748]: I0320 10:36:49.777315 4748 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:36:49 crc kubenswrapper[4748]: I0320 10:36:49.779054 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:36:49 crc kubenswrapper[4748]: I0320 10:36:49.779111 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:36:49 crc kubenswrapper[4748]: I0320 10:36:49.779130 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:36:49 crc kubenswrapper[4748]: I0320 10:36:49.779973 4748 scope.go:117] "RemoveContainer" containerID="c78f06d28a973f07c57c8737ae920f13230f8b7712f86c5d779b4e6abf22c67e" Mar 20 10:36:49 crc kubenswrapper[4748]: E0320 10:36:49.780248 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 10:36:49 crc kubenswrapper[4748]: E0320 10:36:49.843631 4748 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e864c49ef1a10 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:36:05.446597136 +0000 UTC m=+0.588142980,LastTimestamp:2026-03-20 10:36:05.446597136 +0000 UTC m=+0.588142980,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:36:49 crc kubenswrapper[4748]: E0320 10:36:49.850597 4748 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e864c4d6ad940 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:36:05.505038656 +0000 UTC m=+0.646584470,LastTimestamp:2026-03-20 10:36:05.505038656 +0000 UTC m=+0.646584470,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:36:49 crc kubenswrapper[4748]: E0320 10:36:49.855824 4748 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e864c4d6b120c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:36:05.505053196 +0000 UTC m=+0.646599010,LastTimestamp:2026-03-20 10:36:05.505053196 +0000 UTC m=+0.646599010,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:36:49 crc kubenswrapper[4748]: E0320 10:36:49.861389 4748 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e864c4d6b3aca default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:36:05.505063626 +0000 UTC m=+0.646609440,LastTimestamp:2026-03-20 10:36:05.505063626 +0000 UTC m=+0.646609440,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:36:49 crc kubenswrapper[4748]: E0320 10:36:49.866108 4748 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e864c51dc4a37 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:36:05.579582007 +0000 UTC m=+0.721127821,LastTimestamp:2026-03-20 10:36:05.579582007 +0000 UTC m=+0.721127821,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:36:49 crc kubenswrapper[4748]: E0320 10:36:49.874033 4748 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e864c4d6ad940\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e864c4d6ad940 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:36:05.505038656 +0000 UTC m=+0.646584470,LastTimestamp:2026-03-20 10:36:05.616890683 +0000 UTC m=+0.758436497,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:36:49 crc kubenswrapper[4748]: E0320 10:36:49.878193 4748 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e864c4d6b120c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e864c4d6b120c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:36:05.505053196 +0000 UTC m=+0.646599010,LastTimestamp:2026-03-20 10:36:05.616910074 +0000 UTC m=+0.758455888,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:36:49 crc kubenswrapper[4748]: E0320 10:36:49.884994 4748 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e864c4d6b3aca\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e864c4d6b3aca default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:36:05.505063626 +0000 UTC m=+0.646609440,LastTimestamp:2026-03-20 10:36:05.616918364 +0000 UTC m=+0.758464178,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:36:49 crc kubenswrapper[4748]: E0320 10:36:49.891369 4748 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e864c4d6ad940\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e864c4d6ad940 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:36:05.505038656 +0000 UTC m=+0.646584470,LastTimestamp:2026-03-20 10:36:05.618204745 +0000 UTC m=+0.759750559,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:36:49 crc kubenswrapper[4748]: E0320 10:36:49.897301 4748 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e864c4d6ad940\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e864c4d6ad940 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:36:05.505038656 +0000 UTC m=+0.646584470,LastTimestamp:2026-03-20 10:36:05.618219446 +0000 UTC m=+0.759765270,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:36:49 crc kubenswrapper[4748]: E0320 10:36:49.903732 4748 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e864c4d6b120c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e864c4d6b120c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:36:05.505053196 +0000 UTC m=+0.646599010,LastTimestamp:2026-03-20 10:36:05.618226116 +0000 UTC m=+0.759771930,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:36:49 crc kubenswrapper[4748]: E0320 10:36:49.909618 4748 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e864c4d6b120c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e864c4d6b120c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:36:05.505053196 +0000 UTC m=+0.646599010,LastTimestamp:2026-03-20 10:36:05.618236856 +0000 UTC m=+0.759782690,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:36:49 crc kubenswrapper[4748]: E0320 10:36:49.914667 4748 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e864c4d6b3aca\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e864c4d6b3aca default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:36:05.505063626 +0000 UTC m=+0.646609440,LastTimestamp:2026-03-20 10:36:05.618247166 +0000 UTC m=+0.759792970,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:36:49 crc kubenswrapper[4748]: E0320 10:36:49.919544 4748 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e864c4d6b3aca\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e864c4d6b3aca default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:36:05.505063626 +0000 UTC m=+0.646609440,LastTimestamp:2026-03-20 10:36:05.618252026 +0000 UTC m=+0.759797850,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:36:49 crc kubenswrapper[4748]: E0320 10:36:49.924279 4748 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e864c4d6ad940\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e864c4d6ad940 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:36:05.505038656 +0000 UTC m=+0.646584470,LastTimestamp:2026-03-20 10:36:05.619053846 +0000 UTC m=+0.760599660,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:36:49 crc kubenswrapper[4748]: E0320 10:36:49.929471 4748 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e864c4d6b120c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e864c4d6b120c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:36:05.505053196 +0000 UTC m=+0.646599010,LastTimestamp:2026-03-20 10:36:05.619068246 +0000 UTC m=+0.760614060,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:36:49 crc kubenswrapper[4748]: E0320 10:36:49.934824 4748 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e864c4d6b3aca\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e864c4d6b3aca default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:36:05.505063626 +0000 UTC m=+0.646609440,LastTimestamp:2026-03-20 10:36:05.619077906 +0000 UTC m=+0.760623720,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:36:49 crc kubenswrapper[4748]: E0320 10:36:49.939765 4748 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e864c4d6ad940\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e864c4d6ad940 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:36:05.505038656 +0000 UTC m=+0.646584470,LastTimestamp:2026-03-20 10:36:05.6196487 +0000 UTC m=+0.761194514,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:36:49 crc kubenswrapper[4748]: E0320 10:36:49.944671 4748 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e864c4d6b120c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e864c4d6b120c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:36:05.505053196 +0000 UTC m=+0.646599010,LastTimestamp:2026-03-20 10:36:05.619661341 +0000 UTC m=+0.761207155,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:36:49 crc kubenswrapper[4748]: E0320 10:36:49.949291 4748 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e864c4d6ad940\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e864c4d6ad940 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:36:05.505038656 +0000 UTC m=+0.646584470,LastTimestamp:2026-03-20 10:36:05.619669401 +0000 UTC m=+0.761215215,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:36:49 crc kubenswrapper[4748]: E0320 10:36:49.956463 4748 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e864c4d6b3aca\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e864c4d6b3aca default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:36:05.505063626 +0000 UTC m=+0.646609440,LastTimestamp:2026-03-20 10:36:05.619678011 +0000 UTC m=+0.761223825,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:36:49 crc kubenswrapper[4748]: E0320 10:36:49.962202 4748 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e864c4d6b120c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e864c4d6b120c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:36:05.505053196 +0000 UTC m=+0.646599010,LastTimestamp:2026-03-20 10:36:05.619686321 +0000 UTC m=+0.761232135,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:36:49 crc kubenswrapper[4748]: E0320 10:36:49.966189 4748 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e864c4d6b3aca\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e864c4d6b3aca default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:36:05.505063626 +0000 UTC m=+0.646609440,LastTimestamp:2026-03-20 10:36:05.619694691 +0000 UTC m=+0.761240505,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:36:49 crc kubenswrapper[4748]: E0320 10:36:49.970104 4748 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e864c4d6ad940\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e864c4d6ad940 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:36:05.505038656 +0000 UTC m=+0.646584470,LastTimestamp:2026-03-20 10:36:05.620506271 +0000 UTC m=+0.762052115,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:36:49 crc kubenswrapper[4748]: E0320 10:36:49.974052 4748 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e864c4d6b120c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e864c4d6b120c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:36:05.505053196 +0000 UTC m=+0.646599010,LastTimestamp:2026-03-20 10:36:05.620536142 +0000 UTC m=+0.762081996,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:36:49 crc kubenswrapper[4748]: E0320 10:36:49.979239 4748 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e864c6b6e15a0 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:36:06.0085672 +0000 UTC m=+1.150113014,LastTimestamp:2026-03-20 10:36:06.0085672 +0000 UTC m=+1.150113014,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:36:49 crc kubenswrapper[4748]: E0320 10:36:49.982721 4748 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e864c6b6e27d4 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:36:06.00857186 +0000 UTC m=+1.150117714,LastTimestamp:2026-03-20 10:36:06.00857186 +0000 UTC m=+1.150117714,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:36:49 crc kubenswrapper[4748]: E0320 10:36:49.986801 4748 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e864c6ca920e2 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:36:06.029213922 +0000 UTC m=+1.170759756,LastTimestamp:2026-03-20 10:36:06.029213922 +0000 UTC m=+1.170759756,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:36:49 crc kubenswrapper[4748]: E0320 10:36:49.990064 4748 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e864c6cabef60 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:36:06.029397856 +0000 UTC m=+1.170943710,LastTimestamp:2026-03-20 10:36:06.029397856 +0000 UTC m=+1.170943710,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:36:49 crc kubenswrapper[4748]: E0320 10:36:49.994993 4748 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e864c6ce8a687 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:36:06.033376903 +0000 UTC m=+1.174922717,LastTimestamp:2026-03-20 10:36:06.033376903 +0000 UTC m=+1.174922717,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:36:50 crc kubenswrapper[4748]: E0320 10:36:50.000709 4748 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e864c8f6ded09 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:36:06.612536585 +0000 UTC m=+1.754082409,LastTimestamp:2026-03-20 10:36:06.612536585 +0000 UTC m=+1.754082409,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:36:50 crc kubenswrapper[4748]: E0320 10:36:50.007962 4748 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e864c8f8d9dbb openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Created,Message:Created container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:36:06.614613435 +0000 UTC m=+1.756159249,LastTimestamp:2026-03-20 10:36:06.614613435 +0000 UTC m=+1.756159249,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:36:50 crc kubenswrapper[4748]: E0320 10:36:50.012186 4748 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e864c8fb56309 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:36:06.617219849 +0000 UTC m=+1.758765663,LastTimestamp:2026-03-20 10:36:06.617219849 +0000 UTC m=+1.758765663,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:36:50 crc kubenswrapper[4748]: E0320 10:36:50.018238 4748 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e864c8fe5b22c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:36:06.620385836 +0000 UTC m=+1.761931650,LastTimestamp:2026-03-20 10:36:06.620385836 +0000 UTC m=+1.761931650,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:36:50 crc kubenswrapper[4748]: E0320 10:36:50.024497 4748 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e864c9010a188 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:36:06.623199624 +0000 UTC m=+1.764745438,LastTimestamp:2026-03-20 10:36:06.623199624 +0000 UTC m=+1.764745438,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:36:50 crc kubenswrapper[4748]: E0320 10:36:50.030902 4748 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e864c902b4b16 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:36:06.624946966 +0000 UTC m=+1.766492780,LastTimestamp:2026-03-20 10:36:06.624946966 +0000 UTC m=+1.766492780,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:36:50 crc kubenswrapper[4748]: E0320 10:36:50.035583 4748 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e864c90463109 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Started,Message:Started container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:36:06.626709769 +0000 UTC m=+1.768255583,LastTimestamp:2026-03-20 10:36:06.626709769 +0000 UTC m=+1.768255583,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:36:50 crc kubenswrapper[4748]: E0320 10:36:50.041676 4748 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e864c9058d8b3 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:36:06.627932339 +0000 UTC m=+1.769478153,LastTimestamp:2026-03-20 10:36:06.627932339 +0000 UTC m=+1.769478153,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:36:50 crc kubenswrapper[4748]: E0320 10:36:50.046026 4748 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e864c907d3d19 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:36:06.630317337 +0000 UTC m=+1.771863151,LastTimestamp:2026-03-20 10:36:06.630317337 +0000 UTC m=+1.771863151,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:36:50 crc kubenswrapper[4748]: E0320 10:36:50.053929 4748 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e864c9090d384 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:36:06.631601028 +0000 UTC m=+1.773146842,LastTimestamp:2026-03-20 10:36:06.631601028 +0000 UTC m=+1.773146842,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:36:50 crc kubenswrapper[4748]: E0320 10:36:50.058605 4748 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e864c91a018ea openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:36:06.64937905 +0000 UTC m=+1.790924864,LastTimestamp:2026-03-20 10:36:06.64937905 +0000 UTC m=+1.790924864,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:36:50 crc kubenswrapper[4748]: E0320 10:36:50.062793 4748 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e864ca0d7d0c1 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:36:06.904688833 +0000 UTC m=+2.046234647,LastTimestamp:2026-03-20 10:36:06.904688833 +0000 UTC m=+2.046234647,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:36:50 crc kubenswrapper[4748]: E0320 10:36:50.066876 4748 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e864ca19a120d openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:36:06.917419533 +0000 UTC m=+2.058965347,LastTimestamp:2026-03-20 10:36:06.917419533 +0000 UTC m=+2.058965347,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:36:50 crc kubenswrapper[4748]: E0320 10:36:50.070546 4748 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e864ca1b4a7b9 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:36:06.919161785 +0000 UTC m=+2.060707599,LastTimestamp:2026-03-20 10:36:06.919161785 +0000 UTC m=+2.060707599,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:36:50 crc kubenswrapper[4748]: E0320 10:36:50.074678 4748 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e864cae48a89c openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Created,Message:Created container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:36:07.130187932 +0000 UTC m=+2.271733776,LastTimestamp:2026-03-20 10:36:07.130187932 +0000 UTC m=+2.271733776,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:36:50 crc kubenswrapper[4748]: E0320 10:36:50.078169 4748 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e864caef10c7e openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Started,Message:Started container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:36:07.14122355 +0000 UTC m=+2.282769374,LastTimestamp:2026-03-20 10:36:07.14122355 +0000 UTC m=+2.282769374,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:36:50 crc kubenswrapper[4748]: E0320 10:36:50.081680 4748 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e864caf082a91 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:36:07.142738577 +0000 UTC m=+2.284284431,LastTimestamp:2026-03-20 10:36:07.142738577 +0000 UTC m=+2.284284431,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:36:50 crc kubenswrapper[4748]: E0320 10:36:50.085618 4748 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e864cbb323dd9 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Created,Message:Created container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:36:07.346822617 +0000 UTC m=+2.488368451,LastTimestamp:2026-03-20 10:36:07.346822617 +0000 UTC m=+2.488368451,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:36:50 crc kubenswrapper[4748]: E0320 10:36:50.089812 4748 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e864cbbf47aca openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Started,Message:Started container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:36:07.359552202 +0000 UTC m=+2.501098026,LastTimestamp:2026-03-20 10:36:07.359552202 +0000 UTC m=+2.501098026,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:36:50 crc kubenswrapper[4748]: E0320 10:36:50.094522 4748 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e864cc68856e0 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:36:07.537014496 +0000 UTC m=+2.678560340,LastTimestamp:2026-03-20 10:36:07.537014496 +0000 UTC m=+2.678560340,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:36:50 crc kubenswrapper[4748]: E0320 10:36:50.098566 4748 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e864cc69b6138 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:36:07.538262328 +0000 UTC m=+2.679808142,LastTimestamp:2026-03-20 10:36:07.538262328 +0000 UTC m=+2.679808142,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:36:50 crc kubenswrapper[4748]: E0320 10:36:50.102425 4748 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e864cc6aa8082 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:36:07.539253378 +0000 UTC m=+2.680799232,LastTimestamp:2026-03-20 10:36:07.539253378 +0000 UTC m=+2.680799232,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:36:50 crc kubenswrapper[4748]: E0320 10:36:50.106753 4748 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e864cc6e638fa openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:36:07.543167226 +0000 UTC m=+2.684713040,LastTimestamp:2026-03-20 10:36:07.543167226 +0000 UTC m=+2.684713040,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:36:50 crc kubenswrapper[4748]: E0320 10:36:50.112002 4748 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e864cd5946945 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Created,Message:Created container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:36:07.789463877 +0000 UTC m=+2.931009691,LastTimestamp:2026-03-20 10:36:07.789463877 +0000 UTC m=+2.931009691,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:36:50 crc kubenswrapper[4748]: E0320 10:36:50.117942 4748 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e864cd5a81d76 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:36:07.79075519 +0000 UTC m=+2.932301004,LastTimestamp:2026-03-20 10:36:07.79075519 +0000 UTC m=+2.932301004,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:36:50 crc kubenswrapper[4748]: E0320 10:36:50.122487 4748 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e864cd5a8823e openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Created,Message:Created container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:36:07.79078099 +0000 UTC m=+2.932326844,LastTimestamp:2026-03-20 10:36:07.79078099 +0000 UTC m=+2.932326844,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:36:50 crc kubenswrapper[4748]: E0320 10:36:50.128984 4748 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e864cd5b4bbc4 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Created,Message:Created container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:36:07.791582148 +0000 UTC m=+2.933127962,LastTimestamp:2026-03-20 10:36:07.791582148 +0000 UTC m=+2.933127962,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:36:50 crc kubenswrapper[4748]: E0320 10:36:50.133134 4748 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e864cd63add87 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Started,Message:Started container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:36:07.800372615 +0000 UTC m=+2.941918429,LastTimestamp:2026-03-20 10:36:07.800372615 +0000 UTC m=+2.941918429,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:36:50 crc kubenswrapper[4748]: E0320 10:36:50.137663 4748 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e864cd6488bf1 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:36:07.801269233 +0000 UTC m=+2.942815047,LastTimestamp:2026-03-20 10:36:07.801269233 +0000 UTC m=+2.942815047,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:36:50 crc kubenswrapper[4748]: E0320 10:36:50.141168 4748 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e864cd66db065 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Started,Message:Started container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:36:07.803703397 +0000 UTC m=+2.945249251,LastTimestamp:2026-03-20 10:36:07.803703397 +0000 UTC m=+2.945249251,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:36:50 crc kubenswrapper[4748]: E0320 10:36:50.145414 4748 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e864cd67dbe7a openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:36:07.804755578 +0000 UTC m=+2.946301392,LastTimestamp:2026-03-20 10:36:07.804755578 +0000 UTC m=+2.946301392,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:36:50 crc kubenswrapper[4748]: E0320 10:36:50.151867 4748 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e864cd7039dca openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:36:07.813529034 +0000 UTC m=+2.955074858,LastTimestamp:2026-03-20 10:36:07.813529034 +0000 UTC m=+2.955074858,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:36:50 crc kubenswrapper[4748]: E0320 10:36:50.156487 4748 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e864cd738e0ce openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Started,Message:Started container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:36:07.817019598 +0000 UTC m=+2.958565422,LastTimestamp:2026-03-20 10:36:07.817019598 +0000 UTC m=+2.958565422,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:36:50 crc kubenswrapper[4748]: E0320 10:36:50.162656 4748 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e864ce1d9709b openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Created,Message:Created container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:36:07.995314331 +0000 UTC m=+3.136860145,LastTimestamp:2026-03-20 10:36:07.995314331 +0000 UTC m=+3.136860145,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:36:50 crc kubenswrapper[4748]: E0320 10:36:50.166769 4748 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e864ce2022b31 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Created,Message:Created container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:36:07.997983537 +0000 UTC m=+3.139529351,LastTimestamp:2026-03-20 10:36:07.997983537 +0000 UTC m=+3.139529351,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:36:50 crc kubenswrapper[4748]: E0320 10:36:50.172592 4748 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e864ce288fdf2 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Started,Message:Started container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:36:08.006819314 +0000 UTC m=+3.148365128,LastTimestamp:2026-03-20 10:36:08.006819314 +0000 UTC m=+3.148365128,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:36:50 crc kubenswrapper[4748]: E0320 10:36:50.181390 4748 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e864ce29a927f openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:36:08.007971455 +0000 UTC m=+3.149517269,LastTimestamp:2026-03-20 10:36:08.007971455 +0000 UTC m=+3.149517269,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:36:50 crc kubenswrapper[4748]: E0320 10:36:50.185979 4748 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e864ce32521b4 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Started,Message:Started container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:36:08.017052084 +0000 UTC m=+3.158597898,LastTimestamp:2026-03-20 10:36:08.017052084 +0000 UTC m=+3.158597898,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:36:50 crc kubenswrapper[4748]: E0320 10:36:50.189806 4748 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e864ce33f2895 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:36:08.018757781 +0000 UTC m=+3.160303635,LastTimestamp:2026-03-20 10:36:08.018757781 +0000 UTC m=+3.160303635,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:36:50 crc kubenswrapper[4748]: E0320 10:36:50.193584 4748 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e864cee57d6af openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Created,Message:Created container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:36:08.204924591 +0000 UTC m=+3.346470405,LastTimestamp:2026-03-20 10:36:08.204924591 +0000 UTC m=+3.346470405,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:36:50 crc kubenswrapper[4748]: E0320 10:36:50.197088 4748 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e864cee73f383 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Created,Message:Created container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:36:08.206766979 +0000 UTC m=+3.348312793,LastTimestamp:2026-03-20 10:36:08.206766979 +0000 UTC m=+3.348312793,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:36:50 crc kubenswrapper[4748]: E0320 10:36:50.200150 4748 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e864cef41d1b6 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Started,Message:Started container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:36:08.220258742 +0000 UTC m=+3.361804556,LastTimestamp:2026-03-20 10:36:08.220258742 +0000 UTC m=+3.361804556,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:36:50 crc kubenswrapper[4748]: E0320 10:36:50.203966 4748 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e864cef54ac34 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:36:08.221494324 +0000 UTC m=+3.363040148,LastTimestamp:2026-03-20 10:36:08.221494324 +0000 UTC m=+3.363040148,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:36:50 crc kubenswrapper[4748]: E0320 10:36:50.208362 4748 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e864cefb822d6 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Started,Message:Started container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:36:08.228012758 +0000 UTC m=+3.369558572,LastTimestamp:2026-03-20 10:36:08.228012758 +0000 UTC m=+3.369558572,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:36:50 crc kubenswrapper[4748]: E0320 10:36:50.211692 4748 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e864cfad010e6 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Created,Message:Created container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:36:08.414130406 +0000 UTC m=+3.555676220,LastTimestamp:2026-03-20 10:36:08.414130406 +0000 UTC m=+3.555676220,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:36:50 crc kubenswrapper[4748]: E0320 10:36:50.214983 4748 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e864cfb6b812a openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Started,Message:Started container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:36:08.424317226 +0000 UTC m=+3.565863040,LastTimestamp:2026-03-20 10:36:08.424317226 +0000 UTC m=+3.565863040,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:36:50 crc kubenswrapper[4748]: E0320 10:36:50.218609 4748 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e864cfb7ed6bf openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:36:08.425584319 +0000 UTC m=+3.567130133,LastTimestamp:2026-03-20 10:36:08.425584319 +0000 UTC m=+3.567130133,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:36:50 crc kubenswrapper[4748]: E0320 10:36:50.223153 4748 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e864d02fc62b8 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:36:08.551252664 +0000 UTC m=+3.692798478,LastTimestamp:2026-03-20 10:36:08.551252664 +0000 UTC m=+3.692798478,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:36:50 crc kubenswrapper[4748]: E0320 10:36:50.227175 4748 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e864d06840c8a openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:36:08.610475146 +0000 UTC m=+3.752020960,LastTimestamp:2026-03-20 10:36:08.610475146 +0000 UTC m=+3.752020960,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:36:50 crc kubenswrapper[4748]: E0320 10:36:50.230446 4748 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e864d07386c30 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:36:08.622296112 +0000 UTC m=+3.763841926,LastTimestamp:2026-03-20 10:36:08.622296112 +0000 UTC m=+3.763841926,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:36:50 crc kubenswrapper[4748]: E0320 10:36:50.233800 4748 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e864d0d171ff0 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Created,Message:Created container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:36:08.7207772 +0000 UTC m=+3.862323004,LastTimestamp:2026-03-20 10:36:08.7207772 +0000 UTC m=+3.862323004,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:36:50 crc kubenswrapper[4748]: E0320 10:36:50.237486 4748 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e864d0ddedb4d openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Started,Message:Started container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:36:08.733866829 +0000 UTC m=+3.875412643,LastTimestamp:2026-03-20 10:36:08.733866829 +0000 UTC m=+3.875412643,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:36:50 crc kubenswrapper[4748]: E0320 10:36:50.241382 4748 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e864d3f7c4995 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:36:09.566267797 +0000 UTC m=+4.707813611,LastTimestamp:2026-03-20 10:36:09.566267797 +0000 UTC m=+4.707813611,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:36:50 crc kubenswrapper[4748]: E0320 10:36:50.244635 4748 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e864d4ab09cf7 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Created,Message:Created container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:36:09.754246391 +0000 UTC m=+4.895792205,LastTimestamp:2026-03-20 10:36:09.754246391 +0000 UTC m=+4.895792205,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:36:50 crc kubenswrapper[4748]: E0320 10:36:50.248075 4748 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e864d4b2e823f openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Started,Message:Started container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:36:09.762497087 +0000 UTC m=+4.904042911,LastTimestamp:2026-03-20 10:36:09.762497087 +0000 UTC m=+4.904042911,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:36:50 crc kubenswrapper[4748]: E0320 10:36:50.252035 4748 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e864d4b40eef7 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:36:09.763704567 +0000 UTC m=+4.905250382,LastTimestamp:2026-03-20 10:36:09.763704567 +0000 UTC m=+4.905250382,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:36:50 crc kubenswrapper[4748]: E0320 10:36:50.256493 4748 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e864d57089d21 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Created,Message:Created container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:36:09.961340193 +0000 UTC m=+5.102886007,LastTimestamp:2026-03-20 10:36:09.961340193 +0000 UTC m=+5.102886007,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:36:50 crc kubenswrapper[4748]: E0320 10:36:50.261410 4748 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e864d57cd3903 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Started,Message:Started container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:36:09.974225155 +0000 UTC m=+5.115771009,LastTimestamp:2026-03-20 10:36:09.974225155 +0000 UTC m=+5.115771009,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:36:50 crc kubenswrapper[4748]: E0320 10:36:50.266465 4748 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e864d57de31cf openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:36:09.975337423 +0000 UTC m=+5.116883227,LastTimestamp:2026-03-20 10:36:09.975337423 +0000 UTC m=+5.116883227,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:36:50 crc kubenswrapper[4748]: E0320 10:36:50.271146 4748 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e864d6481aeac openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Created,Message:Created container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:36:10.187378348 +0000 UTC m=+5.328924192,LastTimestamp:2026-03-20 10:36:10.187378348 +0000 UTC m=+5.328924192,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:36:50 crc kubenswrapper[4748]: E0320 10:36:50.275895 4748 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e864d65354caf openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Started,Message:Started container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:36:10.199149743 +0000 UTC m=+5.340695597,LastTimestamp:2026-03-20 10:36:10.199149743 +0000 UTC m=+5.340695597,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:36:50 crc kubenswrapper[4748]: E0320 10:36:50.280486 4748 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e864d6547cbaf openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:36:10.200361903 +0000 UTC m=+5.341907717,LastTimestamp:2026-03-20 10:36:10.200361903 +0000 UTC m=+5.341907717,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:36:50 crc kubenswrapper[4748]: E0320 10:36:50.286654 4748 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e864d703c3284 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Created,Message:Created container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:36:10.384151172 +0000 UTC m=+5.525696986,LastTimestamp:2026-03-20 10:36:10.384151172 +0000 UTC m=+5.525696986,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:36:50 crc kubenswrapper[4748]: E0320 10:36:50.290387 4748 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e864d7147c90d openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Started,Message:Started container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:36:10.401687821 +0000 UTC m=+5.543233645,LastTimestamp:2026-03-20 10:36:10.401687821 +0000 UTC m=+5.543233645,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:36:50 crc kubenswrapper[4748]: E0320 10:36:50.294721 4748 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e864d715d8dd8 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:36:10.403114456 +0000 UTC m=+5.544660270,LastTimestamp:2026-03-20 10:36:10.403114456 +0000 UTC m=+5.544660270,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:36:50 crc kubenswrapper[4748]: E0320 10:36:50.299476 4748 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e864d81581904 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Created,Message:Created container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:36:10.671192324 +0000 UTC m=+5.812738168,LastTimestamp:2026-03-20 10:36:10.671192324 +0000 UTC m=+5.812738168,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:36:50 crc kubenswrapper[4748]: E0320 10:36:50.305468 4748 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e864d82575ebd openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Started,Message:Started container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:36:10.687921853 +0000 UTC m=+5.829467657,LastTimestamp:2026-03-20 10:36:10.687921853 +0000 UTC m=+5.829467657,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:36:50 crc kubenswrapper[4748]: E0320 10:36:50.312981 4748 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 20 10:36:50 crc kubenswrapper[4748]: &Event{ObjectMeta:{kube-controller-manager-crc.189e864f02cdce42 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": context deadline exceeded (Client.Timeout exceeded while awaiting headers) Mar 20 10:36:50 crc kubenswrapper[4748]: body: Mar 20 10:36:50 crc kubenswrapper[4748]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:36:17.138134594 +0000 UTC m=+12.279680418,LastTimestamp:2026-03-20 10:36:17.138134594 +0000 UTC m=+12.279680418,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 10:36:50 crc kubenswrapper[4748]: > Mar 20 10:36:50 crc kubenswrapper[4748]: E0320 10:36:50.317174 4748 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e864f02cee392 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:36:17.138205586 +0000 UTC m=+12.279751400,LastTimestamp:2026-03-20 10:36:17.138205586 +0000 UTC m=+12.279751400,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:36:50 crc kubenswrapper[4748]: E0320 10:36:50.322369 4748 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 20 10:36:50 crc kubenswrapper[4748]: &Event{ObjectMeta:{kube-apiserver-crc.189e864f7daae574 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:ProbeError,Message:Liveness probe error: Get "https://192.168.126.11:17697/healthz": read tcp 192.168.126.11:32846->192.168.126.11:17697: read: connection reset by peer Mar 20 10:36:50 crc kubenswrapper[4748]: body: Mar 20 10:36:50 crc kubenswrapper[4748]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:36:19.19944434 +0000 UTC m=+14.340990184,LastTimestamp:2026-03-20 10:36:19.19944434 +0000 UTC m=+14.340990184,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 10:36:50 crc kubenswrapper[4748]: > Mar 20 10:36:50 crc kubenswrapper[4748]: E0320 10:36:50.328464 4748 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e864f7dabea16 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Unhealthy,Message:Liveness probe failed: Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:32846->192.168.126.11:17697: read: connection reset by peer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:36:19.199511062 +0000 UTC m=+14.341056916,LastTimestamp:2026-03-20 10:36:19.199511062 +0000 UTC m=+14.341056916,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:36:50 crc kubenswrapper[4748]: E0320 10:36:50.334694 4748 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189e864cfb7ed6bf\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e864cfb7ed6bf openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:36:08.425584319 +0000 UTC m=+3.567130133,LastTimestamp:2026-03-20 10:36:19.614405073 +0000 UTC m=+14.755950907,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:36:50 crc kubenswrapper[4748]: E0320 10:36:50.345561 4748 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189e864d06840c8a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e864d06840c8a openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:36:08.610475146 +0000 UTC m=+3.752020960,LastTimestamp:2026-03-20 10:36:19.827797543 +0000 UTC m=+14.969343377,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:36:50 crc kubenswrapper[4748]: E0320 10:36:50.353268 4748 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189e864d07386c30\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e864d07386c30 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:36:08.622296112 +0000 UTC m=+3.763841926,LastTimestamp:2026-03-20 10:36:19.84046353 +0000 UTC m=+14.982009344,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:36:50 crc kubenswrapper[4748]: E0320 10:36:50.359268 4748 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 20 10:36:50 crc kubenswrapper[4748]: &Event{ObjectMeta:{kube-apiserver-crc.189e864fa4de6d43 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 20 10:36:50 crc kubenswrapper[4748]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 20 10:36:50 crc kubenswrapper[4748]: Mar 20 10:36:50 crc kubenswrapper[4748]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:36:19.857132867 +0000 UTC m=+14.998678721,LastTimestamp:2026-03-20 10:36:19.857132867 +0000 UTC m=+14.998678721,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 10:36:50 crc kubenswrapper[4748]: > Mar 20 10:36:50 crc kubenswrapper[4748]: E0320 10:36:50.363916 4748 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e864fa4df86f2 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:36:19.857204978 +0000 UTC m=+14.998750832,LastTimestamp:2026-03-20 10:36:19.857204978 +0000 UTC m=+14.998750832,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:36:50 crc kubenswrapper[4748]: E0320 10:36:50.370582 4748 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 20 10:36:50 crc kubenswrapper[4748]: &Event{ObjectMeta:{kube-controller-manager-crc.189e865156df7fcf openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 20 10:36:50 crc kubenswrapper[4748]: body: Mar 20 10:36:50 crc kubenswrapper[4748]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:36:27.138514895 +0000 UTC m=+22.280060709,LastTimestamp:2026-03-20 10:36:27.138514895 +0000 UTC m=+22.280060709,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 10:36:50 crc kubenswrapper[4748]: > Mar 20 10:36:50 crc kubenswrapper[4748]: E0320 10:36:50.375734 4748 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e865156e076e3 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:36:27.138578147 +0000 UTC m=+22.280123961,LastTimestamp:2026-03-20 10:36:27.138578147 +0000 UTC m=+22.280123961,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:36:50 crc kubenswrapper[4748]: E0320 10:36:50.381146 4748 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e865156df7fcf\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 20 10:36:50 crc kubenswrapper[4748]: &Event{ObjectMeta:{kube-controller-manager-crc.189e865156df7fcf openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 20 10:36:50 crc kubenswrapper[4748]: body: Mar 20 10:36:50 crc kubenswrapper[4748]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:36:27.138514895 +0000 UTC m=+22.280060709,LastTimestamp:2026-03-20 10:36:37.138985434 +0000 UTC m=+32.280531288,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 10:36:50 crc kubenswrapper[4748]: > Mar 20 10:36:50 crc kubenswrapper[4748]: E0320 10:36:50.385030 4748 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e865156e076e3\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e865156e076e3 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:36:27.138578147 +0000 UTC m=+22.280123961,LastTimestamp:2026-03-20 10:36:37.139058526 +0000 UTC m=+32.280604380,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:36:50 crc kubenswrapper[4748]: E0320 10:36:50.388978 4748 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8653ab232246 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Killing,Message:Container cluster-policy-controller failed startup probe, will be restarted,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:36:37.142168134 +0000 UTC m=+32.283713988,LastTimestamp:2026-03-20 10:36:37.142168134 +0000 UTC m=+32.283713988,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:36:50 crc kubenswrapper[4748]: E0320 10:36:50.393190 4748 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e864c902b4b16\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e864c902b4b16 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:36:06.624946966 +0000 UTC m=+1.766492780,LastTimestamp:2026-03-20 10:36:37.261147012 +0000 UTC m=+32.402692826,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:36:50 crc kubenswrapper[4748]: E0320 10:36:50.397258 4748 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e864ca0d7d0c1\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e864ca0d7d0c1 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:36:06.904688833 +0000 UTC m=+2.046234647,LastTimestamp:2026-03-20 10:36:37.430869698 +0000 UTC m=+32.572415552,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:36:50 crc kubenswrapper[4748]: E0320 10:36:50.400662 4748 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e864ca19a120d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e864ca19a120d openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:36:06.917419533 +0000 UTC m=+2.058965347,LastTimestamp:2026-03-20 10:36:37.440409237 +0000 UTC m=+32.581955071,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:36:50 crc kubenswrapper[4748]: E0320 10:36:50.413250 4748 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e865156df7fcf\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 20 10:36:50 crc kubenswrapper[4748]: &Event{ObjectMeta:{kube-controller-manager-crc.189e865156df7fcf openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 20 10:36:50 crc kubenswrapper[4748]: body: Mar 20 10:36:50 crc kubenswrapper[4748]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:36:27.138514895 +0000 UTC m=+22.280060709,LastTimestamp:2026-03-20 10:36:47.138757151 +0000 UTC m=+42.280302975,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 10:36:50 crc kubenswrapper[4748]: > Mar 20 10:36:50 crc kubenswrapper[4748]: E0320 10:36:50.417618 4748 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e865156e076e3\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e865156e076e3 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:36:27.138578147 +0000 UTC m=+22.280123961,LastTimestamp:2026-03-20 10:36:47.138818382 +0000 UTC m=+42.280364196,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:36:50 crc kubenswrapper[4748]: I0320 10:36:50.458201 4748 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 10:36:51 crc kubenswrapper[4748]: I0320 10:36:51.453736 4748 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 10:36:52 crc kubenswrapper[4748]: I0320 10:36:52.455937 4748 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 10:36:52 crc kubenswrapper[4748]: W0320 10:36:52.474495 4748 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "crc" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Mar 20 10:36:52 crc kubenswrapper[4748]: E0320 10:36:52.474613 4748 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 20 10:36:52 crc kubenswrapper[4748]: W0320 10:36:52.536594 4748 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Mar 20 10:36:52 crc kubenswrapper[4748]: E0320 10:36:52.536731 4748 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 20 10:36:53 crc kubenswrapper[4748]: I0320 10:36:53.454017 4748 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 10:36:54 crc kubenswrapper[4748]: E0320 10:36:54.284377 4748 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 20 10:36:54 crc kubenswrapper[4748]: I0320 10:36:54.296501 4748 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:36:54 crc kubenswrapper[4748]: I0320 10:36:54.298072 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:36:54 crc kubenswrapper[4748]: I0320 10:36:54.298125 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:36:54 crc kubenswrapper[4748]: I0320 10:36:54.298142 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:36:54 crc kubenswrapper[4748]: I0320 10:36:54.298176 4748 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 10:36:54 crc kubenswrapper[4748]: E0320 10:36:54.304212 4748 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 20 10:36:54 crc kubenswrapper[4748]: I0320 10:36:54.453888 4748 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 10:36:55 crc kubenswrapper[4748]: I0320 10:36:55.455069 4748 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 10:36:55 crc kubenswrapper[4748]: E0320 10:36:55.586519 4748 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 10:36:56 crc kubenswrapper[4748]: I0320 10:36:56.453356 4748 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 10:36:57 crc kubenswrapper[4748]: I0320 10:36:57.139779 4748 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 10:36:57 crc kubenswrapper[4748]: I0320 10:36:57.139945 4748 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 10:36:57 crc kubenswrapper[4748]: E0320 10:36:57.148001 4748 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e865156df7fcf\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 20 10:36:57 crc kubenswrapper[4748]: &Event{ObjectMeta:{kube-controller-manager-crc.189e865156df7fcf openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 20 10:36:57 crc kubenswrapper[4748]: body: Mar 20 10:36:57 crc kubenswrapper[4748]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:36:27.138514895 +0000 UTC m=+22.280060709,LastTimestamp:2026-03-20 10:36:57.13991671 +0000 UTC m=+52.281462574,Count:4,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 10:36:57 crc kubenswrapper[4748]: > Mar 20 10:36:57 crc kubenswrapper[4748]: I0320 10:36:57.453488 4748 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 10:36:58 crc kubenswrapper[4748]: I0320 10:36:58.455494 4748 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 10:36:58 crc kubenswrapper[4748]: I0320 10:36:58.864869 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 10:36:58 crc kubenswrapper[4748]: I0320 10:36:58.865310 4748 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:36:58 crc kubenswrapper[4748]: I0320 10:36:58.866909 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:36:58 crc kubenswrapper[4748]: I0320 10:36:58.866954 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:36:58 crc kubenswrapper[4748]: I0320 10:36:58.866963 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:36:59 crc kubenswrapper[4748]: I0320 10:36:59.455099 4748 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 10:37:00 crc kubenswrapper[4748]: I0320 10:37:00.455027 4748 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 10:37:00 crc kubenswrapper[4748]: I0320 10:37:00.515341 4748 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:37:00 crc kubenswrapper[4748]: I0320 10:37:00.516825 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:00 crc kubenswrapper[4748]: I0320 10:37:00.516912 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:00 crc kubenswrapper[4748]: I0320 10:37:00.516930 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:00 crc kubenswrapper[4748]: I0320 10:37:00.517772 4748 scope.go:117] "RemoveContainer" containerID="c78f06d28a973f07c57c8737ae920f13230f8b7712f86c5d779b4e6abf22c67e" Mar 20 10:37:00 crc kubenswrapper[4748]: E0320 10:37:00.518088 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 10:37:01 crc kubenswrapper[4748]: E0320 10:37:01.291975 4748 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 20 10:37:01 crc kubenswrapper[4748]: I0320 10:37:01.304410 4748 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:37:01 crc kubenswrapper[4748]: I0320 10:37:01.306331 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:01 crc kubenswrapper[4748]: I0320 10:37:01.306390 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:01 crc kubenswrapper[4748]: I0320 10:37:01.306405 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:01 crc kubenswrapper[4748]: I0320 10:37:01.306443 4748 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 10:37:01 crc kubenswrapper[4748]: E0320 10:37:01.312986 4748 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 20 10:37:01 crc kubenswrapper[4748]: I0320 10:37:01.454335 4748 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 10:37:02 crc kubenswrapper[4748]: I0320 10:37:02.453622 4748 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 10:37:03 crc kubenswrapper[4748]: I0320 10:37:03.454487 4748 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 10:37:04 crc kubenswrapper[4748]: I0320 10:37:04.143075 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 10:37:04 crc kubenswrapper[4748]: I0320 10:37:04.143502 4748 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:37:04 crc kubenswrapper[4748]: I0320 10:37:04.144687 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:04 crc kubenswrapper[4748]: I0320 10:37:04.144745 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:04 crc kubenswrapper[4748]: I0320 10:37:04.144767 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:04 crc kubenswrapper[4748]: I0320 10:37:04.147692 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 10:37:04 crc kubenswrapper[4748]: I0320 10:37:04.463076 4748 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 10:37:04 crc kubenswrapper[4748]: I0320 10:37:04.776629 4748 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:37:04 crc kubenswrapper[4748]: I0320 10:37:04.778083 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:04 crc kubenswrapper[4748]: I0320 10:37:04.778121 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:04 crc kubenswrapper[4748]: I0320 10:37:04.778131 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:05 crc kubenswrapper[4748]: I0320 10:37:05.453279 4748 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 10:37:05 crc kubenswrapper[4748]: E0320 10:37:05.586947 4748 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 10:37:06 crc kubenswrapper[4748]: I0320 10:37:06.452552 4748 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 10:37:07 crc kubenswrapper[4748]: I0320 10:37:07.453952 4748 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 10:37:08 crc kubenswrapper[4748]: E0320 10:37:08.297050 4748 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 20 10:37:08 crc kubenswrapper[4748]: I0320 10:37:08.313719 4748 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:37:08 crc kubenswrapper[4748]: I0320 10:37:08.315031 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:08 crc kubenswrapper[4748]: I0320 10:37:08.315059 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:08 crc kubenswrapper[4748]: I0320 10:37:08.315096 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:08 crc kubenswrapper[4748]: I0320 10:37:08.315121 4748 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 10:37:08 crc kubenswrapper[4748]: E0320 10:37:08.318411 4748 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 20 10:37:08 crc kubenswrapper[4748]: I0320 10:37:08.451346 4748 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 10:37:09 crc kubenswrapper[4748]: I0320 10:37:09.451720 4748 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 10:37:10 crc kubenswrapper[4748]: I0320 10:37:10.452256 4748 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 10:37:11 crc kubenswrapper[4748]: I0320 10:37:11.451981 4748 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 10:37:12 crc kubenswrapper[4748]: I0320 10:37:12.451573 4748 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 10:37:12 crc kubenswrapper[4748]: I0320 10:37:12.514655 4748 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:37:12 crc kubenswrapper[4748]: I0320 10:37:12.515800 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:12 crc kubenswrapper[4748]: I0320 10:37:12.515858 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:12 crc kubenswrapper[4748]: I0320 10:37:12.515870 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:12 crc kubenswrapper[4748]: I0320 10:37:12.516359 4748 scope.go:117] "RemoveContainer" containerID="c78f06d28a973f07c57c8737ae920f13230f8b7712f86c5d779b4e6abf22c67e" Mar 20 10:37:12 crc kubenswrapper[4748]: W0320 10:37:12.610801 4748 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Mar 20 10:37:12 crc kubenswrapper[4748]: E0320 10:37:12.610885 4748 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 20 10:37:12 crc kubenswrapper[4748]: I0320 10:37:12.800335 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 20 10:37:12 crc kubenswrapper[4748]: I0320 10:37:12.802915 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"abfae41f906e09e4b2be25e2cdf0408811449414f51931ce149a8468a32a88c6"} Mar 20 10:37:12 crc kubenswrapper[4748]: I0320 10:37:12.803113 4748 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:37:12 crc kubenswrapper[4748]: I0320 10:37:12.804592 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:12 crc kubenswrapper[4748]: I0320 10:37:12.804652 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:12 crc kubenswrapper[4748]: I0320 10:37:12.804676 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:13 crc kubenswrapper[4748]: I0320 10:37:13.452597 4748 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 10:37:13 crc kubenswrapper[4748]: I0320 10:37:13.807371 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 20 10:37:13 crc kubenswrapper[4748]: I0320 10:37:13.807719 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 20 10:37:13 crc kubenswrapper[4748]: I0320 10:37:13.809665 4748 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="abfae41f906e09e4b2be25e2cdf0408811449414f51931ce149a8468a32a88c6" exitCode=255 Mar 20 10:37:13 crc kubenswrapper[4748]: I0320 10:37:13.809709 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"abfae41f906e09e4b2be25e2cdf0408811449414f51931ce149a8468a32a88c6"} Mar 20 10:37:13 crc kubenswrapper[4748]: I0320 10:37:13.809771 4748 scope.go:117] "RemoveContainer" containerID="c78f06d28a973f07c57c8737ae920f13230f8b7712f86c5d779b4e6abf22c67e" Mar 20 10:37:13 crc kubenswrapper[4748]: I0320 10:37:13.809923 4748 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:37:13 crc kubenswrapper[4748]: I0320 10:37:13.810925 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:13 crc kubenswrapper[4748]: I0320 10:37:13.810976 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:13 crc kubenswrapper[4748]: I0320 10:37:13.810995 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:13 crc kubenswrapper[4748]: I0320 10:37:13.811661 4748 scope.go:117] "RemoveContainer" containerID="abfae41f906e09e4b2be25e2cdf0408811449414f51931ce149a8468a32a88c6" Mar 20 10:37:13 crc kubenswrapper[4748]: E0320 10:37:13.811886 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 10:37:14 crc kubenswrapper[4748]: I0320 10:37:14.452005 4748 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 10:37:14 crc kubenswrapper[4748]: I0320 10:37:14.814893 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 20 10:37:15 crc kubenswrapper[4748]: E0320 10:37:15.309595 4748 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 20 10:37:15 crc kubenswrapper[4748]: I0320 10:37:15.319018 4748 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:37:15 crc kubenswrapper[4748]: I0320 10:37:15.320416 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:15 crc kubenswrapper[4748]: I0320 10:37:15.320484 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:15 crc kubenswrapper[4748]: I0320 10:37:15.320504 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:15 crc kubenswrapper[4748]: I0320 10:37:15.320538 4748 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 10:37:15 crc kubenswrapper[4748]: E0320 10:37:15.325450 4748 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 20 10:37:15 crc kubenswrapper[4748]: I0320 10:37:15.451668 4748 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 10:37:15 crc kubenswrapper[4748]: E0320 10:37:15.587997 4748 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 10:37:15 crc kubenswrapper[4748]: I0320 10:37:15.912024 4748 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 20 10:37:15 crc kubenswrapper[4748]: I0320 10:37:15.925027 4748 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 20 10:37:16 crc kubenswrapper[4748]: I0320 10:37:16.453010 4748 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 10:37:17 crc kubenswrapper[4748]: I0320 10:37:17.454927 4748 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 10:37:18 crc kubenswrapper[4748]: I0320 10:37:18.407012 4748 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:37:18 crc kubenswrapper[4748]: I0320 10:37:18.407190 4748 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:37:18 crc kubenswrapper[4748]: I0320 10:37:18.408311 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:18 crc kubenswrapper[4748]: I0320 10:37:18.408338 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:18 crc kubenswrapper[4748]: I0320 10:37:18.408350 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:18 crc kubenswrapper[4748]: I0320 10:37:18.408917 4748 scope.go:117] "RemoveContainer" containerID="abfae41f906e09e4b2be25e2cdf0408811449414f51931ce149a8468a32a88c6" Mar 20 10:37:18 crc kubenswrapper[4748]: E0320 10:37:18.409103 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 10:37:18 crc kubenswrapper[4748]: I0320 10:37:18.452675 4748 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 10:37:19 crc kubenswrapper[4748]: I0320 10:37:19.452850 4748 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 10:37:19 crc kubenswrapper[4748]: I0320 10:37:19.777359 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:37:19 crc kubenswrapper[4748]: I0320 10:37:19.777602 4748 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:37:19 crc kubenswrapper[4748]: I0320 10:37:19.779231 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:19 crc kubenswrapper[4748]: I0320 10:37:19.779271 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:19 crc kubenswrapper[4748]: I0320 10:37:19.779285 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:19 crc kubenswrapper[4748]: I0320 10:37:19.779866 4748 scope.go:117] "RemoveContainer" containerID="abfae41f906e09e4b2be25e2cdf0408811449414f51931ce149a8468a32a88c6" Mar 20 10:37:19 crc kubenswrapper[4748]: E0320 10:37:19.780027 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 10:37:20 crc kubenswrapper[4748]: I0320 10:37:20.453649 4748 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 10:37:20 crc kubenswrapper[4748]: I0320 10:37:20.474355 4748 csr.go:261] certificate signing request csr-4zmkx is approved, waiting to be issued Mar 20 10:37:20 crc kubenswrapper[4748]: I0320 10:37:20.486118 4748 csr.go:257] certificate signing request csr-4zmkx is issued Mar 20 10:37:20 crc kubenswrapper[4748]: I0320 10:37:20.510797 4748 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Mar 20 10:37:21 crc kubenswrapper[4748]: I0320 10:37:21.311042 4748 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Mar 20 10:37:21 crc kubenswrapper[4748]: I0320 10:37:21.487239 4748 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2027-01-10 23:49:07.965091449 +0000 UTC Mar 20 10:37:21 crc kubenswrapper[4748]: I0320 10:37:21.487288 4748 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 7117h11m46.477808727s for next certificate rotation Mar 20 10:37:22 crc kubenswrapper[4748]: I0320 10:37:22.326101 4748 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:37:22 crc kubenswrapper[4748]: I0320 10:37:22.327384 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:22 crc kubenswrapper[4748]: I0320 10:37:22.327422 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:22 crc kubenswrapper[4748]: I0320 10:37:22.327433 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:22 crc kubenswrapper[4748]: I0320 10:37:22.327537 4748 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 10:37:22 crc kubenswrapper[4748]: I0320 10:37:22.335585 4748 kubelet_node_status.go:115] "Node was previously registered" node="crc" Mar 20 10:37:22 crc kubenswrapper[4748]: I0320 10:37:22.335906 4748 kubelet_node_status.go:79] "Successfully registered node" node="crc" Mar 20 10:37:22 crc kubenswrapper[4748]: E0320 10:37:22.335941 4748 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 20 10:37:22 crc kubenswrapper[4748]: I0320 10:37:22.340977 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:22 crc kubenswrapper[4748]: I0320 10:37:22.341049 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:22 crc kubenswrapper[4748]: I0320 10:37:22.341065 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:22 crc kubenswrapper[4748]: I0320 10:37:22.341089 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:22 crc kubenswrapper[4748]: I0320 10:37:22.341104 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:22Z","lastTransitionTime":"2026-03-20T10:37:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:22 crc kubenswrapper[4748]: E0320 10:37:22.359166 4748 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:37:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:37:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:37:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:37:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1d9697da-f407-4535-b044-2e042853bd80\\\",\\\"systemUUID\\\":\\\"1909d2db-5267-4c43-8cb4-dc64b5fa3add\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:22 crc kubenswrapper[4748]: I0320 10:37:22.366996 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:22 crc kubenswrapper[4748]: I0320 10:37:22.367070 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:22 crc kubenswrapper[4748]: I0320 10:37:22.367135 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:22 crc kubenswrapper[4748]: I0320 10:37:22.367169 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:22 crc kubenswrapper[4748]: I0320 10:37:22.367189 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:22Z","lastTransitionTime":"2026-03-20T10:37:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:22 crc kubenswrapper[4748]: E0320 10:37:22.378013 4748 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:37:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:37:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:37:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:37:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1d9697da-f407-4535-b044-2e042853bd80\\\",\\\"systemUUID\\\":\\\"1909d2db-5267-4c43-8cb4-dc64b5fa3add\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:22 crc kubenswrapper[4748]: I0320 10:37:22.385913 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:22 crc kubenswrapper[4748]: I0320 10:37:22.385942 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:22 crc kubenswrapper[4748]: I0320 10:37:22.385951 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:22 crc kubenswrapper[4748]: I0320 10:37:22.385967 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:22 crc kubenswrapper[4748]: I0320 10:37:22.385979 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:22Z","lastTransitionTime":"2026-03-20T10:37:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:22 crc kubenswrapper[4748]: E0320 10:37:22.395369 4748 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:37:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:37:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:37:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:37:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1d9697da-f407-4535-b044-2e042853bd80\\\",\\\"systemUUID\\\":\\\"1909d2db-5267-4c43-8cb4-dc64b5fa3add\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:22 crc kubenswrapper[4748]: I0320 10:37:22.404564 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:22 crc kubenswrapper[4748]: I0320 10:37:22.404598 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:22 crc kubenswrapper[4748]: I0320 10:37:22.404609 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:22 crc kubenswrapper[4748]: I0320 10:37:22.404627 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:22 crc kubenswrapper[4748]: I0320 10:37:22.404638 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:22Z","lastTransitionTime":"2026-03-20T10:37:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:22 crc kubenswrapper[4748]: E0320 10:37:22.422376 4748 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:37:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:37:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:37:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:37:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1d9697da-f407-4535-b044-2e042853bd80\\\",\\\"systemUUID\\\":\\\"1909d2db-5267-4c43-8cb4-dc64b5fa3add\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:22 crc kubenswrapper[4748]: E0320 10:37:22.422597 4748 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 10:37:22 crc kubenswrapper[4748]: E0320 10:37:22.422646 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:22 crc kubenswrapper[4748]: E0320 10:37:22.523698 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:22 crc kubenswrapper[4748]: E0320 10:37:22.624418 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:22 crc kubenswrapper[4748]: E0320 10:37:22.725496 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:22 crc kubenswrapper[4748]: E0320 10:37:22.826551 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:22 crc kubenswrapper[4748]: E0320 10:37:22.927016 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:23 crc kubenswrapper[4748]: E0320 10:37:23.027583 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:23 crc kubenswrapper[4748]: E0320 10:37:23.128347 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:23 crc kubenswrapper[4748]: E0320 10:37:23.229229 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:23 crc kubenswrapper[4748]: E0320 10:37:23.330075 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:23 crc kubenswrapper[4748]: E0320 10:37:23.431261 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:23 crc kubenswrapper[4748]: E0320 10:37:23.531709 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:23 crc kubenswrapper[4748]: E0320 10:37:23.632414 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:23 crc kubenswrapper[4748]: E0320 10:37:23.733782 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:23 crc kubenswrapper[4748]: E0320 10:37:23.834993 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:23 crc kubenswrapper[4748]: E0320 10:37:23.935535 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:24 crc kubenswrapper[4748]: E0320 10:37:24.036419 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:24 crc kubenswrapper[4748]: E0320 10:37:24.137796 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:24 crc kubenswrapper[4748]: E0320 10:37:24.239128 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:24 crc kubenswrapper[4748]: E0320 10:37:24.339278 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:24 crc kubenswrapper[4748]: E0320 10:37:24.439708 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:24 crc kubenswrapper[4748]: E0320 10:37:24.540873 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:24 crc kubenswrapper[4748]: E0320 10:37:24.641395 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:24 crc kubenswrapper[4748]: E0320 10:37:24.742036 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:24 crc kubenswrapper[4748]: E0320 10:37:24.842804 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:24 crc kubenswrapper[4748]: E0320 10:37:24.943701 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:25 crc kubenswrapper[4748]: E0320 10:37:25.043816 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:25 crc kubenswrapper[4748]: E0320 10:37:25.144903 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:25 crc kubenswrapper[4748]: E0320 10:37:25.245857 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:25 crc kubenswrapper[4748]: E0320 10:37:25.346940 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:25 crc kubenswrapper[4748]: E0320 10:37:25.447885 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:25 crc kubenswrapper[4748]: E0320 10:37:25.548489 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:25 crc kubenswrapper[4748]: E0320 10:37:25.588962 4748 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 10:37:25 crc kubenswrapper[4748]: E0320 10:37:25.648981 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:25 crc kubenswrapper[4748]: E0320 10:37:25.750109 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:25 crc kubenswrapper[4748]: E0320 10:37:25.851090 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:25 crc kubenswrapper[4748]: E0320 10:37:25.951675 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:26 crc kubenswrapper[4748]: E0320 10:37:26.052456 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:26 crc kubenswrapper[4748]: E0320 10:37:26.153052 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:26 crc kubenswrapper[4748]: E0320 10:37:26.253652 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:26 crc kubenswrapper[4748]: E0320 10:37:26.354725 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:26 crc kubenswrapper[4748]: E0320 10:37:26.455008 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:26 crc kubenswrapper[4748]: E0320 10:37:26.555169 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:26 crc kubenswrapper[4748]: E0320 10:37:26.655728 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:26 crc kubenswrapper[4748]: E0320 10:37:26.756794 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:26 crc kubenswrapper[4748]: E0320 10:37:26.857827 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:26 crc kubenswrapper[4748]: E0320 10:37:26.958780 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:27 crc kubenswrapper[4748]: E0320 10:37:27.059959 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:27 crc kubenswrapper[4748]: E0320 10:37:27.160490 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:27 crc kubenswrapper[4748]: E0320 10:37:27.260596 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:27 crc kubenswrapper[4748]: E0320 10:37:27.361745 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:27 crc kubenswrapper[4748]: E0320 10:37:27.462763 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:27 crc kubenswrapper[4748]: E0320 10:37:27.563423 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:27 crc kubenswrapper[4748]: E0320 10:37:27.664477 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:27 crc kubenswrapper[4748]: E0320 10:37:27.764801 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:27 crc kubenswrapper[4748]: E0320 10:37:27.865698 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:27 crc kubenswrapper[4748]: E0320 10:37:27.965903 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:28 crc kubenswrapper[4748]: E0320 10:37:28.066088 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:28 crc kubenswrapper[4748]: E0320 10:37:28.166479 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:28 crc kubenswrapper[4748]: E0320 10:37:28.267634 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:28 crc kubenswrapper[4748]: E0320 10:37:28.368241 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:28 crc kubenswrapper[4748]: E0320 10:37:28.469344 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:28 crc kubenswrapper[4748]: E0320 10:37:28.570476 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:28 crc kubenswrapper[4748]: E0320 10:37:28.671039 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:28 crc kubenswrapper[4748]: E0320 10:37:28.771502 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:28 crc kubenswrapper[4748]: E0320 10:37:28.872181 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:28 crc kubenswrapper[4748]: E0320 10:37:28.972924 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:29 crc kubenswrapper[4748]: E0320 10:37:29.073063 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:29 crc kubenswrapper[4748]: E0320 10:37:29.173446 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:29 crc kubenswrapper[4748]: E0320 10:37:29.273981 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:29 crc kubenswrapper[4748]: E0320 10:37:29.374925 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:29 crc kubenswrapper[4748]: E0320 10:37:29.475781 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:29 crc kubenswrapper[4748]: E0320 10:37:29.576567 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:29 crc kubenswrapper[4748]: E0320 10:37:29.677121 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:29 crc kubenswrapper[4748]: E0320 10:37:29.777275 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:29 crc kubenswrapper[4748]: E0320 10:37:29.878320 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:29 crc kubenswrapper[4748]: E0320 10:37:29.979265 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:30 crc kubenswrapper[4748]: E0320 10:37:30.079988 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:30 crc kubenswrapper[4748]: E0320 10:37:30.180160 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:30 crc kubenswrapper[4748]: E0320 10:37:30.280365 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:30 crc kubenswrapper[4748]: E0320 10:37:30.381253 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:30 crc kubenswrapper[4748]: E0320 10:37:30.481910 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:30 crc kubenswrapper[4748]: E0320 10:37:30.582924 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:30 crc kubenswrapper[4748]: E0320 10:37:30.683657 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:30 crc kubenswrapper[4748]: E0320 10:37:30.784040 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:30 crc kubenswrapper[4748]: E0320 10:37:30.884670 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:30 crc kubenswrapper[4748]: E0320 10:37:30.985016 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:31 crc kubenswrapper[4748]: E0320 10:37:31.085945 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:31 crc kubenswrapper[4748]: E0320 10:37:31.187093 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:31 crc kubenswrapper[4748]: E0320 10:37:31.287548 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:31 crc kubenswrapper[4748]: E0320 10:37:31.388635 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:31 crc kubenswrapper[4748]: E0320 10:37:31.489608 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:31 crc kubenswrapper[4748]: I0320 10:37:31.515347 4748 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:37:31 crc kubenswrapper[4748]: I0320 10:37:31.516805 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:31 crc kubenswrapper[4748]: I0320 10:37:31.516911 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:31 crc kubenswrapper[4748]: I0320 10:37:31.516934 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:31 crc kubenswrapper[4748]: I0320 10:37:31.518050 4748 scope.go:117] "RemoveContainer" containerID="abfae41f906e09e4b2be25e2cdf0408811449414f51931ce149a8468a32a88c6" Mar 20 10:37:31 crc kubenswrapper[4748]: E0320 10:37:31.518529 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 10:37:31 crc kubenswrapper[4748]: E0320 10:37:31.590411 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:31 crc kubenswrapper[4748]: E0320 10:37:31.690824 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:31 crc kubenswrapper[4748]: E0320 10:37:31.791462 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:31 crc kubenswrapper[4748]: E0320 10:37:31.892411 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:31 crc kubenswrapper[4748]: E0320 10:37:31.992882 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:32 crc kubenswrapper[4748]: E0320 10:37:32.093899 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:32 crc kubenswrapper[4748]: E0320 10:37:32.194949 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:32 crc kubenswrapper[4748]: E0320 10:37:32.295123 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:32 crc kubenswrapper[4748]: E0320 10:37:32.396300 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:32 crc kubenswrapper[4748]: I0320 10:37:32.451181 4748 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 20 10:37:32 crc kubenswrapper[4748]: E0320 10:37:32.496483 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:32 crc kubenswrapper[4748]: E0320 10:37:32.597377 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:32 crc kubenswrapper[4748]: E0320 10:37:32.617784 4748 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 20 10:37:32 crc kubenswrapper[4748]: I0320 10:37:32.624098 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:32 crc kubenswrapper[4748]: I0320 10:37:32.624169 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:32 crc kubenswrapper[4748]: I0320 10:37:32.624197 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:32 crc kubenswrapper[4748]: I0320 10:37:32.624228 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:32 crc kubenswrapper[4748]: I0320 10:37:32.624250 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:32Z","lastTransitionTime":"2026-03-20T10:37:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:32 crc kubenswrapper[4748]: E0320 10:37:32.639421 4748 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:37:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:37:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:37:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:37:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1d9697da-f407-4535-b044-2e042853bd80\\\",\\\"systemUUID\\\":\\\"1909d2db-5267-4c43-8cb4-dc64b5fa3add\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:32 crc kubenswrapper[4748]: I0320 10:37:32.644253 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:32 crc kubenswrapper[4748]: I0320 10:37:32.644302 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:32 crc kubenswrapper[4748]: I0320 10:37:32.644320 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:32 crc kubenswrapper[4748]: I0320 10:37:32.644341 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:32 crc kubenswrapper[4748]: I0320 10:37:32.644360 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:32Z","lastTransitionTime":"2026-03-20T10:37:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:32 crc kubenswrapper[4748]: E0320 10:37:32.660793 4748 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:37:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:37:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:37:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:37:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1d9697da-f407-4535-b044-2e042853bd80\\\",\\\"systemUUID\\\":\\\"1909d2db-5267-4c43-8cb4-dc64b5fa3add\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:32 crc kubenswrapper[4748]: I0320 10:37:32.665869 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:32 crc kubenswrapper[4748]: I0320 10:37:32.665933 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:32 crc kubenswrapper[4748]: I0320 10:37:32.665947 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:32 crc kubenswrapper[4748]: I0320 10:37:32.665974 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:32 crc kubenswrapper[4748]: I0320 10:37:32.665989 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:32Z","lastTransitionTime":"2026-03-20T10:37:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:32 crc kubenswrapper[4748]: E0320 10:37:32.681790 4748 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:37:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:37:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:37:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:37:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1d9697da-f407-4535-b044-2e042853bd80\\\",\\\"systemUUID\\\":\\\"1909d2db-5267-4c43-8cb4-dc64b5fa3add\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:32 crc kubenswrapper[4748]: I0320 10:37:32.687174 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:32 crc kubenswrapper[4748]: I0320 10:37:32.687233 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:32 crc kubenswrapper[4748]: I0320 10:37:32.687251 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:32 crc kubenswrapper[4748]: I0320 10:37:32.687277 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:32 crc kubenswrapper[4748]: I0320 10:37:32.687295 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:32Z","lastTransitionTime":"2026-03-20T10:37:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:32 crc kubenswrapper[4748]: E0320 10:37:32.705560 4748 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:37:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:37:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:37:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:37:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1d9697da-f407-4535-b044-2e042853bd80\\\",\\\"systemUUID\\\":\\\"1909d2db-5267-4c43-8cb4-dc64b5fa3add\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:32 crc kubenswrapper[4748]: E0320 10:37:32.709224 4748 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 10:37:32 crc kubenswrapper[4748]: E0320 10:37:32.709297 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:32 crc kubenswrapper[4748]: E0320 10:37:32.809586 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:32 crc kubenswrapper[4748]: E0320 10:37:32.910440 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:33 crc kubenswrapper[4748]: E0320 10:37:33.010946 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:33 crc kubenswrapper[4748]: E0320 10:37:33.111921 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:33 crc kubenswrapper[4748]: E0320 10:37:33.213028 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:33 crc kubenswrapper[4748]: E0320 10:37:33.314125 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:33 crc kubenswrapper[4748]: E0320 10:37:33.415086 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:33 crc kubenswrapper[4748]: E0320 10:37:33.516223 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:33 crc kubenswrapper[4748]: E0320 10:37:33.617387 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:33 crc kubenswrapper[4748]: E0320 10:37:33.718020 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:33 crc kubenswrapper[4748]: E0320 10:37:33.818567 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:33 crc kubenswrapper[4748]: E0320 10:37:33.918815 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:34 crc kubenswrapper[4748]: E0320 10:37:34.019419 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:34 crc kubenswrapper[4748]: E0320 10:37:34.120419 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:34 crc kubenswrapper[4748]: E0320 10:37:34.221450 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:34 crc kubenswrapper[4748]: E0320 10:37:34.321610 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:34 crc kubenswrapper[4748]: E0320 10:37:34.422730 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:34 crc kubenswrapper[4748]: E0320 10:37:34.522947 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:34 crc kubenswrapper[4748]: E0320 10:37:34.624015 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:34 crc kubenswrapper[4748]: E0320 10:37:34.724351 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:34 crc kubenswrapper[4748]: I0320 10:37:34.743632 4748 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 20 10:37:34 crc kubenswrapper[4748]: E0320 10:37:34.825148 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:34 crc kubenswrapper[4748]: E0320 10:37:34.925290 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:35 crc kubenswrapper[4748]: E0320 10:37:35.025676 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:35 crc kubenswrapper[4748]: E0320 10:37:35.125939 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:35 crc kubenswrapper[4748]: E0320 10:37:35.227054 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:35 crc kubenswrapper[4748]: E0320 10:37:35.327664 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:35 crc kubenswrapper[4748]: E0320 10:37:35.428128 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:35 crc kubenswrapper[4748]: I0320 10:37:35.514728 4748 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:37:35 crc kubenswrapper[4748]: I0320 10:37:35.516148 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:35 crc kubenswrapper[4748]: I0320 10:37:35.516211 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:35 crc kubenswrapper[4748]: I0320 10:37:35.516224 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:35 crc kubenswrapper[4748]: E0320 10:37:35.528635 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:35 crc kubenswrapper[4748]: E0320 10:37:35.589392 4748 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 10:37:35 crc kubenswrapper[4748]: E0320 10:37:35.629754 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:35 crc kubenswrapper[4748]: E0320 10:37:35.730298 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:35 crc kubenswrapper[4748]: E0320 10:37:35.831000 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:35 crc kubenswrapper[4748]: E0320 10:37:35.932021 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:36 crc kubenswrapper[4748]: E0320 10:37:36.032465 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:36 crc kubenswrapper[4748]: E0320 10:37:36.133498 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:36 crc kubenswrapper[4748]: E0320 10:37:36.233683 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:36 crc kubenswrapper[4748]: E0320 10:37:36.334789 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:36 crc kubenswrapper[4748]: E0320 10:37:36.435466 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:36 crc kubenswrapper[4748]: E0320 10:37:36.535693 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:36 crc kubenswrapper[4748]: E0320 10:37:36.637117 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:36 crc kubenswrapper[4748]: E0320 10:37:36.738161 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:36 crc kubenswrapper[4748]: E0320 10:37:36.839159 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:36 crc kubenswrapper[4748]: E0320 10:37:36.939941 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:37 crc kubenswrapper[4748]: E0320 10:37:37.040990 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:37 crc kubenswrapper[4748]: E0320 10:37:37.141346 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:37 crc kubenswrapper[4748]: E0320 10:37:37.242012 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:37 crc kubenswrapper[4748]: E0320 10:37:37.342963 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:37 crc kubenswrapper[4748]: E0320 10:37:37.443957 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:37 crc kubenswrapper[4748]: E0320 10:37:37.544737 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:37 crc kubenswrapper[4748]: E0320 10:37:37.645366 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:37 crc kubenswrapper[4748]: E0320 10:37:37.746363 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:37 crc kubenswrapper[4748]: E0320 10:37:37.846783 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:37 crc kubenswrapper[4748]: E0320 10:37:37.947999 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:38 crc kubenswrapper[4748]: E0320 10:37:38.048132 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:38 crc kubenswrapper[4748]: E0320 10:37:38.148898 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:38 crc kubenswrapper[4748]: E0320 10:37:38.249975 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:38 crc kubenswrapper[4748]: E0320 10:37:38.350646 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:38 crc kubenswrapper[4748]: E0320 10:37:38.451784 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:38 crc kubenswrapper[4748]: E0320 10:37:38.552762 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:38 crc kubenswrapper[4748]: E0320 10:37:38.653894 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:38 crc kubenswrapper[4748]: E0320 10:37:38.754746 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:38 crc kubenswrapper[4748]: E0320 10:37:38.855423 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:38 crc kubenswrapper[4748]: E0320 10:37:38.956102 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:39 crc kubenswrapper[4748]: E0320 10:37:39.057169 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:39 crc kubenswrapper[4748]: E0320 10:37:39.158324 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:39 crc kubenswrapper[4748]: E0320 10:37:39.259512 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:39 crc kubenswrapper[4748]: E0320 10:37:39.360011 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:39 crc kubenswrapper[4748]: E0320 10:37:39.460454 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:39 crc kubenswrapper[4748]: E0320 10:37:39.560806 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:39 crc kubenswrapper[4748]: E0320 10:37:39.661827 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:39 crc kubenswrapper[4748]: E0320 10:37:39.762628 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:39 crc kubenswrapper[4748]: E0320 10:37:39.863172 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:39 crc kubenswrapper[4748]: E0320 10:37:39.964321 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:40 crc kubenswrapper[4748]: E0320 10:37:40.065031 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:40 crc kubenswrapper[4748]: E0320 10:37:40.165749 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:40 crc kubenswrapper[4748]: E0320 10:37:40.266445 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:40 crc kubenswrapper[4748]: E0320 10:37:40.367056 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:40 crc kubenswrapper[4748]: E0320 10:37:40.467159 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:40 crc kubenswrapper[4748]: E0320 10:37:40.567730 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:40 crc kubenswrapper[4748]: E0320 10:37:40.667927 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:40 crc kubenswrapper[4748]: E0320 10:37:40.768865 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:40 crc kubenswrapper[4748]: E0320 10:37:40.869994 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:40 crc kubenswrapper[4748]: E0320 10:37:40.970978 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:41 crc kubenswrapper[4748]: E0320 10:37:41.071710 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:41 crc kubenswrapper[4748]: E0320 10:37:41.172118 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:41 crc kubenswrapper[4748]: E0320 10:37:41.273161 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:41 crc kubenswrapper[4748]: E0320 10:37:41.373556 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:41 crc kubenswrapper[4748]: E0320 10:37:41.474628 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:41 crc kubenswrapper[4748]: E0320 10:37:41.575065 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:41 crc kubenswrapper[4748]: E0320 10:37:41.676164 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:41 crc kubenswrapper[4748]: E0320 10:37:41.776971 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:41 crc kubenswrapper[4748]: E0320 10:37:41.877329 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:41 crc kubenswrapper[4748]: E0320 10:37:41.977484 4748 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.074544 4748 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.080685 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.080737 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.080752 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.080774 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.080789 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:42Z","lastTransitionTime":"2026-03-20T10:37:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.183497 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.183556 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.183572 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.183593 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.183607 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:42Z","lastTransitionTime":"2026-03-20T10:37:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.286378 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.286471 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.286482 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.286500 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.286514 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:42Z","lastTransitionTime":"2026-03-20T10:37:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.389093 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.389133 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.389144 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.389162 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.389175 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:42Z","lastTransitionTime":"2026-03-20T10:37:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.488893 4748 apiserver.go:52] "Watching apiserver" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.491894 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.491961 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.491981 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.492009 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.492028 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:42Z","lastTransitionTime":"2026-03-20T10:37:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.495669 4748 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.496132 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-w68d2","openshift-image-registry/node-ca-ftkzt","openshift-multus/multus-additional-cni-plugins-qnjmr","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-ovn-kubernetes/ovnkube-node-xdzb8","openshift-machine-config-operator/machine-config-daemon-5lbvz","openshift-multus/multus-z5ksw","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-node-identity/network-node-identity-vrzqb"] Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.496500 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.496576 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.496626 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 10:37:42 crc kubenswrapper[4748]: E0320 10:37:42.496647 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.496659 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.496946 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 10:37:42 crc kubenswrapper[4748]: E0320 10:37:42.497005 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.497154 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:37:42 crc kubenswrapper[4748]: E0320 10:37:42.497209 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.497375 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-w68d2" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.497432 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-ftkzt" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.497609 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-qnjmr" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.497784 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-xdzb8" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.501464 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-z5ksw" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.501473 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.506978 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.507208 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.507226 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.507297 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.507323 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.507514 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.507573 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.507585 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.507624 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.507747 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.507800 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.507980 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.508023 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.508049 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.508097 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.508067 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.508300 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.508406 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.508497 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.508603 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.510440 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.510518 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.510552 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.510712 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.510768 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.510780 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.510889 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.510903 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.510947 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.510983 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.511075 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.511206 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.511373 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.511383 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.511567 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.528388 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.528818 4748 scope.go:117] "RemoveContainer" containerID="abfae41f906e09e4b2be25e2cdf0408811449414f51931ce149a8468a32a88c6" Mar 20 10:37:42 crc kubenswrapper[4748]: E0320 10:37:42.529023 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.530920 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.541657 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.551641 4748 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.551870 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.566523 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.578117 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qnjmr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16bd6321-67e6-40c7-9ad0-5c9035765e5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6ncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6ncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6ncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6ncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6ncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6ncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6ncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:37:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qnjmr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.592448 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xdzb8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f31addae-43ae-459d-bf9d-b5c0ee58faba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndt8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndt8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndt8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndt8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndt8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndt8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndt8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndt8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndt8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:37:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xdzb8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.594381 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.594409 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.594422 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.594440 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.594452 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:42Z","lastTransitionTime":"2026-03-20T10:37:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.604160 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-z5ksw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4275e40d-41ca-4fe4-a44b-fe86f4d2e78b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srwmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:37:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-z5ksw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.612607 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fksm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fksm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:37:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5lbvz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.623930 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.631459 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.639916 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.646247 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ftkzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb932f5b-ebd7-48e2-ba20-3d1633290c8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9l9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:37:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ftkzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.646311 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.646343 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.646361 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.646381 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.646397 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.646412 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.646426 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.646441 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.646458 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.646475 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.646579 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.646596 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.646611 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.646627 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.646642 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.646658 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.646674 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.646692 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.646708 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.646725 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.646739 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.646758 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.646776 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.646794 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.646812 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.646830 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.646859 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.646877 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.646893 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.646911 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.646927 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.646942 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.646959 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.646977 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.646993 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.647010 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.647025 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.647040 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.647055 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.647069 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.647084 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.647100 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.647114 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.647154 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.647171 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.647188 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.647206 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.647221 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.647237 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.647252 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.647266 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.647283 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.647298 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.647314 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.647328 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.647345 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.647361 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.647377 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.647393 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.647408 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.647423 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.647440 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.647455 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.647470 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.647487 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.647502 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.647517 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.647532 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.647552 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.647569 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.647584 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.647599 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.647613 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.647628 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.647644 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.647659 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.647674 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.647689 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.647707 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.647724 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.647739 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.647754 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.647771 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.647787 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.647804 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.647819 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.647849 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.647866 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.647882 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.647899 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.647915 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.647931 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.647947 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.647964 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.647980 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.648015 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.648031 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.648048 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.648065 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.648085 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.648101 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.648127 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.648142 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.648158 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.648173 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.648191 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.648206 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.648221 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.648236 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.648251 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.648267 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.648282 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.648296 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.648313 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.648329 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.648343 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.648359 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.648373 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.648389 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.648409 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.648431 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.648455 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.648478 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.648504 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.648529 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.646749 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.651575 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.647415 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.647687 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.648063 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: E0320 10:37:42.648553 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:37:43.14853251 +0000 UTC m=+98.290078324 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.648811 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.648886 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.649023 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.648992 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.649244 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.649299 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.649421 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.649523 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.649525 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.649709 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.649795 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.649822 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.649914 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.649989 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.650002 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.650104 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.650379 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.650421 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.650443 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.650534 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.650565 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.650675 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.651080 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.651616 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.651618 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.651823 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.651903 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.651932 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.651951 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.651969 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.651985 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.652003 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.652225 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.652232 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.652272 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.652285 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.652418 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.652475 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.652544 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.652822 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.653099 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.653218 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.653445 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.653566 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.653578 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.653669 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.653802 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.653827 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.654198 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.654353 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.654378 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.655066 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.655092 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.655170 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.655272 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.655363 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.655559 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.655703 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.655749 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.655896 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.656021 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.656037 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.656138 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.656191 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.656211 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.656329 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.656368 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.656521 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.656534 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.656537 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.657047 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.657196 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.657395 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.657418 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.657583 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.657766 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.657825 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.657857 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.657932 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.657937 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.657958 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.658065 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.658082 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.658389 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.658486 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.658649 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.658656 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.658708 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.658776 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.658859 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.658931 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.658957 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.659235 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.659291 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.659321 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.659356 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.659364 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.659535 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.659754 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.659772 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.659711 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.657909 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.659929 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.660015 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.660110 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.660181 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.660262 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.660337 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.660472 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.660938 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.660958 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.660977 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.660995 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.661018 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.661035 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.661053 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.661069 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.661086 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.661105 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.661123 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.661141 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.661158 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.661175 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.661172 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.661193 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.661215 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.661303 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.661332 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.661357 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.661384 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.661407 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.661429 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.661451 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.661473 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.661504 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.661526 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.661552 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.661568 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.661576 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.661663 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.661685 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.661712 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.661854 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.661856 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.661871 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.661897 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.661915 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.661930 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.661948 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.661951 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.661966 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.661969 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.661982 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.662025 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.662049 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.662073 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.662098 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.662123 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.662179 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.662200 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.662219 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.662240 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.662317 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srwmv\" (UniqueName: \"kubernetes.io/projected/4275e40d-41ca-4fe4-a44b-fe86f4d2e78b-kube-api-access-srwmv\") pod \"multus-z5ksw\" (UID: \"4275e40d-41ca-4fe4-a44b-fe86f4d2e78b\") " pod="openshift-multus/multus-z5ksw" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.662343 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f31addae-43ae-459d-bf9d-b5c0ee58faba-run-ovn\") pod \"ovnkube-node-xdzb8\" (UID: \"f31addae-43ae-459d-bf9d-b5c0ee58faba\") " pod="openshift-ovn-kubernetes/ovnkube-node-xdzb8" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.662361 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f31addae-43ae-459d-bf9d-b5c0ee58faba-node-log\") pod \"ovnkube-node-xdzb8\" (UID: \"f31addae-43ae-459d-bf9d-b5c0ee58faba\") " pod="openshift-ovn-kubernetes/ovnkube-node-xdzb8" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.662380 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/4275e40d-41ca-4fe4-a44b-fe86f4d2e78b-hostroot\") pod \"multus-z5ksw\" (UID: \"4275e40d-41ca-4fe4-a44b-fe86f4d2e78b\") " pod="openshift-multus/multus-z5ksw" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.662399 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/16bd6321-67e6-40c7-9ad0-5c9035765e5d-cni-binary-copy\") pod \"multus-additional-cni-plugins-qnjmr\" (UID: \"16bd6321-67e6-40c7-9ad0-5c9035765e5d\") " pod="openshift-multus/multus-additional-cni-plugins-qnjmr" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.662452 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.662506 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.662516 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.662545 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.662626 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c-rootfs\") pod \"machine-config-daemon-5lbvz\" (UID: \"8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c\") " pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.662927 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.663078 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4275e40d-41ca-4fe4-a44b-fe86f4d2e78b-multus-conf-dir\") pod \"multus-z5ksw\" (UID: \"4275e40d-41ca-4fe4-a44b-fe86f4d2e78b\") " pod="openshift-multus/multus-z5ksw" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.663113 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.663164 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.663199 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.663216 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f31addae-43ae-459d-bf9d-b5c0ee58faba-host-run-netns\") pod \"ovnkube-node-xdzb8\" (UID: \"f31addae-43ae-459d-bf9d-b5c0ee58faba\") " pod="openshift-ovn-kubernetes/ovnkube-node-xdzb8" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.663243 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f31addae-43ae-459d-bf9d-b5c0ee58faba-host-cni-netd\") pod \"ovnkube-node-xdzb8\" (UID: \"f31addae-43ae-459d-bf9d-b5c0ee58faba\") " pod="openshift-ovn-kubernetes/ovnkube-node-xdzb8" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.663262 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/4275e40d-41ca-4fe4-a44b-fe86f4d2e78b-host-run-multus-certs\") pod \"multus-z5ksw\" (UID: \"4275e40d-41ca-4fe4-a44b-fe86f4d2e78b\") " pod="openshift-multus/multus-z5ksw" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.663286 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.663306 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/4275e40d-41ca-4fe4-a44b-fe86f4d2e78b-host-run-k8s-cni-cncf-io\") pod \"multus-z5ksw\" (UID: \"4275e40d-41ca-4fe4-a44b-fe86f4d2e78b\") " pod="openshift-multus/multus-z5ksw" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.663326 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/1cb89200-ecbf-4725-b48f-801aaecd6ad0-hosts-file\") pod \"node-resolver-w68d2\" (UID: \"1cb89200-ecbf-4725-b48f-801aaecd6ad0\") " pod="openshift-dns/node-resolver-w68d2" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.663348 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f31addae-43ae-459d-bf9d-b5c0ee58faba-host-cni-bin\") pod \"ovnkube-node-xdzb8\" (UID: \"f31addae-43ae-459d-bf9d-b5c0ee58faba\") " pod="openshift-ovn-kubernetes/ovnkube-node-xdzb8" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.663370 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.663372 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.663390 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c-proxy-tls\") pod \"machine-config-daemon-5lbvz\" (UID: \"8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c\") " pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.663409 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4275e40d-41ca-4fe4-a44b-fe86f4d2e78b-cnibin\") pod \"multus-z5ksw\" (UID: \"4275e40d-41ca-4fe4-a44b-fe86f4d2e78b\") " pod="openshift-multus/multus-z5ksw" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.663428 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4275e40d-41ca-4fe4-a44b-fe86f4d2e78b-cni-binary-copy\") pod \"multus-z5ksw\" (UID: \"4275e40d-41ca-4fe4-a44b-fe86f4d2e78b\") " pod="openshift-multus/multus-z5ksw" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.663448 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.663458 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.663479 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.663503 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c-mcd-auth-proxy-config\") pod \"machine-config-daemon-5lbvz\" (UID: \"8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c\") " pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.663522 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.663543 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/16bd6321-67e6-40c7-9ad0-5c9035765e5d-system-cni-dir\") pod \"multus-additional-cni-plugins-qnjmr\" (UID: \"16bd6321-67e6-40c7-9ad0-5c9035765e5d\") " pod="openshift-multus/multus-additional-cni-plugins-qnjmr" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.663561 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6ncv\" (UniqueName: \"kubernetes.io/projected/16bd6321-67e6-40c7-9ad0-5c9035765e5d-kube-api-access-z6ncv\") pod \"multus-additional-cni-plugins-qnjmr\" (UID: \"16bd6321-67e6-40c7-9ad0-5c9035765e5d\") " pod="openshift-multus/multus-additional-cni-plugins-qnjmr" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.663580 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4275e40d-41ca-4fe4-a44b-fe86f4d2e78b-os-release\") pod \"multus-z5ksw\" (UID: \"4275e40d-41ca-4fe4-a44b-fe86f4d2e78b\") " pod="openshift-multus/multus-z5ksw" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.663585 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.663599 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/4275e40d-41ca-4fe4-a44b-fe86f4d2e78b-multus-socket-dir-parent\") pod \"multus-z5ksw\" (UID: \"4275e40d-41ca-4fe4-a44b-fe86f4d2e78b\") " pod="openshift-multus/multus-z5ksw" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.663726 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.663617 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4275e40d-41ca-4fe4-a44b-fe86f4d2e78b-host-var-lib-kubelet\") pod \"multus-z5ksw\" (UID: \"4275e40d-41ca-4fe4-a44b-fe86f4d2e78b\") " pod="openshift-multus/multus-z5ksw" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.663782 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.663804 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f31addae-43ae-459d-bf9d-b5c0ee58faba-log-socket\") pod \"ovnkube-node-xdzb8\" (UID: \"f31addae-43ae-459d-bf9d-b5c0ee58faba\") " pod="openshift-ovn-kubernetes/ovnkube-node-xdzb8" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.663824 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f31addae-43ae-459d-bf9d-b5c0ee58faba-env-overrides\") pod \"ovnkube-node-xdzb8\" (UID: \"f31addae-43ae-459d-bf9d-b5c0ee58faba\") " pod="openshift-ovn-kubernetes/ovnkube-node-xdzb8" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.663884 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/4275e40d-41ca-4fe4-a44b-fe86f4d2e78b-host-var-lib-cni-multus\") pod \"multus-z5ksw\" (UID: \"4275e40d-41ca-4fe4-a44b-fe86f4d2e78b\") " pod="openshift-multus/multus-z5ksw" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.663908 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-szq25\" (UniqueName: \"kubernetes.io/projected/1cb89200-ecbf-4725-b48f-801aaecd6ad0-kube-api-access-szq25\") pod \"node-resolver-w68d2\" (UID: \"1cb89200-ecbf-4725-b48f-801aaecd6ad0\") " pod="openshift-dns/node-resolver-w68d2" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.663932 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.663953 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f31addae-43ae-459d-bf9d-b5c0ee58faba-etc-openvswitch\") pod \"ovnkube-node-xdzb8\" (UID: \"f31addae-43ae-459d-bf9d-b5c0ee58faba\") " pod="openshift-ovn-kubernetes/ovnkube-node-xdzb8" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.663975 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.663993 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4275e40d-41ca-4fe4-a44b-fe86f4d2e78b-host-run-netns\") pod \"multus-z5ksw\" (UID: \"4275e40d-41ca-4fe4-a44b-fe86f4d2e78b\") " pod="openshift-multus/multus-z5ksw" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.664014 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.664032 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/16bd6321-67e6-40c7-9ad0-5c9035765e5d-tuning-conf-dir\") pod \"multus-additional-cni-plugins-qnjmr\" (UID: \"16bd6321-67e6-40c7-9ad0-5c9035765e5d\") " pod="openshift-multus/multus-additional-cni-plugins-qnjmr" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.664056 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f31addae-43ae-459d-bf9d-b5c0ee58faba-run-systemd\") pod \"ovnkube-node-xdzb8\" (UID: \"f31addae-43ae-459d-bf9d-b5c0ee58faba\") " pod="openshift-ovn-kubernetes/ovnkube-node-xdzb8" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.664077 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f31addae-43ae-459d-bf9d-b5c0ee58faba-host-run-ovn-kubernetes\") pod \"ovnkube-node-xdzb8\" (UID: \"f31addae-43ae-459d-bf9d-b5c0ee58faba\") " pod="openshift-ovn-kubernetes/ovnkube-node-xdzb8" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.664098 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f31addae-43ae-459d-bf9d-b5c0ee58faba-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-xdzb8\" (UID: \"f31addae-43ae-459d-bf9d-b5c0ee58faba\") " pod="openshift-ovn-kubernetes/ovnkube-node-xdzb8" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.664120 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f31addae-43ae-459d-bf9d-b5c0ee58faba-ovnkube-script-lib\") pod \"ovnkube-node-xdzb8\" (UID: \"f31addae-43ae-459d-bf9d-b5c0ee58faba\") " pod="openshift-ovn-kubernetes/ovnkube-node-xdzb8" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.664153 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndt8m\" (UniqueName: \"kubernetes.io/projected/f31addae-43ae-459d-bf9d-b5c0ee58faba-kube-api-access-ndt8m\") pod \"ovnkube-node-xdzb8\" (UID: \"f31addae-43ae-459d-bf9d-b5c0ee58faba\") " pod="openshift-ovn-kubernetes/ovnkube-node-xdzb8" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.664176 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fksm\" (UniqueName: \"kubernetes.io/projected/8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c-kube-api-access-8fksm\") pod \"machine-config-daemon-5lbvz\" (UID: \"8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c\") " pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.664197 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/16bd6321-67e6-40c7-9ad0-5c9035765e5d-os-release\") pod \"multus-additional-cni-plugins-qnjmr\" (UID: \"16bd6321-67e6-40c7-9ad0-5c9035765e5d\") " pod="openshift-multus/multus-additional-cni-plugins-qnjmr" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.664217 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.664243 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9l9h\" (UniqueName: \"kubernetes.io/projected/bb932f5b-ebd7-48e2-ba20-3d1633290c8e-kube-api-access-k9l9h\") pod \"node-ca-ftkzt\" (UID: \"bb932f5b-ebd7-48e2-ba20-3d1633290c8e\") " pod="openshift-image-registry/node-ca-ftkzt" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.664263 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f31addae-43ae-459d-bf9d-b5c0ee58faba-host-slash\") pod \"ovnkube-node-xdzb8\" (UID: \"f31addae-43ae-459d-bf9d-b5c0ee58faba\") " pod="openshift-ovn-kubernetes/ovnkube-node-xdzb8" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.664172 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5483745-95c8-4c6e-bd16-ec5fec57af5d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:36:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:36:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:36:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:36:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:36:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c40d6e7336ba79828d98e8a8a2cb2e049ce6ad046d53a6337d9ab0d1ce0ee11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:36:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed2d8bdb5b326ae91ba0ca3da26d434af7ae6a3c604fd3ff22eb0ae8b00a0aa1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:36:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15d3d6d4f8c47119a747f3c76a9bb2b05c6f0030576418e787f3deb6e9f2485b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:36:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abfae41f906e09e4b2be25e2cdf0408811449414f51931ce149a8468a32a88c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abfae41f906e09e4b2be25e2cdf0408811449414f51931ce149a8468a32a88c6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:37:13Z\\\",\\\"message\\\":\\\"file observer\\\\nW0320 10:37:13.169272 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 10:37:13.169530 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:37:13.170805 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-114493928/tls.crt::/tmp/serving-cert-114493928/tls.key\\\\\\\"\\\\nI0320 10:37:13.625648 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 10:37:13.628420 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 10:37:13.628441 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 10:37:13.628469 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 10:37:13.628482 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 10:37:13.635699 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 10:37:13.635718 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0320 10:37:13.635717 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 10:37:13.635722 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:37:13.635738 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 10:37:13.635741 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 10:37:13.635744 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 10:37:13.635746 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 10:37:13.637809 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:37:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://989be6314e2d0af65e9541b5b2a0be5aed080255658e56c1c5bd989d1a842acf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:36:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b974db792d69c06471dc5bd8f9e64e727b052347aed06e5a1c80653838984f51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b974db792d69c06471dc5bd8f9e64e727b052347aed06e5a1c80653838984f51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:36:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:36:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:36:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.664280 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4275e40d-41ca-4fe4-a44b-fe86f4d2e78b-multus-daemon-config\") pod \"multus-z5ksw\" (UID: \"4275e40d-41ca-4fe4-a44b-fe86f4d2e78b\") " pod="openshift-multus/multus-z5ksw" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.664302 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f31addae-43ae-459d-bf9d-b5c0ee58faba-systemd-units\") pod \"ovnkube-node-xdzb8\" (UID: \"f31addae-43ae-459d-bf9d-b5c0ee58faba\") " pod="openshift-ovn-kubernetes/ovnkube-node-xdzb8" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.664318 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f31addae-43ae-459d-bf9d-b5c0ee58faba-var-lib-openvswitch\") pod \"ovnkube-node-xdzb8\" (UID: \"f31addae-43ae-459d-bf9d-b5c0ee58faba\") " pod="openshift-ovn-kubernetes/ovnkube-node-xdzb8" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.664337 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f31addae-43ae-459d-bf9d-b5c0ee58faba-ovn-node-metrics-cert\") pod \"ovnkube-node-xdzb8\" (UID: \"f31addae-43ae-459d-bf9d-b5c0ee58faba\") " pod="openshift-ovn-kubernetes/ovnkube-node-xdzb8" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.664368 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4275e40d-41ca-4fe4-a44b-fe86f4d2e78b-system-cni-dir\") pod \"multus-z5ksw\" (UID: \"4275e40d-41ca-4fe4-a44b-fe86f4d2e78b\") " pod="openshift-multus/multus-z5ksw" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.664388 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4275e40d-41ca-4fe4-a44b-fe86f4d2e78b-etc-kubernetes\") pod \"multus-z5ksw\" (UID: \"4275e40d-41ca-4fe4-a44b-fe86f4d2e78b\") " pod="openshift-multus/multus-z5ksw" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.663903 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.664411 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/16bd6321-67e6-40c7-9ad0-5c9035765e5d-cnibin\") pod \"multus-additional-cni-plugins-qnjmr\" (UID: \"16bd6321-67e6-40c7-9ad0-5c9035765e5d\") " pod="openshift-multus/multus-additional-cni-plugins-qnjmr" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.663919 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.664433 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/16bd6321-67e6-40c7-9ad0-5c9035765e5d-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-qnjmr\" (UID: \"16bd6321-67e6-40c7-9ad0-5c9035765e5d\") " pod="openshift-multus/multus-additional-cni-plugins-qnjmr" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.664457 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bb932f5b-ebd7-48e2-ba20-3d1633290c8e-host\") pod \"node-ca-ftkzt\" (UID: \"bb932f5b-ebd7-48e2-ba20-3d1633290c8e\") " pod="openshift-image-registry/node-ca-ftkzt" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.664477 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/bb932f5b-ebd7-48e2-ba20-3d1633290c8e-serviceca\") pod \"node-ca-ftkzt\" (UID: \"bb932f5b-ebd7-48e2-ba20-3d1633290c8e\") " pod="openshift-image-registry/node-ca-ftkzt" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.664502 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.664522 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.665146 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f31addae-43ae-459d-bf9d-b5c0ee58faba-run-openvswitch\") pod \"ovnkube-node-xdzb8\" (UID: \"f31addae-43ae-459d-bf9d-b5c0ee58faba\") " pod="openshift-ovn-kubernetes/ovnkube-node-xdzb8" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.665440 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f31addae-43ae-459d-bf9d-b5c0ee58faba-ovnkube-config\") pod \"ovnkube-node-xdzb8\" (UID: \"f31addae-43ae-459d-bf9d-b5c0ee58faba\") " pod="openshift-ovn-kubernetes/ovnkube-node-xdzb8" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.665486 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4275e40d-41ca-4fe4-a44b-fe86f4d2e78b-multus-cni-dir\") pod \"multus-z5ksw\" (UID: \"4275e40d-41ca-4fe4-a44b-fe86f4d2e78b\") " pod="openshift-multus/multus-z5ksw" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.665540 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4275e40d-41ca-4fe4-a44b-fe86f4d2e78b-host-var-lib-cni-bin\") pod \"multus-z5ksw\" (UID: \"4275e40d-41ca-4fe4-a44b-fe86f4d2e78b\") " pod="openshift-multus/multus-z5ksw" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.665607 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f31addae-43ae-459d-bf9d-b5c0ee58faba-host-kubelet\") pod \"ovnkube-node-xdzb8\" (UID: \"f31addae-43ae-459d-bf9d-b5c0ee58faba\") " pod="openshift-ovn-kubernetes/ovnkube-node-xdzb8" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.666047 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.666069 4748 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.666085 4748 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.666105 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.666202 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.666216 4748 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.666229 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.666248 4748 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.666262 4748 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.666316 4748 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.666328 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.666344 4748 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.666415 4748 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.666431 4748 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.666447 4748 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.666457 4748 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.666470 4748 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.666485 4748 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.666500 4748 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.666514 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.666525 4748 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.666555 4748 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.666572 4748 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.666589 4748 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.666601 4748 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.666710 4748 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.666723 4748 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.666736 4748 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.666749 4748 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.666873 4748 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.666889 4748 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.666902 4748 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.666914 4748 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.667060 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.667057 4748 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.667075 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.667088 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.667201 4748 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.667221 4748 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.667233 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.667245 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.667261 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.667274 4748 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.667285 4748 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.667297 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.667312 4748 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.667383 4748 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.667400 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.667413 4748 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.667431 4748 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.667484 4748 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.667501 4748 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.667518 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.667530 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.667568 4748 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.667580 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.667597 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.667610 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.667652 4748 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.667666 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.667681 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.667693 4748 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.667705 4748 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.667719 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.667737 4748 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.664165 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.664333 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.664384 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.664455 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.664514 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.665215 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.665611 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.665745 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.665961 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.666053 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.666555 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.666741 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.666974 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.667407 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.667648 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.667873 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.668135 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.667528 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.668171 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.668541 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.669212 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.668664 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: E0320 10:37:42.668719 4748 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.668775 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.668799 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.668894 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.669172 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.669196 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: E0320 10:37:42.669433 4748 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.669926 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.670076 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.670545 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.670592 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.670621 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.670786 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.670911 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.671189 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.671292 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.671078 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.671412 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: E0320 10:37:42.671570 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 10:37:43.171546638 +0000 UTC m=+98.313092452 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.671627 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.671649 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.671742 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.671215 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.671319 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.671100 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.673673 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.674076 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.674472 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.675992 4748 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.676025 4748 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.676060 4748 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.676218 4748 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.676241 4748 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.676255 4748 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.676271 4748 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.676283 4748 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.676294 4748 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.676305 4748 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.676343 4748 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.676358 4748 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.676376 4748 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.676388 4748 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.676979 4748 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.677001 4748 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.677015 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.677029 4748 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.677043 4748 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.677058 4748 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.677072 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.677084 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.677095 4748 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.677119 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.677130 4748 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.677141 4748 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.677154 4748 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.677166 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.677178 4748 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.677190 4748 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.677204 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.677215 4748 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.677226 4748 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.677237 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.677253 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.677270 4748 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.677284 4748 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.677297 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.677311 4748 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.677322 4748 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.677333 4748 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.677347 4748 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.677359 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.677369 4748 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.677379 4748 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.679918 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.680416 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.680514 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 10:37:42 crc kubenswrapper[4748]: E0320 10:37:42.681534 4748 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 10:37:42 crc kubenswrapper[4748]: E0320 10:37:42.681561 4748 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 10:37:42 crc kubenswrapper[4748]: E0320 10:37:42.681578 4748 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:37:42 crc kubenswrapper[4748]: E0320 10:37:42.681646 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 10:37:43.181625143 +0000 UTC m=+98.323170957 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.681667 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: E0320 10:37:42.681767 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 10:37:43.181743186 +0000 UTC m=+98.323289010 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.681906 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.682206 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.682379 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.682846 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.684509 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.685481 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: E0320 10:37:42.685675 4748 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 10:37:42 crc kubenswrapper[4748]: E0320 10:37:42.685706 4748 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 10:37:42 crc kubenswrapper[4748]: E0320 10:37:42.685826 4748 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.686017 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 10:37:42 crc kubenswrapper[4748]: E0320 10:37:42.686110 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 10:37:43.185919647 +0000 UTC m=+98.327465461 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.686260 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.686944 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.687307 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.687462 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.688357 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.688746 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.688812 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.688907 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.689006 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.689133 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.689184 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.691876 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.691899 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.692227 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.692417 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.693439 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w68d2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cb89200-ecbf-4725-b48f-801aaecd6ad0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-szq25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:37:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w68d2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.694529 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.694634 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.694920 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.695048 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.695062 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.695217 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.695233 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.695267 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.695485 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.696015 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.696128 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.696235 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.696272 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.696508 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.696798 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.697684 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.697715 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.697727 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.697746 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.697759 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:42Z","lastTransitionTime":"2026-03-20T10:37:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.698038 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.698969 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.712612 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.716458 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.718849 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.718890 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.718903 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.718922 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.718935 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:42Z","lastTransitionTime":"2026-03-20T10:37:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.726007 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.726151 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:37:42 crc kubenswrapper[4748]: E0320 10:37:42.730965 4748 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1d9697da-f407-4535-b044-2e042853bd80\\\",\\\"systemUUID\\\":\\\"1909d2db-5267-4c43-8cb4-dc64b5fa3add\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.734826 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.734870 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.734878 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.734894 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.734904 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:42Z","lastTransitionTime":"2026-03-20T10:37:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:42 crc kubenswrapper[4748]: E0320 10:37:42.745111 4748 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1d9697da-f407-4535-b044-2e042853bd80\\\",\\\"systemUUID\\\":\\\"1909d2db-5267-4c43-8cb4-dc64b5fa3add\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.748295 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.748329 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.748339 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.748360 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.748371 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:42Z","lastTransitionTime":"2026-03-20T10:37:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:42 crc kubenswrapper[4748]: E0320 10:37:42.759066 4748 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1d9697da-f407-4535-b044-2e042853bd80\\\",\\\"systemUUID\\\":\\\"1909d2db-5267-4c43-8cb4-dc64b5fa3add\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.762314 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.762350 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.762359 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.762374 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.762384 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:42Z","lastTransitionTime":"2026-03-20T10:37:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:42 crc kubenswrapper[4748]: E0320 10:37:42.775207 4748 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1d9697da-f407-4535-b044-2e042853bd80\\\",\\\"systemUUID\\\":\\\"1909d2db-5267-4c43-8cb4-dc64b5fa3add\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.779487 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c-proxy-tls\") pod \"machine-config-daemon-5lbvz\" (UID: \"8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c\") " pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.779547 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.779575 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.779587 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.779605 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.779617 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:42Z","lastTransitionTime":"2026-03-20T10:37:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.779771 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4275e40d-41ca-4fe4-a44b-fe86f4d2e78b-cnibin\") pod \"multus-z5ksw\" (UID: \"4275e40d-41ca-4fe4-a44b-fe86f4d2e78b\") " pod="openshift-multus/multus-z5ksw" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.779577 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4275e40d-41ca-4fe4-a44b-fe86f4d2e78b-cnibin\") pod \"multus-z5ksw\" (UID: \"4275e40d-41ca-4fe4-a44b-fe86f4d2e78b\") " pod="openshift-multus/multus-z5ksw" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.780114 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4275e40d-41ca-4fe4-a44b-fe86f4d2e78b-cni-binary-copy\") pod \"multus-z5ksw\" (UID: \"4275e40d-41ca-4fe4-a44b-fe86f4d2e78b\") " pod="openshift-multus/multus-z5ksw" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.780148 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.780173 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c-mcd-auth-proxy-config\") pod \"machine-config-daemon-5lbvz\" (UID: \"8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c\") " pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.780198 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/16bd6321-67e6-40c7-9ad0-5c9035765e5d-system-cni-dir\") pod \"multus-additional-cni-plugins-qnjmr\" (UID: \"16bd6321-67e6-40c7-9ad0-5c9035765e5d\") " pod="openshift-multus/multus-additional-cni-plugins-qnjmr" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.780222 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z6ncv\" (UniqueName: \"kubernetes.io/projected/16bd6321-67e6-40c7-9ad0-5c9035765e5d-kube-api-access-z6ncv\") pod \"multus-additional-cni-plugins-qnjmr\" (UID: \"16bd6321-67e6-40c7-9ad0-5c9035765e5d\") " pod="openshift-multus/multus-additional-cni-plugins-qnjmr" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.780250 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/4275e40d-41ca-4fe4-a44b-fe86f4d2e78b-multus-socket-dir-parent\") pod \"multus-z5ksw\" (UID: \"4275e40d-41ca-4fe4-a44b-fe86f4d2e78b\") " pod="openshift-multus/multus-z5ksw" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.780282 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4275e40d-41ca-4fe4-a44b-fe86f4d2e78b-host-var-lib-kubelet\") pod \"multus-z5ksw\" (UID: \"4275e40d-41ca-4fe4-a44b-fe86f4d2e78b\") " pod="openshift-multus/multus-z5ksw" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.780310 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f31addae-43ae-459d-bf9d-b5c0ee58faba-log-socket\") pod \"ovnkube-node-xdzb8\" (UID: \"f31addae-43ae-459d-bf9d-b5c0ee58faba\") " pod="openshift-ovn-kubernetes/ovnkube-node-xdzb8" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.780337 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f31addae-43ae-459d-bf9d-b5c0ee58faba-env-overrides\") pod \"ovnkube-node-xdzb8\" (UID: \"f31addae-43ae-459d-bf9d-b5c0ee58faba\") " pod="openshift-ovn-kubernetes/ovnkube-node-xdzb8" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.780359 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4275e40d-41ca-4fe4-a44b-fe86f4d2e78b-os-release\") pod \"multus-z5ksw\" (UID: \"4275e40d-41ca-4fe4-a44b-fe86f4d2e78b\") " pod="openshift-multus/multus-z5ksw" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.780383 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/4275e40d-41ca-4fe4-a44b-fe86f4d2e78b-host-var-lib-cni-multus\") pod \"multus-z5ksw\" (UID: \"4275e40d-41ca-4fe4-a44b-fe86f4d2e78b\") " pod="openshift-multus/multus-z5ksw" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.780407 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-szq25\" (UniqueName: \"kubernetes.io/projected/1cb89200-ecbf-4725-b48f-801aaecd6ad0-kube-api-access-szq25\") pod \"node-resolver-w68d2\" (UID: \"1cb89200-ecbf-4725-b48f-801aaecd6ad0\") " pod="openshift-dns/node-resolver-w68d2" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.780460 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.780482 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4275e40d-41ca-4fe4-a44b-fe86f4d2e78b-host-var-lib-kubelet\") pod \"multus-z5ksw\" (UID: \"4275e40d-41ca-4fe4-a44b-fe86f4d2e78b\") " pod="openshift-multus/multus-z5ksw" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.780489 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f31addae-43ae-459d-bf9d-b5c0ee58faba-etc-openvswitch\") pod \"ovnkube-node-xdzb8\" (UID: \"f31addae-43ae-459d-bf9d-b5c0ee58faba\") " pod="openshift-ovn-kubernetes/ovnkube-node-xdzb8" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.780530 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f31addae-43ae-459d-bf9d-b5c0ee58faba-etc-openvswitch\") pod \"ovnkube-node-xdzb8\" (UID: \"f31addae-43ae-459d-bf9d-b5c0ee58faba\") " pod="openshift-ovn-kubernetes/ovnkube-node-xdzb8" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.780586 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4275e40d-41ca-4fe4-a44b-fe86f4d2e78b-host-run-netns\") pod \"multus-z5ksw\" (UID: \"4275e40d-41ca-4fe4-a44b-fe86f4d2e78b\") " pod="openshift-multus/multus-z5ksw" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.780585 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f31addae-43ae-459d-bf9d-b5c0ee58faba-log-socket\") pod \"ovnkube-node-xdzb8\" (UID: \"f31addae-43ae-459d-bf9d-b5c0ee58faba\") " pod="openshift-ovn-kubernetes/ovnkube-node-xdzb8" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.780612 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/16bd6321-67e6-40c7-9ad0-5c9035765e5d-tuning-conf-dir\") pod \"multus-additional-cni-plugins-qnjmr\" (UID: \"16bd6321-67e6-40c7-9ad0-5c9035765e5d\") " pod="openshift-multus/multus-additional-cni-plugins-qnjmr" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.780636 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f31addae-43ae-459d-bf9d-b5c0ee58faba-run-systemd\") pod \"ovnkube-node-xdzb8\" (UID: \"f31addae-43ae-459d-bf9d-b5c0ee58faba\") " pod="openshift-ovn-kubernetes/ovnkube-node-xdzb8" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.780659 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f31addae-43ae-459d-bf9d-b5c0ee58faba-host-run-ovn-kubernetes\") pod \"ovnkube-node-xdzb8\" (UID: \"f31addae-43ae-459d-bf9d-b5c0ee58faba\") " pod="openshift-ovn-kubernetes/ovnkube-node-xdzb8" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.780682 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f31addae-43ae-459d-bf9d-b5c0ee58faba-ovnkube-script-lib\") pod \"ovnkube-node-xdzb8\" (UID: \"f31addae-43ae-459d-bf9d-b5c0ee58faba\") " pod="openshift-ovn-kubernetes/ovnkube-node-xdzb8" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.780703 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ndt8m\" (UniqueName: \"kubernetes.io/projected/f31addae-43ae-459d-bf9d-b5c0ee58faba-kube-api-access-ndt8m\") pod \"ovnkube-node-xdzb8\" (UID: \"f31addae-43ae-459d-bf9d-b5c0ee58faba\") " pod="openshift-ovn-kubernetes/ovnkube-node-xdzb8" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.780726 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f31addae-43ae-459d-bf9d-b5c0ee58faba-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-xdzb8\" (UID: \"f31addae-43ae-459d-bf9d-b5c0ee58faba\") " pod="openshift-ovn-kubernetes/ovnkube-node-xdzb8" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.780754 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/16bd6321-67e6-40c7-9ad0-5c9035765e5d-os-release\") pod \"multus-additional-cni-plugins-qnjmr\" (UID: \"16bd6321-67e6-40c7-9ad0-5c9035765e5d\") " pod="openshift-multus/multus-additional-cni-plugins-qnjmr" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.780787 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9l9h\" (UniqueName: \"kubernetes.io/projected/bb932f5b-ebd7-48e2-ba20-3d1633290c8e-kube-api-access-k9l9h\") pod \"node-ca-ftkzt\" (UID: \"bb932f5b-ebd7-48e2-ba20-3d1633290c8e\") " pod="openshift-image-registry/node-ca-ftkzt" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.780815 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f31addae-43ae-459d-bf9d-b5c0ee58faba-host-slash\") pod \"ovnkube-node-xdzb8\" (UID: \"f31addae-43ae-459d-bf9d-b5c0ee58faba\") " pod="openshift-ovn-kubernetes/ovnkube-node-xdzb8" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.780864 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8fksm\" (UniqueName: \"kubernetes.io/projected/8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c-kube-api-access-8fksm\") pod \"machine-config-daemon-5lbvz\" (UID: \"8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c\") " pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.780888 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f31addae-43ae-459d-bf9d-b5c0ee58faba-systemd-units\") pod \"ovnkube-node-xdzb8\" (UID: \"f31addae-43ae-459d-bf9d-b5c0ee58faba\") " pod="openshift-ovn-kubernetes/ovnkube-node-xdzb8" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.780983 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f31addae-43ae-459d-bf9d-b5c0ee58faba-host-run-ovn-kubernetes\") pod \"ovnkube-node-xdzb8\" (UID: \"f31addae-43ae-459d-bf9d-b5c0ee58faba\") " pod="openshift-ovn-kubernetes/ovnkube-node-xdzb8" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.780989 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f31addae-43ae-459d-bf9d-b5c0ee58faba-var-lib-openvswitch\") pod \"ovnkube-node-xdzb8\" (UID: \"f31addae-43ae-459d-bf9d-b5c0ee58faba\") " pod="openshift-ovn-kubernetes/ovnkube-node-xdzb8" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.781019 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f31addae-43ae-459d-bf9d-b5c0ee58faba-ovn-node-metrics-cert\") pod \"ovnkube-node-xdzb8\" (UID: \"f31addae-43ae-459d-bf9d-b5c0ee58faba\") " pod="openshift-ovn-kubernetes/ovnkube-node-xdzb8" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.781040 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/16bd6321-67e6-40c7-9ad0-5c9035765e5d-system-cni-dir\") pod \"multus-additional-cni-plugins-qnjmr\" (UID: \"16bd6321-67e6-40c7-9ad0-5c9035765e5d\") " pod="openshift-multus/multus-additional-cni-plugins-qnjmr" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.781041 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4275e40d-41ca-4fe4-a44b-fe86f4d2e78b-multus-daemon-config\") pod \"multus-z5ksw\" (UID: \"4275e40d-41ca-4fe4-a44b-fe86f4d2e78b\") " pod="openshift-multus/multus-z5ksw" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.781080 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4275e40d-41ca-4fe4-a44b-fe86f4d2e78b-system-cni-dir\") pod \"multus-z5ksw\" (UID: \"4275e40d-41ca-4fe4-a44b-fe86f4d2e78b\") " pod="openshift-multus/multus-z5ksw" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.781096 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4275e40d-41ca-4fe4-a44b-fe86f4d2e78b-etc-kubernetes\") pod \"multus-z5ksw\" (UID: \"4275e40d-41ca-4fe4-a44b-fe86f4d2e78b\") " pod="openshift-multus/multus-z5ksw" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.781111 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/16bd6321-67e6-40c7-9ad0-5c9035765e5d-cnibin\") pod \"multus-additional-cni-plugins-qnjmr\" (UID: \"16bd6321-67e6-40c7-9ad0-5c9035765e5d\") " pod="openshift-multus/multus-additional-cni-plugins-qnjmr" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.781127 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/16bd6321-67e6-40c7-9ad0-5c9035765e5d-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-qnjmr\" (UID: \"16bd6321-67e6-40c7-9ad0-5c9035765e5d\") " pod="openshift-multus/multus-additional-cni-plugins-qnjmr" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.781143 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bb932f5b-ebd7-48e2-ba20-3d1633290c8e-host\") pod \"node-ca-ftkzt\" (UID: \"bb932f5b-ebd7-48e2-ba20-3d1633290c8e\") " pod="openshift-image-registry/node-ca-ftkzt" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.781159 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/bb932f5b-ebd7-48e2-ba20-3d1633290c8e-serviceca\") pod \"node-ca-ftkzt\" (UID: \"bb932f5b-ebd7-48e2-ba20-3d1633290c8e\") " pod="openshift-image-registry/node-ca-ftkzt" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.781185 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f31addae-43ae-459d-bf9d-b5c0ee58faba-run-openvswitch\") pod \"ovnkube-node-xdzb8\" (UID: \"f31addae-43ae-459d-bf9d-b5c0ee58faba\") " pod="openshift-ovn-kubernetes/ovnkube-node-xdzb8" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.781201 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f31addae-43ae-459d-bf9d-b5c0ee58faba-ovnkube-config\") pod \"ovnkube-node-xdzb8\" (UID: \"f31addae-43ae-459d-bf9d-b5c0ee58faba\") " pod="openshift-ovn-kubernetes/ovnkube-node-xdzb8" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.781209 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c-mcd-auth-proxy-config\") pod \"machine-config-daemon-5lbvz\" (UID: \"8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c\") " pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.781233 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4275e40d-41ca-4fe4-a44b-fe86f4d2e78b-host-var-lib-cni-bin\") pod \"multus-z5ksw\" (UID: \"4275e40d-41ca-4fe4-a44b-fe86f4d2e78b\") " pod="openshift-multus/multus-z5ksw" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.781217 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4275e40d-41ca-4fe4-a44b-fe86f4d2e78b-host-var-lib-cni-bin\") pod \"multus-z5ksw\" (UID: \"4275e40d-41ca-4fe4-a44b-fe86f4d2e78b\") " pod="openshift-multus/multus-z5ksw" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.781250 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4275e40d-41ca-4fe4-a44b-fe86f4d2e78b-os-release\") pod \"multus-z5ksw\" (UID: \"4275e40d-41ca-4fe4-a44b-fe86f4d2e78b\") " pod="openshift-multus/multus-z5ksw" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.781275 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f31addae-43ae-459d-bf9d-b5c0ee58faba-host-kubelet\") pod \"ovnkube-node-xdzb8\" (UID: \"f31addae-43ae-459d-bf9d-b5c0ee58faba\") " pod="openshift-ovn-kubernetes/ovnkube-node-xdzb8" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.781299 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/4275e40d-41ca-4fe4-a44b-fe86f4d2e78b-host-var-lib-cni-multus\") pod \"multus-z5ksw\" (UID: \"4275e40d-41ca-4fe4-a44b-fe86f4d2e78b\") " pod="openshift-multus/multus-z5ksw" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.781308 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4275e40d-41ca-4fe4-a44b-fe86f4d2e78b-multus-cni-dir\") pod \"multus-z5ksw\" (UID: \"4275e40d-41ca-4fe4-a44b-fe86f4d2e78b\") " pod="openshift-multus/multus-z5ksw" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.781339 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-srwmv\" (UniqueName: \"kubernetes.io/projected/4275e40d-41ca-4fe4-a44b-fe86f4d2e78b-kube-api-access-srwmv\") pod \"multus-z5ksw\" (UID: \"4275e40d-41ca-4fe4-a44b-fe86f4d2e78b\") " pod="openshift-multus/multus-z5ksw" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.781369 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f31addae-43ae-459d-bf9d-b5c0ee58faba-run-ovn\") pod \"ovnkube-node-xdzb8\" (UID: \"f31addae-43ae-459d-bf9d-b5c0ee58faba\") " pod="openshift-ovn-kubernetes/ovnkube-node-xdzb8" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.781397 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f31addae-43ae-459d-bf9d-b5c0ee58faba-node-log\") pod \"ovnkube-node-xdzb8\" (UID: \"f31addae-43ae-459d-bf9d-b5c0ee58faba\") " pod="openshift-ovn-kubernetes/ovnkube-node-xdzb8" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.781428 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/4275e40d-41ca-4fe4-a44b-fe86f4d2e78b-hostroot\") pod \"multus-z5ksw\" (UID: \"4275e40d-41ca-4fe4-a44b-fe86f4d2e78b\") " pod="openshift-multus/multus-z5ksw" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.781414 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/4275e40d-41ca-4fe4-a44b-fe86f4d2e78b-multus-socket-dir-parent\") pod \"multus-z5ksw\" (UID: \"4275e40d-41ca-4fe4-a44b-fe86f4d2e78b\") " pod="openshift-multus/multus-z5ksw" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.781463 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/16bd6321-67e6-40c7-9ad0-5c9035765e5d-cni-binary-copy\") pod \"multus-additional-cni-plugins-qnjmr\" (UID: \"16bd6321-67e6-40c7-9ad0-5c9035765e5d\") " pod="openshift-multus/multus-additional-cni-plugins-qnjmr" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.781487 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4275e40d-41ca-4fe4-a44b-fe86f4d2e78b-host-run-netns\") pod \"multus-z5ksw\" (UID: \"4275e40d-41ca-4fe4-a44b-fe86f4d2e78b\") " pod="openshift-multus/multus-z5ksw" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.781513 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4275e40d-41ca-4fe4-a44b-fe86f4d2e78b-multus-conf-dir\") pod \"multus-z5ksw\" (UID: \"4275e40d-41ca-4fe4-a44b-fe86f4d2e78b\") " pod="openshift-multus/multus-z5ksw" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.781544 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f31addae-43ae-459d-bf9d-b5c0ee58faba-host-run-netns\") pod \"ovnkube-node-xdzb8\" (UID: \"f31addae-43ae-459d-bf9d-b5c0ee58faba\") " pod="openshift-ovn-kubernetes/ovnkube-node-xdzb8" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.781573 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f31addae-43ae-459d-bf9d-b5c0ee58faba-host-cni-netd\") pod \"ovnkube-node-xdzb8\" (UID: \"f31addae-43ae-459d-bf9d-b5c0ee58faba\") " pod="openshift-ovn-kubernetes/ovnkube-node-xdzb8" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.781602 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c-rootfs\") pod \"machine-config-daemon-5lbvz\" (UID: \"8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c\") " pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.781633 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/4275e40d-41ca-4fe4-a44b-fe86f4d2e78b-host-run-multus-certs\") pod \"multus-z5ksw\" (UID: \"4275e40d-41ca-4fe4-a44b-fe86f4d2e78b\") " pod="openshift-multus/multus-z5ksw" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.781641 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f31addae-43ae-459d-bf9d-b5c0ee58faba-env-overrides\") pod \"ovnkube-node-xdzb8\" (UID: \"f31addae-43ae-459d-bf9d-b5c0ee58faba\") " pod="openshift-ovn-kubernetes/ovnkube-node-xdzb8" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.781665 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/1cb89200-ecbf-4725-b48f-801aaecd6ad0-hosts-file\") pod \"node-resolver-w68d2\" (UID: \"1cb89200-ecbf-4725-b48f-801aaecd6ad0\") " pod="openshift-dns/node-resolver-w68d2" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.781695 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f31addae-43ae-459d-bf9d-b5c0ee58faba-host-cni-bin\") pod \"ovnkube-node-xdzb8\" (UID: \"f31addae-43ae-459d-bf9d-b5c0ee58faba\") " pod="openshift-ovn-kubernetes/ovnkube-node-xdzb8" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.781726 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.781727 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/4275e40d-41ca-4fe4-a44b-fe86f4d2e78b-host-run-k8s-cni-cncf-io\") pod \"multus-z5ksw\" (UID: \"4275e40d-41ca-4fe4-a44b-fe86f4d2e78b\") " pod="openshift-multus/multus-z5ksw" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.781766 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/4275e40d-41ca-4fe4-a44b-fe86f4d2e78b-host-run-k8s-cni-cncf-io\") pod \"multus-z5ksw\" (UID: \"4275e40d-41ca-4fe4-a44b-fe86f4d2e78b\") " pod="openshift-multus/multus-z5ksw" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.781779 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4275e40d-41ca-4fe4-a44b-fe86f4d2e78b-multus-daemon-config\") pod \"multus-z5ksw\" (UID: \"4275e40d-41ca-4fe4-a44b-fe86f4d2e78b\") " pod="openshift-multus/multus-z5ksw" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.782199 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/16bd6321-67e6-40c7-9ad0-5c9035765e5d-tuning-conf-dir\") pod \"multus-additional-cni-plugins-qnjmr\" (UID: \"16bd6321-67e6-40c7-9ad0-5c9035765e5d\") " pod="openshift-multus/multus-additional-cni-plugins-qnjmr" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.788949 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c-rootfs\") pod \"machine-config-daemon-5lbvz\" (UID: \"8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c\") " pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.788967 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f31addae-43ae-459d-bf9d-b5c0ee58faba-node-log\") pod \"ovnkube-node-xdzb8\" (UID: \"f31addae-43ae-459d-bf9d-b5c0ee58faba\") " pod="openshift-ovn-kubernetes/ovnkube-node-xdzb8" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.789015 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/4275e40d-41ca-4fe4-a44b-fe86f4d2e78b-host-run-multus-certs\") pod \"multus-z5ksw\" (UID: \"4275e40d-41ca-4fe4-a44b-fe86f4d2e78b\") " pod="openshift-multus/multus-z5ksw" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.789434 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4275e40d-41ca-4fe4-a44b-fe86f4d2e78b-cni-binary-copy\") pod \"multus-z5ksw\" (UID: \"4275e40d-41ca-4fe4-a44b-fe86f4d2e78b\") " pod="openshift-multus/multus-z5ksw" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.789468 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.789485 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f31addae-43ae-459d-bf9d-b5c0ee58faba-host-kubelet\") pod \"ovnkube-node-xdzb8\" (UID: \"f31addae-43ae-459d-bf9d-b5c0ee58faba\") " pod="openshift-ovn-kubernetes/ovnkube-node-xdzb8" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.789515 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4275e40d-41ca-4fe4-a44b-fe86f4d2e78b-multus-cni-dir\") pod \"multus-z5ksw\" (UID: \"4275e40d-41ca-4fe4-a44b-fe86f4d2e78b\") " pod="openshift-multus/multus-z5ksw" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.789679 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f31addae-43ae-459d-bf9d-b5c0ee58faba-ovnkube-script-lib\") pod \"ovnkube-node-xdzb8\" (UID: \"f31addae-43ae-459d-bf9d-b5c0ee58faba\") " pod="openshift-ovn-kubernetes/ovnkube-node-xdzb8" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.789713 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/4275e40d-41ca-4fe4-a44b-fe86f4d2e78b-hostroot\") pod \"multus-z5ksw\" (UID: \"4275e40d-41ca-4fe4-a44b-fe86f4d2e78b\") " pod="openshift-multus/multus-z5ksw" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.789714 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f31addae-43ae-459d-bf9d-b5c0ee58faba-host-run-netns\") pod \"ovnkube-node-xdzb8\" (UID: \"f31addae-43ae-459d-bf9d-b5c0ee58faba\") " pod="openshift-ovn-kubernetes/ovnkube-node-xdzb8" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.789724 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f31addae-43ae-459d-bf9d-b5c0ee58faba-host-cni-netd\") pod \"ovnkube-node-xdzb8\" (UID: \"f31addae-43ae-459d-bf9d-b5c0ee58faba\") " pod="openshift-ovn-kubernetes/ovnkube-node-xdzb8" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.789736 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f31addae-43ae-459d-bf9d-b5c0ee58faba-run-ovn\") pod \"ovnkube-node-xdzb8\" (UID: \"f31addae-43ae-459d-bf9d-b5c0ee58faba\") " pod="openshift-ovn-kubernetes/ovnkube-node-xdzb8" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.789760 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f31addae-43ae-459d-bf9d-b5c0ee58faba-run-systemd\") pod \"ovnkube-node-xdzb8\" (UID: \"f31addae-43ae-459d-bf9d-b5c0ee58faba\") " pod="openshift-ovn-kubernetes/ovnkube-node-xdzb8" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.789760 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/1cb89200-ecbf-4725-b48f-801aaecd6ad0-hosts-file\") pod \"node-resolver-w68d2\" (UID: \"1cb89200-ecbf-4725-b48f-801aaecd6ad0\") " pod="openshift-dns/node-resolver-w68d2" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.789808 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f31addae-43ae-459d-bf9d-b5c0ee58faba-host-slash\") pod \"ovnkube-node-xdzb8\" (UID: \"f31addae-43ae-459d-bf9d-b5c0ee58faba\") " pod="openshift-ovn-kubernetes/ovnkube-node-xdzb8" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.789930 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f31addae-43ae-459d-bf9d-b5c0ee58faba-host-cni-bin\") pod \"ovnkube-node-xdzb8\" (UID: \"f31addae-43ae-459d-bf9d-b5c0ee58faba\") " pod="openshift-ovn-kubernetes/ovnkube-node-xdzb8" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.790485 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/16bd6321-67e6-40c7-9ad0-5c9035765e5d-cni-binary-copy\") pod \"multus-additional-cni-plugins-qnjmr\" (UID: \"16bd6321-67e6-40c7-9ad0-5c9035765e5d\") " pod="openshift-multus/multus-additional-cni-plugins-qnjmr" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.790519 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f31addae-43ae-459d-bf9d-b5c0ee58faba-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-xdzb8\" (UID: \"f31addae-43ae-459d-bf9d-b5c0ee58faba\") " pod="openshift-ovn-kubernetes/ovnkube-node-xdzb8" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.790565 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/16bd6321-67e6-40c7-9ad0-5c9035765e5d-os-release\") pod \"multus-additional-cni-plugins-qnjmr\" (UID: \"16bd6321-67e6-40c7-9ad0-5c9035765e5d\") " pod="openshift-multus/multus-additional-cni-plugins-qnjmr" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.790587 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4275e40d-41ca-4fe4-a44b-fe86f4d2e78b-multus-conf-dir\") pod \"multus-z5ksw\" (UID: \"4275e40d-41ca-4fe4-a44b-fe86f4d2e78b\") " pod="openshift-multus/multus-z5ksw" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.790607 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f31addae-43ae-459d-bf9d-b5c0ee58faba-systemd-units\") pod \"ovnkube-node-xdzb8\" (UID: \"f31addae-43ae-459d-bf9d-b5c0ee58faba\") " pod="openshift-ovn-kubernetes/ovnkube-node-xdzb8" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.789890 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4275e40d-41ca-4fe4-a44b-fe86f4d2e78b-system-cni-dir\") pod \"multus-z5ksw\" (UID: \"4275e40d-41ca-4fe4-a44b-fe86f4d2e78b\") " pod="openshift-multus/multus-z5ksw" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.790981 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4275e40d-41ca-4fe4-a44b-fe86f4d2e78b-etc-kubernetes\") pod \"multus-z5ksw\" (UID: \"4275e40d-41ca-4fe4-a44b-fe86f4d2e78b\") " pod="openshift-multus/multus-z5ksw" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.791008 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/16bd6321-67e6-40c7-9ad0-5c9035765e5d-cnibin\") pod \"multus-additional-cni-plugins-qnjmr\" (UID: \"16bd6321-67e6-40c7-9ad0-5c9035765e5d\") " pod="openshift-multus/multus-additional-cni-plugins-qnjmr" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.791028 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f31addae-43ae-459d-bf9d-b5c0ee58faba-run-openvswitch\") pod \"ovnkube-node-xdzb8\" (UID: \"f31addae-43ae-459d-bf9d-b5c0ee58faba\") " pod="openshift-ovn-kubernetes/ovnkube-node-xdzb8" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.791313 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/16bd6321-67e6-40c7-9ad0-5c9035765e5d-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-qnjmr\" (UID: \"16bd6321-67e6-40c7-9ad0-5c9035765e5d\") " pod="openshift-multus/multus-additional-cni-plugins-qnjmr" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.791552 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bb932f5b-ebd7-48e2-ba20-3d1633290c8e-host\") pod \"node-ca-ftkzt\" (UID: \"bb932f5b-ebd7-48e2-ba20-3d1633290c8e\") " pod="openshift-image-registry/node-ca-ftkzt" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.792009 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/bb932f5b-ebd7-48e2-ba20-3d1633290c8e-serviceca\") pod \"node-ca-ftkzt\" (UID: \"bb932f5b-ebd7-48e2-ba20-3d1633290c8e\") " pod="openshift-image-registry/node-ca-ftkzt" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.792182 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f31addae-43ae-459d-bf9d-b5c0ee58faba-ovnkube-config\") pod \"ovnkube-node-xdzb8\" (UID: \"f31addae-43ae-459d-bf9d-b5c0ee58faba\") " pod="openshift-ovn-kubernetes/ovnkube-node-xdzb8" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.792388 4748 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.792411 4748 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.792426 4748 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.792439 4748 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.792452 4748 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.792464 4748 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.792476 4748 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.792488 4748 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.792500 4748 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.792511 4748 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.792523 4748 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.792536 4748 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.792551 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.792564 4748 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.792575 4748 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.792586 4748 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.792598 4748 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.792612 4748 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.792624 4748 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.792637 4748 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.792648 4748 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.792660 4748 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.792673 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.792685 4748 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.792696 4748 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.792702 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c-proxy-tls\") pod \"machine-config-daemon-5lbvz\" (UID: \"8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c\") " pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.792694 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f31addae-43ae-459d-bf9d-b5c0ee58faba-var-lib-openvswitch\") pod \"ovnkube-node-xdzb8\" (UID: \"f31addae-43ae-459d-bf9d-b5c0ee58faba\") " pod="openshift-ovn-kubernetes/ovnkube-node-xdzb8" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.792712 4748 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.792827 4748 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.792871 4748 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.792882 4748 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.792893 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.792904 4748 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.792939 4748 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.792949 4748 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.792960 4748 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.792970 4748 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.792979 4748 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.792989 4748 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.793019 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.793030 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.793042 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.793052 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.793062 4748 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.793072 4748 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.793104 4748 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.793114 4748 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.793123 4748 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.793133 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.793143 4748 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.793151 4748 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.793180 4748 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.793190 4748 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.793200 4748 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.793209 4748 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.793218 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.793228 4748 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.793257 4748 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.793268 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.793277 4748 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.793285 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.793295 4748 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.793303 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.793313 4748 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.793341 4748 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.793351 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.793360 4748 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.793368 4748 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.793377 4748 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.793386 4748 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.793394 4748 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.793424 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.793435 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.793443 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.793451 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.793460 4748 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.793472 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.793499 4748 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.793509 4748 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.793518 4748 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.793529 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.793537 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.793545 4748 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.793553 4748 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.793580 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.793590 4748 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.793600 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.802114 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6ncv\" (UniqueName: \"kubernetes.io/projected/16bd6321-67e6-40c7-9ad0-5c9035765e5d-kube-api-access-z6ncv\") pod \"multus-additional-cni-plugins-qnjmr\" (UID: \"16bd6321-67e6-40c7-9ad0-5c9035765e5d\") " pod="openshift-multus/multus-additional-cni-plugins-qnjmr" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.805395 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f31addae-43ae-459d-bf9d-b5c0ee58faba-ovn-node-metrics-cert\") pod \"ovnkube-node-xdzb8\" (UID: \"f31addae-43ae-459d-bf9d-b5c0ee58faba\") " pod="openshift-ovn-kubernetes/ovnkube-node-xdzb8" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.808532 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-szq25\" (UniqueName: \"kubernetes.io/projected/1cb89200-ecbf-4725-b48f-801aaecd6ad0-kube-api-access-szq25\") pod \"node-resolver-w68d2\" (UID: \"1cb89200-ecbf-4725-b48f-801aaecd6ad0\") " pod="openshift-dns/node-resolver-w68d2" Mar 20 10:37:42 crc kubenswrapper[4748]: E0320 10:37:42.808763 4748 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1d9697da-f407-4535-b044-2e042853bd80\\\",\\\"systemUUID\\\":\\\"1909d2db-5267-4c43-8cb4-dc64b5fa3add\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:42 crc kubenswrapper[4748]: E0320 10:37:42.808879 4748 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.810702 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fksm\" (UniqueName: \"kubernetes.io/projected/8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c-kube-api-access-8fksm\") pod \"machine-config-daemon-5lbvz\" (UID: \"8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c\") " pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.811998 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.812035 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.812051 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.812075 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.812092 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:42Z","lastTransitionTime":"2026-03-20T10:37:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.812102 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndt8m\" (UniqueName: \"kubernetes.io/projected/f31addae-43ae-459d-bf9d-b5c0ee58faba-kube-api-access-ndt8m\") pod \"ovnkube-node-xdzb8\" (UID: \"f31addae-43ae-459d-bf9d-b5c0ee58faba\") " pod="openshift-ovn-kubernetes/ovnkube-node-xdzb8" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.814252 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9l9h\" (UniqueName: \"kubernetes.io/projected/bb932f5b-ebd7-48e2-ba20-3d1633290c8e-kube-api-access-k9l9h\") pod \"node-ca-ftkzt\" (UID: \"bb932f5b-ebd7-48e2-ba20-3d1633290c8e\") " pod="openshift-image-registry/node-ca-ftkzt" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.814944 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-srwmv\" (UniqueName: \"kubernetes.io/projected/4275e40d-41ca-4fe4-a44b-fe86f4d2e78b-kube-api-access-srwmv\") pod \"multus-z5ksw\" (UID: \"4275e40d-41ca-4fe4-a44b-fe86f4d2e78b\") " pod="openshift-multus/multus-z5ksw" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.821636 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 10:37:42 crc kubenswrapper[4748]: E0320 10:37:42.836864 4748 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 10:37:42 crc kubenswrapper[4748]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 20 10:37:42 crc kubenswrapper[4748]: set -o allexport Mar 20 10:37:42 crc kubenswrapper[4748]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 20 10:37:42 crc kubenswrapper[4748]: source /etc/kubernetes/apiserver-url.env Mar 20 10:37:42 crc kubenswrapper[4748]: else Mar 20 10:37:42 crc kubenswrapper[4748]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 20 10:37:42 crc kubenswrapper[4748]: exit 1 Mar 20 10:37:42 crc kubenswrapper[4748]: fi Mar 20 10:37:42 crc kubenswrapper[4748]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 20 10:37:42 crc kubenswrapper[4748]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 10:37:42 crc kubenswrapper[4748]: > logger="UnhandledError" Mar 20 10:37:42 crc kubenswrapper[4748]: E0320 10:37:42.838337 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.842976 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.854432 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 10:37:42 crc kubenswrapper[4748]: W0320 10:37:42.858218 4748 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-be3a2f78dc3c50241ce3b3d6a47b91f666f881a0f84dbe2b317baf185bd53942 WatchSource:0}: Error finding container be3a2f78dc3c50241ce3b3d6a47b91f666f881a0f84dbe2b317baf185bd53942: Status 404 returned error can't find the container with id be3a2f78dc3c50241ce3b3d6a47b91f666f881a0f84dbe2b317baf185bd53942 Mar 20 10:37:42 crc kubenswrapper[4748]: E0320 10:37:42.860646 4748 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 20 10:37:42 crc kubenswrapper[4748]: E0320 10:37:42.862074 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.863305 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-w68d2" Mar 20 10:37:42 crc kubenswrapper[4748]: W0320 10:37:42.869716 4748 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-826f66d17788630f86a227e2b394e381a366d7aad930af11ec243fdfbe7fda7b WatchSource:0}: Error finding container 826f66d17788630f86a227e2b394e381a366d7aad930af11ec243fdfbe7fda7b: Status 404 returned error can't find the container with id 826f66d17788630f86a227e2b394e381a366d7aad930af11ec243fdfbe7fda7b Mar 20 10:37:42 crc kubenswrapper[4748]: E0320 10:37:42.872492 4748 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 10:37:42 crc kubenswrapper[4748]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 20 10:37:42 crc kubenswrapper[4748]: if [[ -f "/env/_master" ]]; then Mar 20 10:37:42 crc kubenswrapper[4748]: set -o allexport Mar 20 10:37:42 crc kubenswrapper[4748]: source "/env/_master" Mar 20 10:37:42 crc kubenswrapper[4748]: set +o allexport Mar 20 10:37:42 crc kubenswrapper[4748]: fi Mar 20 10:37:42 crc kubenswrapper[4748]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Mar 20 10:37:42 crc kubenswrapper[4748]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Mar 20 10:37:42 crc kubenswrapper[4748]: ho_enable="--enable-hybrid-overlay" Mar 20 10:37:42 crc kubenswrapper[4748]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Mar 20 10:37:42 crc kubenswrapper[4748]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Mar 20 10:37:42 crc kubenswrapper[4748]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Mar 20 10:37:42 crc kubenswrapper[4748]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 20 10:37:42 crc kubenswrapper[4748]: --webhook-cert-dir="/etc/webhook-cert" \ Mar 20 10:37:42 crc kubenswrapper[4748]: --webhook-host=127.0.0.1 \ Mar 20 10:37:42 crc kubenswrapper[4748]: --webhook-port=9743 \ Mar 20 10:37:42 crc kubenswrapper[4748]: ${ho_enable} \ Mar 20 10:37:42 crc kubenswrapper[4748]: --enable-interconnect \ Mar 20 10:37:42 crc kubenswrapper[4748]: --disable-approver \ Mar 20 10:37:42 crc kubenswrapper[4748]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Mar 20 10:37:42 crc kubenswrapper[4748]: --wait-for-kubernetes-api=200s \ Mar 20 10:37:42 crc kubenswrapper[4748]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Mar 20 10:37:42 crc kubenswrapper[4748]: --loglevel="${LOGLEVEL}" Mar 20 10:37:42 crc kubenswrapper[4748]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 10:37:42 crc kubenswrapper[4748]: > logger="UnhandledError" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.873987 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-ftkzt" Mar 20 10:37:42 crc kubenswrapper[4748]: E0320 10:37:42.879608 4748 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 10:37:42 crc kubenswrapper[4748]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 20 10:37:42 crc kubenswrapper[4748]: if [[ -f "/env/_master" ]]; then Mar 20 10:37:42 crc kubenswrapper[4748]: set -o allexport Mar 20 10:37:42 crc kubenswrapper[4748]: source "/env/_master" Mar 20 10:37:42 crc kubenswrapper[4748]: set +o allexport Mar 20 10:37:42 crc kubenswrapper[4748]: fi Mar 20 10:37:42 crc kubenswrapper[4748]: Mar 20 10:37:42 crc kubenswrapper[4748]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Mar 20 10:37:42 crc kubenswrapper[4748]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 20 10:37:42 crc kubenswrapper[4748]: --disable-webhook \ Mar 20 10:37:42 crc kubenswrapper[4748]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Mar 20 10:37:42 crc kubenswrapper[4748]: --loglevel="${LOGLEVEL}" Mar 20 10:37:42 crc kubenswrapper[4748]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 10:37:42 crc kubenswrapper[4748]: > logger="UnhandledError" Mar 20 10:37:42 crc kubenswrapper[4748]: E0320 10:37:42.880773 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Mar 20 10:37:42 crc kubenswrapper[4748]: E0320 10:37:42.885334 4748 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 10:37:42 crc kubenswrapper[4748]: container &Container{Name:dns-node-resolver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/bin/bash -c #!/bin/bash Mar 20 10:37:42 crc kubenswrapper[4748]: set -uo pipefail Mar 20 10:37:42 crc kubenswrapper[4748]: Mar 20 10:37:42 crc kubenswrapper[4748]: trap 'jobs -p | xargs kill || true; wait; exit 0' TERM Mar 20 10:37:42 crc kubenswrapper[4748]: Mar 20 10:37:42 crc kubenswrapper[4748]: OPENSHIFT_MARKER="openshift-generated-node-resolver" Mar 20 10:37:42 crc kubenswrapper[4748]: HOSTS_FILE="/etc/hosts" Mar 20 10:37:42 crc kubenswrapper[4748]: TEMP_FILE="/etc/hosts.tmp" Mar 20 10:37:42 crc kubenswrapper[4748]: Mar 20 10:37:42 crc kubenswrapper[4748]: IFS=', ' read -r -a services <<< "${SERVICES}" Mar 20 10:37:42 crc kubenswrapper[4748]: Mar 20 10:37:42 crc kubenswrapper[4748]: # Make a temporary file with the old hosts file's attributes. Mar 20 10:37:42 crc kubenswrapper[4748]: if ! cp -f --attributes-only "${HOSTS_FILE}" "${TEMP_FILE}"; then Mar 20 10:37:42 crc kubenswrapper[4748]: echo "Failed to preserve hosts file. Exiting." Mar 20 10:37:42 crc kubenswrapper[4748]: exit 1 Mar 20 10:37:42 crc kubenswrapper[4748]: fi Mar 20 10:37:42 crc kubenswrapper[4748]: Mar 20 10:37:42 crc kubenswrapper[4748]: while true; do Mar 20 10:37:42 crc kubenswrapper[4748]: declare -A svc_ips Mar 20 10:37:42 crc kubenswrapper[4748]: for svc in "${services[@]}"; do Mar 20 10:37:42 crc kubenswrapper[4748]: # Fetch service IP from cluster dns if present. We make several tries Mar 20 10:37:42 crc kubenswrapper[4748]: # to do it: IPv4, IPv6, IPv4 over TCP and IPv6 over TCP. The two last ones Mar 20 10:37:42 crc kubenswrapper[4748]: # are for deployments with Kuryr on older OpenStack (OSP13) - those do not Mar 20 10:37:42 crc kubenswrapper[4748]: # support UDP loadbalancers and require reaching DNS through TCP. Mar 20 10:37:42 crc kubenswrapper[4748]: cmds=('dig -t A @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 20 10:37:42 crc kubenswrapper[4748]: 'dig -t AAAA @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 20 10:37:42 crc kubenswrapper[4748]: 'dig -t A +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 20 10:37:42 crc kubenswrapper[4748]: 'dig -t AAAA +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"') Mar 20 10:37:42 crc kubenswrapper[4748]: for i in ${!cmds[*]} Mar 20 10:37:42 crc kubenswrapper[4748]: do Mar 20 10:37:42 crc kubenswrapper[4748]: ips=($(eval "${cmds[i]}")) Mar 20 10:37:42 crc kubenswrapper[4748]: if [[ "$?" -eq 0 && "${#ips[@]}" -ne 0 ]]; then Mar 20 10:37:42 crc kubenswrapper[4748]: svc_ips["${svc}"]="${ips[@]}" Mar 20 10:37:42 crc kubenswrapper[4748]: break Mar 20 10:37:42 crc kubenswrapper[4748]: fi Mar 20 10:37:42 crc kubenswrapper[4748]: done Mar 20 10:37:42 crc kubenswrapper[4748]: done Mar 20 10:37:42 crc kubenswrapper[4748]: Mar 20 10:37:42 crc kubenswrapper[4748]: # Update /etc/hosts only if we get valid service IPs Mar 20 10:37:42 crc kubenswrapper[4748]: # We will not update /etc/hosts when there is coredns service outage or api unavailability Mar 20 10:37:42 crc kubenswrapper[4748]: # Stale entries could exist in /etc/hosts if the service is deleted Mar 20 10:37:42 crc kubenswrapper[4748]: if [[ -n "${svc_ips[*]-}" ]]; then Mar 20 10:37:42 crc kubenswrapper[4748]: # Build a new hosts file from /etc/hosts with our custom entries filtered out Mar 20 10:37:42 crc kubenswrapper[4748]: if ! sed --silent "/# ${OPENSHIFT_MARKER}/d; w ${TEMP_FILE}" "${HOSTS_FILE}"; then Mar 20 10:37:42 crc kubenswrapper[4748]: # Only continue rebuilding the hosts entries if its original content is preserved Mar 20 10:37:42 crc kubenswrapper[4748]: sleep 60 & wait Mar 20 10:37:42 crc kubenswrapper[4748]: continue Mar 20 10:37:42 crc kubenswrapper[4748]: fi Mar 20 10:37:42 crc kubenswrapper[4748]: Mar 20 10:37:42 crc kubenswrapper[4748]: # Append resolver entries for services Mar 20 10:37:42 crc kubenswrapper[4748]: rc=0 Mar 20 10:37:42 crc kubenswrapper[4748]: for svc in "${!svc_ips[@]}"; do Mar 20 10:37:42 crc kubenswrapper[4748]: for ip in ${svc_ips[${svc}]}; do Mar 20 10:37:42 crc kubenswrapper[4748]: echo "${ip} ${svc} ${svc}.${CLUSTER_DOMAIN} # ${OPENSHIFT_MARKER}" >> "${TEMP_FILE}" || rc=$? Mar 20 10:37:42 crc kubenswrapper[4748]: done Mar 20 10:37:42 crc kubenswrapper[4748]: done Mar 20 10:37:42 crc kubenswrapper[4748]: if [[ $rc -ne 0 ]]; then Mar 20 10:37:42 crc kubenswrapper[4748]: sleep 60 & wait Mar 20 10:37:42 crc kubenswrapper[4748]: continue Mar 20 10:37:42 crc kubenswrapper[4748]: fi Mar 20 10:37:42 crc kubenswrapper[4748]: Mar 20 10:37:42 crc kubenswrapper[4748]: Mar 20 10:37:42 crc kubenswrapper[4748]: # TODO: Update /etc/hosts atomically to avoid any inconsistent behavior Mar 20 10:37:42 crc kubenswrapper[4748]: # Replace /etc/hosts with our modified version if needed Mar 20 10:37:42 crc kubenswrapper[4748]: cmp "${TEMP_FILE}" "${HOSTS_FILE}" || cp -f "${TEMP_FILE}" "${HOSTS_FILE}" Mar 20 10:37:42 crc kubenswrapper[4748]: # TEMP_FILE is not removed to avoid file create/delete and attributes copy churn Mar 20 10:37:42 crc kubenswrapper[4748]: fi Mar 20 10:37:42 crc kubenswrapper[4748]: sleep 60 & wait Mar 20 10:37:42 crc kubenswrapper[4748]: unset svc_ips Mar 20 10:37:42 crc kubenswrapper[4748]: done Mar 20 10:37:42 crc kubenswrapper[4748]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:SERVICES,Value:image-registry.openshift-image-registry.svc,ValueFrom:nil,},EnvVar{Name:NAMESERVER,Value:10.217.4.10,ValueFrom:nil,},EnvVar{Name:CLUSTER_DOMAIN,Value:cluster.local,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{22020096 0} {} 21Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:hosts-file,ReadOnly:false,MountPath:/etc/hosts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-szq25,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-resolver-w68d2_openshift-dns(1cb89200-ecbf-4725-b48f-801aaecd6ad0): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 10:37:42 crc kubenswrapper[4748]: > logger="UnhandledError" Mar 20 10:37:42 crc kubenswrapper[4748]: E0320 10:37:42.887942 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dns-node-resolver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-dns/node-resolver-w68d2" podUID="1cb89200-ecbf-4725-b48f-801aaecd6ad0" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.892254 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-qnjmr" Mar 20 10:37:42 crc kubenswrapper[4748]: W0320 10:37:42.892664 4748 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbb932f5b_ebd7_48e2_ba20_3d1633290c8e.slice/crio-da7408d90b83a52992004aa8f2de4d1bb9c05eb9816b8cc634d91aad22d03afc WatchSource:0}: Error finding container da7408d90b83a52992004aa8f2de4d1bb9c05eb9816b8cc634d91aad22d03afc: Status 404 returned error can't find the container with id da7408d90b83a52992004aa8f2de4d1bb9c05eb9816b8cc634d91aad22d03afc Mar 20 10:37:42 crc kubenswrapper[4748]: E0320 10:37:42.898243 4748 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 10:37:42 crc kubenswrapper[4748]: container &Container{Name:node-ca,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f,Command:[/bin/sh -c trap 'jobs -p | xargs -r kill; echo shutting down node-ca; exit 0' TERM Mar 20 10:37:42 crc kubenswrapper[4748]: while [ true ]; Mar 20 10:37:42 crc kubenswrapper[4748]: do Mar 20 10:37:42 crc kubenswrapper[4748]: for f in $(ls /tmp/serviceca); do Mar 20 10:37:42 crc kubenswrapper[4748]: echo $f Mar 20 10:37:42 crc kubenswrapper[4748]: ca_file_path="/tmp/serviceca/${f}" Mar 20 10:37:42 crc kubenswrapper[4748]: f=$(echo $f | sed -r 's/(.*)\.\./\1:/') Mar 20 10:37:42 crc kubenswrapper[4748]: reg_dir_path="/etc/docker/certs.d/${f}" Mar 20 10:37:42 crc kubenswrapper[4748]: if [ -e "${reg_dir_path}" ]; then Mar 20 10:37:42 crc kubenswrapper[4748]: cp -u $ca_file_path $reg_dir_path/ca.crt Mar 20 10:37:42 crc kubenswrapper[4748]: else Mar 20 10:37:42 crc kubenswrapper[4748]: mkdir $reg_dir_path Mar 20 10:37:42 crc kubenswrapper[4748]: cp $ca_file_path $reg_dir_path/ca.crt Mar 20 10:37:42 crc kubenswrapper[4748]: fi Mar 20 10:37:42 crc kubenswrapper[4748]: done Mar 20 10:37:42 crc kubenswrapper[4748]: for d in $(ls /etc/docker/certs.d); do Mar 20 10:37:42 crc kubenswrapper[4748]: echo $d Mar 20 10:37:42 crc kubenswrapper[4748]: dp=$(echo $d | sed -r 's/(.*):/\1\.\./') Mar 20 10:37:42 crc kubenswrapper[4748]: reg_conf_path="/tmp/serviceca/${dp}" Mar 20 10:37:42 crc kubenswrapper[4748]: if [ ! -e "${reg_conf_path}" ]; then Mar 20 10:37:42 crc kubenswrapper[4748]: rm -rf /etc/docker/certs.d/$d Mar 20 10:37:42 crc kubenswrapper[4748]: fi Mar 20 10:37:42 crc kubenswrapper[4748]: done Mar 20 10:37:42 crc kubenswrapper[4748]: sleep 60 & wait ${!} Mar 20 10:37:42 crc kubenswrapper[4748]: done Mar 20 10:37:42 crc kubenswrapper[4748]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{10485760 0} {} 10Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:serviceca,ReadOnly:false,MountPath:/tmp/serviceca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host,ReadOnly:false,MountPath:/etc/docker/certs.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-k9l9h,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:*1001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-ca-ftkzt_openshift-image-registry(bb932f5b-ebd7-48e2-ba20-3d1633290c8e): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 10:37:42 crc kubenswrapper[4748]: > logger="UnhandledError" Mar 20 10:37:42 crc kubenswrapper[4748]: E0320 10:37:42.899463 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"node-ca\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-image-registry/node-ca-ftkzt" podUID="bb932f5b-ebd7-48e2-ba20-3d1633290c8e" Mar 20 10:37:42 crc kubenswrapper[4748]: W0320 10:37:42.903746 4748 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod16bd6321_67e6_40c7_9ad0_5c9035765e5d.slice/crio-d56f3cdafc7f11212ee43585dfeeedc2c0ed889ed131177fa882866e758cd0ac WatchSource:0}: Error finding container d56f3cdafc7f11212ee43585dfeeedc2c0ed889ed131177fa882866e758cd0ac: Status 404 returned error can't find the container with id d56f3cdafc7f11212ee43585dfeeedc2c0ed889ed131177fa882866e758cd0ac Mar 20 10:37:42 crc kubenswrapper[4748]: E0320 10:37:42.906651 4748 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:egress-router-binary-copy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,Command:[/entrypoint/cnibincopy.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/bin/,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:true,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-z6ncv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-additional-cni-plugins-qnjmr_openshift-multus(16bd6321-67e6-40c7-9ad0-5c9035765e5d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.907647 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-xdzb8" Mar 20 10:37:42 crc kubenswrapper[4748]: E0320 10:37:42.907734 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"egress-router-binary-copy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-additional-cni-plugins-qnjmr" podUID="16bd6321-67e6-40c7-9ad0-5c9035765e5d" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.914485 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.914520 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.914530 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.914548 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.914560 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:42Z","lastTransitionTime":"2026-03-20T10:37:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.919916 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-z5ksw" Mar 20 10:37:42 crc kubenswrapper[4748]: W0320 10:37:42.920664 4748 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf31addae_43ae_459d_bf9d_b5c0ee58faba.slice/crio-407b8a51d2008d9ca2a94f6c8b8c647c80a8508cff7c3681726790013119adb5 WatchSource:0}: Error finding container 407b8a51d2008d9ca2a94f6c8b8c647c80a8508cff7c3681726790013119adb5: Status 404 returned error can't find the container with id 407b8a51d2008d9ca2a94f6c8b8c647c80a8508cff7c3681726790013119adb5 Mar 20 10:37:42 crc kubenswrapper[4748]: E0320 10:37:42.926028 4748 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 10:37:42 crc kubenswrapper[4748]: init container &Container{Name:kubecfg-setup,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c cat << EOF > /etc/ovn/kubeconfig Mar 20 10:37:42 crc kubenswrapper[4748]: apiVersion: v1 Mar 20 10:37:42 crc kubenswrapper[4748]: clusters: Mar 20 10:37:42 crc kubenswrapper[4748]: - cluster: Mar 20 10:37:42 crc kubenswrapper[4748]: certificate-authority: /var/run/secrets/kubernetes.io/serviceaccount/ca.crt Mar 20 10:37:42 crc kubenswrapper[4748]: server: https://api-int.crc.testing:6443 Mar 20 10:37:42 crc kubenswrapper[4748]: name: default-cluster Mar 20 10:37:42 crc kubenswrapper[4748]: contexts: Mar 20 10:37:42 crc kubenswrapper[4748]: - context: Mar 20 10:37:42 crc kubenswrapper[4748]: cluster: default-cluster Mar 20 10:37:42 crc kubenswrapper[4748]: namespace: default Mar 20 10:37:42 crc kubenswrapper[4748]: user: default-auth Mar 20 10:37:42 crc kubenswrapper[4748]: name: default-context Mar 20 10:37:42 crc kubenswrapper[4748]: current-context: default-context Mar 20 10:37:42 crc kubenswrapper[4748]: kind: Config Mar 20 10:37:42 crc kubenswrapper[4748]: preferences: {} Mar 20 10:37:42 crc kubenswrapper[4748]: users: Mar 20 10:37:42 crc kubenswrapper[4748]: - name: default-auth Mar 20 10:37:42 crc kubenswrapper[4748]: user: Mar 20 10:37:42 crc kubenswrapper[4748]: client-certificate: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 20 10:37:42 crc kubenswrapper[4748]: client-key: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 20 10:37:42 crc kubenswrapper[4748]: EOF Mar 20 10:37:42 crc kubenswrapper[4748]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-openvswitch,ReadOnly:false,MountPath:/etc/ovn/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ndt8m,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-node-xdzb8_openshift-ovn-kubernetes(f31addae-43ae-459d-bf9d-b5c0ee58faba): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 10:37:42 crc kubenswrapper[4748]: > logger="UnhandledError" Mar 20 10:37:42 crc kubenswrapper[4748]: E0320 10:37:42.927261 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kubecfg-setup\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-ovn-kubernetes/ovnkube-node-xdzb8" podUID="f31addae-43ae-459d-bf9d-b5c0ee58faba" Mar 20 10:37:42 crc kubenswrapper[4748]: I0320 10:37:42.927413 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" Mar 20 10:37:42 crc kubenswrapper[4748]: W0320 10:37:42.934339 4748 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4275e40d_41ca_4fe4_a44b_fe86f4d2e78b.slice/crio-cd5bc6940c303b743ec2e86513a47a0f1824089e7eee990d943396eabb96b65c WatchSource:0}: Error finding container cd5bc6940c303b743ec2e86513a47a0f1824089e7eee990d943396eabb96b65c: Status 404 returned error can't find the container with id cd5bc6940c303b743ec2e86513a47a0f1824089e7eee990d943396eabb96b65c Mar 20 10:37:42 crc kubenswrapper[4748]: E0320 10:37:42.938763 4748 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 10:37:42 crc kubenswrapper[4748]: container &Container{Name:kube-multus,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,Command:[/bin/bash -ec --],Args:[MULTUS_DAEMON_OPT="" Mar 20 10:37:42 crc kubenswrapper[4748]: /entrypoint/cnibincopy.sh; exec /usr/src/multus-cni/bin/multus-daemon $MULTUS_DAEMON_OPT Mar 20 10:37:42 crc kubenswrapper[4748]: ],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/bin/,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:6443,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:api-int.crc.testing,ValueFrom:nil,},EnvVar{Name:MULTUS_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:K8S_NODE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:false,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:system-cni-dir,ReadOnly:false,MountPath:/host/etc/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-cni-dir,ReadOnly:false,MountPath:/host/run/multus/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-socket-dir-parent,ReadOnly:false,MountPath:/host/run/multus,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-k8s-cni-cncf-io,ReadOnly:false,MountPath:/run/k8s.cni.cncf.io,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-netns,ReadOnly:false,MountPath:/run/netns,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-bin,ReadOnly:false,MountPath:/var/lib/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-multus,ReadOnly:false,MountPath:/var/lib/cni/multus,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-kubelet,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:hostroot,ReadOnly:false,MountPath:/hostroot,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-conf-dir,ReadOnly:false,MountPath:/etc/cni/multus/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-daemon-config,ReadOnly:true,MountPath:/etc/cni/net.d/multus.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-multus-certs,ReadOnly:false,MountPath:/etc/cni/multus/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:etc-kubernetes,ReadOnly:false,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-srwmv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-z5ksw_openshift-multus(4275e40d-41ca-4fe4-a44b-fe86f4d2e78b): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 10:37:42 crc kubenswrapper[4748]: > logger="UnhandledError" Mar 20 10:37:42 crc kubenswrapper[4748]: E0320 10:37:42.940080 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-z5ksw" podUID="4275e40d-41ca-4fe4-a44b-fe86f4d2e78b" Mar 20 10:37:42 crc kubenswrapper[4748]: W0320 10:37:42.947374 4748 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e81ab84_9a9e_4ec4_ae87_ec51a8bc658c.slice/crio-1a924177f53942043db0825f2f8f9d7ecdc8fe427c01ad10209a4dd590861293 WatchSource:0}: Error finding container 1a924177f53942043db0825f2f8f9d7ecdc8fe427c01ad10209a4dd590861293: Status 404 returned error can't find the container with id 1a924177f53942043db0825f2f8f9d7ecdc8fe427c01ad10209a4dd590861293 Mar 20 10:37:42 crc kubenswrapper[4748]: E0320 10:37:42.951164 4748 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:machine-config-daemon,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a,Command:[/usr/bin/machine-config-daemon],Args:[start --payload-version=4.18.1],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:health,HostPort:8798,ContainerPort:8798,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:rootfs,ReadOnly:false,MountPath:/rootfs,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8fksm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/health,Port:{0 8798 },Host:127.0.0.1,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:120,TimeoutSeconds:1,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-5lbvz_openshift-machine-config-operator(8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 20 10:37:42 crc kubenswrapper[4748]: E0320 10:37:42.955079 4748 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,Command:[],Args:[--secure-listen-address=0.0.0.0:9001 --config-file=/etc/kube-rbac-proxy/config-file.yaml --tls-cipher-suites=TLS_AES_128_GCM_SHA256,TLS_AES_256_GCM_SHA384,TLS_CHACHA20_POLY1305_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_ECDSA_WITH_CHACHA20_POLY1305_SHA256,TLS_ECDHE_RSA_WITH_CHACHA20_POLY1305_SHA256 --tls-min-version=VersionTLS12 --upstream=http://127.0.0.1:8797 --logtostderr=true --tls-cert-file=/etc/tls/private/tls.crt --tls-private-key-file=/etc/tls/private/tls.key],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:9001,ContainerPort:9001,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:proxy-tls,ReadOnly:false,MountPath:/etc/tls/private,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:mcd-auth-proxy-config,ReadOnly:false,MountPath:/etc/kube-rbac-proxy,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8fksm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-5lbvz_openshift-machine-config-operator(8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 20 10:37:42 crc kubenswrapper[4748]: E0320 10:37:42.956451 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"machine-config-daemon\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" Mar 20 10:37:43 crc kubenswrapper[4748]: I0320 10:37:43.016910 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:43 crc kubenswrapper[4748]: I0320 10:37:43.016961 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:43 crc kubenswrapper[4748]: I0320 10:37:43.016973 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:43 crc kubenswrapper[4748]: I0320 10:37:43.016994 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:43 crc kubenswrapper[4748]: I0320 10:37:43.017008 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:43Z","lastTransitionTime":"2026-03-20T10:37:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:43 crc kubenswrapper[4748]: I0320 10:37:43.030352 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" event={"ID":"8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c","Type":"ContainerStarted","Data":"1a924177f53942043db0825f2f8f9d7ecdc8fe427c01ad10209a4dd590861293"} Mar 20 10:37:43 crc kubenswrapper[4748]: I0320 10:37:43.032743 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"be3a2f78dc3c50241ce3b3d6a47b91f666f881a0f84dbe2b317baf185bd53942"} Mar 20 10:37:43 crc kubenswrapper[4748]: E0320 10:37:43.033554 4748 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:machine-config-daemon,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a,Command:[/usr/bin/machine-config-daemon],Args:[start --payload-version=4.18.1],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:health,HostPort:8798,ContainerPort:8798,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:rootfs,ReadOnly:false,MountPath:/rootfs,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8fksm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/health,Port:{0 8798 },Host:127.0.0.1,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:120,TimeoutSeconds:1,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-5lbvz_openshift-machine-config-operator(8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 20 10:37:43 crc kubenswrapper[4748]: I0320 10:37:43.034868 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qnjmr" event={"ID":"16bd6321-67e6-40c7-9ad0-5c9035765e5d","Type":"ContainerStarted","Data":"d56f3cdafc7f11212ee43585dfeeedc2c0ed889ed131177fa882866e758cd0ac"} Mar 20 10:37:43 crc kubenswrapper[4748]: E0320 10:37:43.036587 4748 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 20 10:37:43 crc kubenswrapper[4748]: E0320 10:37:43.036603 4748 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,Command:[],Args:[--secure-listen-address=0.0.0.0:9001 --config-file=/etc/kube-rbac-proxy/config-file.yaml --tls-cipher-suites=TLS_AES_128_GCM_SHA256,TLS_AES_256_GCM_SHA384,TLS_CHACHA20_POLY1305_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_ECDSA_WITH_CHACHA20_POLY1305_SHA256,TLS_ECDHE_RSA_WITH_CHACHA20_POLY1305_SHA256 --tls-min-version=VersionTLS12 --upstream=http://127.0.0.1:8797 --logtostderr=true --tls-cert-file=/etc/tls/private/tls.crt --tls-private-key-file=/etc/tls/private/tls.key],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:9001,ContainerPort:9001,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:proxy-tls,ReadOnly:false,MountPath:/etc/tls/private,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:mcd-auth-proxy-config,ReadOnly:false,MountPath:/etc/kube-rbac-proxy,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8fksm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-5lbvz_openshift-machine-config-operator(8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 20 10:37:43 crc kubenswrapper[4748]: E0320 10:37:43.037555 4748 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:egress-router-binary-copy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,Command:[/entrypoint/cnibincopy.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/bin/,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:true,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-z6ncv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-additional-cni-plugins-qnjmr_openshift-multus(16bd6321-67e6-40c7-9ad0-5c9035765e5d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 20 10:37:43 crc kubenswrapper[4748]: E0320 10:37:43.037763 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"machine-config-daemon\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" Mar 20 10:37:43 crc kubenswrapper[4748]: E0320 10:37:43.037769 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Mar 20 10:37:43 crc kubenswrapper[4748]: I0320 10:37:43.039586 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"0f3110df2cf44d663ab8da838816b837c4e0c3abb10d40c528cc37358cc5fe52"} Mar 20 10:37:43 crc kubenswrapper[4748]: E0320 10:37:43.040724 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"egress-router-binary-copy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-additional-cni-plugins-qnjmr" podUID="16bd6321-67e6-40c7-9ad0-5c9035765e5d" Mar 20 10:37:43 crc kubenswrapper[4748]: E0320 10:37:43.042206 4748 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 10:37:43 crc kubenswrapper[4748]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 20 10:37:43 crc kubenswrapper[4748]: set -o allexport Mar 20 10:37:43 crc kubenswrapper[4748]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 20 10:37:43 crc kubenswrapper[4748]: source /etc/kubernetes/apiserver-url.env Mar 20 10:37:43 crc kubenswrapper[4748]: else Mar 20 10:37:43 crc kubenswrapper[4748]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 20 10:37:43 crc kubenswrapper[4748]: exit 1 Mar 20 10:37:43 crc kubenswrapper[4748]: fi Mar 20 10:37:43 crc kubenswrapper[4748]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 20 10:37:43 crc kubenswrapper[4748]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 10:37:43 crc kubenswrapper[4748]: > logger="UnhandledError" Mar 20 10:37:43 crc kubenswrapper[4748]: I0320 10:37:43.043205 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-w68d2" event={"ID":"1cb89200-ecbf-4725-b48f-801aaecd6ad0","Type":"ContainerStarted","Data":"6467af8dc41a768b89878eb3de7ee80568ef9a3c4f8894593cc24c46601fcf7f"} Mar 20 10:37:43 crc kubenswrapper[4748]: E0320 10:37:43.043944 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 20 10:37:43 crc kubenswrapper[4748]: I0320 10:37:43.046434 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qnjmr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16bd6321-67e6-40c7-9ad0-5c9035765e5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6ncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6ncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6ncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6ncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6ncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6ncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6ncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:37:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qnjmr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:43 crc kubenswrapper[4748]: E0320 10:37:43.046516 4748 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 10:37:43 crc kubenswrapper[4748]: container &Container{Name:dns-node-resolver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/bin/bash -c #!/bin/bash Mar 20 10:37:43 crc kubenswrapper[4748]: set -uo pipefail Mar 20 10:37:43 crc kubenswrapper[4748]: Mar 20 10:37:43 crc kubenswrapper[4748]: trap 'jobs -p | xargs kill || true; wait; exit 0' TERM Mar 20 10:37:43 crc kubenswrapper[4748]: Mar 20 10:37:43 crc kubenswrapper[4748]: OPENSHIFT_MARKER="openshift-generated-node-resolver" Mar 20 10:37:43 crc kubenswrapper[4748]: HOSTS_FILE="/etc/hosts" Mar 20 10:37:43 crc kubenswrapper[4748]: TEMP_FILE="/etc/hosts.tmp" Mar 20 10:37:43 crc kubenswrapper[4748]: Mar 20 10:37:43 crc kubenswrapper[4748]: IFS=', ' read -r -a services <<< "${SERVICES}" Mar 20 10:37:43 crc kubenswrapper[4748]: Mar 20 10:37:43 crc kubenswrapper[4748]: # Make a temporary file with the old hosts file's attributes. Mar 20 10:37:43 crc kubenswrapper[4748]: if ! cp -f --attributes-only "${HOSTS_FILE}" "${TEMP_FILE}"; then Mar 20 10:37:43 crc kubenswrapper[4748]: echo "Failed to preserve hosts file. Exiting." Mar 20 10:37:43 crc kubenswrapper[4748]: exit 1 Mar 20 10:37:43 crc kubenswrapper[4748]: fi Mar 20 10:37:43 crc kubenswrapper[4748]: Mar 20 10:37:43 crc kubenswrapper[4748]: while true; do Mar 20 10:37:43 crc kubenswrapper[4748]: declare -A svc_ips Mar 20 10:37:43 crc kubenswrapper[4748]: for svc in "${services[@]}"; do Mar 20 10:37:43 crc kubenswrapper[4748]: # Fetch service IP from cluster dns if present. We make several tries Mar 20 10:37:43 crc kubenswrapper[4748]: # to do it: IPv4, IPv6, IPv4 over TCP and IPv6 over TCP. The two last ones Mar 20 10:37:43 crc kubenswrapper[4748]: # are for deployments with Kuryr on older OpenStack (OSP13) - those do not Mar 20 10:37:43 crc kubenswrapper[4748]: # support UDP loadbalancers and require reaching DNS through TCP. Mar 20 10:37:43 crc kubenswrapper[4748]: cmds=('dig -t A @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 20 10:37:43 crc kubenswrapper[4748]: 'dig -t AAAA @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 20 10:37:43 crc kubenswrapper[4748]: 'dig -t A +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 20 10:37:43 crc kubenswrapper[4748]: 'dig -t AAAA +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"') Mar 20 10:37:43 crc kubenswrapper[4748]: for i in ${!cmds[*]} Mar 20 10:37:43 crc kubenswrapper[4748]: do Mar 20 10:37:43 crc kubenswrapper[4748]: ips=($(eval "${cmds[i]}")) Mar 20 10:37:43 crc kubenswrapper[4748]: if [[ "$?" -eq 0 && "${#ips[@]}" -ne 0 ]]; then Mar 20 10:37:43 crc kubenswrapper[4748]: svc_ips["${svc}"]="${ips[@]}" Mar 20 10:37:43 crc kubenswrapper[4748]: break Mar 20 10:37:43 crc kubenswrapper[4748]: fi Mar 20 10:37:43 crc kubenswrapper[4748]: done Mar 20 10:37:43 crc kubenswrapper[4748]: done Mar 20 10:37:43 crc kubenswrapper[4748]: Mar 20 10:37:43 crc kubenswrapper[4748]: # Update /etc/hosts only if we get valid service IPs Mar 20 10:37:43 crc kubenswrapper[4748]: # We will not update /etc/hosts when there is coredns service outage or api unavailability Mar 20 10:37:43 crc kubenswrapper[4748]: # Stale entries could exist in /etc/hosts if the service is deleted Mar 20 10:37:43 crc kubenswrapper[4748]: if [[ -n "${svc_ips[*]-}" ]]; then Mar 20 10:37:43 crc kubenswrapper[4748]: # Build a new hosts file from /etc/hosts with our custom entries filtered out Mar 20 10:37:43 crc kubenswrapper[4748]: if ! sed --silent "/# ${OPENSHIFT_MARKER}/d; w ${TEMP_FILE}" "${HOSTS_FILE}"; then Mar 20 10:37:43 crc kubenswrapper[4748]: # Only continue rebuilding the hosts entries if its original content is preserved Mar 20 10:37:43 crc kubenswrapper[4748]: sleep 60 & wait Mar 20 10:37:43 crc kubenswrapper[4748]: continue Mar 20 10:37:43 crc kubenswrapper[4748]: fi Mar 20 10:37:43 crc kubenswrapper[4748]: Mar 20 10:37:43 crc kubenswrapper[4748]: # Append resolver entries for services Mar 20 10:37:43 crc kubenswrapper[4748]: rc=0 Mar 20 10:37:43 crc kubenswrapper[4748]: for svc in "${!svc_ips[@]}"; do Mar 20 10:37:43 crc kubenswrapper[4748]: for ip in ${svc_ips[${svc}]}; do Mar 20 10:37:43 crc kubenswrapper[4748]: echo "${ip} ${svc} ${svc}.${CLUSTER_DOMAIN} # ${OPENSHIFT_MARKER}" >> "${TEMP_FILE}" || rc=$? Mar 20 10:37:43 crc kubenswrapper[4748]: done Mar 20 10:37:43 crc kubenswrapper[4748]: done Mar 20 10:37:43 crc kubenswrapper[4748]: if [[ $rc -ne 0 ]]; then Mar 20 10:37:43 crc kubenswrapper[4748]: sleep 60 & wait Mar 20 10:37:43 crc kubenswrapper[4748]: continue Mar 20 10:37:43 crc kubenswrapper[4748]: fi Mar 20 10:37:43 crc kubenswrapper[4748]: Mar 20 10:37:43 crc kubenswrapper[4748]: Mar 20 10:37:43 crc kubenswrapper[4748]: # TODO: Update /etc/hosts atomically to avoid any inconsistent behavior Mar 20 10:37:43 crc kubenswrapper[4748]: # Replace /etc/hosts with our modified version if needed Mar 20 10:37:43 crc kubenswrapper[4748]: cmp "${TEMP_FILE}" "${HOSTS_FILE}" || cp -f "${TEMP_FILE}" "${HOSTS_FILE}" Mar 20 10:37:43 crc kubenswrapper[4748]: # TEMP_FILE is not removed to avoid file create/delete and attributes copy churn Mar 20 10:37:43 crc kubenswrapper[4748]: fi Mar 20 10:37:43 crc kubenswrapper[4748]: sleep 60 & wait Mar 20 10:37:43 crc kubenswrapper[4748]: unset svc_ips Mar 20 10:37:43 crc kubenswrapper[4748]: done Mar 20 10:37:43 crc kubenswrapper[4748]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:SERVICES,Value:image-registry.openshift-image-registry.svc,ValueFrom:nil,},EnvVar{Name:NAMESERVER,Value:10.217.4.10,ValueFrom:nil,},EnvVar{Name:CLUSTER_DOMAIN,Value:cluster.local,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{22020096 0} {} 21Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:hosts-file,ReadOnly:false,MountPath:/etc/hosts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-szq25,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-resolver-w68d2_openshift-dns(1cb89200-ecbf-4725-b48f-801aaecd6ad0): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 10:37:43 crc kubenswrapper[4748]: > logger="UnhandledError" Mar 20 10:37:43 crc kubenswrapper[4748]: E0320 10:37:43.047664 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dns-node-resolver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-dns/node-resolver-w68d2" podUID="1cb89200-ecbf-4725-b48f-801aaecd6ad0" Mar 20 10:37:43 crc kubenswrapper[4748]: I0320 10:37:43.048148 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xdzb8" event={"ID":"f31addae-43ae-459d-bf9d-b5c0ee58faba","Type":"ContainerStarted","Data":"407b8a51d2008d9ca2a94f6c8b8c647c80a8508cff7c3681726790013119adb5"} Mar 20 10:37:43 crc kubenswrapper[4748]: I0320 10:37:43.051126 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-z5ksw" event={"ID":"4275e40d-41ca-4fe4-a44b-fe86f4d2e78b","Type":"ContainerStarted","Data":"cd5bc6940c303b743ec2e86513a47a0f1824089e7eee990d943396eabb96b65c"} Mar 20 10:37:43 crc kubenswrapper[4748]: E0320 10:37:43.054176 4748 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 10:37:43 crc kubenswrapper[4748]: init container &Container{Name:kubecfg-setup,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c cat << EOF > /etc/ovn/kubeconfig Mar 20 10:37:43 crc kubenswrapper[4748]: apiVersion: v1 Mar 20 10:37:43 crc kubenswrapper[4748]: clusters: Mar 20 10:37:43 crc kubenswrapper[4748]: - cluster: Mar 20 10:37:43 crc kubenswrapper[4748]: certificate-authority: /var/run/secrets/kubernetes.io/serviceaccount/ca.crt Mar 20 10:37:43 crc kubenswrapper[4748]: server: https://api-int.crc.testing:6443 Mar 20 10:37:43 crc kubenswrapper[4748]: name: default-cluster Mar 20 10:37:43 crc kubenswrapper[4748]: contexts: Mar 20 10:37:43 crc kubenswrapper[4748]: - context: Mar 20 10:37:43 crc kubenswrapper[4748]: cluster: default-cluster Mar 20 10:37:43 crc kubenswrapper[4748]: namespace: default Mar 20 10:37:43 crc kubenswrapper[4748]: user: default-auth Mar 20 10:37:43 crc kubenswrapper[4748]: name: default-context Mar 20 10:37:43 crc kubenswrapper[4748]: current-context: default-context Mar 20 10:37:43 crc kubenswrapper[4748]: kind: Config Mar 20 10:37:43 crc kubenswrapper[4748]: preferences: {} Mar 20 10:37:43 crc kubenswrapper[4748]: users: Mar 20 10:37:43 crc kubenswrapper[4748]: - name: default-auth Mar 20 10:37:43 crc kubenswrapper[4748]: user: Mar 20 10:37:43 crc kubenswrapper[4748]: client-certificate: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 20 10:37:43 crc kubenswrapper[4748]: client-key: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 20 10:37:43 crc kubenswrapper[4748]: EOF Mar 20 10:37:43 crc kubenswrapper[4748]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-openvswitch,ReadOnly:false,MountPath:/etc/ovn/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ndt8m,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-node-xdzb8_openshift-ovn-kubernetes(f31addae-43ae-459d-bf9d-b5c0ee58faba): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 10:37:43 crc kubenswrapper[4748]: > logger="UnhandledError" Mar 20 10:37:43 crc kubenswrapper[4748]: E0320 10:37:43.056124 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kubecfg-setup\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-ovn-kubernetes/ovnkube-node-xdzb8" podUID="f31addae-43ae-459d-bf9d-b5c0ee58faba" Mar 20 10:37:43 crc kubenswrapper[4748]: E0320 10:37:43.059685 4748 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 10:37:43 crc kubenswrapper[4748]: container &Container{Name:kube-multus,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,Command:[/bin/bash -ec --],Args:[MULTUS_DAEMON_OPT="" Mar 20 10:37:43 crc kubenswrapper[4748]: /entrypoint/cnibincopy.sh; exec /usr/src/multus-cni/bin/multus-daemon $MULTUS_DAEMON_OPT Mar 20 10:37:43 crc kubenswrapper[4748]: ],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/bin/,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:6443,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:api-int.crc.testing,ValueFrom:nil,},EnvVar{Name:MULTUS_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:K8S_NODE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:false,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:system-cni-dir,ReadOnly:false,MountPath:/host/etc/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-cni-dir,ReadOnly:false,MountPath:/host/run/multus/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-socket-dir-parent,ReadOnly:false,MountPath:/host/run/multus,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-k8s-cni-cncf-io,ReadOnly:false,MountPath:/run/k8s.cni.cncf.io,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-netns,ReadOnly:false,MountPath:/run/netns,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-bin,ReadOnly:false,MountPath:/var/lib/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-multus,ReadOnly:false,MountPath:/var/lib/cni/multus,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-kubelet,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:hostroot,ReadOnly:false,MountPath:/hostroot,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-conf-dir,ReadOnly:false,MountPath:/etc/cni/multus/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-daemon-config,ReadOnly:true,MountPath:/etc/cni/net.d/multus.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-multus-certs,ReadOnly:false,MountPath:/etc/cni/multus/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:etc-kubernetes,ReadOnly:false,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-srwmv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-z5ksw_openshift-multus(4275e40d-41ca-4fe4-a44b-fe86f4d2e78b): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 10:37:43 crc kubenswrapper[4748]: > logger="UnhandledError" Mar 20 10:37:43 crc kubenswrapper[4748]: E0320 10:37:43.060927 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-z5ksw" podUID="4275e40d-41ca-4fe4-a44b-fe86f4d2e78b" Mar 20 10:37:43 crc kubenswrapper[4748]: I0320 10:37:43.066561 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-ftkzt" event={"ID":"bb932f5b-ebd7-48e2-ba20-3d1633290c8e","Type":"ContainerStarted","Data":"da7408d90b83a52992004aa8f2de4d1bb9c05eb9816b8cc634d91aad22d03afc"} Mar 20 10:37:43 crc kubenswrapper[4748]: I0320 10:37:43.069640 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xdzb8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f31addae-43ae-459d-bf9d-b5c0ee58faba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndt8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndt8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndt8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndt8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndt8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndt8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndt8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndt8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndt8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:37:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xdzb8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:43 crc kubenswrapper[4748]: E0320 10:37:43.074008 4748 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 10:37:43 crc kubenswrapper[4748]: container &Container{Name:node-ca,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f,Command:[/bin/sh -c trap 'jobs -p | xargs -r kill; echo shutting down node-ca; exit 0' TERM Mar 20 10:37:43 crc kubenswrapper[4748]: while [ true ]; Mar 20 10:37:43 crc kubenswrapper[4748]: do Mar 20 10:37:43 crc kubenswrapper[4748]: for f in $(ls /tmp/serviceca); do Mar 20 10:37:43 crc kubenswrapper[4748]: echo $f Mar 20 10:37:43 crc kubenswrapper[4748]: ca_file_path="/tmp/serviceca/${f}" Mar 20 10:37:43 crc kubenswrapper[4748]: f=$(echo $f | sed -r 's/(.*)\.\./\1:/') Mar 20 10:37:43 crc kubenswrapper[4748]: reg_dir_path="/etc/docker/certs.d/${f}" Mar 20 10:37:43 crc kubenswrapper[4748]: if [ -e "${reg_dir_path}" ]; then Mar 20 10:37:43 crc kubenswrapper[4748]: cp -u $ca_file_path $reg_dir_path/ca.crt Mar 20 10:37:43 crc kubenswrapper[4748]: else Mar 20 10:37:43 crc kubenswrapper[4748]: mkdir $reg_dir_path Mar 20 10:37:43 crc kubenswrapper[4748]: cp $ca_file_path $reg_dir_path/ca.crt Mar 20 10:37:43 crc kubenswrapper[4748]: fi Mar 20 10:37:43 crc kubenswrapper[4748]: done Mar 20 10:37:43 crc kubenswrapper[4748]: for d in $(ls /etc/docker/certs.d); do Mar 20 10:37:43 crc kubenswrapper[4748]: echo $d Mar 20 10:37:43 crc kubenswrapper[4748]: dp=$(echo $d | sed -r 's/(.*):/\1\.\./') Mar 20 10:37:43 crc kubenswrapper[4748]: reg_conf_path="/tmp/serviceca/${dp}" Mar 20 10:37:43 crc kubenswrapper[4748]: if [ ! -e "${reg_conf_path}" ]; then Mar 20 10:37:43 crc kubenswrapper[4748]: rm -rf /etc/docker/certs.d/$d Mar 20 10:37:43 crc kubenswrapper[4748]: fi Mar 20 10:37:43 crc kubenswrapper[4748]: done Mar 20 10:37:43 crc kubenswrapper[4748]: sleep 60 & wait ${!} Mar 20 10:37:43 crc kubenswrapper[4748]: done Mar 20 10:37:43 crc kubenswrapper[4748]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{10485760 0} {} 10Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:serviceca,ReadOnly:false,MountPath:/tmp/serviceca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host,ReadOnly:false,MountPath:/etc/docker/certs.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-k9l9h,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:*1001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-ca-ftkzt_openshift-image-registry(bb932f5b-ebd7-48e2-ba20-3d1633290c8e): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 10:37:43 crc kubenswrapper[4748]: > logger="UnhandledError" Mar 20 10:37:43 crc kubenswrapper[4748]: I0320 10:37:43.074272 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"826f66d17788630f86a227e2b394e381a366d7aad930af11ec243fdfbe7fda7b"} Mar 20 10:37:43 crc kubenswrapper[4748]: E0320 10:37:43.075172 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"node-ca\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-image-registry/node-ca-ftkzt" podUID="bb932f5b-ebd7-48e2-ba20-3d1633290c8e" Mar 20 10:37:43 crc kubenswrapper[4748]: I0320 10:37:43.075418 4748 scope.go:117] "RemoveContainer" containerID="abfae41f906e09e4b2be25e2cdf0408811449414f51931ce149a8468a32a88c6" Mar 20 10:37:43 crc kubenswrapper[4748]: E0320 10:37:43.075702 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 10:37:43 crc kubenswrapper[4748]: E0320 10:37:43.076885 4748 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 10:37:43 crc kubenswrapper[4748]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 20 10:37:43 crc kubenswrapper[4748]: if [[ -f "/env/_master" ]]; then Mar 20 10:37:43 crc kubenswrapper[4748]: set -o allexport Mar 20 10:37:43 crc kubenswrapper[4748]: source "/env/_master" Mar 20 10:37:43 crc kubenswrapper[4748]: set +o allexport Mar 20 10:37:43 crc kubenswrapper[4748]: fi Mar 20 10:37:43 crc kubenswrapper[4748]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Mar 20 10:37:43 crc kubenswrapper[4748]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Mar 20 10:37:43 crc kubenswrapper[4748]: ho_enable="--enable-hybrid-overlay" Mar 20 10:37:43 crc kubenswrapper[4748]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Mar 20 10:37:43 crc kubenswrapper[4748]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Mar 20 10:37:43 crc kubenswrapper[4748]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Mar 20 10:37:43 crc kubenswrapper[4748]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 20 10:37:43 crc kubenswrapper[4748]: --webhook-cert-dir="/etc/webhook-cert" \ Mar 20 10:37:43 crc kubenswrapper[4748]: --webhook-host=127.0.0.1 \ Mar 20 10:37:43 crc kubenswrapper[4748]: --webhook-port=9743 \ Mar 20 10:37:43 crc kubenswrapper[4748]: ${ho_enable} \ Mar 20 10:37:43 crc kubenswrapper[4748]: --enable-interconnect \ Mar 20 10:37:43 crc kubenswrapper[4748]: --disable-approver \ Mar 20 10:37:43 crc kubenswrapper[4748]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Mar 20 10:37:43 crc kubenswrapper[4748]: --wait-for-kubernetes-api=200s \ Mar 20 10:37:43 crc kubenswrapper[4748]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Mar 20 10:37:43 crc kubenswrapper[4748]: --loglevel="${LOGLEVEL}" Mar 20 10:37:43 crc kubenswrapper[4748]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 10:37:43 crc kubenswrapper[4748]: > logger="UnhandledError" Mar 20 10:37:43 crc kubenswrapper[4748]: E0320 10:37:43.080192 4748 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 10:37:43 crc kubenswrapper[4748]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 20 10:37:43 crc kubenswrapper[4748]: if [[ -f "/env/_master" ]]; then Mar 20 10:37:43 crc kubenswrapper[4748]: set -o allexport Mar 20 10:37:43 crc kubenswrapper[4748]: source "/env/_master" Mar 20 10:37:43 crc kubenswrapper[4748]: set +o allexport Mar 20 10:37:43 crc kubenswrapper[4748]: fi Mar 20 10:37:43 crc kubenswrapper[4748]: Mar 20 10:37:43 crc kubenswrapper[4748]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Mar 20 10:37:43 crc kubenswrapper[4748]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 20 10:37:43 crc kubenswrapper[4748]: --disable-webhook \ Mar 20 10:37:43 crc kubenswrapper[4748]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Mar 20 10:37:43 crc kubenswrapper[4748]: --loglevel="${LOGLEVEL}" Mar 20 10:37:43 crc kubenswrapper[4748]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 10:37:43 crc kubenswrapper[4748]: > logger="UnhandledError" Mar 20 10:37:43 crc kubenswrapper[4748]: E0320 10:37:43.081325 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Mar 20 10:37:43 crc kubenswrapper[4748]: I0320 10:37:43.087592 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:43 crc kubenswrapper[4748]: I0320 10:37:43.102471 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:43 crc kubenswrapper[4748]: I0320 10:37:43.115184 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ftkzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb932f5b-ebd7-48e2-ba20-3d1633290c8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9l9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:37:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ftkzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:43 crc kubenswrapper[4748]: I0320 10:37:43.119793 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:43 crc kubenswrapper[4748]: I0320 10:37:43.119857 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:43 crc kubenswrapper[4748]: I0320 10:37:43.119872 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:43 crc kubenswrapper[4748]: I0320 10:37:43.119920 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:43 crc kubenswrapper[4748]: I0320 10:37:43.120519 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:43Z","lastTransitionTime":"2026-03-20T10:37:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:43 crc kubenswrapper[4748]: I0320 10:37:43.130247 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-z5ksw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4275e40d-41ca-4fe4-a44b-fe86f4d2e78b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srwmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:37:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-z5ksw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:43 crc kubenswrapper[4748]: I0320 10:37:43.141720 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fksm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fksm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:37:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5lbvz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:43 crc kubenswrapper[4748]: I0320 10:37:43.156945 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:43 crc kubenswrapper[4748]: I0320 10:37:43.166983 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:43 crc kubenswrapper[4748]: I0320 10:37:43.177289 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w68d2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cb89200-ecbf-4725-b48f-801aaecd6ad0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-szq25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:37:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w68d2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:43 crc kubenswrapper[4748]: I0320 10:37:43.191981 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5483745-95c8-4c6e-bd16-ec5fec57af5d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:36:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:36:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:36:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:36:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:36:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c40d6e7336ba79828d98e8a8a2cb2e049ce6ad046d53a6337d9ab0d1ce0ee11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:36:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed2d8bdb5b326ae91ba0ca3da26d434af7ae6a3c604fd3ff22eb0ae8b00a0aa1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:36:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15d3d6d4f8c47119a747f3c76a9bb2b05c6f0030576418e787f3deb6e9f2485b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:36:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abfae41f906e09e4b2be25e2cdf0408811449414f51931ce149a8468a32a88c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abfae41f906e09e4b2be25e2cdf0408811449414f51931ce149a8468a32a88c6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:37:13Z\\\",\\\"message\\\":\\\"file observer\\\\nW0320 10:37:13.169272 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 10:37:13.169530 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:37:13.170805 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-114493928/tls.crt::/tmp/serving-cert-114493928/tls.key\\\\\\\"\\\\nI0320 10:37:13.625648 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 10:37:13.628420 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 10:37:13.628441 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 10:37:13.628469 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 10:37:13.628482 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 10:37:13.635699 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 10:37:13.635718 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0320 10:37:13.635717 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 10:37:13.635722 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:37:13.635738 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 10:37:13.635741 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 10:37:13.635744 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 10:37:13.635746 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 10:37:13.637809 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:37:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://989be6314e2d0af65e9541b5b2a0be5aed080255658e56c1c5bd989d1a842acf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:36:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b974db792d69c06471dc5bd8f9e64e727b052347aed06e5a1c80653838984f51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b974db792d69c06471dc5bd8f9e64e727b052347aed06e5a1c80653838984f51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:36:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:36:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:36:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:43 crc kubenswrapper[4748]: I0320 10:37:43.196299 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:37:43 crc kubenswrapper[4748]: I0320 10:37:43.196438 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:37:43 crc kubenswrapper[4748]: I0320 10:37:43.196490 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:37:43 crc kubenswrapper[4748]: I0320 10:37:43.196534 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:37:43 crc kubenswrapper[4748]: I0320 10:37:43.196558 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:37:43 crc kubenswrapper[4748]: E0320 10:37:43.197073 4748 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 10:37:43 crc kubenswrapper[4748]: E0320 10:37:43.197130 4748 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 10:37:43 crc kubenswrapper[4748]: E0320 10:37:43.197152 4748 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:37:43 crc kubenswrapper[4748]: E0320 10:37:43.197185 4748 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 10:37:43 crc kubenswrapper[4748]: E0320 10:37:43.197349 4748 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 10:37:43 crc kubenswrapper[4748]: E0320 10:37:43.197365 4748 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 10:37:43 crc kubenswrapper[4748]: E0320 10:37:43.197376 4748 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:37:43 crc kubenswrapper[4748]: E0320 10:37:43.197153 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:37:44.197130193 +0000 UTC m=+99.338676087 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:37:43 crc kubenswrapper[4748]: E0320 10:37:43.197422 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 10:37:44.19740776 +0000 UTC m=+99.338953574 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:37:43 crc kubenswrapper[4748]: E0320 10:37:43.197097 4748 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 10:37:43 crc kubenswrapper[4748]: E0320 10:37:43.197435 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 10:37:44.1974306 +0000 UTC m=+99.338976414 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 10:37:43 crc kubenswrapper[4748]: E0320 10:37:43.197530 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 10:37:44.197498222 +0000 UTC m=+99.339044076 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:37:43 crc kubenswrapper[4748]: E0320 10:37:43.197579 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 10:37:44.197558864 +0000 UTC m=+99.339104798 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 10:37:43 crc kubenswrapper[4748]: I0320 10:37:43.208119 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:43 crc kubenswrapper[4748]: I0320 10:37:43.221820 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:43 crc kubenswrapper[4748]: I0320 10:37:43.222523 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:43 crc kubenswrapper[4748]: I0320 10:37:43.222563 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:43 crc kubenswrapper[4748]: I0320 10:37:43.222574 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:43 crc kubenswrapper[4748]: I0320 10:37:43.222592 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:43 crc kubenswrapper[4748]: I0320 10:37:43.222605 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:43Z","lastTransitionTime":"2026-03-20T10:37:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:43 crc kubenswrapper[4748]: I0320 10:37:43.236398 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:43 crc kubenswrapper[4748]: I0320 10:37:43.251139 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:43 crc kubenswrapper[4748]: I0320 10:37:43.268470 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qnjmr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16bd6321-67e6-40c7-9ad0-5c9035765e5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6ncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6ncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6ncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6ncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6ncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6ncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6ncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:37:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qnjmr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:43 crc kubenswrapper[4748]: I0320 10:37:43.292906 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xdzb8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f31addae-43ae-459d-bf9d-b5c0ee58faba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndt8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndt8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndt8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndt8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndt8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndt8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndt8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndt8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndt8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:37:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xdzb8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:43 crc kubenswrapper[4748]: I0320 10:37:43.308286 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:43 crc kubenswrapper[4748]: I0320 10:37:43.323348 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:43 crc kubenswrapper[4748]: I0320 10:37:43.327705 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:43 crc kubenswrapper[4748]: I0320 10:37:43.327749 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:43 crc kubenswrapper[4748]: I0320 10:37:43.327766 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:43 crc kubenswrapper[4748]: I0320 10:37:43.327792 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:43 crc kubenswrapper[4748]: I0320 10:37:43.327811 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:43Z","lastTransitionTime":"2026-03-20T10:37:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:43 crc kubenswrapper[4748]: I0320 10:37:43.339438 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:43 crc kubenswrapper[4748]: I0320 10:37:43.351708 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ftkzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb932f5b-ebd7-48e2-ba20-3d1633290c8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9l9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:37:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ftkzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:43 crc kubenswrapper[4748]: I0320 10:37:43.368500 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-z5ksw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4275e40d-41ca-4fe4-a44b-fe86f4d2e78b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srwmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:37:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-z5ksw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:43 crc kubenswrapper[4748]: I0320 10:37:43.383132 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fksm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fksm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:37:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5lbvz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:43 crc kubenswrapper[4748]: I0320 10:37:43.401141 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5483745-95c8-4c6e-bd16-ec5fec57af5d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:36:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:36:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:36:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:36:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:36:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c40d6e7336ba79828d98e8a8a2cb2e049ce6ad046d53a6337d9ab0d1ce0ee11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:36:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed2d8bdb5b326ae91ba0ca3da26d434af7ae6a3c604fd3ff22eb0ae8b00a0aa1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:36:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15d3d6d4f8c47119a747f3c76a9bb2b05c6f0030576418e787f3deb6e9f2485b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:36:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abfae41f906e09e4b2be25e2cdf0408811449414f51931ce149a8468a32a88c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abfae41f906e09e4b2be25e2cdf0408811449414f51931ce149a8468a32a88c6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:37:13Z\\\",\\\"message\\\":\\\"file observer\\\\nW0320 10:37:13.169272 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 10:37:13.169530 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:37:13.170805 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-114493928/tls.crt::/tmp/serving-cert-114493928/tls.key\\\\\\\"\\\\nI0320 10:37:13.625648 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 10:37:13.628420 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 10:37:13.628441 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 10:37:13.628469 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 10:37:13.628482 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 10:37:13.635699 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 10:37:13.635718 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0320 10:37:13.635717 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 10:37:13.635722 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:37:13.635738 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 10:37:13.635741 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 10:37:13.635744 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 10:37:13.635746 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 10:37:13.637809 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:37:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://989be6314e2d0af65e9541b5b2a0be5aed080255658e56c1c5bd989d1a842acf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:36:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b974db792d69c06471dc5bd8f9e64e727b052347aed06e5a1c80653838984f51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b974db792d69c06471dc5bd8f9e64e727b052347aed06e5a1c80653838984f51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:36:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:36:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:36:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:43 crc kubenswrapper[4748]: I0320 10:37:43.415652 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:43 crc kubenswrapper[4748]: I0320 10:37:43.426551 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w68d2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cb89200-ecbf-4725-b48f-801aaecd6ad0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-szq25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:37:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w68d2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:43 crc kubenswrapper[4748]: I0320 10:37:43.430662 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:43 crc kubenswrapper[4748]: I0320 10:37:43.430721 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:43 crc kubenswrapper[4748]: I0320 10:37:43.430731 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:43 crc kubenswrapper[4748]: I0320 10:37:43.430759 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:43 crc kubenswrapper[4748]: I0320 10:37:43.430773 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:43Z","lastTransitionTime":"2026-03-20T10:37:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:43 crc kubenswrapper[4748]: I0320 10:37:43.522089 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Mar 20 10:37:43 crc kubenswrapper[4748]: I0320 10:37:43.523737 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Mar 20 10:37:43 crc kubenswrapper[4748]: I0320 10:37:43.526245 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Mar 20 10:37:43 crc kubenswrapper[4748]: I0320 10:37:43.527793 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Mar 20 10:37:43 crc kubenswrapper[4748]: I0320 10:37:43.529997 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Mar 20 10:37:43 crc kubenswrapper[4748]: I0320 10:37:43.531593 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Mar 20 10:37:43 crc kubenswrapper[4748]: I0320 10:37:43.534045 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:43 crc kubenswrapper[4748]: I0320 10:37:43.534131 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:43 crc kubenswrapper[4748]: I0320 10:37:43.534158 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:43 crc kubenswrapper[4748]: I0320 10:37:43.534187 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:43 crc kubenswrapper[4748]: I0320 10:37:43.534207 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:43Z","lastTransitionTime":"2026-03-20T10:37:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:43 crc kubenswrapper[4748]: I0320 10:37:43.534736 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Mar 20 10:37:43 crc kubenswrapper[4748]: I0320 10:37:43.535962 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Mar 20 10:37:43 crc kubenswrapper[4748]: I0320 10:37:43.538120 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Mar 20 10:37:43 crc kubenswrapper[4748]: I0320 10:37:43.539215 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Mar 20 10:37:43 crc kubenswrapper[4748]: I0320 10:37:43.541110 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Mar 20 10:37:43 crc kubenswrapper[4748]: I0320 10:37:43.542540 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Mar 20 10:37:43 crc kubenswrapper[4748]: I0320 10:37:43.544439 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Mar 20 10:37:43 crc kubenswrapper[4748]: I0320 10:37:43.545630 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Mar 20 10:37:43 crc kubenswrapper[4748]: I0320 10:37:43.547523 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Mar 20 10:37:43 crc kubenswrapper[4748]: I0320 10:37:43.548765 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Mar 20 10:37:43 crc kubenswrapper[4748]: I0320 10:37:43.550042 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Mar 20 10:37:43 crc kubenswrapper[4748]: I0320 10:37:43.551733 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Mar 20 10:37:43 crc kubenswrapper[4748]: I0320 10:37:43.552951 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Mar 20 10:37:43 crc kubenswrapper[4748]: I0320 10:37:43.554329 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Mar 20 10:37:43 crc kubenswrapper[4748]: I0320 10:37:43.556210 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Mar 20 10:37:43 crc kubenswrapper[4748]: I0320 10:37:43.557613 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Mar 20 10:37:43 crc kubenswrapper[4748]: I0320 10:37:43.559478 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Mar 20 10:37:43 crc kubenswrapper[4748]: I0320 10:37:43.560684 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Mar 20 10:37:43 crc kubenswrapper[4748]: I0320 10:37:43.561317 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Mar 20 10:37:43 crc kubenswrapper[4748]: I0320 10:37:43.562874 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Mar 20 10:37:43 crc kubenswrapper[4748]: I0320 10:37:43.564361 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Mar 20 10:37:43 crc kubenswrapper[4748]: I0320 10:37:43.565252 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Mar 20 10:37:43 crc kubenswrapper[4748]: I0320 10:37:43.566226 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Mar 20 10:37:43 crc kubenswrapper[4748]: I0320 10:37:43.567490 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Mar 20 10:37:43 crc kubenswrapper[4748]: I0320 10:37:43.568233 4748 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Mar 20 10:37:43 crc kubenswrapper[4748]: I0320 10:37:43.568385 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Mar 20 10:37:43 crc kubenswrapper[4748]: I0320 10:37:43.571868 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Mar 20 10:37:43 crc kubenswrapper[4748]: I0320 10:37:43.572572 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Mar 20 10:37:43 crc kubenswrapper[4748]: I0320 10:37:43.573614 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Mar 20 10:37:43 crc kubenswrapper[4748]: I0320 10:37:43.575983 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Mar 20 10:37:43 crc kubenswrapper[4748]: I0320 10:37:43.577526 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Mar 20 10:37:43 crc kubenswrapper[4748]: I0320 10:37:43.578326 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Mar 20 10:37:43 crc kubenswrapper[4748]: I0320 10:37:43.579822 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Mar 20 10:37:43 crc kubenswrapper[4748]: I0320 10:37:43.581099 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Mar 20 10:37:43 crc kubenswrapper[4748]: I0320 10:37:43.582566 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Mar 20 10:37:43 crc kubenswrapper[4748]: I0320 10:37:43.583581 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Mar 20 10:37:43 crc kubenswrapper[4748]: I0320 10:37:43.585653 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Mar 20 10:37:43 crc kubenswrapper[4748]: I0320 10:37:43.588232 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Mar 20 10:37:43 crc kubenswrapper[4748]: I0320 10:37:43.589033 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Mar 20 10:37:43 crc kubenswrapper[4748]: I0320 10:37:43.590660 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Mar 20 10:37:43 crc kubenswrapper[4748]: I0320 10:37:43.591727 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Mar 20 10:37:43 crc kubenswrapper[4748]: I0320 10:37:43.593554 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Mar 20 10:37:43 crc kubenswrapper[4748]: I0320 10:37:43.594424 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Mar 20 10:37:43 crc kubenswrapper[4748]: I0320 10:37:43.595143 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Mar 20 10:37:43 crc kubenswrapper[4748]: I0320 10:37:43.596407 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Mar 20 10:37:43 crc kubenswrapper[4748]: I0320 10:37:43.597214 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Mar 20 10:37:43 crc kubenswrapper[4748]: I0320 10:37:43.598649 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Mar 20 10:37:43 crc kubenswrapper[4748]: I0320 10:37:43.599669 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Mar 20 10:37:43 crc kubenswrapper[4748]: I0320 10:37:43.636563 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:43 crc kubenswrapper[4748]: I0320 10:37:43.636603 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:43 crc kubenswrapper[4748]: I0320 10:37:43.636613 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:43 crc kubenswrapper[4748]: I0320 10:37:43.636632 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:43 crc kubenswrapper[4748]: I0320 10:37:43.636644 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:43Z","lastTransitionTime":"2026-03-20T10:37:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:43 crc kubenswrapper[4748]: I0320 10:37:43.739705 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:43 crc kubenswrapper[4748]: I0320 10:37:43.739750 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:43 crc kubenswrapper[4748]: I0320 10:37:43.739763 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:43 crc kubenswrapper[4748]: I0320 10:37:43.739782 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:43 crc kubenswrapper[4748]: I0320 10:37:43.739796 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:43Z","lastTransitionTime":"2026-03-20T10:37:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:43 crc kubenswrapper[4748]: I0320 10:37:43.843109 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:43 crc kubenswrapper[4748]: I0320 10:37:43.843178 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:43 crc kubenswrapper[4748]: I0320 10:37:43.843202 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:43 crc kubenswrapper[4748]: I0320 10:37:43.843233 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:43 crc kubenswrapper[4748]: I0320 10:37:43.843258 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:43Z","lastTransitionTime":"2026-03-20T10:37:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:43 crc kubenswrapper[4748]: I0320 10:37:43.946444 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:43 crc kubenswrapper[4748]: I0320 10:37:43.946489 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:43 crc kubenswrapper[4748]: I0320 10:37:43.946520 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:43 crc kubenswrapper[4748]: I0320 10:37:43.946540 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:43 crc kubenswrapper[4748]: I0320 10:37:43.946550 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:43Z","lastTransitionTime":"2026-03-20T10:37:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:44 crc kubenswrapper[4748]: I0320 10:37:44.048963 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:44 crc kubenswrapper[4748]: I0320 10:37:44.049002 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:44 crc kubenswrapper[4748]: I0320 10:37:44.049011 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:44 crc kubenswrapper[4748]: I0320 10:37:44.049030 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:44 crc kubenswrapper[4748]: I0320 10:37:44.049039 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:44Z","lastTransitionTime":"2026-03-20T10:37:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:44 crc kubenswrapper[4748]: I0320 10:37:44.151170 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:44 crc kubenswrapper[4748]: I0320 10:37:44.151216 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:44 crc kubenswrapper[4748]: I0320 10:37:44.151227 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:44 crc kubenswrapper[4748]: I0320 10:37:44.151245 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:44 crc kubenswrapper[4748]: I0320 10:37:44.151256 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:44Z","lastTransitionTime":"2026-03-20T10:37:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:44 crc kubenswrapper[4748]: I0320 10:37:44.210304 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:37:44 crc kubenswrapper[4748]: E0320 10:37:44.210627 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:37:46.210584568 +0000 UTC m=+101.352130392 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:37:44 crc kubenswrapper[4748]: I0320 10:37:44.210926 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:37:44 crc kubenswrapper[4748]: I0320 10:37:44.211036 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:37:44 crc kubenswrapper[4748]: E0320 10:37:44.211137 4748 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 10:37:44 crc kubenswrapper[4748]: E0320 10:37:44.211178 4748 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 10:37:44 crc kubenswrapper[4748]: E0320 10:37:44.211193 4748 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:37:44 crc kubenswrapper[4748]: E0320 10:37:44.211209 4748 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 10:37:44 crc kubenswrapper[4748]: E0320 10:37:44.211233 4748 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 10:37:44 crc kubenswrapper[4748]: I0320 10:37:44.211149 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:37:44 crc kubenswrapper[4748]: E0320 10:37:44.211247 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 10:37:46.211229613 +0000 UTC m=+101.352775427 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:37:44 crc kubenswrapper[4748]: I0320 10:37:44.211353 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:37:44 crc kubenswrapper[4748]: E0320 10:37:44.211252 4748 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:37:44 crc kubenswrapper[4748]: E0320 10:37:44.211395 4748 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 10:37:44 crc kubenswrapper[4748]: E0320 10:37:44.211545 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 10:37:46.21150521 +0000 UTC m=+101.353051184 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:37:44 crc kubenswrapper[4748]: E0320 10:37:44.211582 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 10:37:46.211572322 +0000 UTC m=+101.353118386 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 10:37:44 crc kubenswrapper[4748]: E0320 10:37:44.211700 4748 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 10:37:44 crc kubenswrapper[4748]: E0320 10:37:44.211856 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 10:37:46.211803607 +0000 UTC m=+101.353349411 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 10:37:44 crc kubenswrapper[4748]: I0320 10:37:44.253456 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:44 crc kubenswrapper[4748]: I0320 10:37:44.253500 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:44 crc kubenswrapper[4748]: I0320 10:37:44.253510 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:44 crc kubenswrapper[4748]: I0320 10:37:44.253529 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:44 crc kubenswrapper[4748]: I0320 10:37:44.253539 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:44Z","lastTransitionTime":"2026-03-20T10:37:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:44 crc kubenswrapper[4748]: I0320 10:37:44.357668 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:44 crc kubenswrapper[4748]: I0320 10:37:44.357825 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:44 crc kubenswrapper[4748]: I0320 10:37:44.357895 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:44 crc kubenswrapper[4748]: I0320 10:37:44.358604 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:44 crc kubenswrapper[4748]: I0320 10:37:44.358693 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:44Z","lastTransitionTime":"2026-03-20T10:37:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:44 crc kubenswrapper[4748]: I0320 10:37:44.461545 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:44 crc kubenswrapper[4748]: I0320 10:37:44.461595 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:44 crc kubenswrapper[4748]: I0320 10:37:44.461603 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:44 crc kubenswrapper[4748]: I0320 10:37:44.461617 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:44 crc kubenswrapper[4748]: I0320 10:37:44.461654 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:44Z","lastTransitionTime":"2026-03-20T10:37:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:44 crc kubenswrapper[4748]: I0320 10:37:44.514571 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:37:44 crc kubenswrapper[4748]: I0320 10:37:44.514647 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:37:44 crc kubenswrapper[4748]: I0320 10:37:44.514597 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:37:44 crc kubenswrapper[4748]: E0320 10:37:44.514786 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:37:44 crc kubenswrapper[4748]: E0320 10:37:44.514911 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:37:44 crc kubenswrapper[4748]: E0320 10:37:44.515045 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:37:44 crc kubenswrapper[4748]: I0320 10:37:44.565374 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:44 crc kubenswrapper[4748]: I0320 10:37:44.565449 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:44 crc kubenswrapper[4748]: I0320 10:37:44.565468 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:44 crc kubenswrapper[4748]: I0320 10:37:44.565498 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:44 crc kubenswrapper[4748]: I0320 10:37:44.565517 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:44Z","lastTransitionTime":"2026-03-20T10:37:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:44 crc kubenswrapper[4748]: I0320 10:37:44.668600 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:44 crc kubenswrapper[4748]: I0320 10:37:44.668677 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:44 crc kubenswrapper[4748]: I0320 10:37:44.668696 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:44 crc kubenswrapper[4748]: I0320 10:37:44.668721 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:44 crc kubenswrapper[4748]: I0320 10:37:44.668738 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:44Z","lastTransitionTime":"2026-03-20T10:37:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:44 crc kubenswrapper[4748]: I0320 10:37:44.771257 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:44 crc kubenswrapper[4748]: I0320 10:37:44.771325 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:44 crc kubenswrapper[4748]: I0320 10:37:44.771351 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:44 crc kubenswrapper[4748]: I0320 10:37:44.771496 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:44 crc kubenswrapper[4748]: I0320 10:37:44.771546 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:44Z","lastTransitionTime":"2026-03-20T10:37:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:44 crc kubenswrapper[4748]: I0320 10:37:44.874161 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:44 crc kubenswrapper[4748]: I0320 10:37:44.874246 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:44 crc kubenswrapper[4748]: I0320 10:37:44.874272 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:44 crc kubenswrapper[4748]: I0320 10:37:44.874304 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:44 crc kubenswrapper[4748]: I0320 10:37:44.874328 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:44Z","lastTransitionTime":"2026-03-20T10:37:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:44 crc kubenswrapper[4748]: I0320 10:37:44.978389 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:44 crc kubenswrapper[4748]: I0320 10:37:44.978426 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:44 crc kubenswrapper[4748]: I0320 10:37:44.978435 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:44 crc kubenswrapper[4748]: I0320 10:37:44.978451 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:44 crc kubenswrapper[4748]: I0320 10:37:44.978462 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:44Z","lastTransitionTime":"2026-03-20T10:37:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:45 crc kubenswrapper[4748]: I0320 10:37:45.080335 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:45 crc kubenswrapper[4748]: I0320 10:37:45.080408 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:45 crc kubenswrapper[4748]: I0320 10:37:45.080431 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:45 crc kubenswrapper[4748]: I0320 10:37:45.080482 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:45 crc kubenswrapper[4748]: I0320 10:37:45.080507 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:45Z","lastTransitionTime":"2026-03-20T10:37:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:45 crc kubenswrapper[4748]: I0320 10:37:45.183269 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:45 crc kubenswrapper[4748]: I0320 10:37:45.183306 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:45 crc kubenswrapper[4748]: I0320 10:37:45.183316 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:45 crc kubenswrapper[4748]: I0320 10:37:45.183332 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:45 crc kubenswrapper[4748]: I0320 10:37:45.183342 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:45Z","lastTransitionTime":"2026-03-20T10:37:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:45 crc kubenswrapper[4748]: I0320 10:37:45.285968 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:45 crc kubenswrapper[4748]: I0320 10:37:45.286021 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:45 crc kubenswrapper[4748]: I0320 10:37:45.286034 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:45 crc kubenswrapper[4748]: I0320 10:37:45.286052 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:45 crc kubenswrapper[4748]: I0320 10:37:45.286070 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:45Z","lastTransitionTime":"2026-03-20T10:37:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:45 crc kubenswrapper[4748]: I0320 10:37:45.388588 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:45 crc kubenswrapper[4748]: I0320 10:37:45.388652 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:45 crc kubenswrapper[4748]: I0320 10:37:45.388667 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:45 crc kubenswrapper[4748]: I0320 10:37:45.388693 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:45 crc kubenswrapper[4748]: I0320 10:37:45.388710 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:45Z","lastTransitionTime":"2026-03-20T10:37:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:45 crc kubenswrapper[4748]: I0320 10:37:45.491359 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:45 crc kubenswrapper[4748]: I0320 10:37:45.491458 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:45 crc kubenswrapper[4748]: I0320 10:37:45.491504 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:45 crc kubenswrapper[4748]: I0320 10:37:45.491530 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:45 crc kubenswrapper[4748]: I0320 10:37:45.491548 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:45Z","lastTransitionTime":"2026-03-20T10:37:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:45 crc kubenswrapper[4748]: I0320 10:37:45.529943 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:45 crc kubenswrapper[4748]: I0320 10:37:45.543810 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:45 crc kubenswrapper[4748]: I0320 10:37:45.560307 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qnjmr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16bd6321-67e6-40c7-9ad0-5c9035765e5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6ncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6ncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6ncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6ncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6ncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6ncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6ncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:37:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qnjmr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:45 crc kubenswrapper[4748]: I0320 10:37:45.576508 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xdzb8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f31addae-43ae-459d-bf9d-b5c0ee58faba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndt8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndt8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndt8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndt8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndt8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndt8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndt8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndt8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndt8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:37:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xdzb8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:45 crc kubenswrapper[4748]: I0320 10:37:45.587579 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:45 crc kubenswrapper[4748]: I0320 10:37:45.594008 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:45 crc kubenswrapper[4748]: I0320 10:37:45.594056 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:45 crc kubenswrapper[4748]: I0320 10:37:45.594067 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:45 crc kubenswrapper[4748]: I0320 10:37:45.594086 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:45 crc kubenswrapper[4748]: I0320 10:37:45.594102 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:45Z","lastTransitionTime":"2026-03-20T10:37:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:45 crc kubenswrapper[4748]: I0320 10:37:45.604106 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:45 crc kubenswrapper[4748]: I0320 10:37:45.616736 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:45 crc kubenswrapper[4748]: I0320 10:37:45.627399 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ftkzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb932f5b-ebd7-48e2-ba20-3d1633290c8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9l9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:37:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ftkzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:45 crc kubenswrapper[4748]: I0320 10:37:45.638282 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-z5ksw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4275e40d-41ca-4fe4-a44b-fe86f4d2e78b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srwmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:37:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-z5ksw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:45 crc kubenswrapper[4748]: I0320 10:37:45.646424 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fksm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fksm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:37:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5lbvz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:45 crc kubenswrapper[4748]: I0320 10:37:45.660692 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5483745-95c8-4c6e-bd16-ec5fec57af5d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:36:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:36:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:36:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:36:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:36:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c40d6e7336ba79828d98e8a8a2cb2e049ce6ad046d53a6337d9ab0d1ce0ee11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:36:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed2d8bdb5b326ae91ba0ca3da26d434af7ae6a3c604fd3ff22eb0ae8b00a0aa1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:36:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15d3d6d4f8c47119a747f3c76a9bb2b05c6f0030576418e787f3deb6e9f2485b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:36:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abfae41f906e09e4b2be25e2cdf0408811449414f51931ce149a8468a32a88c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abfae41f906e09e4b2be25e2cdf0408811449414f51931ce149a8468a32a88c6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:37:13Z\\\",\\\"message\\\":\\\"file observer\\\\nW0320 10:37:13.169272 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 10:37:13.169530 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:37:13.170805 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-114493928/tls.crt::/tmp/serving-cert-114493928/tls.key\\\\\\\"\\\\nI0320 10:37:13.625648 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 10:37:13.628420 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 10:37:13.628441 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 10:37:13.628469 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 10:37:13.628482 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 10:37:13.635699 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 10:37:13.635718 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0320 10:37:13.635717 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 10:37:13.635722 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:37:13.635738 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 10:37:13.635741 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 10:37:13.635744 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 10:37:13.635746 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 10:37:13.637809 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:37:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://989be6314e2d0af65e9541b5b2a0be5aed080255658e56c1c5bd989d1a842acf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:36:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b974db792d69c06471dc5bd8f9e64e727b052347aed06e5a1c80653838984f51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b974db792d69c06471dc5bd8f9e64e727b052347aed06e5a1c80653838984f51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:36:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:36:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:36:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:45 crc kubenswrapper[4748]: I0320 10:37:45.673213 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:45 crc kubenswrapper[4748]: I0320 10:37:45.682536 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w68d2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cb89200-ecbf-4725-b48f-801aaecd6ad0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-szq25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:37:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w68d2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:45 crc kubenswrapper[4748]: I0320 10:37:45.696195 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:45 crc kubenswrapper[4748]: I0320 10:37:45.696225 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:45 crc kubenswrapper[4748]: I0320 10:37:45.696234 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:45 crc kubenswrapper[4748]: I0320 10:37:45.696251 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:45 crc kubenswrapper[4748]: I0320 10:37:45.696263 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:45Z","lastTransitionTime":"2026-03-20T10:37:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:45 crc kubenswrapper[4748]: I0320 10:37:45.799080 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:45 crc kubenswrapper[4748]: I0320 10:37:45.799120 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:45 crc kubenswrapper[4748]: I0320 10:37:45.799129 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:45 crc kubenswrapper[4748]: I0320 10:37:45.799146 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:45 crc kubenswrapper[4748]: I0320 10:37:45.799159 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:45Z","lastTransitionTime":"2026-03-20T10:37:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:45 crc kubenswrapper[4748]: I0320 10:37:45.902117 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:45 crc kubenswrapper[4748]: I0320 10:37:45.902163 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:45 crc kubenswrapper[4748]: I0320 10:37:45.902174 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:45 crc kubenswrapper[4748]: I0320 10:37:45.902191 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:45 crc kubenswrapper[4748]: I0320 10:37:45.902202 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:45Z","lastTransitionTime":"2026-03-20T10:37:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:46 crc kubenswrapper[4748]: I0320 10:37:46.004556 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:46 crc kubenswrapper[4748]: I0320 10:37:46.004612 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:46 crc kubenswrapper[4748]: I0320 10:37:46.004623 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:46 crc kubenswrapper[4748]: I0320 10:37:46.004639 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:46 crc kubenswrapper[4748]: I0320 10:37:46.004649 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:46Z","lastTransitionTime":"2026-03-20T10:37:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:46 crc kubenswrapper[4748]: I0320 10:37:46.106896 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:46 crc kubenswrapper[4748]: I0320 10:37:46.106960 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:46 crc kubenswrapper[4748]: I0320 10:37:46.106977 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:46 crc kubenswrapper[4748]: I0320 10:37:46.107000 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:46 crc kubenswrapper[4748]: I0320 10:37:46.107019 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:46Z","lastTransitionTime":"2026-03-20T10:37:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:46 crc kubenswrapper[4748]: I0320 10:37:46.209879 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:46 crc kubenswrapper[4748]: I0320 10:37:46.209953 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:46 crc kubenswrapper[4748]: I0320 10:37:46.209974 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:46 crc kubenswrapper[4748]: I0320 10:37:46.209998 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:46 crc kubenswrapper[4748]: I0320 10:37:46.210017 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:46Z","lastTransitionTime":"2026-03-20T10:37:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:46 crc kubenswrapper[4748]: I0320 10:37:46.231490 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:37:46 crc kubenswrapper[4748]: I0320 10:37:46.231601 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:37:46 crc kubenswrapper[4748]: I0320 10:37:46.231638 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:37:46 crc kubenswrapper[4748]: I0320 10:37:46.231668 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:37:46 crc kubenswrapper[4748]: E0320 10:37:46.231693 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:37:50.231664814 +0000 UTC m=+105.373210638 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:37:46 crc kubenswrapper[4748]: E0320 10:37:46.231740 4748 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 10:37:46 crc kubenswrapper[4748]: I0320 10:37:46.231747 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:37:46 crc kubenswrapper[4748]: E0320 10:37:46.231790 4748 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 10:37:46 crc kubenswrapper[4748]: E0320 10:37:46.231811 4748 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 10:37:46 crc kubenswrapper[4748]: E0320 10:37:46.231824 4748 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:37:46 crc kubenswrapper[4748]: E0320 10:37:46.231795 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 10:37:50.231780097 +0000 UTC m=+105.373325911 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 10:37:46 crc kubenswrapper[4748]: E0320 10:37:46.231888 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 10:37:50.231876609 +0000 UTC m=+105.373422493 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:37:46 crc kubenswrapper[4748]: E0320 10:37:46.231935 4748 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 10:37:46 crc kubenswrapper[4748]: E0320 10:37:46.231979 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 10:37:50.231971541 +0000 UTC m=+105.373517355 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 10:37:46 crc kubenswrapper[4748]: E0320 10:37:46.231937 4748 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 10:37:46 crc kubenswrapper[4748]: E0320 10:37:46.232008 4748 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 10:37:46 crc kubenswrapper[4748]: E0320 10:37:46.232020 4748 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:37:46 crc kubenswrapper[4748]: E0320 10:37:46.232042 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 10:37:50.232037313 +0000 UTC m=+105.373583117 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:37:46 crc kubenswrapper[4748]: I0320 10:37:46.313244 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:46 crc kubenswrapper[4748]: I0320 10:37:46.313300 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:46 crc kubenswrapper[4748]: I0320 10:37:46.313310 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:46 crc kubenswrapper[4748]: I0320 10:37:46.313328 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:46 crc kubenswrapper[4748]: I0320 10:37:46.313339 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:46Z","lastTransitionTime":"2026-03-20T10:37:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:46 crc kubenswrapper[4748]: I0320 10:37:46.415553 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:46 crc kubenswrapper[4748]: I0320 10:37:46.415614 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:46 crc kubenswrapper[4748]: I0320 10:37:46.415634 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:46 crc kubenswrapper[4748]: I0320 10:37:46.415663 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:46 crc kubenswrapper[4748]: I0320 10:37:46.415687 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:46Z","lastTransitionTime":"2026-03-20T10:37:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:46 crc kubenswrapper[4748]: I0320 10:37:46.515246 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:37:46 crc kubenswrapper[4748]: I0320 10:37:46.515246 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:37:46 crc kubenswrapper[4748]: E0320 10:37:46.515512 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:37:46 crc kubenswrapper[4748]: E0320 10:37:46.515387 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:37:46 crc kubenswrapper[4748]: I0320 10:37:46.515247 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:37:46 crc kubenswrapper[4748]: E0320 10:37:46.515661 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:37:46 crc kubenswrapper[4748]: I0320 10:37:46.518659 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:46 crc kubenswrapper[4748]: I0320 10:37:46.518701 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:46 crc kubenswrapper[4748]: I0320 10:37:46.518712 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:46 crc kubenswrapper[4748]: I0320 10:37:46.518728 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:46 crc kubenswrapper[4748]: I0320 10:37:46.518741 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:46Z","lastTransitionTime":"2026-03-20T10:37:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:46 crc kubenswrapper[4748]: I0320 10:37:46.622074 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:46 crc kubenswrapper[4748]: I0320 10:37:46.622151 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:46 crc kubenswrapper[4748]: I0320 10:37:46.622169 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:46 crc kubenswrapper[4748]: I0320 10:37:46.622195 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:46 crc kubenswrapper[4748]: I0320 10:37:46.622218 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:46Z","lastTransitionTime":"2026-03-20T10:37:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:46 crc kubenswrapper[4748]: I0320 10:37:46.725736 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:46 crc kubenswrapper[4748]: I0320 10:37:46.725802 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:46 crc kubenswrapper[4748]: I0320 10:37:46.725823 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:46 crc kubenswrapper[4748]: I0320 10:37:46.725918 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:46 crc kubenswrapper[4748]: I0320 10:37:46.725953 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:46Z","lastTransitionTime":"2026-03-20T10:37:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:46 crc kubenswrapper[4748]: I0320 10:37:46.828927 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:46 crc kubenswrapper[4748]: I0320 10:37:46.828992 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:46 crc kubenswrapper[4748]: I0320 10:37:46.829010 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:46 crc kubenswrapper[4748]: I0320 10:37:46.829038 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:46 crc kubenswrapper[4748]: I0320 10:37:46.829059 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:46Z","lastTransitionTime":"2026-03-20T10:37:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:46 crc kubenswrapper[4748]: I0320 10:37:46.931772 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:46 crc kubenswrapper[4748]: I0320 10:37:46.931823 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:46 crc kubenswrapper[4748]: I0320 10:37:46.931849 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:46 crc kubenswrapper[4748]: I0320 10:37:46.931874 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:46 crc kubenswrapper[4748]: I0320 10:37:46.931897 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:46Z","lastTransitionTime":"2026-03-20T10:37:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:47 crc kubenswrapper[4748]: I0320 10:37:47.034890 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:47 crc kubenswrapper[4748]: I0320 10:37:47.034969 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:47 crc kubenswrapper[4748]: I0320 10:37:47.034995 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:47 crc kubenswrapper[4748]: I0320 10:37:47.035030 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:47 crc kubenswrapper[4748]: I0320 10:37:47.035056 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:47Z","lastTransitionTime":"2026-03-20T10:37:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:47 crc kubenswrapper[4748]: I0320 10:37:47.138079 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:47 crc kubenswrapper[4748]: I0320 10:37:47.138607 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:47 crc kubenswrapper[4748]: I0320 10:37:47.138781 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:47 crc kubenswrapper[4748]: I0320 10:37:47.138988 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:47 crc kubenswrapper[4748]: I0320 10:37:47.139161 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:47Z","lastTransitionTime":"2026-03-20T10:37:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:47 crc kubenswrapper[4748]: I0320 10:37:47.242239 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:47 crc kubenswrapper[4748]: I0320 10:37:47.242283 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:47 crc kubenswrapper[4748]: I0320 10:37:47.242292 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:47 crc kubenswrapper[4748]: I0320 10:37:47.242309 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:47 crc kubenswrapper[4748]: I0320 10:37:47.242321 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:47Z","lastTransitionTime":"2026-03-20T10:37:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:47 crc kubenswrapper[4748]: I0320 10:37:47.345635 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:47 crc kubenswrapper[4748]: I0320 10:37:47.345680 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:47 crc kubenswrapper[4748]: I0320 10:37:47.345691 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:47 crc kubenswrapper[4748]: I0320 10:37:47.345709 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:47 crc kubenswrapper[4748]: I0320 10:37:47.345720 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:47Z","lastTransitionTime":"2026-03-20T10:37:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:47 crc kubenswrapper[4748]: I0320 10:37:47.447743 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:47 crc kubenswrapper[4748]: I0320 10:37:47.447816 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:47 crc kubenswrapper[4748]: I0320 10:37:47.447877 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:47 crc kubenswrapper[4748]: I0320 10:37:47.447912 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:47 crc kubenswrapper[4748]: I0320 10:37:47.447935 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:47Z","lastTransitionTime":"2026-03-20T10:37:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:47 crc kubenswrapper[4748]: I0320 10:37:47.549937 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:47 crc kubenswrapper[4748]: I0320 10:37:47.549991 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:47 crc kubenswrapper[4748]: I0320 10:37:47.550005 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:47 crc kubenswrapper[4748]: I0320 10:37:47.550022 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:47 crc kubenswrapper[4748]: I0320 10:37:47.550035 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:47Z","lastTransitionTime":"2026-03-20T10:37:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:47 crc kubenswrapper[4748]: I0320 10:37:47.654116 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:47 crc kubenswrapper[4748]: I0320 10:37:47.654171 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:47 crc kubenswrapper[4748]: I0320 10:37:47.654192 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:47 crc kubenswrapper[4748]: I0320 10:37:47.654220 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:47 crc kubenswrapper[4748]: I0320 10:37:47.654241 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:47Z","lastTransitionTime":"2026-03-20T10:37:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:47 crc kubenswrapper[4748]: I0320 10:37:47.758876 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:47 crc kubenswrapper[4748]: I0320 10:37:47.758938 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:47 crc kubenswrapper[4748]: I0320 10:37:47.758956 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:47 crc kubenswrapper[4748]: I0320 10:37:47.758981 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:47 crc kubenswrapper[4748]: I0320 10:37:47.758998 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:47Z","lastTransitionTime":"2026-03-20T10:37:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:47 crc kubenswrapper[4748]: I0320 10:37:47.861772 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:47 crc kubenswrapper[4748]: I0320 10:37:47.861815 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:47 crc kubenswrapper[4748]: I0320 10:37:47.861825 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:47 crc kubenswrapper[4748]: I0320 10:37:47.861864 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:47 crc kubenswrapper[4748]: I0320 10:37:47.861877 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:47Z","lastTransitionTime":"2026-03-20T10:37:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:47 crc kubenswrapper[4748]: I0320 10:37:47.887173 4748 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 20 10:37:47 crc kubenswrapper[4748]: I0320 10:37:47.964709 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:47 crc kubenswrapper[4748]: I0320 10:37:47.964754 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:47 crc kubenswrapper[4748]: I0320 10:37:47.964765 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:47 crc kubenswrapper[4748]: I0320 10:37:47.964794 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:47 crc kubenswrapper[4748]: I0320 10:37:47.964808 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:47Z","lastTransitionTime":"2026-03-20T10:37:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:48 crc kubenswrapper[4748]: I0320 10:37:48.066735 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:48 crc kubenswrapper[4748]: I0320 10:37:48.066773 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:48 crc kubenswrapper[4748]: I0320 10:37:48.066804 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:48 crc kubenswrapper[4748]: I0320 10:37:48.066822 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:48 crc kubenswrapper[4748]: I0320 10:37:48.066854 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:48Z","lastTransitionTime":"2026-03-20T10:37:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:48 crc kubenswrapper[4748]: I0320 10:37:48.169457 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:48 crc kubenswrapper[4748]: I0320 10:37:48.169822 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:48 crc kubenswrapper[4748]: I0320 10:37:48.170061 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:48 crc kubenswrapper[4748]: I0320 10:37:48.170224 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:48 crc kubenswrapper[4748]: I0320 10:37:48.170605 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:48Z","lastTransitionTime":"2026-03-20T10:37:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:48 crc kubenswrapper[4748]: I0320 10:37:48.273184 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:48 crc kubenswrapper[4748]: I0320 10:37:48.273426 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:48 crc kubenswrapper[4748]: I0320 10:37:48.273502 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:48 crc kubenswrapper[4748]: I0320 10:37:48.273588 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:48 crc kubenswrapper[4748]: I0320 10:37:48.273664 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:48Z","lastTransitionTime":"2026-03-20T10:37:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:48 crc kubenswrapper[4748]: I0320 10:37:48.376257 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:48 crc kubenswrapper[4748]: I0320 10:37:48.376306 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:48 crc kubenswrapper[4748]: I0320 10:37:48.376318 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:48 crc kubenswrapper[4748]: I0320 10:37:48.376333 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:48 crc kubenswrapper[4748]: I0320 10:37:48.376344 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:48Z","lastTransitionTime":"2026-03-20T10:37:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:48 crc kubenswrapper[4748]: I0320 10:37:48.478936 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:48 crc kubenswrapper[4748]: I0320 10:37:48.478985 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:48 crc kubenswrapper[4748]: I0320 10:37:48.478993 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:48 crc kubenswrapper[4748]: I0320 10:37:48.479010 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:48 crc kubenswrapper[4748]: I0320 10:37:48.479019 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:48Z","lastTransitionTime":"2026-03-20T10:37:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:48 crc kubenswrapper[4748]: I0320 10:37:48.514821 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:37:48 crc kubenswrapper[4748]: I0320 10:37:48.514911 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:37:48 crc kubenswrapper[4748]: I0320 10:37:48.514933 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:37:48 crc kubenswrapper[4748]: E0320 10:37:48.514979 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:37:48 crc kubenswrapper[4748]: E0320 10:37:48.515073 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:37:48 crc kubenswrapper[4748]: E0320 10:37:48.515163 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:37:48 crc kubenswrapper[4748]: I0320 10:37:48.581924 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:48 crc kubenswrapper[4748]: I0320 10:37:48.581961 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:48 crc kubenswrapper[4748]: I0320 10:37:48.581979 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:48 crc kubenswrapper[4748]: I0320 10:37:48.582002 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:48 crc kubenswrapper[4748]: I0320 10:37:48.582018 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:48Z","lastTransitionTime":"2026-03-20T10:37:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:48 crc kubenswrapper[4748]: I0320 10:37:48.684967 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:48 crc kubenswrapper[4748]: I0320 10:37:48.685004 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:48 crc kubenswrapper[4748]: I0320 10:37:48.685015 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:48 crc kubenswrapper[4748]: I0320 10:37:48.685031 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:48 crc kubenswrapper[4748]: I0320 10:37:48.685041 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:48Z","lastTransitionTime":"2026-03-20T10:37:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:48 crc kubenswrapper[4748]: I0320 10:37:48.787373 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:48 crc kubenswrapper[4748]: I0320 10:37:48.787414 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:48 crc kubenswrapper[4748]: I0320 10:37:48.787424 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:48 crc kubenswrapper[4748]: I0320 10:37:48.787441 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:48 crc kubenswrapper[4748]: I0320 10:37:48.787453 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:48Z","lastTransitionTime":"2026-03-20T10:37:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:48 crc kubenswrapper[4748]: I0320 10:37:48.890074 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:48 crc kubenswrapper[4748]: I0320 10:37:48.890132 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:48 crc kubenswrapper[4748]: I0320 10:37:48.890151 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:48 crc kubenswrapper[4748]: I0320 10:37:48.890175 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:48 crc kubenswrapper[4748]: I0320 10:37:48.890193 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:48Z","lastTransitionTime":"2026-03-20T10:37:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:48 crc kubenswrapper[4748]: I0320 10:37:48.993718 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:48 crc kubenswrapper[4748]: I0320 10:37:48.993762 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:48 crc kubenswrapper[4748]: I0320 10:37:48.993773 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:48 crc kubenswrapper[4748]: I0320 10:37:48.993791 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:48 crc kubenswrapper[4748]: I0320 10:37:48.993805 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:48Z","lastTransitionTime":"2026-03-20T10:37:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:49 crc kubenswrapper[4748]: I0320 10:37:49.095694 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:49 crc kubenswrapper[4748]: I0320 10:37:49.095754 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:49 crc kubenswrapper[4748]: I0320 10:37:49.095779 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:49 crc kubenswrapper[4748]: I0320 10:37:49.095810 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:49 crc kubenswrapper[4748]: I0320 10:37:49.095869 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:49Z","lastTransitionTime":"2026-03-20T10:37:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:49 crc kubenswrapper[4748]: I0320 10:37:49.199021 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:49 crc kubenswrapper[4748]: I0320 10:37:49.199094 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:49 crc kubenswrapper[4748]: I0320 10:37:49.199106 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:49 crc kubenswrapper[4748]: I0320 10:37:49.199124 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:49 crc kubenswrapper[4748]: I0320 10:37:49.199138 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:49Z","lastTransitionTime":"2026-03-20T10:37:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:49 crc kubenswrapper[4748]: I0320 10:37:49.302333 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:49 crc kubenswrapper[4748]: I0320 10:37:49.302397 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:49 crc kubenswrapper[4748]: I0320 10:37:49.302411 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:49 crc kubenswrapper[4748]: I0320 10:37:49.302429 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:49 crc kubenswrapper[4748]: I0320 10:37:49.302444 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:49Z","lastTransitionTime":"2026-03-20T10:37:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:49 crc kubenswrapper[4748]: I0320 10:37:49.405102 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:49 crc kubenswrapper[4748]: I0320 10:37:49.405148 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:49 crc kubenswrapper[4748]: I0320 10:37:49.405158 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:49 crc kubenswrapper[4748]: I0320 10:37:49.405176 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:49 crc kubenswrapper[4748]: I0320 10:37:49.405190 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:49Z","lastTransitionTime":"2026-03-20T10:37:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:49 crc kubenswrapper[4748]: I0320 10:37:49.507630 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:49 crc kubenswrapper[4748]: I0320 10:37:49.507684 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:49 crc kubenswrapper[4748]: I0320 10:37:49.507700 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:49 crc kubenswrapper[4748]: I0320 10:37:49.507725 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:49 crc kubenswrapper[4748]: I0320 10:37:49.507744 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:49Z","lastTransitionTime":"2026-03-20T10:37:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:49 crc kubenswrapper[4748]: I0320 10:37:49.531556 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Mar 20 10:37:49 crc kubenswrapper[4748]: I0320 10:37:49.596555 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-95zs6"] Mar 20 10:37:49 crc kubenswrapper[4748]: I0320 10:37:49.597030 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-95zs6" Mar 20 10:37:49 crc kubenswrapper[4748]: I0320 10:37:49.599307 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 20 10:37:49 crc kubenswrapper[4748]: I0320 10:37:49.599339 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 20 10:37:49 crc kubenswrapper[4748]: I0320 10:37:49.610561 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:49 crc kubenswrapper[4748]: I0320 10:37:49.610599 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:49 crc kubenswrapper[4748]: I0320 10:37:49.610627 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:49 crc kubenswrapper[4748]: I0320 10:37:49.610651 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:49 crc kubenswrapper[4748]: I0320 10:37:49.610665 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:49Z","lastTransitionTime":"2026-03-20T10:37:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:49 crc kubenswrapper[4748]: I0320 10:37:49.610736 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:49 crc kubenswrapper[4748]: I0320 10:37:49.620794 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:49 crc kubenswrapper[4748]: I0320 10:37:49.631406 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ftkzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb932f5b-ebd7-48e2-ba20-3d1633290c8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9l9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:37:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ftkzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:49 crc kubenswrapper[4748]: I0320 10:37:49.648052 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-z5ksw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4275e40d-41ca-4fe4-a44b-fe86f4d2e78b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srwmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:37:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-z5ksw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:49 crc kubenswrapper[4748]: I0320 10:37:49.663005 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fksm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fksm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:37:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5lbvz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:49 crc kubenswrapper[4748]: I0320 10:37:49.667026 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7866861b-54f0-43f4-8038-2f87675ff0f7-env-overrides\") pod \"ovnkube-control-plane-749d76644c-95zs6\" (UID: \"7866861b-54f0-43f4-8038-2f87675ff0f7\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-95zs6" Mar 20 10:37:49 crc kubenswrapper[4748]: I0320 10:37:49.667156 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ng765\" (UniqueName: \"kubernetes.io/projected/7866861b-54f0-43f4-8038-2f87675ff0f7-kube-api-access-ng765\") pod \"ovnkube-control-plane-749d76644c-95zs6\" (UID: \"7866861b-54f0-43f4-8038-2f87675ff0f7\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-95zs6" Mar 20 10:37:49 crc kubenswrapper[4748]: I0320 10:37:49.667213 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7866861b-54f0-43f4-8038-2f87675ff0f7-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-95zs6\" (UID: \"7866861b-54f0-43f4-8038-2f87675ff0f7\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-95zs6" Mar 20 10:37:49 crc kubenswrapper[4748]: I0320 10:37:49.667275 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7866861b-54f0-43f4-8038-2f87675ff0f7-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-95zs6\" (UID: \"7866861b-54f0-43f4-8038-2f87675ff0f7\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-95zs6" Mar 20 10:37:49 crc kubenswrapper[4748]: I0320 10:37:49.692439 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bebc081d-35c4-4fb1-b774-1b05d4294efe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:36:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:36:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:36:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:36:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:36:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ca15e526ecd376669cb2a7c1548debd0aff2ab5e7ae466bbfca7398391a4eb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:36:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5844c4d5c4b02ba26d1e9f39a80e6df89f51440d755e03de56504a9373c0955\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:36:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://641de573660b1953a4b99303cb87e56a7d9b8fabadf4fbf1800bc853a5b49a86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:36:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78a3b0d0138fd88cff5759472f5be514bb57ee4a566614d8a2a5338e38af7260\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:36:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08e5a3265c29eedc75658bbc6228215861d3f584633ef23c20df9e11ed8e0141\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:36:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d11cbf1605caf577fd462823f54f7390c8c8ab1d24036423e46215d1598af27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d11cbf1605caf577fd462823f54f7390c8c8ab1d24036423e46215d1598af27\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:36:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:36:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddabedf53696b05f22ba61f42f37346f819ab9d210e1ffb33bcc4c4bc685daa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddabedf53696b05f22ba61f42f37346f819ab9d210e1ffb33bcc4c4bc685daa8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:36:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:36:07Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9147cf13ba247b6bd925ebdd83346a8023d6f9cc0ec94b41bde683ce7a91f737\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9147cf13ba247b6bd925ebdd83346a8023d6f9cc0ec94b41bde683ce7a91f737\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:36:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:36:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:36:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:49 crc kubenswrapper[4748]: I0320 10:37:49.704376 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:49 crc kubenswrapper[4748]: I0320 10:37:49.713292 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:49 crc kubenswrapper[4748]: I0320 10:37:49.713357 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:49 crc kubenswrapper[4748]: I0320 10:37:49.713381 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:49 crc kubenswrapper[4748]: I0320 10:37:49.713413 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:49 crc kubenswrapper[4748]: I0320 10:37:49.713435 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:49Z","lastTransitionTime":"2026-03-20T10:37:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:49 crc kubenswrapper[4748]: I0320 10:37:49.719021 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:49 crc kubenswrapper[4748]: I0320 10:37:49.728718 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w68d2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cb89200-ecbf-4725-b48f-801aaecd6ad0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-szq25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:37:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w68d2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:49 crc kubenswrapper[4748]: I0320 10:37:49.742655 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5483745-95c8-4c6e-bd16-ec5fec57af5d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:36:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:36:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:36:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:36:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:36:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c40d6e7336ba79828d98e8a8a2cb2e049ce6ad046d53a6337d9ab0d1ce0ee11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:36:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed2d8bdb5b326ae91ba0ca3da26d434af7ae6a3c604fd3ff22eb0ae8b00a0aa1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:36:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15d3d6d4f8c47119a747f3c76a9bb2b05c6f0030576418e787f3deb6e9f2485b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:36:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abfae41f906e09e4b2be25e2cdf0408811449414f51931ce149a8468a32a88c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abfae41f906e09e4b2be25e2cdf0408811449414f51931ce149a8468a32a88c6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:37:13Z\\\",\\\"message\\\":\\\"file observer\\\\nW0320 10:37:13.169272 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 10:37:13.169530 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:37:13.170805 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-114493928/tls.crt::/tmp/serving-cert-114493928/tls.key\\\\\\\"\\\\nI0320 10:37:13.625648 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 10:37:13.628420 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 10:37:13.628441 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 10:37:13.628469 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 10:37:13.628482 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 10:37:13.635699 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 10:37:13.635718 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0320 10:37:13.635717 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 10:37:13.635722 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:37:13.635738 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 10:37:13.635741 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 10:37:13.635744 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 10:37:13.635746 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 10:37:13.637809 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:37:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://989be6314e2d0af65e9541b5b2a0be5aed080255658e56c1c5bd989d1a842acf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:36:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b974db792d69c06471dc5bd8f9e64e727b052347aed06e5a1c80653838984f51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b974db792d69c06471dc5bd8f9e64e727b052347aed06e5a1c80653838984f51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:36:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:36:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:36:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:49 crc kubenswrapper[4748]: I0320 10:37:49.753210 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:49 crc kubenswrapper[4748]: I0320 10:37:49.767758 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:49 crc kubenswrapper[4748]: I0320 10:37:49.768197 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ng765\" (UniqueName: \"kubernetes.io/projected/7866861b-54f0-43f4-8038-2f87675ff0f7-kube-api-access-ng765\") pod \"ovnkube-control-plane-749d76644c-95zs6\" (UID: \"7866861b-54f0-43f4-8038-2f87675ff0f7\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-95zs6" Mar 20 10:37:49 crc kubenswrapper[4748]: I0320 10:37:49.768245 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7866861b-54f0-43f4-8038-2f87675ff0f7-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-95zs6\" (UID: \"7866861b-54f0-43f4-8038-2f87675ff0f7\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-95zs6" Mar 20 10:37:49 crc kubenswrapper[4748]: I0320 10:37:49.768279 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7866861b-54f0-43f4-8038-2f87675ff0f7-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-95zs6\" (UID: \"7866861b-54f0-43f4-8038-2f87675ff0f7\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-95zs6" Mar 20 10:37:49 crc kubenswrapper[4748]: I0320 10:37:49.768338 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7866861b-54f0-43f4-8038-2f87675ff0f7-env-overrides\") pod \"ovnkube-control-plane-749d76644c-95zs6\" (UID: \"7866861b-54f0-43f4-8038-2f87675ff0f7\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-95zs6" Mar 20 10:37:49 crc kubenswrapper[4748]: I0320 10:37:49.769029 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7866861b-54f0-43f4-8038-2f87675ff0f7-env-overrides\") pod \"ovnkube-control-plane-749d76644c-95zs6\" (UID: \"7866861b-54f0-43f4-8038-2f87675ff0f7\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-95zs6" Mar 20 10:37:49 crc kubenswrapper[4748]: I0320 10:37:49.769436 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7866861b-54f0-43f4-8038-2f87675ff0f7-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-95zs6\" (UID: \"7866861b-54f0-43f4-8038-2f87675ff0f7\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-95zs6" Mar 20 10:37:49 crc kubenswrapper[4748]: I0320 10:37:49.776520 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7866861b-54f0-43f4-8038-2f87675ff0f7-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-95zs6\" (UID: \"7866861b-54f0-43f4-8038-2f87675ff0f7\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-95zs6" Mar 20 10:37:49 crc kubenswrapper[4748]: I0320 10:37:49.782872 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qnjmr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16bd6321-67e6-40c7-9ad0-5c9035765e5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6ncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6ncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6ncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6ncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6ncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6ncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6ncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:37:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qnjmr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:49 crc kubenswrapper[4748]: I0320 10:37:49.790678 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ng765\" (UniqueName: \"kubernetes.io/projected/7866861b-54f0-43f4-8038-2f87675ff0f7-kube-api-access-ng765\") pod \"ovnkube-control-plane-749d76644c-95zs6\" (UID: \"7866861b-54f0-43f4-8038-2f87675ff0f7\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-95zs6" Mar 20 10:37:49 crc kubenswrapper[4748]: I0320 10:37:49.802658 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xdzb8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f31addae-43ae-459d-bf9d-b5c0ee58faba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndt8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndt8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndt8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndt8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndt8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndt8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndt8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndt8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndt8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:37:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xdzb8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:49 crc kubenswrapper[4748]: I0320 10:37:49.816651 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:49 crc kubenswrapper[4748]: I0320 10:37:49.816659 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-95zs6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7866861b-54f0-43f4-8038-2f87675ff0f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ng765\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ng765\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:37:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-95zs6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:49 crc kubenswrapper[4748]: I0320 10:37:49.816719 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:49 crc kubenswrapper[4748]: I0320 10:37:49.816741 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:49 crc kubenswrapper[4748]: I0320 10:37:49.816770 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:49 crc kubenswrapper[4748]: I0320 10:37:49.816794 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:49Z","lastTransitionTime":"2026-03-20T10:37:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:49 crc kubenswrapper[4748]: I0320 10:37:49.911095 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-95zs6" Mar 20 10:37:49 crc kubenswrapper[4748]: I0320 10:37:49.918830 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:49 crc kubenswrapper[4748]: I0320 10:37:49.918902 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:49 crc kubenswrapper[4748]: I0320 10:37:49.918920 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:49 crc kubenswrapper[4748]: I0320 10:37:49.918943 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:49 crc kubenswrapper[4748]: I0320 10:37:49.918961 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:49Z","lastTransitionTime":"2026-03-20T10:37:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:49 crc kubenswrapper[4748]: W0320 10:37:49.929458 4748 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7866861b_54f0_43f4_8038_2f87675ff0f7.slice/crio-c210d10f22dab2fd8a09448d050c649e8a216c23e6a5d52c897cbd803342683f WatchSource:0}: Error finding container c210d10f22dab2fd8a09448d050c649e8a216c23e6a5d52c897cbd803342683f: Status 404 returned error can't find the container with id c210d10f22dab2fd8a09448d050c649e8a216c23e6a5d52c897cbd803342683f Mar 20 10:37:50 crc kubenswrapper[4748]: I0320 10:37:50.021676 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:50 crc kubenswrapper[4748]: I0320 10:37:50.021717 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:50 crc kubenswrapper[4748]: I0320 10:37:50.021729 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:50 crc kubenswrapper[4748]: I0320 10:37:50.021747 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:50 crc kubenswrapper[4748]: I0320 10:37:50.021759 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:50Z","lastTransitionTime":"2026-03-20T10:37:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:50 crc kubenswrapper[4748]: I0320 10:37:50.095192 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-95zs6" event={"ID":"7866861b-54f0-43f4-8038-2f87675ff0f7","Type":"ContainerStarted","Data":"c210d10f22dab2fd8a09448d050c649e8a216c23e6a5d52c897cbd803342683f"} Mar 20 10:37:50 crc kubenswrapper[4748]: I0320 10:37:50.124730 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:50 crc kubenswrapper[4748]: I0320 10:37:50.124774 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:50 crc kubenswrapper[4748]: I0320 10:37:50.124786 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:50 crc kubenswrapper[4748]: I0320 10:37:50.124805 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:50 crc kubenswrapper[4748]: I0320 10:37:50.124818 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:50Z","lastTransitionTime":"2026-03-20T10:37:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:50 crc kubenswrapper[4748]: I0320 10:37:50.227189 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:50 crc kubenswrapper[4748]: I0320 10:37:50.227241 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:50 crc kubenswrapper[4748]: I0320 10:37:50.227249 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:50 crc kubenswrapper[4748]: I0320 10:37:50.227263 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:50 crc kubenswrapper[4748]: I0320 10:37:50.227290 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:50Z","lastTransitionTime":"2026-03-20T10:37:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:50 crc kubenswrapper[4748]: I0320 10:37:50.273279 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:37:50 crc kubenswrapper[4748]: I0320 10:37:50.273377 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:37:50 crc kubenswrapper[4748]: I0320 10:37:50.273404 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:37:50 crc kubenswrapper[4748]: I0320 10:37:50.273429 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:37:50 crc kubenswrapper[4748]: I0320 10:37:50.273455 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:37:50 crc kubenswrapper[4748]: E0320 10:37:50.273553 4748 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 10:37:50 crc kubenswrapper[4748]: E0320 10:37:50.273570 4748 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 10:37:50 crc kubenswrapper[4748]: E0320 10:37:50.273573 4748 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 10:37:50 crc kubenswrapper[4748]: E0320 10:37:50.273582 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:37:58.273538293 +0000 UTC m=+113.415084147 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:37:50 crc kubenswrapper[4748]: E0320 10:37:50.273610 4748 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 10:37:50 crc kubenswrapper[4748]: E0320 10:37:50.273640 4748 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 10:37:50 crc kubenswrapper[4748]: E0320 10:37:50.273581 4748 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:37:50 crc kubenswrapper[4748]: E0320 10:37:50.273605 4748 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 10:37:50 crc kubenswrapper[4748]: E0320 10:37:50.273683 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 10:37:58.273670106 +0000 UTC m=+113.415215910 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 10:37:50 crc kubenswrapper[4748]: E0320 10:37:50.273639 4748 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:37:50 crc kubenswrapper[4748]: E0320 10:37:50.273741 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 10:37:58.273721168 +0000 UTC m=+113.415267022 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:37:50 crc kubenswrapper[4748]: E0320 10:37:50.273765 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 10:37:58.273751728 +0000 UTC m=+113.415297582 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 10:37:50 crc kubenswrapper[4748]: E0320 10:37:50.273830 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 10:37:58.27379874 +0000 UTC m=+113.415344604 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:37:50 crc kubenswrapper[4748]: I0320 10:37:50.330485 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:50 crc kubenswrapper[4748]: I0320 10:37:50.330537 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:50 crc kubenswrapper[4748]: I0320 10:37:50.330548 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:50 crc kubenswrapper[4748]: I0320 10:37:50.330568 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:50 crc kubenswrapper[4748]: I0320 10:37:50.330583 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:50Z","lastTransitionTime":"2026-03-20T10:37:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:50 crc kubenswrapper[4748]: I0320 10:37:50.433345 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:50 crc kubenswrapper[4748]: I0320 10:37:50.433397 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:50 crc kubenswrapper[4748]: I0320 10:37:50.433417 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:50 crc kubenswrapper[4748]: I0320 10:37:50.433441 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:50 crc kubenswrapper[4748]: I0320 10:37:50.433459 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:50Z","lastTransitionTime":"2026-03-20T10:37:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:50 crc kubenswrapper[4748]: I0320 10:37:50.515365 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:37:50 crc kubenswrapper[4748]: I0320 10:37:50.515424 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:37:50 crc kubenswrapper[4748]: I0320 10:37:50.515387 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:37:50 crc kubenswrapper[4748]: E0320 10:37:50.515547 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:37:50 crc kubenswrapper[4748]: E0320 10:37:50.515648 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:37:50 crc kubenswrapper[4748]: E0320 10:37:50.515765 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:37:50 crc kubenswrapper[4748]: I0320 10:37:50.536659 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:50 crc kubenswrapper[4748]: I0320 10:37:50.536726 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:50 crc kubenswrapper[4748]: I0320 10:37:50.536749 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:50 crc kubenswrapper[4748]: I0320 10:37:50.536801 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:50 crc kubenswrapper[4748]: I0320 10:37:50.536823 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:50Z","lastTransitionTime":"2026-03-20T10:37:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:50 crc kubenswrapper[4748]: I0320 10:37:50.640278 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:50 crc kubenswrapper[4748]: I0320 10:37:50.640363 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:50 crc kubenswrapper[4748]: I0320 10:37:50.640387 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:50 crc kubenswrapper[4748]: I0320 10:37:50.640413 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:50 crc kubenswrapper[4748]: I0320 10:37:50.640431 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:50Z","lastTransitionTime":"2026-03-20T10:37:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:50 crc kubenswrapper[4748]: I0320 10:37:50.695168 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-5jzd5"] Mar 20 10:37:50 crc kubenswrapper[4748]: I0320 10:37:50.696210 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5jzd5" Mar 20 10:37:50 crc kubenswrapper[4748]: E0320 10:37:50.696318 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5jzd5" podUID="d7a2dfc5-1dd9-4ef1-9419-39f60da74b16" Mar 20 10:37:50 crc kubenswrapper[4748]: I0320 10:37:50.717143 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bebc081d-35c4-4fb1-b774-1b05d4294efe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:36:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:36:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:36:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:36:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:36:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ca15e526ecd376669cb2a7c1548debd0aff2ab5e7ae466bbfca7398391a4eb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:36:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5844c4d5c4b02ba26d1e9f39a80e6df89f51440d755e03de56504a9373c0955\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:36:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://641de573660b1953a4b99303cb87e56a7d9b8fabadf4fbf1800bc853a5b49a86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:36:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78a3b0d0138fd88cff5759472f5be514bb57ee4a566614d8a2a5338e38af7260\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:36:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08e5a3265c29eedc75658bbc6228215861d3f584633ef23c20df9e11ed8e0141\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:36:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d11cbf1605caf577fd462823f54f7390c8c8ab1d24036423e46215d1598af27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d11cbf1605caf577fd462823f54f7390c8c8ab1d24036423e46215d1598af27\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:36:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:36:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddabedf53696b05f22ba61f42f37346f819ab9d210e1ffb33bcc4c4bc685daa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddabedf53696b05f22ba61f42f37346f819ab9d210e1ffb33bcc4c4bc685daa8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:36:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:36:07Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9147cf13ba247b6bd925ebdd83346a8023d6f9cc0ec94b41bde683ce7a91f737\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9147cf13ba247b6bd925ebdd83346a8023d6f9cc0ec94b41bde683ce7a91f737\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:36:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:36:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:36:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:50 crc kubenswrapper[4748]: I0320 10:37:50.732917 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:50 crc kubenswrapper[4748]: I0320 10:37:50.743958 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:50 crc kubenswrapper[4748]: I0320 10:37:50.744007 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:50 crc kubenswrapper[4748]: I0320 10:37:50.744022 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:50 crc kubenswrapper[4748]: I0320 10:37:50.744044 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:50 crc kubenswrapper[4748]: I0320 10:37:50.744060 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:50Z","lastTransitionTime":"2026-03-20T10:37:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:50 crc kubenswrapper[4748]: I0320 10:37:50.747710 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:50 crc kubenswrapper[4748]: I0320 10:37:50.764728 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:50 crc kubenswrapper[4748]: I0320 10:37:50.778592 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ftkzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb932f5b-ebd7-48e2-ba20-3d1633290c8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9l9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:37:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ftkzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:50 crc kubenswrapper[4748]: I0320 10:37:50.779480 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kf2rc\" (UniqueName: \"kubernetes.io/projected/d7a2dfc5-1dd9-4ef1-9419-39f60da74b16-kube-api-access-kf2rc\") pod \"network-metrics-daemon-5jzd5\" (UID: \"d7a2dfc5-1dd9-4ef1-9419-39f60da74b16\") " pod="openshift-multus/network-metrics-daemon-5jzd5" Mar 20 10:37:50 crc kubenswrapper[4748]: I0320 10:37:50.779686 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d7a2dfc5-1dd9-4ef1-9419-39f60da74b16-metrics-certs\") pod \"network-metrics-daemon-5jzd5\" (UID: \"d7a2dfc5-1dd9-4ef1-9419-39f60da74b16\") " pod="openshift-multus/network-metrics-daemon-5jzd5" Mar 20 10:37:50 crc kubenswrapper[4748]: I0320 10:37:50.796152 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-z5ksw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4275e40d-41ca-4fe4-a44b-fe86f4d2e78b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srwmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:37:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-z5ksw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:50 crc kubenswrapper[4748]: I0320 10:37:50.809929 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fksm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fksm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:37:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5lbvz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:50 crc kubenswrapper[4748]: I0320 10:37:50.825374 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5483745-95c8-4c6e-bd16-ec5fec57af5d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:36:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:36:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:36:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:36:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:36:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c40d6e7336ba79828d98e8a8a2cb2e049ce6ad046d53a6337d9ab0d1ce0ee11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:36:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed2d8bdb5b326ae91ba0ca3da26d434af7ae6a3c604fd3ff22eb0ae8b00a0aa1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:36:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15d3d6d4f8c47119a747f3c76a9bb2b05c6f0030576418e787f3deb6e9f2485b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:36:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abfae41f906e09e4b2be25e2cdf0408811449414f51931ce149a8468a32a88c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abfae41f906e09e4b2be25e2cdf0408811449414f51931ce149a8468a32a88c6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:37:13Z\\\",\\\"message\\\":\\\"file observer\\\\nW0320 10:37:13.169272 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 10:37:13.169530 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:37:13.170805 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-114493928/tls.crt::/tmp/serving-cert-114493928/tls.key\\\\\\\"\\\\nI0320 10:37:13.625648 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 10:37:13.628420 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 10:37:13.628441 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 10:37:13.628469 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 10:37:13.628482 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 10:37:13.635699 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 10:37:13.635718 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0320 10:37:13.635717 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 10:37:13.635722 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:37:13.635738 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 10:37:13.635741 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 10:37:13.635744 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 10:37:13.635746 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 10:37:13.637809 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:37:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://989be6314e2d0af65e9541b5b2a0be5aed080255658e56c1c5bd989d1a842acf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:36:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b974db792d69c06471dc5bd8f9e64e727b052347aed06e5a1c80653838984f51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b974db792d69c06471dc5bd8f9e64e727b052347aed06e5a1c80653838984f51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:36:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:36:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:36:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:50 crc kubenswrapper[4748]: I0320 10:37:50.840009 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:50 crc kubenswrapper[4748]: I0320 10:37:50.847319 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:50 crc kubenswrapper[4748]: I0320 10:37:50.847379 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:50 crc kubenswrapper[4748]: I0320 10:37:50.847391 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:50 crc kubenswrapper[4748]: I0320 10:37:50.847413 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:50 crc kubenswrapper[4748]: I0320 10:37:50.847425 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:50Z","lastTransitionTime":"2026-03-20T10:37:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:50 crc kubenswrapper[4748]: I0320 10:37:50.850357 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w68d2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cb89200-ecbf-4725-b48f-801aaecd6ad0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-szq25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:37:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w68d2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:50 crc kubenswrapper[4748]: I0320 10:37:50.863123 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:50 crc kubenswrapper[4748]: I0320 10:37:50.876219 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:50 crc kubenswrapper[4748]: I0320 10:37:50.880923 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kf2rc\" (UniqueName: \"kubernetes.io/projected/d7a2dfc5-1dd9-4ef1-9419-39f60da74b16-kube-api-access-kf2rc\") pod \"network-metrics-daemon-5jzd5\" (UID: \"d7a2dfc5-1dd9-4ef1-9419-39f60da74b16\") " pod="openshift-multus/network-metrics-daemon-5jzd5" Mar 20 10:37:50 crc kubenswrapper[4748]: I0320 10:37:50.880970 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d7a2dfc5-1dd9-4ef1-9419-39f60da74b16-metrics-certs\") pod \"network-metrics-daemon-5jzd5\" (UID: \"d7a2dfc5-1dd9-4ef1-9419-39f60da74b16\") " pod="openshift-multus/network-metrics-daemon-5jzd5" Mar 20 10:37:50 crc kubenswrapper[4748]: E0320 10:37:50.881176 4748 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 10:37:50 crc kubenswrapper[4748]: E0320 10:37:50.881254 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d7a2dfc5-1dd9-4ef1-9419-39f60da74b16-metrics-certs podName:d7a2dfc5-1dd9-4ef1-9419-39f60da74b16 nodeName:}" failed. No retries permitted until 2026-03-20 10:37:51.381228151 +0000 UTC m=+106.522773955 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d7a2dfc5-1dd9-4ef1-9419-39f60da74b16-metrics-certs") pod "network-metrics-daemon-5jzd5" (UID: "d7a2dfc5-1dd9-4ef1-9419-39f60da74b16") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 10:37:50 crc kubenswrapper[4748]: I0320 10:37:50.893460 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qnjmr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16bd6321-67e6-40c7-9ad0-5c9035765e5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6ncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6ncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6ncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6ncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6ncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6ncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6ncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:37:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qnjmr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:50 crc kubenswrapper[4748]: I0320 10:37:50.902551 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kf2rc\" (UniqueName: \"kubernetes.io/projected/d7a2dfc5-1dd9-4ef1-9419-39f60da74b16-kube-api-access-kf2rc\") pod \"network-metrics-daemon-5jzd5\" (UID: \"d7a2dfc5-1dd9-4ef1-9419-39f60da74b16\") " pod="openshift-multus/network-metrics-daemon-5jzd5" Mar 20 10:37:50 crc kubenswrapper[4748]: I0320 10:37:50.921741 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xdzb8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f31addae-43ae-459d-bf9d-b5c0ee58faba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndt8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndt8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndt8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndt8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndt8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndt8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndt8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndt8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndt8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:37:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xdzb8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:50 crc kubenswrapper[4748]: I0320 10:37:50.939243 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-95zs6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7866861b-54f0-43f4-8038-2f87675ff0f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ng765\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ng765\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:37:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-95zs6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:50 crc kubenswrapper[4748]: I0320 10:37:50.951067 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:50 crc kubenswrapper[4748]: I0320 10:37:50.951124 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:50 crc kubenswrapper[4748]: I0320 10:37:50.951149 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:50 crc kubenswrapper[4748]: I0320 10:37:50.951178 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:50 crc kubenswrapper[4748]: I0320 10:37:50.951199 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:50Z","lastTransitionTime":"2026-03-20T10:37:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:50 crc kubenswrapper[4748]: I0320 10:37:50.953109 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5jzd5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7a2dfc5-1dd9-4ef1-9419-39f60da74b16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kf2rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kf2rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:37:50Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5jzd5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:51 crc kubenswrapper[4748]: I0320 10:37:51.054983 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:51 crc kubenswrapper[4748]: I0320 10:37:51.055050 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:51 crc kubenswrapper[4748]: I0320 10:37:51.055066 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:51 crc kubenswrapper[4748]: I0320 10:37:51.055089 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:51 crc kubenswrapper[4748]: I0320 10:37:51.055103 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:51Z","lastTransitionTime":"2026-03-20T10:37:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:51 crc kubenswrapper[4748]: I0320 10:37:51.101325 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-95zs6" event={"ID":"7866861b-54f0-43f4-8038-2f87675ff0f7","Type":"ContainerStarted","Data":"3255d6c8ddc6aa04fcbceefd8e5fad043235401608adb5b05cd7aa8a30208e0c"} Mar 20 10:37:51 crc kubenswrapper[4748]: I0320 10:37:51.101381 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-95zs6" event={"ID":"7866861b-54f0-43f4-8038-2f87675ff0f7","Type":"ContainerStarted","Data":"800ac44e3f5eb0fee149f0e67f71fc566444c67b08fb592a3b30bf780fae8bfb"} Mar 20 10:37:51 crc kubenswrapper[4748]: I0320 10:37:51.117629 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:51 crc kubenswrapper[4748]: I0320 10:37:51.131616 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ftkzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb932f5b-ebd7-48e2-ba20-3d1633290c8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9l9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:37:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ftkzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:51 crc kubenswrapper[4748]: I0320 10:37:51.149238 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-z5ksw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4275e40d-41ca-4fe4-a44b-fe86f4d2e78b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srwmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:37:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-z5ksw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:51 crc kubenswrapper[4748]: I0320 10:37:51.158824 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:51 crc kubenswrapper[4748]: I0320 10:37:51.158937 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:51 crc kubenswrapper[4748]: I0320 10:37:51.158954 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:51 crc kubenswrapper[4748]: I0320 10:37:51.158979 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:51 crc kubenswrapper[4748]: I0320 10:37:51.158996 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:51Z","lastTransitionTime":"2026-03-20T10:37:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:51 crc kubenswrapper[4748]: I0320 10:37:51.166904 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fksm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fksm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:37:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5lbvz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:51 crc kubenswrapper[4748]: I0320 10:37:51.209964 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bebc081d-35c4-4fb1-b774-1b05d4294efe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:36:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:36:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:36:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:36:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:36:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ca15e526ecd376669cb2a7c1548debd0aff2ab5e7ae466bbfca7398391a4eb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:36:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5844c4d5c4b02ba26d1e9f39a80e6df89f51440d755e03de56504a9373c0955\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:36:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://641de573660b1953a4b99303cb87e56a7d9b8fabadf4fbf1800bc853a5b49a86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:36:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78a3b0d0138fd88cff5759472f5be514bb57ee4a566614d8a2a5338e38af7260\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:36:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08e5a3265c29eedc75658bbc6228215861d3f584633ef23c20df9e11ed8e0141\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:36:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d11cbf1605caf577fd462823f54f7390c8c8ab1d24036423e46215d1598af27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d11cbf1605caf577fd462823f54f7390c8c8ab1d24036423e46215d1598af27\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:36:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:36:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddabedf53696b05f22ba61f42f37346f819ab9d210e1ffb33bcc4c4bc685daa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddabedf53696b05f22ba61f42f37346f819ab9d210e1ffb33bcc4c4bc685daa8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:36:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:36:07Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9147cf13ba247b6bd925ebdd83346a8023d6f9cc0ec94b41bde683ce7a91f737\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9147cf13ba247b6bd925ebdd83346a8023d6f9cc0ec94b41bde683ce7a91f737\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:36:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:36:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:36:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:51 crc kubenswrapper[4748]: I0320 10:37:51.229140 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:51 crc kubenswrapper[4748]: I0320 10:37:51.242994 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:51 crc kubenswrapper[4748]: I0320 10:37:51.251514 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w68d2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cb89200-ecbf-4725-b48f-801aaecd6ad0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-szq25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:37:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w68d2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:51 crc kubenswrapper[4748]: I0320 10:37:51.262070 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:51 crc kubenswrapper[4748]: I0320 10:37:51.262130 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:51 crc kubenswrapper[4748]: I0320 10:37:51.262147 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:51 crc kubenswrapper[4748]: I0320 10:37:51.262167 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:51 crc kubenswrapper[4748]: I0320 10:37:51.262182 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:51Z","lastTransitionTime":"2026-03-20T10:37:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:51 crc kubenswrapper[4748]: I0320 10:37:51.267316 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5483745-95c8-4c6e-bd16-ec5fec57af5d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:36:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:36:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:36:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:36:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:36:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c40d6e7336ba79828d98e8a8a2cb2e049ce6ad046d53a6337d9ab0d1ce0ee11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:36:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed2d8bdb5b326ae91ba0ca3da26d434af7ae6a3c604fd3ff22eb0ae8b00a0aa1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:36:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15d3d6d4f8c47119a747f3c76a9bb2b05c6f0030576418e787f3deb6e9f2485b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:36:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abfae41f906e09e4b2be25e2cdf0408811449414f51931ce149a8468a32a88c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abfae41f906e09e4b2be25e2cdf0408811449414f51931ce149a8468a32a88c6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:37:13Z\\\",\\\"message\\\":\\\"file observer\\\\nW0320 10:37:13.169272 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 10:37:13.169530 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:37:13.170805 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-114493928/tls.crt::/tmp/serving-cert-114493928/tls.key\\\\\\\"\\\\nI0320 10:37:13.625648 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 10:37:13.628420 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 10:37:13.628441 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 10:37:13.628469 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 10:37:13.628482 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 10:37:13.635699 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 10:37:13.635718 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0320 10:37:13.635717 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 10:37:13.635722 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:37:13.635738 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 10:37:13.635741 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 10:37:13.635744 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 10:37:13.635746 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 10:37:13.637809 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:37:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://989be6314e2d0af65e9541b5b2a0be5aed080255658e56c1c5bd989d1a842acf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:36:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b974db792d69c06471dc5bd8f9e64e727b052347aed06e5a1c80653838984f51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b974db792d69c06471dc5bd8f9e64e727b052347aed06e5a1c80653838984f51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:36:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:36:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:36:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:51 crc kubenswrapper[4748]: I0320 10:37:51.280907 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:51 crc kubenswrapper[4748]: I0320 10:37:51.292869 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:51 crc kubenswrapper[4748]: I0320 10:37:51.305287 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qnjmr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16bd6321-67e6-40c7-9ad0-5c9035765e5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6ncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6ncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6ncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6ncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6ncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6ncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6ncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:37:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qnjmr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:51 crc kubenswrapper[4748]: I0320 10:37:51.324309 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xdzb8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f31addae-43ae-459d-bf9d-b5c0ee58faba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndt8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndt8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndt8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndt8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndt8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndt8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndt8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndt8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndt8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:37:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xdzb8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:51 crc kubenswrapper[4748]: I0320 10:37:51.341958 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-95zs6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7866861b-54f0-43f4-8038-2f87675ff0f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://800ac44e3f5eb0fee149f0e67f71fc566444c67b08fb592a3b30bf780fae8bfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:37:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ng765\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3255d6c8ddc6aa04fcbceefd8e5fad043235401608adb5b05cd7aa8a30208e0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:37:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ng765\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:37:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-95zs6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:51 crc kubenswrapper[4748]: I0320 10:37:51.355281 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5jzd5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7a2dfc5-1dd9-4ef1-9419-39f60da74b16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kf2rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kf2rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:37:50Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5jzd5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:51 crc kubenswrapper[4748]: I0320 10:37:51.365159 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:51 crc kubenswrapper[4748]: I0320 10:37:51.365211 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:51 crc kubenswrapper[4748]: I0320 10:37:51.365225 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:51 crc kubenswrapper[4748]: I0320 10:37:51.365243 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:51 crc kubenswrapper[4748]: I0320 10:37:51.365256 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:51Z","lastTransitionTime":"2026-03-20T10:37:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:51 crc kubenswrapper[4748]: I0320 10:37:51.368357 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:51 crc kubenswrapper[4748]: I0320 10:37:51.388818 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d7a2dfc5-1dd9-4ef1-9419-39f60da74b16-metrics-certs\") pod \"network-metrics-daemon-5jzd5\" (UID: \"d7a2dfc5-1dd9-4ef1-9419-39f60da74b16\") " pod="openshift-multus/network-metrics-daemon-5jzd5" Mar 20 10:37:51 crc kubenswrapper[4748]: E0320 10:37:51.389035 4748 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 10:37:51 crc kubenswrapper[4748]: E0320 10:37:51.389126 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d7a2dfc5-1dd9-4ef1-9419-39f60da74b16-metrics-certs podName:d7a2dfc5-1dd9-4ef1-9419-39f60da74b16 nodeName:}" failed. No retries permitted until 2026-03-20 10:37:52.389103166 +0000 UTC m=+107.530649030 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d7a2dfc5-1dd9-4ef1-9419-39f60da74b16-metrics-certs") pod "network-metrics-daemon-5jzd5" (UID: "d7a2dfc5-1dd9-4ef1-9419-39f60da74b16") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 10:37:51 crc kubenswrapper[4748]: I0320 10:37:51.468428 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:51 crc kubenswrapper[4748]: I0320 10:37:51.468494 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:51 crc kubenswrapper[4748]: I0320 10:37:51.468521 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:51 crc kubenswrapper[4748]: I0320 10:37:51.468548 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:51 crc kubenswrapper[4748]: I0320 10:37:51.468565 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:51Z","lastTransitionTime":"2026-03-20T10:37:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:51 crc kubenswrapper[4748]: I0320 10:37:51.572435 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:51 crc kubenswrapper[4748]: I0320 10:37:51.572512 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:51 crc kubenswrapper[4748]: I0320 10:37:51.572529 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:51 crc kubenswrapper[4748]: I0320 10:37:51.572555 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:51 crc kubenswrapper[4748]: I0320 10:37:51.572571 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:51Z","lastTransitionTime":"2026-03-20T10:37:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:51 crc kubenswrapper[4748]: I0320 10:37:51.675140 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:51 crc kubenswrapper[4748]: I0320 10:37:51.675216 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:51 crc kubenswrapper[4748]: I0320 10:37:51.675240 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:51 crc kubenswrapper[4748]: I0320 10:37:51.675276 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:51 crc kubenswrapper[4748]: I0320 10:37:51.675301 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:51Z","lastTransitionTime":"2026-03-20T10:37:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:51 crc kubenswrapper[4748]: I0320 10:37:51.778433 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:51 crc kubenswrapper[4748]: I0320 10:37:51.778512 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:51 crc kubenswrapper[4748]: I0320 10:37:51.778571 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:51 crc kubenswrapper[4748]: I0320 10:37:51.778598 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:51 crc kubenswrapper[4748]: I0320 10:37:51.778618 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:51Z","lastTransitionTime":"2026-03-20T10:37:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:51 crc kubenswrapper[4748]: I0320 10:37:51.882359 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:51 crc kubenswrapper[4748]: I0320 10:37:51.882435 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:51 crc kubenswrapper[4748]: I0320 10:37:51.882457 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:51 crc kubenswrapper[4748]: I0320 10:37:51.882484 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:51 crc kubenswrapper[4748]: I0320 10:37:51.882503 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:51Z","lastTransitionTime":"2026-03-20T10:37:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:51 crc kubenswrapper[4748]: I0320 10:37:51.985924 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:51 crc kubenswrapper[4748]: I0320 10:37:51.985989 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:51 crc kubenswrapper[4748]: I0320 10:37:51.986003 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:51 crc kubenswrapper[4748]: I0320 10:37:51.986024 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:51 crc kubenswrapper[4748]: I0320 10:37:51.986039 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:51Z","lastTransitionTime":"2026-03-20T10:37:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:52 crc kubenswrapper[4748]: I0320 10:37:52.089855 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:52 crc kubenswrapper[4748]: I0320 10:37:52.089921 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:52 crc kubenswrapper[4748]: I0320 10:37:52.089934 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:52 crc kubenswrapper[4748]: I0320 10:37:52.089954 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:52 crc kubenswrapper[4748]: I0320 10:37:52.089968 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:52Z","lastTransitionTime":"2026-03-20T10:37:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:52 crc kubenswrapper[4748]: I0320 10:37:52.192974 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:52 crc kubenswrapper[4748]: I0320 10:37:52.193068 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:52 crc kubenswrapper[4748]: I0320 10:37:52.193096 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:52 crc kubenswrapper[4748]: I0320 10:37:52.193124 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:52 crc kubenswrapper[4748]: I0320 10:37:52.193143 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:52Z","lastTransitionTime":"2026-03-20T10:37:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:52 crc kubenswrapper[4748]: I0320 10:37:52.295877 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:52 crc kubenswrapper[4748]: I0320 10:37:52.295947 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:52 crc kubenswrapper[4748]: I0320 10:37:52.295972 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:52 crc kubenswrapper[4748]: I0320 10:37:52.296007 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:52 crc kubenswrapper[4748]: I0320 10:37:52.296033 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:52Z","lastTransitionTime":"2026-03-20T10:37:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:52 crc kubenswrapper[4748]: I0320 10:37:52.399523 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d7a2dfc5-1dd9-4ef1-9419-39f60da74b16-metrics-certs\") pod \"network-metrics-daemon-5jzd5\" (UID: \"d7a2dfc5-1dd9-4ef1-9419-39f60da74b16\") " pod="openshift-multus/network-metrics-daemon-5jzd5" Mar 20 10:37:52 crc kubenswrapper[4748]: I0320 10:37:52.399738 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:52 crc kubenswrapper[4748]: I0320 10:37:52.399874 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:52 crc kubenswrapper[4748]: I0320 10:37:52.399903 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:52 crc kubenswrapper[4748]: E0320 10:37:52.399798 4748 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 10:37:52 crc kubenswrapper[4748]: I0320 10:37:52.399963 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:52 crc kubenswrapper[4748]: I0320 10:37:52.399990 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:52Z","lastTransitionTime":"2026-03-20T10:37:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:52 crc kubenswrapper[4748]: E0320 10:37:52.400048 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d7a2dfc5-1dd9-4ef1-9419-39f60da74b16-metrics-certs podName:d7a2dfc5-1dd9-4ef1-9419-39f60da74b16 nodeName:}" failed. No retries permitted until 2026-03-20 10:37:54.400009728 +0000 UTC m=+109.541555572 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d7a2dfc5-1dd9-4ef1-9419-39f60da74b16-metrics-certs") pod "network-metrics-daemon-5jzd5" (UID: "d7a2dfc5-1dd9-4ef1-9419-39f60da74b16") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 10:37:52 crc kubenswrapper[4748]: I0320 10:37:52.503093 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:52 crc kubenswrapper[4748]: I0320 10:37:52.503153 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:52 crc kubenswrapper[4748]: I0320 10:37:52.503171 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:52 crc kubenswrapper[4748]: I0320 10:37:52.503197 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:52 crc kubenswrapper[4748]: I0320 10:37:52.503217 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:52Z","lastTransitionTime":"2026-03-20T10:37:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:52 crc kubenswrapper[4748]: I0320 10:37:52.514759 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5jzd5" Mar 20 10:37:52 crc kubenswrapper[4748]: I0320 10:37:52.514806 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:37:52 crc kubenswrapper[4748]: I0320 10:37:52.514914 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:37:52 crc kubenswrapper[4748]: E0320 10:37:52.515059 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5jzd5" podUID="d7a2dfc5-1dd9-4ef1-9419-39f60da74b16" Mar 20 10:37:52 crc kubenswrapper[4748]: I0320 10:37:52.515112 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:37:52 crc kubenswrapper[4748]: E0320 10:37:52.515230 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:37:52 crc kubenswrapper[4748]: E0320 10:37:52.515364 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:37:52 crc kubenswrapper[4748]: E0320 10:37:52.515545 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:37:52 crc kubenswrapper[4748]: I0320 10:37:52.605936 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:52 crc kubenswrapper[4748]: I0320 10:37:52.605996 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:52 crc kubenswrapper[4748]: I0320 10:37:52.606024 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:52 crc kubenswrapper[4748]: I0320 10:37:52.606056 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:52 crc kubenswrapper[4748]: I0320 10:37:52.606076 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:52Z","lastTransitionTime":"2026-03-20T10:37:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:52 crc kubenswrapper[4748]: I0320 10:37:52.710089 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:52 crc kubenswrapper[4748]: I0320 10:37:52.710189 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:52 crc kubenswrapper[4748]: I0320 10:37:52.710228 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:52 crc kubenswrapper[4748]: I0320 10:37:52.710268 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:52 crc kubenswrapper[4748]: I0320 10:37:52.710293 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:52Z","lastTransitionTime":"2026-03-20T10:37:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:52 crc kubenswrapper[4748]: I0320 10:37:52.813647 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:52 crc kubenswrapper[4748]: I0320 10:37:52.813719 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:52 crc kubenswrapper[4748]: I0320 10:37:52.813744 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:52 crc kubenswrapper[4748]: I0320 10:37:52.813774 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:52 crc kubenswrapper[4748]: I0320 10:37:52.813794 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:52Z","lastTransitionTime":"2026-03-20T10:37:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:52 crc kubenswrapper[4748]: I0320 10:37:52.867100 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:52 crc kubenswrapper[4748]: I0320 10:37:52.867179 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:52 crc kubenswrapper[4748]: I0320 10:37:52.867200 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:52 crc kubenswrapper[4748]: I0320 10:37:52.867238 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:52 crc kubenswrapper[4748]: I0320 10:37:52.867260 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:52Z","lastTransitionTime":"2026-03-20T10:37:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:52 crc kubenswrapper[4748]: E0320 10:37:52.885246 4748 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:37:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:37:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:37:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:37:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1d9697da-f407-4535-b044-2e042853bd80\\\",\\\"systemUUID\\\":\\\"1909d2db-5267-4c43-8cb4-dc64b5fa3add\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:52 crc kubenswrapper[4748]: I0320 10:37:52.890912 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:52 crc kubenswrapper[4748]: I0320 10:37:52.890959 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:52 crc kubenswrapper[4748]: I0320 10:37:52.891020 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:52 crc kubenswrapper[4748]: I0320 10:37:52.891046 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:52 crc kubenswrapper[4748]: I0320 10:37:52.891064 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:52Z","lastTransitionTime":"2026-03-20T10:37:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:52 crc kubenswrapper[4748]: E0320 10:37:52.909114 4748 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:37:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:37:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:37:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:37:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1d9697da-f407-4535-b044-2e042853bd80\\\",\\\"systemUUID\\\":\\\"1909d2db-5267-4c43-8cb4-dc64b5fa3add\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:52 crc kubenswrapper[4748]: I0320 10:37:52.914906 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:52 crc kubenswrapper[4748]: I0320 10:37:52.915016 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:52 crc kubenswrapper[4748]: I0320 10:37:52.915042 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:52 crc kubenswrapper[4748]: I0320 10:37:52.915075 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:52 crc kubenswrapper[4748]: I0320 10:37:52.915098 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:52Z","lastTransitionTime":"2026-03-20T10:37:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:52 crc kubenswrapper[4748]: E0320 10:37:52.929109 4748 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:37:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:37:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:37:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:37:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1d9697da-f407-4535-b044-2e042853bd80\\\",\\\"systemUUID\\\":\\\"1909d2db-5267-4c43-8cb4-dc64b5fa3add\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:52 crc kubenswrapper[4748]: I0320 10:37:52.934526 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:52 crc kubenswrapper[4748]: I0320 10:37:52.934570 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:52 crc kubenswrapper[4748]: I0320 10:37:52.934587 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:52 crc kubenswrapper[4748]: I0320 10:37:52.934610 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:52 crc kubenswrapper[4748]: I0320 10:37:52.934628 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:52Z","lastTransitionTime":"2026-03-20T10:37:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:52 crc kubenswrapper[4748]: E0320 10:37:52.955882 4748 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:37:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:37:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:37:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:37:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1d9697da-f407-4535-b044-2e042853bd80\\\",\\\"systemUUID\\\":\\\"1909d2db-5267-4c43-8cb4-dc64b5fa3add\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:52 crc kubenswrapper[4748]: I0320 10:37:52.961482 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:52 crc kubenswrapper[4748]: I0320 10:37:52.961539 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:52 crc kubenswrapper[4748]: I0320 10:37:52.961556 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:52 crc kubenswrapper[4748]: I0320 10:37:52.961582 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:52 crc kubenswrapper[4748]: I0320 10:37:52.961601 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:52Z","lastTransitionTime":"2026-03-20T10:37:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:52 crc kubenswrapper[4748]: E0320 10:37:52.976869 4748 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:37:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:37:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:37:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:37:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1d9697da-f407-4535-b044-2e042853bd80\\\",\\\"systemUUID\\\":\\\"1909d2db-5267-4c43-8cb4-dc64b5fa3add\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:52 crc kubenswrapper[4748]: E0320 10:37:52.977401 4748 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 10:37:52 crc kubenswrapper[4748]: I0320 10:37:52.979711 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:52 crc kubenswrapper[4748]: I0320 10:37:52.979765 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:52 crc kubenswrapper[4748]: I0320 10:37:52.979787 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:52 crc kubenswrapper[4748]: I0320 10:37:52.979813 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:52 crc kubenswrapper[4748]: I0320 10:37:52.979870 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:52Z","lastTransitionTime":"2026-03-20T10:37:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:53 crc kubenswrapper[4748]: I0320 10:37:53.083454 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:53 crc kubenswrapper[4748]: I0320 10:37:53.083521 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:53 crc kubenswrapper[4748]: I0320 10:37:53.083542 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:53 crc kubenswrapper[4748]: I0320 10:37:53.083573 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:53 crc kubenswrapper[4748]: I0320 10:37:53.083597 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:53Z","lastTransitionTime":"2026-03-20T10:37:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:53 crc kubenswrapper[4748]: I0320 10:37:53.187675 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:53 crc kubenswrapper[4748]: I0320 10:37:53.187814 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:53 crc kubenswrapper[4748]: I0320 10:37:53.187865 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:53 crc kubenswrapper[4748]: I0320 10:37:53.187902 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:53 crc kubenswrapper[4748]: I0320 10:37:53.187925 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:53Z","lastTransitionTime":"2026-03-20T10:37:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:53 crc kubenswrapper[4748]: I0320 10:37:53.291785 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:53 crc kubenswrapper[4748]: I0320 10:37:53.291907 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:53 crc kubenswrapper[4748]: I0320 10:37:53.291926 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:53 crc kubenswrapper[4748]: I0320 10:37:53.291957 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:53 crc kubenswrapper[4748]: I0320 10:37:53.291979 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:53Z","lastTransitionTime":"2026-03-20T10:37:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:53 crc kubenswrapper[4748]: I0320 10:37:53.395760 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:53 crc kubenswrapper[4748]: I0320 10:37:53.395871 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:53 crc kubenswrapper[4748]: I0320 10:37:53.395896 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:53 crc kubenswrapper[4748]: I0320 10:37:53.395925 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:53 crc kubenswrapper[4748]: I0320 10:37:53.395948 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:53Z","lastTransitionTime":"2026-03-20T10:37:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:53 crc kubenswrapper[4748]: I0320 10:37:53.499474 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:53 crc kubenswrapper[4748]: I0320 10:37:53.499549 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:53 crc kubenswrapper[4748]: I0320 10:37:53.499570 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:53 crc kubenswrapper[4748]: I0320 10:37:53.499602 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:53 crc kubenswrapper[4748]: I0320 10:37:53.499627 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:53Z","lastTransitionTime":"2026-03-20T10:37:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:53 crc kubenswrapper[4748]: I0320 10:37:53.516657 4748 scope.go:117] "RemoveContainer" containerID="abfae41f906e09e4b2be25e2cdf0408811449414f51931ce149a8468a32a88c6" Mar 20 10:37:53 crc kubenswrapper[4748]: E0320 10:37:53.517289 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 10:37:53 crc kubenswrapper[4748]: I0320 10:37:53.603486 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:53 crc kubenswrapper[4748]: I0320 10:37:53.603550 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:53 crc kubenswrapper[4748]: I0320 10:37:53.603567 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:53 crc kubenswrapper[4748]: I0320 10:37:53.603593 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:53 crc kubenswrapper[4748]: I0320 10:37:53.603612 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:53Z","lastTransitionTime":"2026-03-20T10:37:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:53 crc kubenswrapper[4748]: I0320 10:37:53.707145 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:53 crc kubenswrapper[4748]: I0320 10:37:53.707198 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:53 crc kubenswrapper[4748]: I0320 10:37:53.707216 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:53 crc kubenswrapper[4748]: I0320 10:37:53.707241 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:53 crc kubenswrapper[4748]: I0320 10:37:53.707259 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:53Z","lastTransitionTime":"2026-03-20T10:37:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:53 crc kubenswrapper[4748]: I0320 10:37:53.810033 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:53 crc kubenswrapper[4748]: I0320 10:37:53.810090 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:53 crc kubenswrapper[4748]: I0320 10:37:53.810102 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:53 crc kubenswrapper[4748]: I0320 10:37:53.810122 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:53 crc kubenswrapper[4748]: I0320 10:37:53.810135 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:53Z","lastTransitionTime":"2026-03-20T10:37:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:53 crc kubenswrapper[4748]: I0320 10:37:53.912730 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:53 crc kubenswrapper[4748]: I0320 10:37:53.912788 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:53 crc kubenswrapper[4748]: I0320 10:37:53.912812 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:53 crc kubenswrapper[4748]: I0320 10:37:53.912861 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:53 crc kubenswrapper[4748]: I0320 10:37:53.912878 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:53Z","lastTransitionTime":"2026-03-20T10:37:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:54 crc kubenswrapper[4748]: I0320 10:37:54.015343 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:54 crc kubenswrapper[4748]: I0320 10:37:54.015388 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:54 crc kubenswrapper[4748]: I0320 10:37:54.015398 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:54 crc kubenswrapper[4748]: I0320 10:37:54.015417 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:54 crc kubenswrapper[4748]: I0320 10:37:54.015430 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:54Z","lastTransitionTime":"2026-03-20T10:37:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:54 crc kubenswrapper[4748]: I0320 10:37:54.112532 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-w68d2" event={"ID":"1cb89200-ecbf-4725-b48f-801aaecd6ad0","Type":"ContainerStarted","Data":"cb2d05795c830ca8e3f512fe8af38eacc2202ed111af8df7425104aeaab642d6"} Mar 20 10:37:54 crc kubenswrapper[4748]: I0320 10:37:54.118465 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:54 crc kubenswrapper[4748]: I0320 10:37:54.118498 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:54 crc kubenswrapper[4748]: I0320 10:37:54.118509 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:54 crc kubenswrapper[4748]: I0320 10:37:54.118524 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:54 crc kubenswrapper[4748]: I0320 10:37:54.118535 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:54Z","lastTransitionTime":"2026-03-20T10:37:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:54 crc kubenswrapper[4748]: I0320 10:37:54.127006 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ftkzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb932f5b-ebd7-48e2-ba20-3d1633290c8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9l9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:37:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ftkzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:54 crc kubenswrapper[4748]: I0320 10:37:54.144067 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-z5ksw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4275e40d-41ca-4fe4-a44b-fe86f4d2e78b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srwmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:37:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-z5ksw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:54 crc kubenswrapper[4748]: I0320 10:37:54.158179 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fksm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fksm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:37:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5lbvz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:54 crc kubenswrapper[4748]: I0320 10:37:54.186865 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bebc081d-35c4-4fb1-b774-1b05d4294efe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:36:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:36:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:36:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:36:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:36:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ca15e526ecd376669cb2a7c1548debd0aff2ab5e7ae466bbfca7398391a4eb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:36:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5844c4d5c4b02ba26d1e9f39a80e6df89f51440d755e03de56504a9373c0955\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:36:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://641de573660b1953a4b99303cb87e56a7d9b8fabadf4fbf1800bc853a5b49a86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:36:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78a3b0d0138fd88cff5759472f5be514bb57ee4a566614d8a2a5338e38af7260\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:36:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08e5a3265c29eedc75658bbc6228215861d3f584633ef23c20df9e11ed8e0141\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:36:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d11cbf1605caf577fd462823f54f7390c8c8ab1d24036423e46215d1598af27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d11cbf1605caf577fd462823f54f7390c8c8ab1d24036423e46215d1598af27\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:36:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:36:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddabedf53696b05f22ba61f42f37346f819ab9d210e1ffb33bcc4c4bc685daa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddabedf53696b05f22ba61f42f37346f819ab9d210e1ffb33bcc4c4bc685daa8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:36:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:36:07Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9147cf13ba247b6bd925ebdd83346a8023d6f9cc0ec94b41bde683ce7a91f737\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9147cf13ba247b6bd925ebdd83346a8023d6f9cc0ec94b41bde683ce7a91f737\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:36:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:36:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:36:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:54 crc kubenswrapper[4748]: I0320 10:37:54.206729 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:54 crc kubenswrapper[4748]: I0320 10:37:54.223303 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:54 crc kubenswrapper[4748]: I0320 10:37:54.223447 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:54 crc kubenswrapper[4748]: I0320 10:37:54.223477 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:54 crc kubenswrapper[4748]: I0320 10:37:54.223463 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:54 crc kubenswrapper[4748]: I0320 10:37:54.223515 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:54 crc kubenswrapper[4748]: I0320 10:37:54.223737 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:54Z","lastTransitionTime":"2026-03-20T10:37:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:54 crc kubenswrapper[4748]: I0320 10:37:54.241545 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:54 crc kubenswrapper[4748]: I0320 10:37:54.260718 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5483745-95c8-4c6e-bd16-ec5fec57af5d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:36:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:36:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:36:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:36:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:36:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c40d6e7336ba79828d98e8a8a2cb2e049ce6ad046d53a6337d9ab0d1ce0ee11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:36:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed2d8bdb5b326ae91ba0ca3da26d434af7ae6a3c604fd3ff22eb0ae8b00a0aa1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:36:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15d3d6d4f8c47119a747f3c76a9bb2b05c6f0030576418e787f3deb6e9f2485b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:36:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abfae41f906e09e4b2be25e2cdf0408811449414f51931ce149a8468a32a88c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abfae41f906e09e4b2be25e2cdf0408811449414f51931ce149a8468a32a88c6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:37:13Z\\\",\\\"message\\\":\\\"file observer\\\\nW0320 10:37:13.169272 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 10:37:13.169530 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:37:13.170805 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-114493928/tls.crt::/tmp/serving-cert-114493928/tls.key\\\\\\\"\\\\nI0320 10:37:13.625648 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 10:37:13.628420 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 10:37:13.628441 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 10:37:13.628469 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 10:37:13.628482 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 10:37:13.635699 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 10:37:13.635718 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0320 10:37:13.635717 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 10:37:13.635722 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:37:13.635738 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 10:37:13.635741 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 10:37:13.635744 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 10:37:13.635746 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 10:37:13.637809 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:37:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://989be6314e2d0af65e9541b5b2a0be5aed080255658e56c1c5bd989d1a842acf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:36:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b974db792d69c06471dc5bd8f9e64e727b052347aed06e5a1c80653838984f51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b974db792d69c06471dc5bd8f9e64e727b052347aed06e5a1c80653838984f51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:36:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:36:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:36:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:54 crc kubenswrapper[4748]: I0320 10:37:54.277753 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:54 crc kubenswrapper[4748]: I0320 10:37:54.290982 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w68d2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cb89200-ecbf-4725-b48f-801aaecd6ad0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb2d05795c830ca8e3f512fe8af38eacc2202ed111af8df7425104aeaab642d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:37:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-szq25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:37:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w68d2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:54 crc kubenswrapper[4748]: I0320 10:37:54.305871 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:54 crc kubenswrapper[4748]: I0320 10:37:54.324519 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xdzb8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f31addae-43ae-459d-bf9d-b5c0ee58faba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndt8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndt8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndt8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndt8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndt8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndt8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndt8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndt8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndt8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:37:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xdzb8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:54 crc kubenswrapper[4748]: I0320 10:37:54.326854 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:54 crc kubenswrapper[4748]: I0320 10:37:54.326884 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:54 crc kubenswrapper[4748]: I0320 10:37:54.326895 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:54 crc kubenswrapper[4748]: I0320 10:37:54.326914 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:54 crc kubenswrapper[4748]: I0320 10:37:54.326927 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:54Z","lastTransitionTime":"2026-03-20T10:37:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:54 crc kubenswrapper[4748]: I0320 10:37:54.340310 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-95zs6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7866861b-54f0-43f4-8038-2f87675ff0f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://800ac44e3f5eb0fee149f0e67f71fc566444c67b08fb592a3b30bf780fae8bfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:37:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ng765\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3255d6c8ddc6aa04fcbceefd8e5fad043235401608adb5b05cd7aa8a30208e0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:37:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ng765\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:37:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-95zs6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:54 crc kubenswrapper[4748]: I0320 10:37:54.357029 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5jzd5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7a2dfc5-1dd9-4ef1-9419-39f60da74b16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kf2rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kf2rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:37:50Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5jzd5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:54 crc kubenswrapper[4748]: I0320 10:37:54.375374 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:54 crc kubenswrapper[4748]: I0320 10:37:54.395784 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qnjmr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16bd6321-67e6-40c7-9ad0-5c9035765e5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6ncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6ncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6ncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6ncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6ncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6ncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6ncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:37:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qnjmr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:54 crc kubenswrapper[4748]: I0320 10:37:54.423222 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d7a2dfc5-1dd9-4ef1-9419-39f60da74b16-metrics-certs\") pod \"network-metrics-daemon-5jzd5\" (UID: \"d7a2dfc5-1dd9-4ef1-9419-39f60da74b16\") " pod="openshift-multus/network-metrics-daemon-5jzd5" Mar 20 10:37:54 crc kubenswrapper[4748]: E0320 10:37:54.423454 4748 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 10:37:54 crc kubenswrapper[4748]: E0320 10:37:54.423537 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d7a2dfc5-1dd9-4ef1-9419-39f60da74b16-metrics-certs podName:d7a2dfc5-1dd9-4ef1-9419-39f60da74b16 nodeName:}" failed. No retries permitted until 2026-03-20 10:37:58.423515266 +0000 UTC m=+113.565061110 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d7a2dfc5-1dd9-4ef1-9419-39f60da74b16-metrics-certs") pod "network-metrics-daemon-5jzd5" (UID: "d7a2dfc5-1dd9-4ef1-9419-39f60da74b16") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 10:37:54 crc kubenswrapper[4748]: I0320 10:37:54.435814 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:54 crc kubenswrapper[4748]: I0320 10:37:54.435937 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:54 crc kubenswrapper[4748]: I0320 10:37:54.435960 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:54 crc kubenswrapper[4748]: I0320 10:37:54.435985 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:54 crc kubenswrapper[4748]: I0320 10:37:54.436004 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:54Z","lastTransitionTime":"2026-03-20T10:37:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:54 crc kubenswrapper[4748]: I0320 10:37:54.515160 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5jzd5" Mar 20 10:37:54 crc kubenswrapper[4748]: I0320 10:37:54.515202 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:37:54 crc kubenswrapper[4748]: I0320 10:37:54.515212 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:37:54 crc kubenswrapper[4748]: E0320 10:37:54.515345 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5jzd5" podUID="d7a2dfc5-1dd9-4ef1-9419-39f60da74b16" Mar 20 10:37:54 crc kubenswrapper[4748]: E0320 10:37:54.516178 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:37:54 crc kubenswrapper[4748]: I0320 10:37:54.516269 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:37:54 crc kubenswrapper[4748]: E0320 10:37:54.516414 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:37:54 crc kubenswrapper[4748]: E0320 10:37:54.516546 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:37:54 crc kubenswrapper[4748]: I0320 10:37:54.538905 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:54 crc kubenswrapper[4748]: I0320 10:37:54.538968 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:54 crc kubenswrapper[4748]: I0320 10:37:54.538990 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:54 crc kubenswrapper[4748]: I0320 10:37:54.539021 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:54 crc kubenswrapper[4748]: I0320 10:37:54.539043 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:54Z","lastTransitionTime":"2026-03-20T10:37:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:54 crc kubenswrapper[4748]: I0320 10:37:54.642017 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:54 crc kubenswrapper[4748]: I0320 10:37:54.642102 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:54 crc kubenswrapper[4748]: I0320 10:37:54.642121 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:54 crc kubenswrapper[4748]: I0320 10:37:54.642149 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:54 crc kubenswrapper[4748]: I0320 10:37:54.642167 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:54Z","lastTransitionTime":"2026-03-20T10:37:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:54 crc kubenswrapper[4748]: I0320 10:37:54.745935 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:54 crc kubenswrapper[4748]: I0320 10:37:54.745988 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:54 crc kubenswrapper[4748]: I0320 10:37:54.745999 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:54 crc kubenswrapper[4748]: I0320 10:37:54.746032 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:54 crc kubenswrapper[4748]: I0320 10:37:54.746046 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:54Z","lastTransitionTime":"2026-03-20T10:37:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:54 crc kubenswrapper[4748]: I0320 10:37:54.848900 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:54 crc kubenswrapper[4748]: I0320 10:37:54.848971 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:54 crc kubenswrapper[4748]: I0320 10:37:54.848993 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:54 crc kubenswrapper[4748]: I0320 10:37:54.849017 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:54 crc kubenswrapper[4748]: I0320 10:37:54.849035 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:54Z","lastTransitionTime":"2026-03-20T10:37:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:54 crc kubenswrapper[4748]: I0320 10:37:54.952932 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:54 crc kubenswrapper[4748]: I0320 10:37:54.953016 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:54 crc kubenswrapper[4748]: I0320 10:37:54.953039 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:54 crc kubenswrapper[4748]: I0320 10:37:54.953069 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:54 crc kubenswrapper[4748]: I0320 10:37:54.953091 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:54Z","lastTransitionTime":"2026-03-20T10:37:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:55 crc kubenswrapper[4748]: I0320 10:37:55.057102 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:55 crc kubenswrapper[4748]: I0320 10:37:55.057193 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:55 crc kubenswrapper[4748]: I0320 10:37:55.057216 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:55 crc kubenswrapper[4748]: I0320 10:37:55.057248 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:55 crc kubenswrapper[4748]: I0320 10:37:55.057271 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:55Z","lastTransitionTime":"2026-03-20T10:37:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:55 crc kubenswrapper[4748]: I0320 10:37:55.160990 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:55 crc kubenswrapper[4748]: I0320 10:37:55.161067 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:55 crc kubenswrapper[4748]: I0320 10:37:55.161085 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:55 crc kubenswrapper[4748]: I0320 10:37:55.161115 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:55 crc kubenswrapper[4748]: I0320 10:37:55.161139 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:55Z","lastTransitionTime":"2026-03-20T10:37:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:55 crc kubenswrapper[4748]: I0320 10:37:55.265295 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:55 crc kubenswrapper[4748]: I0320 10:37:55.265368 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:55 crc kubenswrapper[4748]: I0320 10:37:55.265385 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:55 crc kubenswrapper[4748]: I0320 10:37:55.265415 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:55 crc kubenswrapper[4748]: I0320 10:37:55.265508 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:55Z","lastTransitionTime":"2026-03-20T10:37:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:55 crc kubenswrapper[4748]: I0320 10:37:55.368250 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:55 crc kubenswrapper[4748]: I0320 10:37:55.368307 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:55 crc kubenswrapper[4748]: I0320 10:37:55.368319 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:55 crc kubenswrapper[4748]: I0320 10:37:55.368338 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:55 crc kubenswrapper[4748]: I0320 10:37:55.368351 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:55Z","lastTransitionTime":"2026-03-20T10:37:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:55 crc kubenswrapper[4748]: I0320 10:37:55.471411 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:55 crc kubenswrapper[4748]: I0320 10:37:55.471476 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:55 crc kubenswrapper[4748]: I0320 10:37:55.471497 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:55 crc kubenswrapper[4748]: I0320 10:37:55.471523 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:55 crc kubenswrapper[4748]: I0320 10:37:55.471539 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:55Z","lastTransitionTime":"2026-03-20T10:37:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:55 crc kubenswrapper[4748]: I0320 10:37:55.529355 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-95zs6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7866861b-54f0-43f4-8038-2f87675ff0f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://800ac44e3f5eb0fee149f0e67f71fc566444c67b08fb592a3b30bf780fae8bfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:37:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ng765\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3255d6c8ddc6aa04fcbceefd8e5fad043235401608adb5b05cd7aa8a30208e0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:37:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ng765\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:37:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-95zs6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:55 crc kubenswrapper[4748]: I0320 10:37:55.539853 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5jzd5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7a2dfc5-1dd9-4ef1-9419-39f60da74b16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kf2rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kf2rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:37:50Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5jzd5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:55 crc kubenswrapper[4748]: I0320 10:37:55.554487 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:55 crc kubenswrapper[4748]: I0320 10:37:55.567705 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qnjmr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16bd6321-67e6-40c7-9ad0-5c9035765e5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6ncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6ncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6ncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6ncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6ncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6ncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6ncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:37:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qnjmr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:55 crc kubenswrapper[4748]: I0320 10:37:55.573987 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:55 crc kubenswrapper[4748]: I0320 10:37:55.574050 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:55 crc kubenswrapper[4748]: I0320 10:37:55.574073 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:55 crc kubenswrapper[4748]: I0320 10:37:55.574144 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:55 crc kubenswrapper[4748]: I0320 10:37:55.574171 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:55Z","lastTransitionTime":"2026-03-20T10:37:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:55 crc kubenswrapper[4748]: I0320 10:37:55.583134 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xdzb8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f31addae-43ae-459d-bf9d-b5c0ee58faba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndt8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndt8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndt8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndt8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndt8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndt8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndt8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndt8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndt8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:37:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xdzb8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:55 crc kubenswrapper[4748]: I0320 10:37:55.594097 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-z5ksw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4275e40d-41ca-4fe4-a44b-fe86f4d2e78b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srwmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:37:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-z5ksw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:55 crc kubenswrapper[4748]: I0320 10:37:55.609288 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fksm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fksm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:37:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5lbvz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:55 crc kubenswrapper[4748]: I0320 10:37:55.626788 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bebc081d-35c4-4fb1-b774-1b05d4294efe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:36:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:36:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:36:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:36:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:36:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ca15e526ecd376669cb2a7c1548debd0aff2ab5e7ae466bbfca7398391a4eb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:36:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5844c4d5c4b02ba26d1e9f39a80e6df89f51440d755e03de56504a9373c0955\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:36:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://641de573660b1953a4b99303cb87e56a7d9b8fabadf4fbf1800bc853a5b49a86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:36:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78a3b0d0138fd88cff5759472f5be514bb57ee4a566614d8a2a5338e38af7260\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:36:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08e5a3265c29eedc75658bbc6228215861d3f584633ef23c20df9e11ed8e0141\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:36:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d11cbf1605caf577fd462823f54f7390c8c8ab1d24036423e46215d1598af27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d11cbf1605caf577fd462823f54f7390c8c8ab1d24036423e46215d1598af27\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:36:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:36:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddabedf53696b05f22ba61f42f37346f819ab9d210e1ffb33bcc4c4bc685daa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddabedf53696b05f22ba61f42f37346f819ab9d210e1ffb33bcc4c4bc685daa8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:36:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:36:07Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9147cf13ba247b6bd925ebdd83346a8023d6f9cc0ec94b41bde683ce7a91f737\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9147cf13ba247b6bd925ebdd83346a8023d6f9cc0ec94b41bde683ce7a91f737\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:36:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:36:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:36:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:55 crc kubenswrapper[4748]: I0320 10:37:55.639180 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:55 crc kubenswrapper[4748]: I0320 10:37:55.652782 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:55 crc kubenswrapper[4748]: I0320 10:37:55.668869 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:55 crc kubenswrapper[4748]: I0320 10:37:55.676575 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:55 crc kubenswrapper[4748]: I0320 10:37:55.676610 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:55 crc kubenswrapper[4748]: I0320 10:37:55.676621 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:55 crc kubenswrapper[4748]: I0320 10:37:55.676641 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:55 crc kubenswrapper[4748]: I0320 10:37:55.676654 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:55Z","lastTransitionTime":"2026-03-20T10:37:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:55 crc kubenswrapper[4748]: I0320 10:37:55.680513 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ftkzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb932f5b-ebd7-48e2-ba20-3d1633290c8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9l9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:37:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ftkzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:55 crc kubenswrapper[4748]: I0320 10:37:55.696506 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5483745-95c8-4c6e-bd16-ec5fec57af5d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:36:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:36:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:36:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:36:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:36:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c40d6e7336ba79828d98e8a8a2cb2e049ce6ad046d53a6337d9ab0d1ce0ee11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:36:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed2d8bdb5b326ae91ba0ca3da26d434af7ae6a3c604fd3ff22eb0ae8b00a0aa1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:36:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15d3d6d4f8c47119a747f3c76a9bb2b05c6f0030576418e787f3deb6e9f2485b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:36:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abfae41f906e09e4b2be25e2cdf0408811449414f51931ce149a8468a32a88c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abfae41f906e09e4b2be25e2cdf0408811449414f51931ce149a8468a32a88c6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:37:13Z\\\",\\\"message\\\":\\\"file observer\\\\nW0320 10:37:13.169272 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 10:37:13.169530 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:37:13.170805 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-114493928/tls.crt::/tmp/serving-cert-114493928/tls.key\\\\\\\"\\\\nI0320 10:37:13.625648 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 10:37:13.628420 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 10:37:13.628441 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 10:37:13.628469 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 10:37:13.628482 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 10:37:13.635699 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 10:37:13.635718 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0320 10:37:13.635717 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 10:37:13.635722 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:37:13.635738 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 10:37:13.635741 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 10:37:13.635744 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 10:37:13.635746 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 10:37:13.637809 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:37:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://989be6314e2d0af65e9541b5b2a0be5aed080255658e56c1c5bd989d1a842acf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:36:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b974db792d69c06471dc5bd8f9e64e727b052347aed06e5a1c80653838984f51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b974db792d69c06471dc5bd8f9e64e727b052347aed06e5a1c80653838984f51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:36:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:36:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:36:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:55 crc kubenswrapper[4748]: I0320 10:37:55.710861 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:55 crc kubenswrapper[4748]: I0320 10:37:55.724752 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w68d2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cb89200-ecbf-4725-b48f-801aaecd6ad0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb2d05795c830ca8e3f512fe8af38eacc2202ed111af8df7425104aeaab642d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:37:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-szq25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:37:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w68d2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:55 crc kubenswrapper[4748]: I0320 10:37:55.739208 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:55 crc kubenswrapper[4748]: I0320 10:37:55.780138 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:55 crc kubenswrapper[4748]: I0320 10:37:55.780224 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:55 crc kubenswrapper[4748]: I0320 10:37:55.780245 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:55 crc kubenswrapper[4748]: I0320 10:37:55.780276 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:55 crc kubenswrapper[4748]: I0320 10:37:55.780297 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:55Z","lastTransitionTime":"2026-03-20T10:37:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:55 crc kubenswrapper[4748]: I0320 10:37:55.884204 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:55 crc kubenswrapper[4748]: I0320 10:37:55.884315 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:55 crc kubenswrapper[4748]: I0320 10:37:55.884334 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:55 crc kubenswrapper[4748]: I0320 10:37:55.884362 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:55 crc kubenswrapper[4748]: I0320 10:37:55.884386 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:55Z","lastTransitionTime":"2026-03-20T10:37:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:55 crc kubenswrapper[4748]: I0320 10:37:55.987755 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:55 crc kubenswrapper[4748]: I0320 10:37:55.987816 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:55 crc kubenswrapper[4748]: I0320 10:37:55.987866 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:55 crc kubenswrapper[4748]: I0320 10:37:55.987898 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:55 crc kubenswrapper[4748]: I0320 10:37:55.987918 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:55Z","lastTransitionTime":"2026-03-20T10:37:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:56 crc kubenswrapper[4748]: I0320 10:37:56.091156 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:56 crc kubenswrapper[4748]: I0320 10:37:56.091248 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:56 crc kubenswrapper[4748]: I0320 10:37:56.091277 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:56 crc kubenswrapper[4748]: I0320 10:37:56.091321 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:56 crc kubenswrapper[4748]: I0320 10:37:56.091344 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:56Z","lastTransitionTime":"2026-03-20T10:37:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:56 crc kubenswrapper[4748]: I0320 10:37:56.195028 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:56 crc kubenswrapper[4748]: I0320 10:37:56.195069 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:56 crc kubenswrapper[4748]: I0320 10:37:56.195081 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:56 crc kubenswrapper[4748]: I0320 10:37:56.195102 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:56 crc kubenswrapper[4748]: I0320 10:37:56.195116 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:56Z","lastTransitionTime":"2026-03-20T10:37:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:56 crc kubenswrapper[4748]: I0320 10:37:56.298414 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:56 crc kubenswrapper[4748]: I0320 10:37:56.298452 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:56 crc kubenswrapper[4748]: I0320 10:37:56.298461 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:56 crc kubenswrapper[4748]: I0320 10:37:56.298479 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:56 crc kubenswrapper[4748]: I0320 10:37:56.298489 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:56Z","lastTransitionTime":"2026-03-20T10:37:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:56 crc kubenswrapper[4748]: I0320 10:37:56.401808 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:56 crc kubenswrapper[4748]: I0320 10:37:56.401894 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:56 crc kubenswrapper[4748]: I0320 10:37:56.401911 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:56 crc kubenswrapper[4748]: I0320 10:37:56.401933 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:56 crc kubenswrapper[4748]: I0320 10:37:56.401946 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:56Z","lastTransitionTime":"2026-03-20T10:37:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:56 crc kubenswrapper[4748]: I0320 10:37:56.505397 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:56 crc kubenswrapper[4748]: I0320 10:37:56.505482 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:56 crc kubenswrapper[4748]: I0320 10:37:56.505499 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:56 crc kubenswrapper[4748]: I0320 10:37:56.505531 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:56 crc kubenswrapper[4748]: I0320 10:37:56.505550 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:56Z","lastTransitionTime":"2026-03-20T10:37:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:56 crc kubenswrapper[4748]: I0320 10:37:56.514472 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5jzd5" Mar 20 10:37:56 crc kubenswrapper[4748]: E0320 10:37:56.514697 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5jzd5" podUID="d7a2dfc5-1dd9-4ef1-9419-39f60da74b16" Mar 20 10:37:56 crc kubenswrapper[4748]: I0320 10:37:56.515085 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:37:56 crc kubenswrapper[4748]: I0320 10:37:56.515159 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:37:56 crc kubenswrapper[4748]: I0320 10:37:56.515246 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:37:56 crc kubenswrapper[4748]: E0320 10:37:56.515441 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:37:56 crc kubenswrapper[4748]: E0320 10:37:56.515542 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:37:56 crc kubenswrapper[4748]: E0320 10:37:56.515644 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:37:56 crc kubenswrapper[4748]: I0320 10:37:56.609206 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:56 crc kubenswrapper[4748]: I0320 10:37:56.609245 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:56 crc kubenswrapper[4748]: I0320 10:37:56.609301 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:56 crc kubenswrapper[4748]: I0320 10:37:56.609327 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:56 crc kubenswrapper[4748]: I0320 10:37:56.609367 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:56Z","lastTransitionTime":"2026-03-20T10:37:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:56 crc kubenswrapper[4748]: I0320 10:37:56.713158 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:56 crc kubenswrapper[4748]: I0320 10:37:56.713210 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:56 crc kubenswrapper[4748]: I0320 10:37:56.713224 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:56 crc kubenswrapper[4748]: I0320 10:37:56.713248 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:56 crc kubenswrapper[4748]: I0320 10:37:56.713263 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:56Z","lastTransitionTime":"2026-03-20T10:37:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:56 crc kubenswrapper[4748]: I0320 10:37:56.816978 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:56 crc kubenswrapper[4748]: I0320 10:37:56.817039 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:56 crc kubenswrapper[4748]: I0320 10:37:56.817058 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:56 crc kubenswrapper[4748]: I0320 10:37:56.817086 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:56 crc kubenswrapper[4748]: I0320 10:37:56.817106 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:56Z","lastTransitionTime":"2026-03-20T10:37:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:56 crc kubenswrapper[4748]: I0320 10:37:56.920308 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:56 crc kubenswrapper[4748]: I0320 10:37:56.920408 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:56 crc kubenswrapper[4748]: I0320 10:37:56.920433 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:56 crc kubenswrapper[4748]: I0320 10:37:56.921067 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:56 crc kubenswrapper[4748]: I0320 10:37:56.921343 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:56Z","lastTransitionTime":"2026-03-20T10:37:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:57 crc kubenswrapper[4748]: I0320 10:37:57.024915 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:57 crc kubenswrapper[4748]: I0320 10:37:57.024983 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:57 crc kubenswrapper[4748]: I0320 10:37:57.025003 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:57 crc kubenswrapper[4748]: I0320 10:37:57.025029 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:57 crc kubenswrapper[4748]: I0320 10:37:57.025052 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:57Z","lastTransitionTime":"2026-03-20T10:37:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:57 crc kubenswrapper[4748]: I0320 10:37:57.125102 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"f9cb1fc3eafd5de5fca7d4df8e73044f74a9ac7a576d64ba67fec5dfa110f565"} Mar 20 10:37:57 crc kubenswrapper[4748]: I0320 10:37:57.127664 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:57 crc kubenswrapper[4748]: I0320 10:37:57.127721 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:57 crc kubenswrapper[4748]: I0320 10:37:57.127739 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:57 crc kubenswrapper[4748]: I0320 10:37:57.127763 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:57 crc kubenswrapper[4748]: I0320 10:37:57.127781 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:57Z","lastTransitionTime":"2026-03-20T10:37:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:57 crc kubenswrapper[4748]: I0320 10:37:57.128274 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-z5ksw" event={"ID":"4275e40d-41ca-4fe4-a44b-fe86f4d2e78b","Type":"ContainerStarted","Data":"6d7da9e84325d6db16238d0a74d6557fb637f246413b37115035e4fd0aa8eaa9"} Mar 20 10:37:57 crc kubenswrapper[4748]: I0320 10:37:57.147725 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:57 crc kubenswrapper[4748]: I0320 10:37:57.189489 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xdzb8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f31addae-43ae-459d-bf9d-b5c0ee58faba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndt8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndt8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndt8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndt8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndt8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndt8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndt8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndt8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndt8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:37:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xdzb8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:57 crc kubenswrapper[4748]: I0320 10:37:57.206280 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-95zs6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7866861b-54f0-43f4-8038-2f87675ff0f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://800ac44e3f5eb0fee149f0e67f71fc566444c67b08fb592a3b30bf780fae8bfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:37:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ng765\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3255d6c8ddc6aa04fcbceefd8e5fad043235401608adb5b05cd7aa8a30208e0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:37:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ng765\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:37:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-95zs6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:57 crc kubenswrapper[4748]: I0320 10:37:57.219411 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5jzd5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7a2dfc5-1dd9-4ef1-9419-39f60da74b16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kf2rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kf2rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:37:50Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5jzd5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:57 crc kubenswrapper[4748]: I0320 10:37:57.231674 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:57 crc kubenswrapper[4748]: I0320 10:37:57.231733 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:57 crc kubenswrapper[4748]: I0320 10:37:57.231750 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:57 crc kubenswrapper[4748]: I0320 10:37:57.231779 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:57 crc kubenswrapper[4748]: I0320 10:37:57.231792 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:57Z","lastTransitionTime":"2026-03-20T10:37:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:57 crc kubenswrapper[4748]: I0320 10:37:57.234403 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9cb1fc3eafd5de5fca7d4df8e73044f74a9ac7a576d64ba67fec5dfa110f565\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:37:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:57 crc kubenswrapper[4748]: I0320 10:37:57.255357 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qnjmr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16bd6321-67e6-40c7-9ad0-5c9035765e5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6ncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6ncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6ncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6ncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6ncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6ncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6ncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:37:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qnjmr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:57 crc kubenswrapper[4748]: I0320 10:37:57.267749 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ftkzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb932f5b-ebd7-48e2-ba20-3d1633290c8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9l9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:37:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ftkzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:57 crc kubenswrapper[4748]: I0320 10:37:57.285882 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-z5ksw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4275e40d-41ca-4fe4-a44b-fe86f4d2e78b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srwmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:37:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-z5ksw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:57 crc kubenswrapper[4748]: I0320 10:37:57.301302 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fksm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fksm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:37:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5lbvz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:57 crc kubenswrapper[4748]: I0320 10:37:57.335582 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:57 crc kubenswrapper[4748]: I0320 10:37:57.335672 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:57 crc kubenswrapper[4748]: I0320 10:37:57.335691 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:57 crc kubenswrapper[4748]: I0320 10:37:57.335722 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:57 crc kubenswrapper[4748]: I0320 10:37:57.335741 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:57Z","lastTransitionTime":"2026-03-20T10:37:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:57 crc kubenswrapper[4748]: I0320 10:37:57.338587 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bebc081d-35c4-4fb1-b774-1b05d4294efe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:36:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:36:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:36:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:36:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:36:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ca15e526ecd376669cb2a7c1548debd0aff2ab5e7ae466bbfca7398391a4eb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:36:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5844c4d5c4b02ba26d1e9f39a80e6df89f51440d755e03de56504a9373c0955\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:36:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://641de573660b1953a4b99303cb87e56a7d9b8fabadf4fbf1800bc853a5b49a86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:36:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78a3b0d0138fd88cff5759472f5be514bb57ee4a566614d8a2a5338e38af7260\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:36:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08e5a3265c29eedc75658bbc6228215861d3f584633ef23c20df9e11ed8e0141\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:36:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d11cbf1605caf577fd462823f54f7390c8c8ab1d24036423e46215d1598af27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d11cbf1605caf577fd462823f54f7390c8c8ab1d24036423e46215d1598af27\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:36:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:36:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddabedf53696b05f22ba61f42f37346f819ab9d210e1ffb33bcc4c4bc685daa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddabedf53696b05f22ba61f42f37346f819ab9d210e1ffb33bcc4c4bc685daa8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:36:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:36:07Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9147cf13ba247b6bd925ebdd83346a8023d6f9cc0ec94b41bde683ce7a91f737\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9147cf13ba247b6bd925ebdd83346a8023d6f9cc0ec94b41bde683ce7a91f737\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:36:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:36:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:36:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:57 crc kubenswrapper[4748]: I0320 10:37:57.358277 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:57 crc kubenswrapper[4748]: I0320 10:37:57.375172 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:57 crc kubenswrapper[4748]: I0320 10:37:57.394947 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:57 crc kubenswrapper[4748]: I0320 10:37:57.417743 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5483745-95c8-4c6e-bd16-ec5fec57af5d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:36:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:36:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:36:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:36:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:36:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c40d6e7336ba79828d98e8a8a2cb2e049ce6ad046d53a6337d9ab0d1ce0ee11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:36:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed2d8bdb5b326ae91ba0ca3da26d434af7ae6a3c604fd3ff22eb0ae8b00a0aa1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:36:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15d3d6d4f8c47119a747f3c76a9bb2b05c6f0030576418e787f3deb6e9f2485b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:36:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abfae41f906e09e4b2be25e2cdf0408811449414f51931ce149a8468a32a88c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abfae41f906e09e4b2be25e2cdf0408811449414f51931ce149a8468a32a88c6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:37:13Z\\\",\\\"message\\\":\\\"file observer\\\\nW0320 10:37:13.169272 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 10:37:13.169530 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:37:13.170805 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-114493928/tls.crt::/tmp/serving-cert-114493928/tls.key\\\\\\\"\\\\nI0320 10:37:13.625648 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 10:37:13.628420 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 10:37:13.628441 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 10:37:13.628469 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 10:37:13.628482 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 10:37:13.635699 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 10:37:13.635718 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0320 10:37:13.635717 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 10:37:13.635722 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:37:13.635738 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 10:37:13.635741 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 10:37:13.635744 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 10:37:13.635746 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 10:37:13.637809 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:37:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://989be6314e2d0af65e9541b5b2a0be5aed080255658e56c1c5bd989d1a842acf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:36:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b974db792d69c06471dc5bd8f9e64e727b052347aed06e5a1c80653838984f51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b974db792d69c06471dc5bd8f9e64e727b052347aed06e5a1c80653838984f51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:36:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:36:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:36:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:57 crc kubenswrapper[4748]: I0320 10:37:57.433905 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:57 crc kubenswrapper[4748]: I0320 10:37:57.439511 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:57 crc kubenswrapper[4748]: I0320 10:37:57.439576 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:57 crc kubenswrapper[4748]: I0320 10:37:57.439592 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:57 crc kubenswrapper[4748]: I0320 10:37:57.439618 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:57 crc kubenswrapper[4748]: I0320 10:37:57.439637 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:57Z","lastTransitionTime":"2026-03-20T10:37:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:57 crc kubenswrapper[4748]: I0320 10:37:57.445571 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w68d2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cb89200-ecbf-4725-b48f-801aaecd6ad0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb2d05795c830ca8e3f512fe8af38eacc2202ed111af8df7425104aeaab642d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:37:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-szq25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:37:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w68d2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:57 crc kubenswrapper[4748]: I0320 10:37:57.461033 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:57 crc kubenswrapper[4748]: I0320 10:37:57.477176 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9cb1fc3eafd5de5fca7d4df8e73044f74a9ac7a576d64ba67fec5dfa110f565\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:37:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:57 crc kubenswrapper[4748]: I0320 10:37:57.501500 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qnjmr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16bd6321-67e6-40c7-9ad0-5c9035765e5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6ncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6ncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6ncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6ncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6ncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6ncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6ncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:37:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qnjmr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:57 crc kubenswrapper[4748]: I0320 10:37:57.530659 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xdzb8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f31addae-43ae-459d-bf9d-b5c0ee58faba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndt8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndt8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndt8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndt8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndt8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndt8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndt8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndt8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndt8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:37:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xdzb8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:57 crc kubenswrapper[4748]: I0320 10:37:57.542150 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:57 crc kubenswrapper[4748]: I0320 10:37:57.542210 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:57 crc kubenswrapper[4748]: I0320 10:37:57.542229 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:57 crc kubenswrapper[4748]: I0320 10:37:57.542259 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:57 crc kubenswrapper[4748]: I0320 10:37:57.542279 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:57Z","lastTransitionTime":"2026-03-20T10:37:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:57 crc kubenswrapper[4748]: I0320 10:37:57.545523 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-95zs6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7866861b-54f0-43f4-8038-2f87675ff0f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://800ac44e3f5eb0fee149f0e67f71fc566444c67b08fb592a3b30bf780fae8bfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:37:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ng765\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3255d6c8ddc6aa04fcbceefd8e5fad043235401608adb5b05cd7aa8a30208e0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:37:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ng765\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:37:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-95zs6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:57 crc kubenswrapper[4748]: I0320 10:37:57.560719 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5jzd5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7a2dfc5-1dd9-4ef1-9419-39f60da74b16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kf2rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kf2rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:37:50Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5jzd5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:57 crc kubenswrapper[4748]: I0320 10:37:57.582390 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bebc081d-35c4-4fb1-b774-1b05d4294efe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:36:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:36:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:36:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:36:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:36:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ca15e526ecd376669cb2a7c1548debd0aff2ab5e7ae466bbfca7398391a4eb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:36:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5844c4d5c4b02ba26d1e9f39a80e6df89f51440d755e03de56504a9373c0955\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:36:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://641de573660b1953a4b99303cb87e56a7d9b8fabadf4fbf1800bc853a5b49a86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:36:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78a3b0d0138fd88cff5759472f5be514bb57ee4a566614d8a2a5338e38af7260\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:36:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08e5a3265c29eedc75658bbc6228215861d3f584633ef23c20df9e11ed8e0141\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:36:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d11cbf1605caf577fd462823f54f7390c8c8ab1d24036423e46215d1598af27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d11cbf1605caf577fd462823f54f7390c8c8ab1d24036423e46215d1598af27\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:36:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:36:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddabedf53696b05f22ba61f42f37346f819ab9d210e1ffb33bcc4c4bc685daa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddabedf53696b05f22ba61f42f37346f819ab9d210e1ffb33bcc4c4bc685daa8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:36:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:36:07Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9147cf13ba247b6bd925ebdd83346a8023d6f9cc0ec94b41bde683ce7a91f737\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9147cf13ba247b6bd925ebdd83346a8023d6f9cc0ec94b41bde683ce7a91f737\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:36:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:36:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:36:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:57 crc kubenswrapper[4748]: I0320 10:37:57.595507 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:57 crc kubenswrapper[4748]: I0320 10:37:57.607762 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:57 crc kubenswrapper[4748]: I0320 10:37:57.622868 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:57 crc kubenswrapper[4748]: I0320 10:37:57.633575 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ftkzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb932f5b-ebd7-48e2-ba20-3d1633290c8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9l9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:37:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ftkzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:57 crc kubenswrapper[4748]: I0320 10:37:57.646007 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-z5ksw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4275e40d-41ca-4fe4-a44b-fe86f4d2e78b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d7da9e84325d6db16238d0a74d6557fb637f246413b37115035e4fd0aa8eaa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:37:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srwmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:37:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-z5ksw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:57 crc kubenswrapper[4748]: I0320 10:37:57.646172 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:57 crc kubenswrapper[4748]: I0320 10:37:57.646202 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:57 crc kubenswrapper[4748]: I0320 10:37:57.646210 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:57 crc kubenswrapper[4748]: I0320 10:37:57.646226 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:57 crc kubenswrapper[4748]: I0320 10:37:57.646236 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:57Z","lastTransitionTime":"2026-03-20T10:37:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:57 crc kubenswrapper[4748]: I0320 10:37:57.657204 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fksm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fksm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:37:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5lbvz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:57 crc kubenswrapper[4748]: I0320 10:37:57.669331 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5483745-95c8-4c6e-bd16-ec5fec57af5d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:36:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:36:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:36:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:36:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:36:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c40d6e7336ba79828d98e8a8a2cb2e049ce6ad046d53a6337d9ab0d1ce0ee11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:36:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed2d8bdb5b326ae91ba0ca3da26d434af7ae6a3c604fd3ff22eb0ae8b00a0aa1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:36:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15d3d6d4f8c47119a747f3c76a9bb2b05c6f0030576418e787f3deb6e9f2485b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:36:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abfae41f906e09e4b2be25e2cdf0408811449414f51931ce149a8468a32a88c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abfae41f906e09e4b2be25e2cdf0408811449414f51931ce149a8468a32a88c6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:37:13Z\\\",\\\"message\\\":\\\"file observer\\\\nW0320 10:37:13.169272 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 10:37:13.169530 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:37:13.170805 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-114493928/tls.crt::/tmp/serving-cert-114493928/tls.key\\\\\\\"\\\\nI0320 10:37:13.625648 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 10:37:13.628420 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 10:37:13.628441 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 10:37:13.628469 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 10:37:13.628482 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 10:37:13.635699 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 10:37:13.635718 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0320 10:37:13.635717 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 10:37:13.635722 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:37:13.635738 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 10:37:13.635741 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 10:37:13.635744 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 10:37:13.635746 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 10:37:13.637809 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:37:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://989be6314e2d0af65e9541b5b2a0be5aed080255658e56c1c5bd989d1a842acf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:36:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b974db792d69c06471dc5bd8f9e64e727b052347aed06e5a1c80653838984f51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b974db792d69c06471dc5bd8f9e64e727b052347aed06e5a1c80653838984f51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:36:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:36:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:36:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:57 crc kubenswrapper[4748]: I0320 10:37:57.680486 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:57 crc kubenswrapper[4748]: I0320 10:37:57.692767 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w68d2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cb89200-ecbf-4725-b48f-801aaecd6ad0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb2d05795c830ca8e3f512fe8af38eacc2202ed111af8df7425104aeaab642d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:37:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-szq25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:37:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w68d2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:57 crc kubenswrapper[4748]: I0320 10:37:57.748943 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:57 crc kubenswrapper[4748]: I0320 10:37:57.749022 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:57 crc kubenswrapper[4748]: I0320 10:37:57.749088 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:57 crc kubenswrapper[4748]: I0320 10:37:57.749122 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:57 crc kubenswrapper[4748]: I0320 10:37:57.749146 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:57Z","lastTransitionTime":"2026-03-20T10:37:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:57 crc kubenswrapper[4748]: I0320 10:37:57.852341 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:57 crc kubenswrapper[4748]: I0320 10:37:57.852409 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:57 crc kubenswrapper[4748]: I0320 10:37:57.852418 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:57 crc kubenswrapper[4748]: I0320 10:37:57.852435 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:57 crc kubenswrapper[4748]: I0320 10:37:57.852446 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:57Z","lastTransitionTime":"2026-03-20T10:37:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:57 crc kubenswrapper[4748]: I0320 10:37:57.955347 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:57 crc kubenswrapper[4748]: I0320 10:37:57.955390 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:57 crc kubenswrapper[4748]: I0320 10:37:57.955402 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:57 crc kubenswrapper[4748]: I0320 10:37:57.955421 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:57 crc kubenswrapper[4748]: I0320 10:37:57.955433 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:57Z","lastTransitionTime":"2026-03-20T10:37:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:58 crc kubenswrapper[4748]: I0320 10:37:58.058510 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:58 crc kubenswrapper[4748]: I0320 10:37:58.058595 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:58 crc kubenswrapper[4748]: I0320 10:37:58.058621 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:58 crc kubenswrapper[4748]: I0320 10:37:58.058652 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:58 crc kubenswrapper[4748]: I0320 10:37:58.058675 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:58Z","lastTransitionTime":"2026-03-20T10:37:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:58 crc kubenswrapper[4748]: I0320 10:37:58.133046 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qnjmr" event={"ID":"16bd6321-67e6-40c7-9ad0-5c9035765e5d","Type":"ContainerStarted","Data":"48dc39d41e31eafb666756d3a4195885c16775b9ef82f62a3a7ad15b22995b5c"} Mar 20 10:37:58 crc kubenswrapper[4748]: I0320 10:37:58.135804 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" event={"ID":"8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c","Type":"ContainerStarted","Data":"03c4104b260930c777d385e243f9163dff5c4db4d6a523b77574c5b0ba63b705"} Mar 20 10:37:58 crc kubenswrapper[4748]: I0320 10:37:58.138077 4748 generic.go:334] "Generic (PLEG): container finished" podID="f31addae-43ae-459d-bf9d-b5c0ee58faba" containerID="de562d64e46c2b835774aef715b11ffb91bcd82bad3f663874d0ee232ea9b341" exitCode=0 Mar 20 10:37:58 crc kubenswrapper[4748]: I0320 10:37:58.138133 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xdzb8" event={"ID":"f31addae-43ae-459d-bf9d-b5c0ee58faba","Type":"ContainerDied","Data":"de562d64e46c2b835774aef715b11ffb91bcd82bad3f663874d0ee232ea9b341"} Mar 20 10:37:58 crc kubenswrapper[4748]: I0320 10:37:58.156246 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5483745-95c8-4c6e-bd16-ec5fec57af5d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:36:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:36:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:36:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:36:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:36:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c40d6e7336ba79828d98e8a8a2cb2e049ce6ad046d53a6337d9ab0d1ce0ee11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:36:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed2d8bdb5b326ae91ba0ca3da26d434af7ae6a3c604fd3ff22eb0ae8b00a0aa1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:36:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15d3d6d4f8c47119a747f3c76a9bb2b05c6f0030576418e787f3deb6e9f2485b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:36:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abfae41f906e09e4b2be25e2cdf0408811449414f51931ce149a8468a32a88c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abfae41f906e09e4b2be25e2cdf0408811449414f51931ce149a8468a32a88c6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:37:13Z\\\",\\\"message\\\":\\\"file observer\\\\nW0320 10:37:13.169272 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 10:37:13.169530 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:37:13.170805 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-114493928/tls.crt::/tmp/serving-cert-114493928/tls.key\\\\\\\"\\\\nI0320 10:37:13.625648 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 10:37:13.628420 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 10:37:13.628441 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 10:37:13.628469 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 10:37:13.628482 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 10:37:13.635699 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 10:37:13.635718 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0320 10:37:13.635717 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 10:37:13.635722 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:37:13.635738 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 10:37:13.635741 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 10:37:13.635744 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 10:37:13.635746 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 10:37:13.637809 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:37:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://989be6314e2d0af65e9541b5b2a0be5aed080255658e56c1c5bd989d1a842acf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:36:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b974db792d69c06471dc5bd8f9e64e727b052347aed06e5a1c80653838984f51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b974db792d69c06471dc5bd8f9e64e727b052347aed06e5a1c80653838984f51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:36:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:36:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:36:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:58 crc kubenswrapper[4748]: I0320 10:37:58.162085 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:58 crc kubenswrapper[4748]: I0320 10:37:58.162152 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:58 crc kubenswrapper[4748]: I0320 10:37:58.162178 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:58 crc kubenswrapper[4748]: I0320 10:37:58.162212 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:58 crc kubenswrapper[4748]: I0320 10:37:58.162243 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:58Z","lastTransitionTime":"2026-03-20T10:37:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:58 crc kubenswrapper[4748]: I0320 10:37:58.172224 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:58 crc kubenswrapper[4748]: I0320 10:37:58.184140 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w68d2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cb89200-ecbf-4725-b48f-801aaecd6ad0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb2d05795c830ca8e3f512fe8af38eacc2202ed111af8df7425104aeaab642d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:37:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-szq25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:37:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w68d2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:58 crc kubenswrapper[4748]: I0320 10:37:58.209105 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:58 crc kubenswrapper[4748]: I0320 10:37:58.230732 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9cb1fc3eafd5de5fca7d4df8e73044f74a9ac7a576d64ba67fec5dfa110f565\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:37:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:58 crc kubenswrapper[4748]: I0320 10:37:58.253815 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qnjmr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16bd6321-67e6-40c7-9ad0-5c9035765e5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6ncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48dc39d41e31eafb666756d3a4195885c16775b9ef82f62a3a7ad15b22995b5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:37:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6ncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6ncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6ncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6ncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6ncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6ncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:37:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qnjmr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:58 crc kubenswrapper[4748]: I0320 10:37:58.265455 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:58 crc kubenswrapper[4748]: I0320 10:37:58.265491 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:58 crc kubenswrapper[4748]: I0320 10:37:58.265505 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:58 crc kubenswrapper[4748]: I0320 10:37:58.265524 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:58 crc kubenswrapper[4748]: I0320 10:37:58.265538 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:58Z","lastTransitionTime":"2026-03-20T10:37:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:58 crc kubenswrapper[4748]: I0320 10:37:58.285035 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xdzb8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f31addae-43ae-459d-bf9d-b5c0ee58faba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndt8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndt8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndt8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndt8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndt8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndt8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndt8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndt8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndt8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:37:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xdzb8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:58 crc kubenswrapper[4748]: I0320 10:37:58.299217 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-95zs6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7866861b-54f0-43f4-8038-2f87675ff0f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://800ac44e3f5eb0fee149f0e67f71fc566444c67b08fb592a3b30bf780fae8bfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:37:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ng765\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3255d6c8ddc6aa04fcbceefd8e5fad043235401608adb5b05cd7aa8a30208e0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:37:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ng765\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:37:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-95zs6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:58 crc kubenswrapper[4748]: I0320 10:37:58.316679 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5jzd5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7a2dfc5-1dd9-4ef1-9419-39f60da74b16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kf2rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kf2rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:37:50Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5jzd5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:58 crc kubenswrapper[4748]: I0320 10:37:58.342040 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bebc081d-35c4-4fb1-b774-1b05d4294efe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:36:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:36:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:36:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:36:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:36:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ca15e526ecd376669cb2a7c1548debd0aff2ab5e7ae466bbfca7398391a4eb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:36:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5844c4d5c4b02ba26d1e9f39a80e6df89f51440d755e03de56504a9373c0955\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:36:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://641de573660b1953a4b99303cb87e56a7d9b8fabadf4fbf1800bc853a5b49a86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:36:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78a3b0d0138fd88cff5759472f5be514bb57ee4a566614d8a2a5338e38af7260\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:36:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08e5a3265c29eedc75658bbc6228215861d3f584633ef23c20df9e11ed8e0141\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:36:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d11cbf1605caf577fd462823f54f7390c8c8ab1d24036423e46215d1598af27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d11cbf1605caf577fd462823f54f7390c8c8ab1d24036423e46215d1598af27\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:36:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:36:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddabedf53696b05f22ba61f42f37346f819ab9d210e1ffb33bcc4c4bc685daa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddabedf53696b05f22ba61f42f37346f819ab9d210e1ffb33bcc4c4bc685daa8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:36:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:36:07Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9147cf13ba247b6bd925ebdd83346a8023d6f9cc0ec94b41bde683ce7a91f737\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9147cf13ba247b6bd925ebdd83346a8023d6f9cc0ec94b41bde683ce7a91f737\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:36:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:36:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:36:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:58 crc kubenswrapper[4748]: I0320 10:37:58.354353 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:58 crc kubenswrapper[4748]: I0320 10:37:58.361433 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:58 crc kubenswrapper[4748]: I0320 10:37:58.368206 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:58 crc kubenswrapper[4748]: I0320 10:37:58.368236 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:58 crc kubenswrapper[4748]: I0320 10:37:58.368245 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:58 crc kubenswrapper[4748]: I0320 10:37:58.368260 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:58 crc kubenswrapper[4748]: I0320 10:37:58.368269 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:58Z","lastTransitionTime":"2026-03-20T10:37:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:58 crc kubenswrapper[4748]: I0320 10:37:58.370953 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:37:58 crc kubenswrapper[4748]: I0320 10:37:58.371041 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:37:58 crc kubenswrapper[4748]: I0320 10:37:58.371086 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:37:58 crc kubenswrapper[4748]: I0320 10:37:58.371157 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:37:58 crc kubenswrapper[4748]: I0320 10:37:58.371223 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:37:58 crc kubenswrapper[4748]: E0320 10:37:58.371408 4748 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 10:37:58 crc kubenswrapper[4748]: E0320 10:37:58.371444 4748 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 10:37:58 crc kubenswrapper[4748]: E0320 10:37:58.371484 4748 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 10:37:58 crc kubenswrapper[4748]: E0320 10:37:58.371476 4748 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 10:37:58 crc kubenswrapper[4748]: E0320 10:37:58.371540 4748 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 10:37:58 crc kubenswrapper[4748]: E0320 10:37:58.371506 4748 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:37:58 crc kubenswrapper[4748]: E0320 10:37:58.371617 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 10:38:14.371576348 +0000 UTC m=+129.513122242 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 10:37:58 crc kubenswrapper[4748]: E0320 10:37:58.371663 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:38:14.371642879 +0000 UTC m=+129.513188723 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:37:58 crc kubenswrapper[4748]: E0320 10:37:58.371697 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 10:38:14.37168312 +0000 UTC m=+129.513228964 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 10:37:58 crc kubenswrapper[4748]: E0320 10:37:58.371723 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 10:38:14.371710221 +0000 UTC m=+129.513256065 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:37:58 crc kubenswrapper[4748]: E0320 10:37:58.371466 4748 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 10:37:58 crc kubenswrapper[4748]: E0320 10:37:58.371764 4748 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:37:58 crc kubenswrapper[4748]: E0320 10:37:58.371822 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 10:38:14.371810873 +0000 UTC m=+129.513356717 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:37:58 crc kubenswrapper[4748]: I0320 10:37:58.373752 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:58 crc kubenswrapper[4748]: I0320 10:37:58.383951 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ftkzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb932f5b-ebd7-48e2-ba20-3d1633290c8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9l9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:37:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ftkzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:58 crc kubenswrapper[4748]: I0320 10:37:58.393130 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-z5ksw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4275e40d-41ca-4fe4-a44b-fe86f4d2e78b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d7da9e84325d6db16238d0a74d6557fb637f246413b37115035e4fd0aa8eaa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:37:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srwmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:37:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-z5ksw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:58 crc kubenswrapper[4748]: I0320 10:37:58.405932 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fksm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fksm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:37:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5lbvz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:58 crc kubenswrapper[4748]: I0320 10:37:58.416580 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:58 crc kubenswrapper[4748]: I0320 10:37:58.426915 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qnjmr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16bd6321-67e6-40c7-9ad0-5c9035765e5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6ncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48dc39d41e31eafb666756d3a4195885c16775b9ef82f62a3a7ad15b22995b5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:37:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6ncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6ncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6ncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6ncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6ncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6ncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:37:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qnjmr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:58 crc kubenswrapper[4748]: I0320 10:37:58.441123 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xdzb8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f31addae-43ae-459d-bf9d-b5c0ee58faba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndt8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndt8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndt8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndt8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndt8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndt8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndt8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndt8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de562d64e46c2b835774aef715b11ffb91bcd82bad3f663874d0ee232ea9b341\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de562d64e46c2b835774aef715b11ffb91bcd82bad3f663874d0ee232ea9b341\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:37:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:37:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndt8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:37:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xdzb8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:58 crc kubenswrapper[4748]: I0320 10:37:58.450552 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-95zs6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7866861b-54f0-43f4-8038-2f87675ff0f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://800ac44e3f5eb0fee149f0e67f71fc566444c67b08fb592a3b30bf780fae8bfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:37:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ng765\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3255d6c8ddc6aa04fcbceefd8e5fad043235401608adb5b05cd7aa8a30208e0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:37:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ng765\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:37:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-95zs6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:58 crc kubenswrapper[4748]: I0320 10:37:58.458393 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5jzd5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7a2dfc5-1dd9-4ef1-9419-39f60da74b16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kf2rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kf2rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:37:50Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5jzd5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:58 crc kubenswrapper[4748]: I0320 10:37:58.468877 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9cb1fc3eafd5de5fca7d4df8e73044f74a9ac7a576d64ba67fec5dfa110f565\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:37:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:58 crc kubenswrapper[4748]: I0320 10:37:58.470926 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:58 crc kubenswrapper[4748]: I0320 10:37:58.470980 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:58 crc kubenswrapper[4748]: I0320 10:37:58.470992 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:58 crc kubenswrapper[4748]: I0320 10:37:58.471010 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:58 crc kubenswrapper[4748]: I0320 10:37:58.471021 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:58Z","lastTransitionTime":"2026-03-20T10:37:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:58 crc kubenswrapper[4748]: I0320 10:37:58.471750 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d7a2dfc5-1dd9-4ef1-9419-39f60da74b16-metrics-certs\") pod \"network-metrics-daemon-5jzd5\" (UID: \"d7a2dfc5-1dd9-4ef1-9419-39f60da74b16\") " pod="openshift-multus/network-metrics-daemon-5jzd5" Mar 20 10:37:58 crc kubenswrapper[4748]: E0320 10:37:58.471971 4748 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 10:37:58 crc kubenswrapper[4748]: E0320 10:37:58.472076 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d7a2dfc5-1dd9-4ef1-9419-39f60da74b16-metrics-certs podName:d7a2dfc5-1dd9-4ef1-9419-39f60da74b16 nodeName:}" failed. No retries permitted until 2026-03-20 10:38:06.472054176 +0000 UTC m=+121.613600000 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d7a2dfc5-1dd9-4ef1-9419-39f60da74b16-metrics-certs") pod "network-metrics-daemon-5jzd5" (UID: "d7a2dfc5-1dd9-4ef1-9419-39f60da74b16") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 10:37:58 crc kubenswrapper[4748]: I0320 10:37:58.482438 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:58 crc kubenswrapper[4748]: I0320 10:37:58.490756 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ftkzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb932f5b-ebd7-48e2-ba20-3d1633290c8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9l9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:37:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ftkzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:58 crc kubenswrapper[4748]: I0320 10:37:58.506058 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-z5ksw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4275e40d-41ca-4fe4-a44b-fe86f4d2e78b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d7da9e84325d6db16238d0a74d6557fb637f246413b37115035e4fd0aa8eaa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:37:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srwmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:37:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-z5ksw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:58 crc kubenswrapper[4748]: I0320 10:37:58.514466 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:37:58 crc kubenswrapper[4748]: I0320 10:37:58.514547 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5jzd5" Mar 20 10:37:58 crc kubenswrapper[4748]: E0320 10:37:58.514621 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:37:58 crc kubenswrapper[4748]: I0320 10:37:58.514759 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:37:58 crc kubenswrapper[4748]: I0320 10:37:58.515756 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:37:58 crc kubenswrapper[4748]: E0320 10:37:58.516178 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:37:58 crc kubenswrapper[4748]: E0320 10:37:58.516997 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:37:58 crc kubenswrapper[4748]: E0320 10:37:58.515437 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5jzd5" podUID="d7a2dfc5-1dd9-4ef1-9419-39f60da74b16" Mar 20 10:37:58 crc kubenswrapper[4748]: I0320 10:37:58.524462 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fksm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fksm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:37:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5lbvz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:58 crc kubenswrapper[4748]: I0320 10:37:58.555714 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bebc081d-35c4-4fb1-b774-1b05d4294efe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:36:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:36:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:36:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:36:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:36:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ca15e526ecd376669cb2a7c1548debd0aff2ab5e7ae466bbfca7398391a4eb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:36:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5844c4d5c4b02ba26d1e9f39a80e6df89f51440d755e03de56504a9373c0955\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:36:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://641de573660b1953a4b99303cb87e56a7d9b8fabadf4fbf1800bc853a5b49a86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:36:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78a3b0d0138fd88cff5759472f5be514bb57ee4a566614d8a2a5338e38af7260\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:36:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08e5a3265c29eedc75658bbc6228215861d3f584633ef23c20df9e11ed8e0141\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:36:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d11cbf1605caf577fd462823f54f7390c8c8ab1d24036423e46215d1598af27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d11cbf1605caf577fd462823f54f7390c8c8ab1d24036423e46215d1598af27\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:36:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:36:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddabedf53696b05f22ba61f42f37346f819ab9d210e1ffb33bcc4c4bc685daa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddabedf53696b05f22ba61f42f37346f819ab9d210e1ffb33bcc4c4bc685daa8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:36:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:36:07Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9147cf13ba247b6bd925ebdd83346a8023d6f9cc0ec94b41bde683ce7a91f737\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9147cf13ba247b6bd925ebdd83346a8023d6f9cc0ec94b41bde683ce7a91f737\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:36:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:36:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:36:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:58 crc kubenswrapper[4748]: I0320 10:37:58.568296 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:58 crc kubenswrapper[4748]: I0320 10:37:58.574425 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:58 crc kubenswrapper[4748]: I0320 10:37:58.574464 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:58 crc kubenswrapper[4748]: I0320 10:37:58.574473 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:58 crc kubenswrapper[4748]: I0320 10:37:58.574488 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:58 crc kubenswrapper[4748]: I0320 10:37:58.574499 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:58Z","lastTransitionTime":"2026-03-20T10:37:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:58 crc kubenswrapper[4748]: I0320 10:37:58.583320 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:58 crc kubenswrapper[4748]: I0320 10:37:58.597562 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w68d2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cb89200-ecbf-4725-b48f-801aaecd6ad0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb2d05795c830ca8e3f512fe8af38eacc2202ed111af8df7425104aeaab642d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:37:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-szq25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:37:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w68d2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:58 crc kubenswrapper[4748]: I0320 10:37:58.613384 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5483745-95c8-4c6e-bd16-ec5fec57af5d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:36:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:36:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:36:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:36:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:36:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c40d6e7336ba79828d98e8a8a2cb2e049ce6ad046d53a6337d9ab0d1ce0ee11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:36:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed2d8bdb5b326ae91ba0ca3da26d434af7ae6a3c604fd3ff22eb0ae8b00a0aa1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:36:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15d3d6d4f8c47119a747f3c76a9bb2b05c6f0030576418e787f3deb6e9f2485b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:36:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abfae41f906e09e4b2be25e2cdf0408811449414f51931ce149a8468a32a88c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abfae41f906e09e4b2be25e2cdf0408811449414f51931ce149a8468a32a88c6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:37:13Z\\\",\\\"message\\\":\\\"file observer\\\\nW0320 10:37:13.169272 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 10:37:13.169530 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:37:13.170805 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-114493928/tls.crt::/tmp/serving-cert-114493928/tls.key\\\\\\\"\\\\nI0320 10:37:13.625648 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 10:37:13.628420 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 10:37:13.628441 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 10:37:13.628469 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 10:37:13.628482 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 10:37:13.635699 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 10:37:13.635718 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0320 10:37:13.635717 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 10:37:13.635722 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:37:13.635738 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 10:37:13.635741 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 10:37:13.635744 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 10:37:13.635746 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 10:37:13.637809 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:37:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://989be6314e2d0af65e9541b5b2a0be5aed080255658e56c1c5bd989d1a842acf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:36:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b974db792d69c06471dc5bd8f9e64e727b052347aed06e5a1c80653838984f51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b974db792d69c06471dc5bd8f9e64e727b052347aed06e5a1c80653838984f51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:36:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:36:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:36:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:58 crc kubenswrapper[4748]: I0320 10:37:58.627567 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:58 crc kubenswrapper[4748]: I0320 10:37:58.680046 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:58 crc kubenswrapper[4748]: I0320 10:37:58.680091 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:58 crc kubenswrapper[4748]: I0320 10:37:58.680108 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:58 crc kubenswrapper[4748]: I0320 10:37:58.680133 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:58 crc kubenswrapper[4748]: I0320 10:37:58.680152 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:58Z","lastTransitionTime":"2026-03-20T10:37:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:58 crc kubenswrapper[4748]: I0320 10:37:58.783776 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:58 crc kubenswrapper[4748]: I0320 10:37:58.784141 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:58 crc kubenswrapper[4748]: I0320 10:37:58.784166 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:58 crc kubenswrapper[4748]: I0320 10:37:58.784192 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:58 crc kubenswrapper[4748]: I0320 10:37:58.784209 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:58Z","lastTransitionTime":"2026-03-20T10:37:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:58 crc kubenswrapper[4748]: I0320 10:37:58.886697 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:58 crc kubenswrapper[4748]: I0320 10:37:58.886735 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:58 crc kubenswrapper[4748]: I0320 10:37:58.886745 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:58 crc kubenswrapper[4748]: I0320 10:37:58.886762 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:58 crc kubenswrapper[4748]: I0320 10:37:58.886772 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:58Z","lastTransitionTime":"2026-03-20T10:37:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:58 crc kubenswrapper[4748]: I0320 10:37:58.989817 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:58 crc kubenswrapper[4748]: I0320 10:37:58.989885 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:58 crc kubenswrapper[4748]: I0320 10:37:58.989899 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:58 crc kubenswrapper[4748]: I0320 10:37:58.989917 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:58 crc kubenswrapper[4748]: I0320 10:37:58.989930 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:58Z","lastTransitionTime":"2026-03-20T10:37:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:59 crc kubenswrapper[4748]: I0320 10:37:59.092967 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:59 crc kubenswrapper[4748]: I0320 10:37:59.093016 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:59 crc kubenswrapper[4748]: I0320 10:37:59.093031 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:59 crc kubenswrapper[4748]: I0320 10:37:59.093050 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:59 crc kubenswrapper[4748]: I0320 10:37:59.093064 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:59Z","lastTransitionTime":"2026-03-20T10:37:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:59 crc kubenswrapper[4748]: I0320 10:37:59.147252 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xdzb8" event={"ID":"f31addae-43ae-459d-bf9d-b5c0ee58faba","Type":"ContainerStarted","Data":"52213e8defefb79f069d768559f145776cd25ea532c71296687b5922c982adf4"} Mar 20 10:37:59 crc kubenswrapper[4748]: I0320 10:37:59.147307 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xdzb8" event={"ID":"f31addae-43ae-459d-bf9d-b5c0ee58faba","Type":"ContainerStarted","Data":"f795435d75086ef32acedeee8f60354fd510f49f2df1bd3d522380ef8d12365f"} Mar 20 10:37:59 crc kubenswrapper[4748]: I0320 10:37:59.147321 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xdzb8" event={"ID":"f31addae-43ae-459d-bf9d-b5c0ee58faba","Type":"ContainerStarted","Data":"87af6eef1dc0be4682b0bbfe7b485e49360e19dd3b48012cc6a0e79e02b6c3f8"} Mar 20 10:37:59 crc kubenswrapper[4748]: I0320 10:37:59.150147 4748 generic.go:334] "Generic (PLEG): container finished" podID="16bd6321-67e6-40c7-9ad0-5c9035765e5d" containerID="48dc39d41e31eafb666756d3a4195885c16775b9ef82f62a3a7ad15b22995b5c" exitCode=0 Mar 20 10:37:59 crc kubenswrapper[4748]: I0320 10:37:59.150263 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qnjmr" event={"ID":"16bd6321-67e6-40c7-9ad0-5c9035765e5d","Type":"ContainerDied","Data":"48dc39d41e31eafb666756d3a4195885c16775b9ef82f62a3a7ad15b22995b5c"} Mar 20 10:37:59 crc kubenswrapper[4748]: I0320 10:37:59.152542 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-ftkzt" event={"ID":"bb932f5b-ebd7-48e2-ba20-3d1633290c8e","Type":"ContainerStarted","Data":"3f474d047b5f62546145a48372af7a8a5038663df53171b9ae88658ce76c2ba6"} Mar 20 10:37:59 crc kubenswrapper[4748]: I0320 10:37:59.158236 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"c7227427635ed0330153fc00958615748717c92bfe2bf0f0b8e26ffbd5882d8f"} Mar 20 10:37:59 crc kubenswrapper[4748]: I0320 10:37:59.161023 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" event={"ID":"8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c","Type":"ContainerStarted","Data":"2460976ea11813e93eb323d016690013bd4516347015151c02ebf788f86c7358"} Mar 20 10:37:59 crc kubenswrapper[4748]: I0320 10:37:59.170441 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5483745-95c8-4c6e-bd16-ec5fec57af5d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:36:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:36:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:36:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:36:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:36:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c40d6e7336ba79828d98e8a8a2cb2e049ce6ad046d53a6337d9ab0d1ce0ee11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:36:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed2d8bdb5b326ae91ba0ca3da26d434af7ae6a3c604fd3ff22eb0ae8b00a0aa1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:36:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15d3d6d4f8c47119a747f3c76a9bb2b05c6f0030576418e787f3deb6e9f2485b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:36:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abfae41f906e09e4b2be25e2cdf0408811449414f51931ce149a8468a32a88c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abfae41f906e09e4b2be25e2cdf0408811449414f51931ce149a8468a32a88c6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:37:13Z\\\",\\\"message\\\":\\\"file observer\\\\nW0320 10:37:13.169272 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 10:37:13.169530 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:37:13.170805 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-114493928/tls.crt::/tmp/serving-cert-114493928/tls.key\\\\\\\"\\\\nI0320 10:37:13.625648 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 10:37:13.628420 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 10:37:13.628441 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 10:37:13.628469 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 10:37:13.628482 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 10:37:13.635699 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 10:37:13.635718 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0320 10:37:13.635717 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 10:37:13.635722 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:37:13.635738 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 10:37:13.635741 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 10:37:13.635744 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 10:37:13.635746 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 10:37:13.637809 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:37:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://989be6314e2d0af65e9541b5b2a0be5aed080255658e56c1c5bd989d1a842acf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:36:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b974db792d69c06471dc5bd8f9e64e727b052347aed06e5a1c80653838984f51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b974db792d69c06471dc5bd8f9e64e727b052347aed06e5a1c80653838984f51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:36:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:36:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:36:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:59 crc kubenswrapper[4748]: I0320 10:37:59.185497 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:59 crc kubenswrapper[4748]: I0320 10:37:59.198391 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:59 crc kubenswrapper[4748]: I0320 10:37:59.198447 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:59 crc kubenswrapper[4748]: I0320 10:37:59.198464 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:59 crc kubenswrapper[4748]: I0320 10:37:59.198511 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:59 crc kubenswrapper[4748]: I0320 10:37:59.198530 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:59Z","lastTransitionTime":"2026-03-20T10:37:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:59 crc kubenswrapper[4748]: I0320 10:37:59.202947 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w68d2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cb89200-ecbf-4725-b48f-801aaecd6ad0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb2d05795c830ca8e3f512fe8af38eacc2202ed111af8df7425104aeaab642d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:37:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-szq25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:37:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w68d2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:59 crc kubenswrapper[4748]: I0320 10:37:59.219980 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:59 crc kubenswrapper[4748]: I0320 10:37:59.252623 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9cb1fc3eafd5de5fca7d4df8e73044f74a9ac7a576d64ba67fec5dfa110f565\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:37:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:59 crc kubenswrapper[4748]: I0320 10:37:59.277940 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qnjmr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16bd6321-67e6-40c7-9ad0-5c9035765e5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6ncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48dc39d41e31eafb666756d3a4195885c16775b9ef82f62a3a7ad15b22995b5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48dc39d41e31eafb666756d3a4195885c16775b9ef82f62a3a7ad15b22995b5c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:37:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:37:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6ncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6ncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6ncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6ncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6ncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6ncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:37:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qnjmr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:59 crc kubenswrapper[4748]: I0320 10:37:59.299371 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xdzb8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f31addae-43ae-459d-bf9d-b5c0ee58faba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndt8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndt8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndt8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndt8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndt8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndt8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndt8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndt8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de562d64e46c2b835774aef715b11ffb91bcd82bad3f663874d0ee232ea9b341\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de562d64e46c2b835774aef715b11ffb91bcd82bad3f663874d0ee232ea9b341\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:37:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:37:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndt8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:37:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xdzb8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:59 crc kubenswrapper[4748]: I0320 10:37:59.301035 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:59 crc kubenswrapper[4748]: I0320 10:37:59.301082 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:59 crc kubenswrapper[4748]: I0320 10:37:59.301107 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:59 crc kubenswrapper[4748]: I0320 10:37:59.301128 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:59 crc kubenswrapper[4748]: I0320 10:37:59.301143 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:59Z","lastTransitionTime":"2026-03-20T10:37:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:59 crc kubenswrapper[4748]: I0320 10:37:59.313734 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-95zs6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7866861b-54f0-43f4-8038-2f87675ff0f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://800ac44e3f5eb0fee149f0e67f71fc566444c67b08fb592a3b30bf780fae8bfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:37:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ng765\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3255d6c8ddc6aa04fcbceefd8e5fad043235401608adb5b05cd7aa8a30208e0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:37:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ng765\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:37:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-95zs6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:59 crc kubenswrapper[4748]: I0320 10:37:59.324933 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5jzd5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7a2dfc5-1dd9-4ef1-9419-39f60da74b16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kf2rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kf2rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:37:50Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5jzd5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:59 crc kubenswrapper[4748]: I0320 10:37:59.344974 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bebc081d-35c4-4fb1-b774-1b05d4294efe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:36:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:36:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:36:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:36:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:36:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ca15e526ecd376669cb2a7c1548debd0aff2ab5e7ae466bbfca7398391a4eb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:36:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5844c4d5c4b02ba26d1e9f39a80e6df89f51440d755e03de56504a9373c0955\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:36:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://641de573660b1953a4b99303cb87e56a7d9b8fabadf4fbf1800bc853a5b49a86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:36:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78a3b0d0138fd88cff5759472f5be514bb57ee4a566614d8a2a5338e38af7260\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:36:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08e5a3265c29eedc75658bbc6228215861d3f584633ef23c20df9e11ed8e0141\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:36:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d11cbf1605caf577fd462823f54f7390c8c8ab1d24036423e46215d1598af27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d11cbf1605caf577fd462823f54f7390c8c8ab1d24036423e46215d1598af27\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:36:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:36:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddabedf53696b05f22ba61f42f37346f819ab9d210e1ffb33bcc4c4bc685daa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddabedf53696b05f22ba61f42f37346f819ab9d210e1ffb33bcc4c4bc685daa8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:36:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:36:07Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9147cf13ba247b6bd925ebdd83346a8023d6f9cc0ec94b41bde683ce7a91f737\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9147cf13ba247b6bd925ebdd83346a8023d6f9cc0ec94b41bde683ce7a91f737\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:36:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:36:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:36:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:59 crc kubenswrapper[4748]: I0320 10:37:59.354243 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:59 crc kubenswrapper[4748]: I0320 10:37:59.366928 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:59 crc kubenswrapper[4748]: I0320 10:37:59.379544 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:59 crc kubenswrapper[4748]: I0320 10:37:59.397678 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ftkzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb932f5b-ebd7-48e2-ba20-3d1633290c8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9l9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:37:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ftkzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:59 crc kubenswrapper[4748]: I0320 10:37:59.404567 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:59 crc kubenswrapper[4748]: I0320 10:37:59.404608 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:59 crc kubenswrapper[4748]: I0320 10:37:59.404617 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:59 crc kubenswrapper[4748]: I0320 10:37:59.404632 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:59 crc kubenswrapper[4748]: I0320 10:37:59.404644 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:59Z","lastTransitionTime":"2026-03-20T10:37:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:59 crc kubenswrapper[4748]: I0320 10:37:59.413739 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-z5ksw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4275e40d-41ca-4fe4-a44b-fe86f4d2e78b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d7da9e84325d6db16238d0a74d6557fb637f246413b37115035e4fd0aa8eaa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:37:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srwmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:37:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-z5ksw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:59 crc kubenswrapper[4748]: I0320 10:37:59.422644 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fksm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fksm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:37:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5lbvz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:59 crc kubenswrapper[4748]: I0320 10:37:59.433237 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:59 crc kubenswrapper[4748]: I0320 10:37:59.450291 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xdzb8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f31addae-43ae-459d-bf9d-b5c0ee58faba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndt8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndt8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndt8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndt8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndt8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndt8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndt8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndt8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de562d64e46c2b835774aef715b11ffb91bcd82bad3f663874d0ee232ea9b341\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de562d64e46c2b835774aef715b11ffb91bcd82bad3f663874d0ee232ea9b341\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:37:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:37:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndt8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:37:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xdzb8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:59 crc kubenswrapper[4748]: I0320 10:37:59.460581 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-95zs6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7866861b-54f0-43f4-8038-2f87675ff0f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://800ac44e3f5eb0fee149f0e67f71fc566444c67b08fb592a3b30bf780fae8bfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:37:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ng765\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3255d6c8ddc6aa04fcbceefd8e5fad043235401608adb5b05cd7aa8a30208e0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:37:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ng765\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:37:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-95zs6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:59 crc kubenswrapper[4748]: I0320 10:37:59.470524 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5jzd5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7a2dfc5-1dd9-4ef1-9419-39f60da74b16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kf2rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kf2rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:37:50Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5jzd5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:59 crc kubenswrapper[4748]: I0320 10:37:59.480686 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9cb1fc3eafd5de5fca7d4df8e73044f74a9ac7a576d64ba67fec5dfa110f565\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:37:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:59 crc kubenswrapper[4748]: I0320 10:37:59.494508 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qnjmr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16bd6321-67e6-40c7-9ad0-5c9035765e5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6ncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48dc39d41e31eafb666756d3a4195885c16775b9ef82f62a3a7ad15b22995b5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48dc39d41e31eafb666756d3a4195885c16775b9ef82f62a3a7ad15b22995b5c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:37:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:37:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6ncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6ncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6ncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6ncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6ncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6ncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:37:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qnjmr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:59 crc kubenswrapper[4748]: I0320 10:37:59.503743 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ftkzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb932f5b-ebd7-48e2-ba20-3d1633290c8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f474d047b5f62546145a48372af7a8a5038663df53171b9ae88658ce76c2ba6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:37:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9l9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:37:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ftkzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:59 crc kubenswrapper[4748]: I0320 10:37:59.507539 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:59 crc kubenswrapper[4748]: I0320 10:37:59.507581 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:59 crc kubenswrapper[4748]: I0320 10:37:59.507591 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:59 crc kubenswrapper[4748]: I0320 10:37:59.507610 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:59 crc kubenswrapper[4748]: I0320 10:37:59.507622 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:59Z","lastTransitionTime":"2026-03-20T10:37:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:59 crc kubenswrapper[4748]: I0320 10:37:59.515074 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-z5ksw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4275e40d-41ca-4fe4-a44b-fe86f4d2e78b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d7da9e84325d6db16238d0a74d6557fb637f246413b37115035e4fd0aa8eaa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:37:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srwmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:37:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-z5ksw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:59 crc kubenswrapper[4748]: I0320 10:37:59.525396 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2460976ea11813e93eb323d016690013bd4516347015151c02ebf788f86c7358\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:37:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fksm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://03c4104b260930c777d385e243f9163dff5c4db4d6a523b77574c5b0ba63b705\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:37:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fksm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:37:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5lbvz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:59 crc kubenswrapper[4748]: I0320 10:37:59.545645 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bebc081d-35c4-4fb1-b774-1b05d4294efe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:36:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:36:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:36:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:36:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:36:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ca15e526ecd376669cb2a7c1548debd0aff2ab5e7ae466bbfca7398391a4eb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:36:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5844c4d5c4b02ba26d1e9f39a80e6df89f51440d755e03de56504a9373c0955\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:36:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://641de573660b1953a4b99303cb87e56a7d9b8fabadf4fbf1800bc853a5b49a86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:36:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78a3b0d0138fd88cff5759472f5be514bb57ee4a566614d8a2a5338e38af7260\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:36:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08e5a3265c29eedc75658bbc6228215861d3f584633ef23c20df9e11ed8e0141\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:36:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d11cbf1605caf577fd462823f54f7390c8c8ab1d24036423e46215d1598af27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d11cbf1605caf577fd462823f54f7390c8c8ab1d24036423e46215d1598af27\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:36:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:36:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddabedf53696b05f22ba61f42f37346f819ab9d210e1ffb33bcc4c4bc685daa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddabedf53696b05f22ba61f42f37346f819ab9d210e1ffb33bcc4c4bc685daa8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:36:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:36:07Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9147cf13ba247b6bd925ebdd83346a8023d6f9cc0ec94b41bde683ce7a91f737\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9147cf13ba247b6bd925ebdd83346a8023d6f9cc0ec94b41bde683ce7a91f737\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:36:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:36:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:36:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:59 crc kubenswrapper[4748]: I0320 10:37:59.556326 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:59 crc kubenswrapper[4748]: I0320 10:37:59.565627 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:59 crc kubenswrapper[4748]: I0320 10:37:59.574921 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:59 crc kubenswrapper[4748]: I0320 10:37:59.587918 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5483745-95c8-4c6e-bd16-ec5fec57af5d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:36:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:36:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:36:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:36:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:36:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c40d6e7336ba79828d98e8a8a2cb2e049ce6ad046d53a6337d9ab0d1ce0ee11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:36:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed2d8bdb5b326ae91ba0ca3da26d434af7ae6a3c604fd3ff22eb0ae8b00a0aa1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:36:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15d3d6d4f8c47119a747f3c76a9bb2b05c6f0030576418e787f3deb6e9f2485b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:36:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abfae41f906e09e4b2be25e2cdf0408811449414f51931ce149a8468a32a88c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abfae41f906e09e4b2be25e2cdf0408811449414f51931ce149a8468a32a88c6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:37:13Z\\\",\\\"message\\\":\\\"file observer\\\\nW0320 10:37:13.169272 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 10:37:13.169530 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:37:13.170805 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-114493928/tls.crt::/tmp/serving-cert-114493928/tls.key\\\\\\\"\\\\nI0320 10:37:13.625648 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 10:37:13.628420 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 10:37:13.628441 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 10:37:13.628469 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 10:37:13.628482 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 10:37:13.635699 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 10:37:13.635718 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0320 10:37:13.635717 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 10:37:13.635722 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:37:13.635738 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 10:37:13.635741 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 10:37:13.635744 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 10:37:13.635746 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 10:37:13.637809 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:37:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://989be6314e2d0af65e9541b5b2a0be5aed080255658e56c1c5bd989d1a842acf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:36:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b974db792d69c06471dc5bd8f9e64e727b052347aed06e5a1c80653838984f51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b974db792d69c06471dc5bd8f9e64e727b052347aed06e5a1c80653838984f51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:36:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:36:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:36:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:59 crc kubenswrapper[4748]: I0320 10:37:59.599167 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:59 crc kubenswrapper[4748]: I0320 10:37:59.609142 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w68d2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cb89200-ecbf-4725-b48f-801aaecd6ad0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb2d05795c830ca8e3f512fe8af38eacc2202ed111af8df7425104aeaab642d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:37:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-szq25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:37:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w68d2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:37:59 crc kubenswrapper[4748]: I0320 10:37:59.610229 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:59 crc kubenswrapper[4748]: I0320 10:37:59.610270 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:59 crc kubenswrapper[4748]: I0320 10:37:59.610279 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:59 crc kubenswrapper[4748]: I0320 10:37:59.610296 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:59 crc kubenswrapper[4748]: I0320 10:37:59.610310 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:59Z","lastTransitionTime":"2026-03-20T10:37:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:59 crc kubenswrapper[4748]: I0320 10:37:59.713478 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:59 crc kubenswrapper[4748]: I0320 10:37:59.713550 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:59 crc kubenswrapper[4748]: I0320 10:37:59.713564 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:59 crc kubenswrapper[4748]: I0320 10:37:59.713585 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:59 crc kubenswrapper[4748]: I0320 10:37:59.713595 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:59Z","lastTransitionTime":"2026-03-20T10:37:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:59 crc kubenswrapper[4748]: I0320 10:37:59.824031 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:59 crc kubenswrapper[4748]: I0320 10:37:59.824096 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:59 crc kubenswrapper[4748]: I0320 10:37:59.824106 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:59 crc kubenswrapper[4748]: I0320 10:37:59.824123 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:59 crc kubenswrapper[4748]: I0320 10:37:59.824134 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:59Z","lastTransitionTime":"2026-03-20T10:37:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:37:59 crc kubenswrapper[4748]: I0320 10:37:59.926329 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:37:59 crc kubenswrapper[4748]: I0320 10:37:59.927016 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:37:59 crc kubenswrapper[4748]: I0320 10:37:59.927028 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:37:59 crc kubenswrapper[4748]: I0320 10:37:59.927045 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:37:59 crc kubenswrapper[4748]: I0320 10:37:59.927058 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:37:59Z","lastTransitionTime":"2026-03-20T10:37:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:38:00 crc kubenswrapper[4748]: I0320 10:38:00.031096 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:38:00 crc kubenswrapper[4748]: I0320 10:38:00.031140 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:38:00 crc kubenswrapper[4748]: I0320 10:38:00.031153 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:38:00 crc kubenswrapper[4748]: I0320 10:38:00.031172 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:38:00 crc kubenswrapper[4748]: I0320 10:38:00.031187 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:38:00Z","lastTransitionTime":"2026-03-20T10:38:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:38:00 crc kubenswrapper[4748]: I0320 10:38:00.134968 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:38:00 crc kubenswrapper[4748]: I0320 10:38:00.135005 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:38:00 crc kubenswrapper[4748]: I0320 10:38:00.135015 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:38:00 crc kubenswrapper[4748]: I0320 10:38:00.135032 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:38:00 crc kubenswrapper[4748]: I0320 10:38:00.135042 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:38:00Z","lastTransitionTime":"2026-03-20T10:38:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:38:00 crc kubenswrapper[4748]: I0320 10:38:00.166405 4748 generic.go:334] "Generic (PLEG): container finished" podID="16bd6321-67e6-40c7-9ad0-5c9035765e5d" containerID="d57ede5a95d782583c98f904805810600747a07fe160c2cd7abdf54c974c2a3d" exitCode=0 Mar 20 10:38:00 crc kubenswrapper[4748]: I0320 10:38:00.166487 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qnjmr" event={"ID":"16bd6321-67e6-40c7-9ad0-5c9035765e5d","Type":"ContainerDied","Data":"d57ede5a95d782583c98f904805810600747a07fe160c2cd7abdf54c974c2a3d"} Mar 20 10:38:00 crc kubenswrapper[4748]: I0320 10:38:00.169680 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"2d2607270510db7b8a19cd88715ead8d94bac9191d58d9644c588ea170d16cf8"} Mar 20 10:38:00 crc kubenswrapper[4748]: I0320 10:38:00.176104 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xdzb8" event={"ID":"f31addae-43ae-459d-bf9d-b5c0ee58faba","Type":"ContainerStarted","Data":"a22f1239a0ca652003eb95b617e5a6b84f13d70ac2dca8f1593c2fb3cc3ff87a"} Mar 20 10:38:00 crc kubenswrapper[4748]: I0320 10:38:00.176161 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xdzb8" event={"ID":"f31addae-43ae-459d-bf9d-b5c0ee58faba","Type":"ContainerStarted","Data":"2bf2050b15cfd2fd678dca05462c6e8ce79672ef315106eb9fdc8fea4ab5c0a6"} Mar 20 10:38:00 crc kubenswrapper[4748]: I0320 10:38:00.176180 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xdzb8" event={"ID":"f31addae-43ae-459d-bf9d-b5c0ee58faba","Type":"ContainerStarted","Data":"8b874abcca50cad2b1c5a29e4d7254aea1e0233ad9280e327d03c34634402211"} Mar 20 10:38:00 crc kubenswrapper[4748]: I0320 10:38:00.194666 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:38:00Z is after 2025-08-24T17:21:41Z" Mar 20 10:38:00 crc kubenswrapper[4748]: I0320 10:38:00.210009 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5jzd5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7a2dfc5-1dd9-4ef1-9419-39f60da74b16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kf2rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kf2rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:37:50Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5jzd5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:38:00Z is after 2025-08-24T17:21:41Z" Mar 20 10:38:00 crc kubenswrapper[4748]: I0320 10:38:00.223370 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9cb1fc3eafd5de5fca7d4df8e73044f74a9ac7a576d64ba67fec5dfa110f565\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:37:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:38:00Z is after 2025-08-24T17:21:41Z" Mar 20 10:38:00 crc kubenswrapper[4748]: I0320 10:38:00.237752 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:38:00 crc kubenswrapper[4748]: I0320 10:38:00.237786 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:38:00 crc kubenswrapper[4748]: I0320 10:38:00.237795 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:38:00 crc kubenswrapper[4748]: I0320 10:38:00.237809 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:38:00 crc kubenswrapper[4748]: I0320 10:38:00.237819 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:38:00Z","lastTransitionTime":"2026-03-20T10:38:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:38:00 crc kubenswrapper[4748]: I0320 10:38:00.239799 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qnjmr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16bd6321-67e6-40c7-9ad0-5c9035765e5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6ncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48dc39d41e31eafb666756d3a4195885c16775b9ef82f62a3a7ad15b22995b5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48dc39d41e31eafb666756d3a4195885c16775b9ef82f62a3a7ad15b22995b5c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:37:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:37:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6ncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d57ede5a95d782583c98f904805810600747a07fe160c2cd7abdf54c974c2a3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d57ede5a95d782583c98f904805810600747a07fe160c2cd7abdf54c974c2a3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:38:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:37:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6ncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6ncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6ncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6ncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6ncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:37:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qnjmr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:38:00Z is after 2025-08-24T17:21:41Z" Mar 20 10:38:00 crc kubenswrapper[4748]: I0320 10:38:00.258692 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xdzb8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f31addae-43ae-459d-bf9d-b5c0ee58faba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndt8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndt8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndt8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndt8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndt8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndt8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndt8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndt8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de562d64e46c2b835774aef715b11ffb91bcd82bad3f663874d0ee232ea9b341\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de562d64e46c2b835774aef715b11ffb91bcd82bad3f663874d0ee232ea9b341\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:37:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:37:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndt8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:37:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xdzb8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:38:00Z is after 2025-08-24T17:21:41Z" Mar 20 10:38:00 crc kubenswrapper[4748]: I0320 10:38:00.271246 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-95zs6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7866861b-54f0-43f4-8038-2f87675ff0f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://800ac44e3f5eb0fee149f0e67f71fc566444c67b08fb592a3b30bf780fae8bfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:37:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ng765\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3255d6c8ddc6aa04fcbceefd8e5fad043235401608adb5b05cd7aa8a30208e0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:37:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ng765\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:37:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-95zs6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:38:00Z is after 2025-08-24T17:21:41Z" Mar 20 10:38:00 crc kubenswrapper[4748]: I0320 10:38:00.284328 4748 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:37:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2460976ea11813e93eb323d016690013bd4516347015151c02ebf788f86c7358\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:37:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fksm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://03c4104b260930c777d385e243f9163dff5c4db4d6a523b77574c5b0ba63b705\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:37:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fksm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:37:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5lbvz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:38:00Z is after 2025-08-24T17:21:41Z" Mar 20 10:38:00 crc kubenswrapper[4748]: I0320 10:38:00.315072 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=11.315053702 podStartE2EDuration="11.315053702s" podCreationTimestamp="2026-03-20 10:37:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:38:00.314646992 +0000 UTC m=+115.456192806" watchObservedRunningTime="2026-03-20 10:38:00.315053702 +0000 UTC m=+115.456599526" Mar 20 10:38:00 crc kubenswrapper[4748]: I0320 10:38:00.340892 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:38:00 crc kubenswrapper[4748]: I0320 10:38:00.340946 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:38:00 crc kubenswrapper[4748]: I0320 10:38:00.340960 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:38:00 crc kubenswrapper[4748]: I0320 10:38:00.340980 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:38:00 crc kubenswrapper[4748]: I0320 10:38:00.340991 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:38:00Z","lastTransitionTime":"2026-03-20T10:38:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:38:00 crc kubenswrapper[4748]: I0320 10:38:00.386693 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-ftkzt" podStartSLOduration=56.38667175 podStartE2EDuration="56.38667175s" podCreationTimestamp="2026-03-20 10:37:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:38:00.372031515 +0000 UTC m=+115.513577349" watchObservedRunningTime="2026-03-20 10:38:00.38667175 +0000 UTC m=+115.528217564" Mar 20 10:38:00 crc kubenswrapper[4748]: I0320 10:38:00.387026 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-z5ksw" podStartSLOduration=56.387021429 podStartE2EDuration="56.387021429s" podCreationTimestamp="2026-03-20 10:37:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:38:00.386490136 +0000 UTC m=+115.528035950" watchObservedRunningTime="2026-03-20 10:38:00.387021429 +0000 UTC m=+115.528567243" Mar 20 10:38:00 crc kubenswrapper[4748]: I0320 10:38:00.436964 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-w68d2" podStartSLOduration=56.43693581 podStartE2EDuration="56.43693581s" podCreationTimestamp="2026-03-20 10:37:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:38:00.436685214 +0000 UTC m=+115.578231038" watchObservedRunningTime="2026-03-20 10:38:00.43693581 +0000 UTC m=+115.578481634" Mar 20 10:38:00 crc kubenswrapper[4748]: I0320 10:38:00.443916 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:38:00 crc kubenswrapper[4748]: I0320 10:38:00.444074 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:38:00 crc kubenswrapper[4748]: I0320 10:38:00.444155 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:38:00 crc kubenswrapper[4748]: I0320 10:38:00.444248 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:38:00 crc kubenswrapper[4748]: I0320 10:38:00.444355 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:38:00Z","lastTransitionTime":"2026-03-20T10:38:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:38:00 crc kubenswrapper[4748]: I0320 10:38:00.493760 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-95zs6" podStartSLOduration=55.493739009 podStartE2EDuration="55.493739009s" podCreationTimestamp="2026-03-20 10:37:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:38:00.493594065 +0000 UTC m=+115.635139909" watchObservedRunningTime="2026-03-20 10:38:00.493739009 +0000 UTC m=+115.635284823" Mar 20 10:38:00 crc kubenswrapper[4748]: I0320 10:38:00.515197 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5jzd5" Mar 20 10:38:00 crc kubenswrapper[4748]: I0320 10:38:00.515258 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:38:00 crc kubenswrapper[4748]: I0320 10:38:00.515266 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:38:00 crc kubenswrapper[4748]: I0320 10:38:00.515269 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:38:00 crc kubenswrapper[4748]: E0320 10:38:00.515364 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5jzd5" podUID="d7a2dfc5-1dd9-4ef1-9419-39f60da74b16" Mar 20 10:38:00 crc kubenswrapper[4748]: E0320 10:38:00.515434 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:38:00 crc kubenswrapper[4748]: E0320 10:38:00.515506 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:38:00 crc kubenswrapper[4748]: E0320 10:38:00.515562 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:38:00 crc kubenswrapper[4748]: I0320 10:38:00.546826 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:38:00 crc kubenswrapper[4748]: I0320 10:38:00.546896 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:38:00 crc kubenswrapper[4748]: I0320 10:38:00.546912 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:38:00 crc kubenswrapper[4748]: I0320 10:38:00.546935 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:38:00 crc kubenswrapper[4748]: I0320 10:38:00.546950 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:38:00Z","lastTransitionTime":"2026-03-20T10:38:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:38:00 crc kubenswrapper[4748]: I0320 10:38:00.559779 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podStartSLOduration=56.559758971 podStartE2EDuration="56.559758971s" podCreationTimestamp="2026-03-20 10:37:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:38:00.558733706 +0000 UTC m=+115.700279530" watchObservedRunningTime="2026-03-20 10:38:00.559758971 +0000 UTC m=+115.701304805" Mar 20 10:38:00 crc kubenswrapper[4748]: I0320 10:38:00.650403 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:38:00 crc kubenswrapper[4748]: I0320 10:38:00.650466 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:38:00 crc kubenswrapper[4748]: I0320 10:38:00.650483 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:38:00 crc kubenswrapper[4748]: I0320 10:38:00.650507 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:38:00 crc kubenswrapper[4748]: I0320 10:38:00.650523 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:38:00Z","lastTransitionTime":"2026-03-20T10:38:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:38:00 crc kubenswrapper[4748]: I0320 10:38:00.753876 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:38:00 crc kubenswrapper[4748]: I0320 10:38:00.753979 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:38:00 crc kubenswrapper[4748]: I0320 10:38:00.754001 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:38:00 crc kubenswrapper[4748]: I0320 10:38:00.754063 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:38:00 crc kubenswrapper[4748]: I0320 10:38:00.754086 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:38:00Z","lastTransitionTime":"2026-03-20T10:38:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:38:00 crc kubenswrapper[4748]: I0320 10:38:00.857005 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:38:00 crc kubenswrapper[4748]: I0320 10:38:00.857082 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:38:00 crc kubenswrapper[4748]: I0320 10:38:00.857094 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:38:00 crc kubenswrapper[4748]: I0320 10:38:00.857114 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:38:00 crc kubenswrapper[4748]: I0320 10:38:00.857148 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:38:00Z","lastTransitionTime":"2026-03-20T10:38:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:38:00 crc kubenswrapper[4748]: I0320 10:38:00.960696 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:38:00 crc kubenswrapper[4748]: I0320 10:38:00.960774 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:38:00 crc kubenswrapper[4748]: I0320 10:38:00.960807 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:38:00 crc kubenswrapper[4748]: I0320 10:38:00.960828 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:38:00 crc kubenswrapper[4748]: I0320 10:38:00.960974 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:38:00Z","lastTransitionTime":"2026-03-20T10:38:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:38:01 crc kubenswrapper[4748]: I0320 10:38:01.064072 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:38:01 crc kubenswrapper[4748]: I0320 10:38:01.064118 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:38:01 crc kubenswrapper[4748]: I0320 10:38:01.064132 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:38:01 crc kubenswrapper[4748]: I0320 10:38:01.064151 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:38:01 crc kubenswrapper[4748]: I0320 10:38:01.064164 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:38:01Z","lastTransitionTime":"2026-03-20T10:38:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:38:01 crc kubenswrapper[4748]: I0320 10:38:01.167142 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:38:01 crc kubenswrapper[4748]: I0320 10:38:01.167206 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:38:01 crc kubenswrapper[4748]: I0320 10:38:01.167230 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:38:01 crc kubenswrapper[4748]: I0320 10:38:01.167261 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:38:01 crc kubenswrapper[4748]: I0320 10:38:01.167286 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:38:01Z","lastTransitionTime":"2026-03-20T10:38:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:38:01 crc kubenswrapper[4748]: I0320 10:38:01.183461 4748 generic.go:334] "Generic (PLEG): container finished" podID="16bd6321-67e6-40c7-9ad0-5c9035765e5d" containerID="cba1a734b5964a5cee7ec88618bd265da53a1120b7527968a6ec9b1cd604723b" exitCode=0 Mar 20 10:38:01 crc kubenswrapper[4748]: I0320 10:38:01.183550 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qnjmr" event={"ID":"16bd6321-67e6-40c7-9ad0-5c9035765e5d","Type":"ContainerDied","Data":"cba1a734b5964a5cee7ec88618bd265da53a1120b7527968a6ec9b1cd604723b"} Mar 20 10:38:01 crc kubenswrapper[4748]: I0320 10:38:01.185776 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"3b1c4319c20ac30f2598999051c46686e925297bd0e212e9a042014f20204756"} Mar 20 10:38:01 crc kubenswrapper[4748]: I0320 10:38:01.271296 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:38:01 crc kubenswrapper[4748]: I0320 10:38:01.271361 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:38:01 crc kubenswrapper[4748]: I0320 10:38:01.271384 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:38:01 crc kubenswrapper[4748]: I0320 10:38:01.271412 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:38:01 crc kubenswrapper[4748]: I0320 10:38:01.271432 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:38:01Z","lastTransitionTime":"2026-03-20T10:38:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:38:01 crc kubenswrapper[4748]: I0320 10:38:01.375516 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:38:01 crc kubenswrapper[4748]: I0320 10:38:01.375604 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:38:01 crc kubenswrapper[4748]: I0320 10:38:01.375630 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:38:01 crc kubenswrapper[4748]: I0320 10:38:01.375667 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:38:01 crc kubenswrapper[4748]: I0320 10:38:01.375692 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:38:01Z","lastTransitionTime":"2026-03-20T10:38:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:38:01 crc kubenswrapper[4748]: I0320 10:38:01.478860 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:38:01 crc kubenswrapper[4748]: I0320 10:38:01.479260 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:38:01 crc kubenswrapper[4748]: I0320 10:38:01.479278 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:38:01 crc kubenswrapper[4748]: I0320 10:38:01.479307 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:38:01 crc kubenswrapper[4748]: I0320 10:38:01.479325 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:38:01Z","lastTransitionTime":"2026-03-20T10:38:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:38:01 crc kubenswrapper[4748]: I0320 10:38:01.582250 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:38:01 crc kubenswrapper[4748]: I0320 10:38:01.582291 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:38:01 crc kubenswrapper[4748]: I0320 10:38:01.582304 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:38:01 crc kubenswrapper[4748]: I0320 10:38:01.582329 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:38:01 crc kubenswrapper[4748]: I0320 10:38:01.582342 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:38:01Z","lastTransitionTime":"2026-03-20T10:38:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:38:01 crc kubenswrapper[4748]: I0320 10:38:01.685752 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:38:01 crc kubenswrapper[4748]: I0320 10:38:01.685806 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:38:01 crc kubenswrapper[4748]: I0320 10:38:01.685824 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:38:01 crc kubenswrapper[4748]: I0320 10:38:01.685877 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:38:01 crc kubenswrapper[4748]: I0320 10:38:01.685896 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:38:01Z","lastTransitionTime":"2026-03-20T10:38:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:38:01 crc kubenswrapper[4748]: I0320 10:38:01.792174 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:38:01 crc kubenswrapper[4748]: I0320 10:38:01.792240 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:38:01 crc kubenswrapper[4748]: I0320 10:38:01.792258 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:38:01 crc kubenswrapper[4748]: I0320 10:38:01.792284 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:38:01 crc kubenswrapper[4748]: I0320 10:38:01.792307 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:38:01Z","lastTransitionTime":"2026-03-20T10:38:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:38:01 crc kubenswrapper[4748]: I0320 10:38:01.895549 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:38:01 crc kubenswrapper[4748]: I0320 10:38:01.895618 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:38:01 crc kubenswrapper[4748]: I0320 10:38:01.895642 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:38:01 crc kubenswrapper[4748]: I0320 10:38:01.895676 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:38:01 crc kubenswrapper[4748]: I0320 10:38:01.895702 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:38:01Z","lastTransitionTime":"2026-03-20T10:38:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:38:01 crc kubenswrapper[4748]: I0320 10:38:01.999414 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:38:01 crc kubenswrapper[4748]: I0320 10:38:01.999476 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:38:01 crc kubenswrapper[4748]: I0320 10:38:01.999496 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:38:01 crc kubenswrapper[4748]: I0320 10:38:01.999522 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:38:01 crc kubenswrapper[4748]: I0320 10:38:01.999542 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:38:01Z","lastTransitionTime":"2026-03-20T10:38:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:38:02 crc kubenswrapper[4748]: I0320 10:38:02.102027 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:38:02 crc kubenswrapper[4748]: I0320 10:38:02.102071 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:38:02 crc kubenswrapper[4748]: I0320 10:38:02.102081 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:38:02 crc kubenswrapper[4748]: I0320 10:38:02.102096 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:38:02 crc kubenswrapper[4748]: I0320 10:38:02.102110 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:38:02Z","lastTransitionTime":"2026-03-20T10:38:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:38:02 crc kubenswrapper[4748]: I0320 10:38:02.194267 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xdzb8" event={"ID":"f31addae-43ae-459d-bf9d-b5c0ee58faba","Type":"ContainerStarted","Data":"fe1f6bd9aee16c9e4c4607895576f28294888b4c1a29f8bf4b4a23bc5b35ec68"} Mar 20 10:38:02 crc kubenswrapper[4748]: I0320 10:38:02.197225 4748 generic.go:334] "Generic (PLEG): container finished" podID="16bd6321-67e6-40c7-9ad0-5c9035765e5d" containerID="2d7ad704aea1d5e1e37f96363270e0b03bdebf243fd03be4c491922bc7265aa8" exitCode=0 Mar 20 10:38:02 crc kubenswrapper[4748]: I0320 10:38:02.197283 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qnjmr" event={"ID":"16bd6321-67e6-40c7-9ad0-5c9035765e5d","Type":"ContainerDied","Data":"2d7ad704aea1d5e1e37f96363270e0b03bdebf243fd03be4c491922bc7265aa8"} Mar 20 10:38:02 crc kubenswrapper[4748]: I0320 10:38:02.203786 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:38:02 crc kubenswrapper[4748]: I0320 10:38:02.203850 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:38:02 crc kubenswrapper[4748]: I0320 10:38:02.203861 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:38:02 crc kubenswrapper[4748]: I0320 10:38:02.203877 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:38:02 crc kubenswrapper[4748]: I0320 10:38:02.203887 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:38:02Z","lastTransitionTime":"2026-03-20T10:38:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:38:02 crc kubenswrapper[4748]: I0320 10:38:02.306282 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:38:02 crc kubenswrapper[4748]: I0320 10:38:02.306903 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:38:02 crc kubenswrapper[4748]: I0320 10:38:02.306928 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:38:02 crc kubenswrapper[4748]: I0320 10:38:02.306947 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:38:02 crc kubenswrapper[4748]: I0320 10:38:02.306960 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:38:02Z","lastTransitionTime":"2026-03-20T10:38:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:38:02 crc kubenswrapper[4748]: I0320 10:38:02.409961 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:38:02 crc kubenswrapper[4748]: I0320 10:38:02.410011 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:38:02 crc kubenswrapper[4748]: I0320 10:38:02.410024 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:38:02 crc kubenswrapper[4748]: I0320 10:38:02.410042 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:38:02 crc kubenswrapper[4748]: I0320 10:38:02.410054 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:38:02Z","lastTransitionTime":"2026-03-20T10:38:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:38:02 crc kubenswrapper[4748]: I0320 10:38:02.513920 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:38:02 crc kubenswrapper[4748]: I0320 10:38:02.513978 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:38:02 crc kubenswrapper[4748]: I0320 10:38:02.513999 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:38:02 crc kubenswrapper[4748]: I0320 10:38:02.514029 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:38:02 crc kubenswrapper[4748]: I0320 10:38:02.514051 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:38:02Z","lastTransitionTime":"2026-03-20T10:38:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:38:02 crc kubenswrapper[4748]: I0320 10:38:02.514272 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:38:02 crc kubenswrapper[4748]: I0320 10:38:02.514300 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5jzd5" Mar 20 10:38:02 crc kubenswrapper[4748]: I0320 10:38:02.514330 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:38:02 crc kubenswrapper[4748]: I0320 10:38:02.514281 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:38:02 crc kubenswrapper[4748]: E0320 10:38:02.514422 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:38:02 crc kubenswrapper[4748]: E0320 10:38:02.514608 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:38:02 crc kubenswrapper[4748]: E0320 10:38:02.514765 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5jzd5" podUID="d7a2dfc5-1dd9-4ef1-9419-39f60da74b16" Mar 20 10:38:02 crc kubenswrapper[4748]: E0320 10:38:02.514902 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:38:02 crc kubenswrapper[4748]: I0320 10:38:02.616405 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:38:02 crc kubenswrapper[4748]: I0320 10:38:02.616439 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:38:02 crc kubenswrapper[4748]: I0320 10:38:02.616447 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:38:02 crc kubenswrapper[4748]: I0320 10:38:02.616462 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:38:02 crc kubenswrapper[4748]: I0320 10:38:02.616474 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:38:02Z","lastTransitionTime":"2026-03-20T10:38:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:38:02 crc kubenswrapper[4748]: I0320 10:38:02.720224 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:38:02 crc kubenswrapper[4748]: I0320 10:38:02.720295 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:38:02 crc kubenswrapper[4748]: I0320 10:38:02.720318 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:38:02 crc kubenswrapper[4748]: I0320 10:38:02.720351 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:38:02 crc kubenswrapper[4748]: I0320 10:38:02.720374 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:38:02Z","lastTransitionTime":"2026-03-20T10:38:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:38:02 crc kubenswrapper[4748]: I0320 10:38:02.823363 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:38:02 crc kubenswrapper[4748]: I0320 10:38:02.823444 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:38:02 crc kubenswrapper[4748]: I0320 10:38:02.823463 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:38:02 crc kubenswrapper[4748]: I0320 10:38:02.823492 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:38:02 crc kubenswrapper[4748]: I0320 10:38:02.823512 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:38:02Z","lastTransitionTime":"2026-03-20T10:38:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:38:02 crc kubenswrapper[4748]: I0320 10:38:02.926498 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:38:02 crc kubenswrapper[4748]: I0320 10:38:02.926553 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:38:02 crc kubenswrapper[4748]: I0320 10:38:02.926566 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:38:02 crc kubenswrapper[4748]: I0320 10:38:02.926587 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:38:02 crc kubenswrapper[4748]: I0320 10:38:02.926601 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:38:02Z","lastTransitionTime":"2026-03-20T10:38:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:38:03 crc kubenswrapper[4748]: I0320 10:38:03.029970 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:38:03 crc kubenswrapper[4748]: I0320 10:38:03.030025 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:38:03 crc kubenswrapper[4748]: I0320 10:38:03.030036 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:38:03 crc kubenswrapper[4748]: I0320 10:38:03.030056 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:38:03 crc kubenswrapper[4748]: I0320 10:38:03.030069 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:38:03Z","lastTransitionTime":"2026-03-20T10:38:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:38:03 crc kubenswrapper[4748]: I0320 10:38:03.077540 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:38:03 crc kubenswrapper[4748]: I0320 10:38:03.077581 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:38:03 crc kubenswrapper[4748]: I0320 10:38:03.077592 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:38:03 crc kubenswrapper[4748]: I0320 10:38:03.077606 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:38:03 crc kubenswrapper[4748]: I0320 10:38:03.077617 4748 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:38:03Z","lastTransitionTime":"2026-03-20T10:38:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:38:03 crc kubenswrapper[4748]: I0320 10:38:03.133590 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-2wflh"] Mar 20 10:38:03 crc kubenswrapper[4748]: I0320 10:38:03.134319 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2wflh" Mar 20 10:38:03 crc kubenswrapper[4748]: I0320 10:38:03.136438 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 20 10:38:03 crc kubenswrapper[4748]: I0320 10:38:03.136707 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 20 10:38:03 crc kubenswrapper[4748]: I0320 10:38:03.136926 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 20 10:38:03 crc kubenswrapper[4748]: I0320 10:38:03.137009 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 20 10:38:03 crc kubenswrapper[4748]: I0320 10:38:03.211072 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qnjmr" event={"ID":"16bd6321-67e6-40c7-9ad0-5c9035765e5d","Type":"ContainerStarted","Data":"e2f0d8762f00a604fbb1b3a62a198a667ad36bde99278eeeacf3ce67d97e3a55"} Mar 20 10:38:03 crc kubenswrapper[4748]: I0320 10:38:03.331338 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/699d3fbb-5c31-4588-a2ea-29a8168a6954-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-2wflh\" (UID: \"699d3fbb-5c31-4588-a2ea-29a8168a6954\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2wflh" Mar 20 10:38:03 crc kubenswrapper[4748]: I0320 10:38:03.331430 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/699d3fbb-5c31-4588-a2ea-29a8168a6954-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-2wflh\" (UID: \"699d3fbb-5c31-4588-a2ea-29a8168a6954\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2wflh" Mar 20 10:38:03 crc kubenswrapper[4748]: I0320 10:38:03.331451 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/699d3fbb-5c31-4588-a2ea-29a8168a6954-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-2wflh\" (UID: \"699d3fbb-5c31-4588-a2ea-29a8168a6954\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2wflh" Mar 20 10:38:03 crc kubenswrapper[4748]: I0320 10:38:03.331502 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/699d3fbb-5c31-4588-a2ea-29a8168a6954-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-2wflh\" (UID: \"699d3fbb-5c31-4588-a2ea-29a8168a6954\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2wflh" Mar 20 10:38:03 crc kubenswrapper[4748]: I0320 10:38:03.331562 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/699d3fbb-5c31-4588-a2ea-29a8168a6954-service-ca\") pod \"cluster-version-operator-5c965bbfc6-2wflh\" (UID: \"699d3fbb-5c31-4588-a2ea-29a8168a6954\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2wflh" Mar 20 10:38:03 crc kubenswrapper[4748]: I0320 10:38:03.432611 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/699d3fbb-5c31-4588-a2ea-29a8168a6954-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-2wflh\" (UID: \"699d3fbb-5c31-4588-a2ea-29a8168a6954\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2wflh" Mar 20 10:38:03 crc kubenswrapper[4748]: I0320 10:38:03.432687 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/699d3fbb-5c31-4588-a2ea-29a8168a6954-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-2wflh\" (UID: \"699d3fbb-5c31-4588-a2ea-29a8168a6954\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2wflh" Mar 20 10:38:03 crc kubenswrapper[4748]: I0320 10:38:03.432712 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/699d3fbb-5c31-4588-a2ea-29a8168a6954-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-2wflh\" (UID: \"699d3fbb-5c31-4588-a2ea-29a8168a6954\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2wflh" Mar 20 10:38:03 crc kubenswrapper[4748]: I0320 10:38:03.432757 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/699d3fbb-5c31-4588-a2ea-29a8168a6954-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-2wflh\" (UID: \"699d3fbb-5c31-4588-a2ea-29a8168a6954\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2wflh" Mar 20 10:38:03 crc kubenswrapper[4748]: I0320 10:38:03.432780 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/699d3fbb-5c31-4588-a2ea-29a8168a6954-service-ca\") pod \"cluster-version-operator-5c965bbfc6-2wflh\" (UID: \"699d3fbb-5c31-4588-a2ea-29a8168a6954\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2wflh" Mar 20 10:38:03 crc kubenswrapper[4748]: I0320 10:38:03.432897 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/699d3fbb-5c31-4588-a2ea-29a8168a6954-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-2wflh\" (UID: \"699d3fbb-5c31-4588-a2ea-29a8168a6954\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2wflh" Mar 20 10:38:03 crc kubenswrapper[4748]: I0320 10:38:03.432899 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/699d3fbb-5c31-4588-a2ea-29a8168a6954-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-2wflh\" (UID: \"699d3fbb-5c31-4588-a2ea-29a8168a6954\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2wflh" Mar 20 10:38:03 crc kubenswrapper[4748]: I0320 10:38:03.433641 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/699d3fbb-5c31-4588-a2ea-29a8168a6954-service-ca\") pod \"cluster-version-operator-5c965bbfc6-2wflh\" (UID: \"699d3fbb-5c31-4588-a2ea-29a8168a6954\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2wflh" Mar 20 10:38:03 crc kubenswrapper[4748]: I0320 10:38:03.439279 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/699d3fbb-5c31-4588-a2ea-29a8168a6954-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-2wflh\" (UID: \"699d3fbb-5c31-4588-a2ea-29a8168a6954\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2wflh" Mar 20 10:38:03 crc kubenswrapper[4748]: I0320 10:38:03.461481 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/699d3fbb-5c31-4588-a2ea-29a8168a6954-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-2wflh\" (UID: \"699d3fbb-5c31-4588-a2ea-29a8168a6954\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2wflh" Mar 20 10:38:03 crc kubenswrapper[4748]: I0320 10:38:03.747465 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2wflh" Mar 20 10:38:03 crc kubenswrapper[4748]: W0320 10:38:03.769321 4748 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod699d3fbb_5c31_4588_a2ea_29a8168a6954.slice/crio-747653ccfeff6787228d378386d7841663c4d57baf671cec812ed9ad49c3fbf2 WatchSource:0}: Error finding container 747653ccfeff6787228d378386d7841663c4d57baf671cec812ed9ad49c3fbf2: Status 404 returned error can't find the container with id 747653ccfeff6787228d378386d7841663c4d57baf671cec812ed9ad49c3fbf2 Mar 20 10:38:04 crc kubenswrapper[4748]: I0320 10:38:04.032757 4748 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Mar 20 10:38:04 crc kubenswrapper[4748]: I0320 10:38:04.042207 4748 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 20 10:38:04 crc kubenswrapper[4748]: I0320 10:38:04.219019 4748 generic.go:334] "Generic (PLEG): container finished" podID="16bd6321-67e6-40c7-9ad0-5c9035765e5d" containerID="e2f0d8762f00a604fbb1b3a62a198a667ad36bde99278eeeacf3ce67d97e3a55" exitCode=0 Mar 20 10:38:04 crc kubenswrapper[4748]: I0320 10:38:04.219058 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qnjmr" event={"ID":"16bd6321-67e6-40c7-9ad0-5c9035765e5d","Type":"ContainerDied","Data":"e2f0d8762f00a604fbb1b3a62a198a667ad36bde99278eeeacf3ce67d97e3a55"} Mar 20 10:38:04 crc kubenswrapper[4748]: I0320 10:38:04.221512 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2wflh" event={"ID":"699d3fbb-5c31-4588-a2ea-29a8168a6954","Type":"ContainerStarted","Data":"9853d350ffe24e7f025547e96cd346eaa56ddf8892f8c9d687759d4b5a41d29d"} Mar 20 10:38:04 crc kubenswrapper[4748]: I0320 10:38:04.221634 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2wflh" event={"ID":"699d3fbb-5c31-4588-a2ea-29a8168a6954","Type":"ContainerStarted","Data":"747653ccfeff6787228d378386d7841663c4d57baf671cec812ed9ad49c3fbf2"} Mar 20 10:38:04 crc kubenswrapper[4748]: I0320 10:38:04.226877 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xdzb8" event={"ID":"f31addae-43ae-459d-bf9d-b5c0ee58faba","Type":"ContainerStarted","Data":"0c85912fc1ac385cb91c20ee935596fe12cd390837609dbaec03b30f96d4168c"} Mar 20 10:38:04 crc kubenswrapper[4748]: I0320 10:38:04.227267 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-xdzb8" Mar 20 10:38:04 crc kubenswrapper[4748]: I0320 10:38:04.227287 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-xdzb8" Mar 20 10:38:04 crc kubenswrapper[4748]: I0320 10:38:04.227437 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-xdzb8" Mar 20 10:38:04 crc kubenswrapper[4748]: I0320 10:38:04.261332 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-xdzb8" Mar 20 10:38:04 crc kubenswrapper[4748]: I0320 10:38:04.266716 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-xdzb8" Mar 20 10:38:04 crc kubenswrapper[4748]: I0320 10:38:04.289160 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-xdzb8" podStartSLOduration=60.289142947 podStartE2EDuration="1m0.289142947s" podCreationTimestamp="2026-03-20 10:37:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:38:04.288692086 +0000 UTC m=+119.430237910" watchObservedRunningTime="2026-03-20 10:38:04.289142947 +0000 UTC m=+119.430688751" Mar 20 10:38:04 crc kubenswrapper[4748]: I0320 10:38:04.290866 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2wflh" podStartSLOduration=60.290824047 podStartE2EDuration="1m0.290824047s" podCreationTimestamp="2026-03-20 10:37:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:38:04.260612224 +0000 UTC m=+119.402158048" watchObservedRunningTime="2026-03-20 10:38:04.290824047 +0000 UTC m=+119.432370081" Mar 20 10:38:04 crc kubenswrapper[4748]: I0320 10:38:04.515095 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:38:04 crc kubenswrapper[4748]: I0320 10:38:04.515147 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:38:04 crc kubenswrapper[4748]: E0320 10:38:04.515280 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:38:04 crc kubenswrapper[4748]: I0320 10:38:04.515308 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:38:04 crc kubenswrapper[4748]: E0320 10:38:04.515640 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:38:04 crc kubenswrapper[4748]: I0320 10:38:04.515721 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5jzd5" Mar 20 10:38:04 crc kubenswrapper[4748]: E0320 10:38:04.515783 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:38:04 crc kubenswrapper[4748]: I0320 10:38:04.515947 4748 scope.go:117] "RemoveContainer" containerID="abfae41f906e09e4b2be25e2cdf0408811449414f51931ce149a8468a32a88c6" Mar 20 10:38:04 crc kubenswrapper[4748]: E0320 10:38:04.515968 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5jzd5" podUID="d7a2dfc5-1dd9-4ef1-9419-39f60da74b16" Mar 20 10:38:05 crc kubenswrapper[4748]: I0320 10:38:05.240607 4748 generic.go:334] "Generic (PLEG): container finished" podID="16bd6321-67e6-40c7-9ad0-5c9035765e5d" containerID="24d026ae863793350369b1c9b55a87c7622151de942f576cf82f45d193aaf817" exitCode=0 Mar 20 10:38:05 crc kubenswrapper[4748]: I0320 10:38:05.240718 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qnjmr" event={"ID":"16bd6321-67e6-40c7-9ad0-5c9035765e5d","Type":"ContainerDied","Data":"24d026ae863793350369b1c9b55a87c7622151de942f576cf82f45d193aaf817"} Mar 20 10:38:05 crc kubenswrapper[4748]: I0320 10:38:05.247679 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 20 10:38:05 crc kubenswrapper[4748]: I0320 10:38:05.253203 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"401326b29a7df326d224b7ab4a9a1189d9219035a3df12a03d97e726d95641c4"} Mar 20 10:38:05 crc kubenswrapper[4748]: I0320 10:38:05.320802 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=23.320774563 podStartE2EDuration="23.320774563s" podCreationTimestamp="2026-03-20 10:37:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:38:05.319463351 +0000 UTC m=+120.461009175" watchObservedRunningTime="2026-03-20 10:38:05.320774563 +0000 UTC m=+120.462320387" Mar 20 10:38:05 crc kubenswrapper[4748]: E0320 10:38:05.501254 4748 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Mar 20 10:38:05 crc kubenswrapper[4748]: E0320 10:38:05.605718 4748 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 10:38:06 crc kubenswrapper[4748]: I0320 10:38:06.260247 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qnjmr" event={"ID":"16bd6321-67e6-40c7-9ad0-5c9035765e5d","Type":"ContainerStarted","Data":"4e5e34bea2dca61a35c8e47ad87a76ec73ae133830aae0a6a1149625d23dfbec"} Mar 20 10:38:06 crc kubenswrapper[4748]: I0320 10:38:06.280632 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-qnjmr" podStartSLOduration=62.280613697 podStartE2EDuration="1m2.280613697s" podCreationTimestamp="2026-03-20 10:37:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:38:06.280215467 +0000 UTC m=+121.421761281" watchObservedRunningTime="2026-03-20 10:38:06.280613697 +0000 UTC m=+121.422159521" Mar 20 10:38:06 crc kubenswrapper[4748]: I0320 10:38:06.312528 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-5jzd5"] Mar 20 10:38:06 crc kubenswrapper[4748]: I0320 10:38:06.312683 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5jzd5" Mar 20 10:38:06 crc kubenswrapper[4748]: E0320 10:38:06.312792 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5jzd5" podUID="d7a2dfc5-1dd9-4ef1-9419-39f60da74b16" Mar 20 10:38:06 crc kubenswrapper[4748]: I0320 10:38:06.514936 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:38:06 crc kubenswrapper[4748]: I0320 10:38:06.514932 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:38:06 crc kubenswrapper[4748]: E0320 10:38:06.515102 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:38:06 crc kubenswrapper[4748]: E0320 10:38:06.515350 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:38:06 crc kubenswrapper[4748]: I0320 10:38:06.515599 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:38:06 crc kubenswrapper[4748]: E0320 10:38:06.515777 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:38:06 crc kubenswrapper[4748]: I0320 10:38:06.571013 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d7a2dfc5-1dd9-4ef1-9419-39f60da74b16-metrics-certs\") pod \"network-metrics-daemon-5jzd5\" (UID: \"d7a2dfc5-1dd9-4ef1-9419-39f60da74b16\") " pod="openshift-multus/network-metrics-daemon-5jzd5" Mar 20 10:38:06 crc kubenswrapper[4748]: E0320 10:38:06.571214 4748 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 10:38:06 crc kubenswrapper[4748]: E0320 10:38:06.571280 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d7a2dfc5-1dd9-4ef1-9419-39f60da74b16-metrics-certs podName:d7a2dfc5-1dd9-4ef1-9419-39f60da74b16 nodeName:}" failed. No retries permitted until 2026-03-20 10:38:22.571260619 +0000 UTC m=+137.712806443 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d7a2dfc5-1dd9-4ef1-9419-39f60da74b16-metrics-certs") pod "network-metrics-daemon-5jzd5" (UID: "d7a2dfc5-1dd9-4ef1-9419-39f60da74b16") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 10:38:08 crc kubenswrapper[4748]: I0320 10:38:08.514917 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:38:08 crc kubenswrapper[4748]: I0320 10:38:08.514947 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:38:08 crc kubenswrapper[4748]: I0320 10:38:08.515015 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:38:08 crc kubenswrapper[4748]: I0320 10:38:08.515050 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5jzd5" Mar 20 10:38:08 crc kubenswrapper[4748]: E0320 10:38:08.515808 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:38:08 crc kubenswrapper[4748]: E0320 10:38:08.515891 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5jzd5" podUID="d7a2dfc5-1dd9-4ef1-9419-39f60da74b16" Mar 20 10:38:08 crc kubenswrapper[4748]: E0320 10:38:08.516013 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:38:08 crc kubenswrapper[4748]: E0320 10:38:08.516218 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:38:08 crc kubenswrapper[4748]: I0320 10:38:08.533930 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 20 10:38:09 crc kubenswrapper[4748]: I0320 10:38:09.776885 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:38:10 crc kubenswrapper[4748]: I0320 10:38:10.514513 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:38:10 crc kubenswrapper[4748]: I0320 10:38:10.514551 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:38:10 crc kubenswrapper[4748]: I0320 10:38:10.514722 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:38:10 crc kubenswrapper[4748]: E0320 10:38:10.514869 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:38:10 crc kubenswrapper[4748]: I0320 10:38:10.515005 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5jzd5" Mar 20 10:38:10 crc kubenswrapper[4748]: E0320 10:38:10.515197 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:38:10 crc kubenswrapper[4748]: E0320 10:38:10.515287 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5jzd5" podUID="d7a2dfc5-1dd9-4ef1-9419-39f60da74b16" Mar 20 10:38:10 crc kubenswrapper[4748]: E0320 10:38:10.515378 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:38:12 crc kubenswrapper[4748]: I0320 10:38:12.515091 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:38:12 crc kubenswrapper[4748]: I0320 10:38:12.515125 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5jzd5" Mar 20 10:38:12 crc kubenswrapper[4748]: I0320 10:38:12.515161 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:38:12 crc kubenswrapper[4748]: I0320 10:38:12.515224 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:38:12 crc kubenswrapper[4748]: I0320 10:38:12.517998 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 20 10:38:12 crc kubenswrapper[4748]: I0320 10:38:12.518988 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 20 10:38:12 crc kubenswrapper[4748]: I0320 10:38:12.521067 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 20 10:38:12 crc kubenswrapper[4748]: I0320 10:38:12.521714 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 20 10:38:12 crc kubenswrapper[4748]: I0320 10:38:12.521727 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 20 10:38:12 crc kubenswrapper[4748]: I0320 10:38:12.522138 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 20 10:38:12 crc kubenswrapper[4748]: I0320 10:38:12.938518 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-xdzb8" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.123954 4748 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.191532 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=5.191504822 podStartE2EDuration="5.191504822s" podCreationTimestamp="2026-03-20 10:38:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:38:13.018032892 +0000 UTC m=+128.159578736" watchObservedRunningTime="2026-03-20 10:38:13.191504822 +0000 UTC m=+128.333050636" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.192209 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-9qxnc"] Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.192696 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9qxnc" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.195515 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-tvhtn"] Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.195939 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-tvhtn" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.201328 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.201463 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.201528 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.201920 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.202100 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.202241 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.202383 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.202529 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.236671 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-4k4dq"] Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.237259 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-4k4dq" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.237629 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-26svx"] Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.237962 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-26svx" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.238355 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-9x2kc"] Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.239734 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.241369 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.241583 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.241708 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.241851 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.241957 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.242140 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-fftbt"] Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.242543 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-9x2kc" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.242763 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-2xltn"] Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.242859 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.242997 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.243060 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-qq9xx"] Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.243129 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.243244 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-fftbt" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.243516 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-qq9xx" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.243526 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2xltn" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.244355 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-9lbsk"] Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.245055 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-8ttxd"] Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.245145 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-9lbsk" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.245421 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-8ttxd" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.245703 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7zqwq"] Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.246081 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7zqwq" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.246269 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.248999 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.250056 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.250561 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.250776 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.250889 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.250936 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.251048 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.251060 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.251154 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.251198 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.251417 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.251528 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-m7vd5"] Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.251796 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.251994 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.252343 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-m7vd5" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.253019 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-c5vw7"] Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.253507 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tqb9k"] Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.254647 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-tlkp2"] Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.254895 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-c5vw7" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.255310 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-rbx59"] Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.255791 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-tlkp2" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.255817 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-rbx59" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.256032 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tqb9k" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.257514 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.257737 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.257818 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.257895 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.257976 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.258054 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.258056 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.258102 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.258184 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.258210 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.258225 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.258226 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.258313 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.258319 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.258401 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.258187 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.258549 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.258402 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.258675 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.258807 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.258853 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.258957 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.259077 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-p46hq"] Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.259677 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-p46hq" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.260024 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-brfsp"] Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.261004 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-brfsp" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.271746 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-kc5xl"] Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.273570 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.274075 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.274426 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.275585 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-kc5xl" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.278993 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.279091 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.279405 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.279771 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.283982 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.287034 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.287093 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.287246 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.287730 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.287965 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.288075 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.288444 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.288544 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-xwxlw"] Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.289122 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-xwxlw" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.289514 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.294897 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-hxtfq"] Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.295560 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6q2px"] Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.295949 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6q2px" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.296060 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-hxtfq" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.306047 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-xpncn"] Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.306546 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zwht2"] Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.306951 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zwht2" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.307212 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-xpncn" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.307690 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.307937 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.308106 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.308327 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.308527 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.308571 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.308661 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.308728 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.308878 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.308981 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.309079 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.309170 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.309385 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.309985 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.310043 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.310138 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.310155 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.310219 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.310258 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.310291 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.310363 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.310444 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.310482 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.310544 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.310632 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.310715 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.310724 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.310633 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.312737 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.312824 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-7cnpj"] Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.313457 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-n6wkh"] Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.313698 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-7cnpj" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.313949 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-n6wkh" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.317477 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-4svjh"] Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.317937 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-4svjh" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.318365 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-dmmdh"] Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.318685 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-dmmdh" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.319134 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-sksdb"] Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.320405 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.321002 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-v79zw"] Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.321427 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-sksdb" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.321456 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-v79zw" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.322474 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-phlbm"] Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.323223 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-phlbm" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.323640 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hsr94"] Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.324207 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hsr94" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.326240 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zbshh"] Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.326616 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-hg8kp"] Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.332003 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.332632 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-hg8kp" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.333724 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zbshh" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.335889 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566710-vh2t7"] Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.343368 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566710-vh2t7" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.348037 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-rcw2w"] Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.354381 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.355732 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.356353 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.356446 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.356659 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f21328d-966f-4d28-b80a-c950c1c83b6e-config\") pod \"kube-apiserver-operator-766d6c64bb-6q2px\" (UID: \"0f21328d-966f-4d28-b80a-c950c1c83b6e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6q2px" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.356705 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/61338f55-8e52-48cb-962d-40aefc460991-etcd-service-ca\") pod \"etcd-operator-b45778765-8ttxd\" (UID: \"61338f55-8e52-48cb-962d-40aefc460991\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8ttxd" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.356731 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmk9t\" (UniqueName: \"kubernetes.io/projected/51c43cdc-1f52-4850-9c2d-d317dc38c754-kube-api-access-gmk9t\") pod \"catalog-operator-68c6474976-hsr94\" (UID: \"51c43cdc-1f52-4850-9c2d-d317dc38c754\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hsr94" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.356748 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/e24b6492-332b-421d-b2cd-ab1f11e06432-encryption-config\") pod \"apiserver-7bbb656c7d-2xltn\" (UID: \"e24b6492-332b-421d-b2cd-ab1f11e06432\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2xltn" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.356763 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/122fe6aa-0852-4ddd-a678-6ce541c38250-config\") pod \"console-operator-58897d9998-9x2kc\" (UID: \"122fe6aa-0852-4ddd-a678-6ce541c38250\") " pod="openshift-console-operator/console-operator-58897d9998-9x2kc" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.356780 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eb3bf250-9fda-4426-baa3-48eed453f90d-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-7zqwq\" (UID: \"eb3bf250-9fda-4426-baa3-48eed453f90d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7zqwq" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.356794 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0f21328d-966f-4d28-b80a-c950c1c83b6e-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-6q2px\" (UID: \"0f21328d-966f-4d28-b80a-c950c1c83b6e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6q2px" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.356809 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9abcdd14-d386-4279-9d4c-4a7326a32a11-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-xpncn\" (UID: \"9abcdd14-d386-4279-9d4c-4a7326a32a11\") " pod="openshift-marketplace/marketplace-operator-79b997595-xpncn" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.356872 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2c8e6760-4ff7-4417-917e-61bdc115d710-service-ca-bundle\") pod \"router-default-5444994796-rbx59\" (UID: \"2c8e6760-4ff7-4417-917e-61bdc115d710\") " pod="openshift-ingress/router-default-5444994796-rbx59" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.356924 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/ff567791-a234-47f5-8350-154f93477bb9-image-import-ca\") pod \"apiserver-76f77b778f-qq9xx\" (UID: \"ff567791-a234-47f5-8350-154f93477bb9\") " pod="openshift-apiserver/apiserver-76f77b778f-qq9xx" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.357015 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/ff567791-a234-47f5-8350-154f93477bb9-encryption-config\") pod \"apiserver-76f77b778f-qq9xx\" (UID: \"ff567791-a234-47f5-8350-154f93477bb9\") " pod="openshift-apiserver/apiserver-76f77b778f-qq9xx" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.357045 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/eb3bf250-9fda-4426-baa3-48eed453f90d-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-7zqwq\" (UID: \"eb3bf250-9fda-4426-baa3-48eed453f90d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7zqwq" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.357085 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/122fe6aa-0852-4ddd-a678-6ce541c38250-trusted-ca\") pod \"console-operator-58897d9998-9x2kc\" (UID: \"122fe6aa-0852-4ddd-a678-6ce541c38250\") " pod="openshift-console-operator/console-operator-58897d9998-9x2kc" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.357102 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/61338f55-8e52-48cb-962d-40aefc460991-config\") pod \"etcd-operator-b45778765-8ttxd\" (UID: \"61338f55-8e52-48cb-962d-40aefc460991\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8ttxd" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.357120 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/61338f55-8e52-48cb-962d-40aefc460991-etcd-ca\") pod \"etcd-operator-b45778765-8ttxd\" (UID: \"61338f55-8e52-48cb-962d-40aefc460991\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8ttxd" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.357193 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/6c6c249f-695d-4875-94ad-a608e8bd7d5f-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-26svx\" (UID: \"6c6c249f-695d-4875-94ad-a608e8bd7d5f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-26svx" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.357220 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8a3c049c-4ed1-4b65-9ba9-854aa25764e5-metrics-tls\") pod \"dns-operator-744455d44c-fftbt\" (UID: \"8a3c049c-4ed1-4b65-9ba9-854aa25764e5\") " pod="openshift-dns-operator/dns-operator-744455d44c-fftbt" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.357268 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0e36c673-8d30-4267-a115-ec3b62ad093a-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-tvhtn\" (UID: \"0e36c673-8d30-4267-a115-ec3b62ad093a\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-tvhtn" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.357317 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/61338f55-8e52-48cb-962d-40aefc460991-serving-cert\") pod \"etcd-operator-b45778765-8ttxd\" (UID: \"61338f55-8e52-48cb-962d-40aefc460991\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8ttxd" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.357342 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2dd2dd89-ba90-440f-abc8-74ab27d7db69-config-volume\") pod \"collect-profiles-29566710-vh2t7\" (UID: \"2dd2dd89-ba90-440f-abc8-74ab27d7db69\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566710-vh2t7" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.357359 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e24b6492-332b-421d-b2cd-ab1f11e06432-etcd-client\") pod \"apiserver-7bbb656c7d-2xltn\" (UID: \"e24b6492-332b-421d-b2cd-ab1f11e06432\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2xltn" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.357376 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff567791-a234-47f5-8350-154f93477bb9-config\") pod \"apiserver-76f77b778f-qq9xx\" (UID: \"ff567791-a234-47f5-8350-154f93477bb9\") " pod="openshift-apiserver/apiserver-76f77b778f-qq9xx" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.357404 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/2cc53e20-383b-4e3a-a00a-d54ac8272e00-images\") pod \"machine-api-operator-5694c8668f-9lbsk\" (UID: \"2cc53e20-383b-4e3a-a00a-d54ac8272e00\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-9lbsk" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.357420 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49x5z\" (UniqueName: \"kubernetes.io/projected/8a3c049c-4ed1-4b65-9ba9-854aa25764e5-kube-api-access-49x5z\") pod \"dns-operator-744455d44c-fftbt\" (UID: \"8a3c049c-4ed1-4b65-9ba9-854aa25764e5\") " pod="openshift-dns-operator/dns-operator-744455d44c-fftbt" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.357441 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e24b6492-332b-421d-b2cd-ab1f11e06432-audit-dir\") pod \"apiserver-7bbb656c7d-2xltn\" (UID: \"e24b6492-332b-421d-b2cd-ab1f11e06432\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2xltn" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.357458 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/51c43cdc-1f52-4850-9c2d-d317dc38c754-profile-collector-cert\") pod \"catalog-operator-68c6474976-hsr94\" (UID: \"51c43cdc-1f52-4850-9c2d-d317dc38c754\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hsr94" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.357475 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ffd1ef63-d06a-4c98-8644-9efc1e7ae8fa-serving-cert\") pod \"controller-manager-879f6c89f-hg8kp\" (UID: \"ffd1ef63-d06a-4c98-8644-9efc1e7ae8fa\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hg8kp" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.357501 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/a6110c56-5634-4ef9-92b1-4c7c75dd4986-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-v79zw\" (UID: \"a6110c56-5634-4ef9-92b1-4c7c75dd4986\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-v79zw" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.357526 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8rk6\" (UniqueName: \"kubernetes.io/projected/a6110c56-5634-4ef9-92b1-4c7c75dd4986-kube-api-access-n8rk6\") pod \"control-plane-machine-set-operator-78cbb6b69f-v79zw\" (UID: \"a6110c56-5634-4ef9-92b1-4c7c75dd4986\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-v79zw" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.357556 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ff567791-a234-47f5-8350-154f93477bb9-trusted-ca-bundle\") pod \"apiserver-76f77b778f-qq9xx\" (UID: \"ff567791-a234-47f5-8350-154f93477bb9\") " pod="openshift-apiserver/apiserver-76f77b778f-qq9xx" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.357571 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/122fe6aa-0852-4ddd-a678-6ce541c38250-serving-cert\") pod \"console-operator-58897d9998-9x2kc\" (UID: \"122fe6aa-0852-4ddd-a678-6ce541c38250\") " pod="openshift-console-operator/console-operator-58897d9998-9x2kc" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.357583 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-rcw2w" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.357605 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/554afb3a-2388-4d1f-8949-7c9c9941d685-config\") pod \"machine-approver-56656f9798-9qxnc\" (UID: \"554afb3a-2388-4d1f-8949-7c9c9941d685\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9qxnc" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.357624 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/64222460-9199-40c1-bf14-07be02855e39-signing-key\") pod \"service-ca-9c57cc56f-4svjh\" (UID: \"64222460-9199-40c1-bf14-07be02855e39\") " pod="openshift-service-ca/service-ca-9c57cc56f-4svjh" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.357640 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2dd2dd89-ba90-440f-abc8-74ab27d7db69-secret-volume\") pod \"collect-profiles-29566710-vh2t7\" (UID: \"2dd2dd89-ba90-440f-abc8-74ab27d7db69\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566710-vh2t7" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.357661 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3ec3ee66-08f2-44d5-a414-90354f5a4a9e-serving-cert\") pod \"openshift-config-operator-7777fb866f-4k4dq\" (UID: \"3ec3ee66-08f2-44d5-a414-90354f5a4a9e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-4k4dq" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.357680 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7kmfs\" (UniqueName: \"kubernetes.io/projected/2cc53e20-383b-4e3a-a00a-d54ac8272e00-kube-api-access-7kmfs\") pod \"machine-api-operator-5694c8668f-9lbsk\" (UID: \"2cc53e20-383b-4e3a-a00a-d54ac8272e00\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-9lbsk" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.357696 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/61338f55-8e52-48cb-962d-40aefc460991-etcd-client\") pod \"etcd-operator-b45778765-8ttxd\" (UID: \"61338f55-8e52-48cb-962d-40aefc460991\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8ttxd" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.357712 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ff567791-a234-47f5-8350-154f93477bb9-serving-cert\") pod \"apiserver-76f77b778f-qq9xx\" (UID: \"ff567791-a234-47f5-8350-154f93477bb9\") " pod="openshift-apiserver/apiserver-76f77b778f-qq9xx" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.357740 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjt6c\" (UniqueName: \"kubernetes.io/projected/c5864ca9-f8a0-40c0-ade0-8d4e6e0a350c-kube-api-access-xjt6c\") pod \"openshift-controller-manager-operator-756b6f6bc6-tqb9k\" (UID: \"c5864ca9-f8a0-40c0-ade0-8d4e6e0a350c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tqb9k" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.357761 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/2c8e6760-4ff7-4417-917e-61bdc115d710-default-certificate\") pod \"router-default-5444994796-rbx59\" (UID: \"2c8e6760-4ff7-4417-917e-61bdc115d710\") " pod="openshift-ingress/router-default-5444994796-rbx59" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.357779 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb3bf250-9fda-4426-baa3-48eed453f90d-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-7zqwq\" (UID: \"eb3bf250-9fda-4426-baa3-48eed453f90d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7zqwq" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.357811 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0e36c673-8d30-4267-a115-ec3b62ad093a-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-tvhtn\" (UID: \"0e36c673-8d30-4267-a115-ec3b62ad093a\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-tvhtn" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.357853 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hp5qr\" (UniqueName: \"kubernetes.io/projected/ffd1ef63-d06a-4c98-8644-9efc1e7ae8fa-kube-api-access-hp5qr\") pod \"controller-manager-879f6c89f-hg8kp\" (UID: \"ffd1ef63-d06a-4c98-8644-9efc1e7ae8fa\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hg8kp" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.357879 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c5864ca9-f8a0-40c0-ade0-8d4e6e0a350c-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-tqb9k\" (UID: \"c5864ca9-f8a0-40c0-ade0-8d4e6e0a350c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tqb9k" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.357896 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/554afb3a-2388-4d1f-8949-7c9c9941d685-auth-proxy-config\") pod \"machine-approver-56656f9798-9qxnc\" (UID: \"554afb3a-2388-4d1f-8949-7c9c9941d685\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9qxnc" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.357918 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7bfg\" (UniqueName: \"kubernetes.io/projected/6c6c249f-695d-4875-94ad-a608e8bd7d5f-kube-api-access-x7bfg\") pod \"cluster-samples-operator-665b6dd947-26svx\" (UID: \"6c6c249f-695d-4875-94ad-a608e8bd7d5f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-26svx" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.357939 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0f21328d-966f-4d28-b80a-c950c1c83b6e-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-6q2px\" (UID: \"0f21328d-966f-4d28-b80a-c950c1c83b6e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6q2px" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.357956 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/ff567791-a234-47f5-8350-154f93477bb9-node-pullsecrets\") pod \"apiserver-76f77b778f-qq9xx\" (UID: \"ff567791-a234-47f5-8350-154f93477bb9\") " pod="openshift-apiserver/apiserver-76f77b778f-qq9xx" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.357996 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29kvb\" (UniqueName: \"kubernetes.io/projected/122fe6aa-0852-4ddd-a678-6ce541c38250-kube-api-access-29kvb\") pod \"console-operator-58897d9998-9x2kc\" (UID: \"122fe6aa-0852-4ddd-a678-6ce541c38250\") " pod="openshift-console-operator/console-operator-58897d9998-9x2kc" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.358013 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2cc53e20-383b-4e3a-a00a-d54ac8272e00-config\") pod \"machine-api-operator-5694c8668f-9lbsk\" (UID: \"2cc53e20-383b-4e3a-a00a-d54ac8272e00\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-9lbsk" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.358039 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5q9zp\" (UniqueName: \"kubernetes.io/projected/64222460-9199-40c1-bf14-07be02855e39-kube-api-access-5q9zp\") pod \"service-ca-9c57cc56f-4svjh\" (UID: \"64222460-9199-40c1-bf14-07be02855e39\") " pod="openshift-service-ca/service-ca-9c57cc56f-4svjh" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.358055 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2c8e6760-4ff7-4417-917e-61bdc115d710-metrics-certs\") pod \"router-default-5444994796-rbx59\" (UID: \"2c8e6760-4ff7-4417-917e-61bdc115d710\") " pod="openshift-ingress/router-default-5444994796-rbx59" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.358070 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5ssh\" (UniqueName: \"kubernetes.io/projected/2c8e6760-4ff7-4417-917e-61bdc115d710-kube-api-access-x5ssh\") pod \"router-default-5444994796-rbx59\" (UID: \"2c8e6760-4ff7-4417-917e-61bdc115d710\") " pod="openshift-ingress/router-default-5444994796-rbx59" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.358089 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/3ec3ee66-08f2-44d5-a414-90354f5a4a9e-available-featuregates\") pod \"openshift-config-operator-7777fb866f-4k4dq\" (UID: \"3ec3ee66-08f2-44d5-a414-90354f5a4a9e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-4k4dq" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.358104 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e24b6492-332b-421d-b2cd-ab1f11e06432-serving-cert\") pod \"apiserver-7bbb656c7d-2xltn\" (UID: \"e24b6492-332b-421d-b2cd-ab1f11e06432\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2xltn" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.358118 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/ff567791-a234-47f5-8350-154f93477bb9-audit\") pod \"apiserver-76f77b778f-qq9xx\" (UID: \"ff567791-a234-47f5-8350-154f93477bb9\") " pod="openshift-apiserver/apiserver-76f77b778f-qq9xx" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.358159 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/51c43cdc-1f52-4850-9c2d-d317dc38c754-srv-cert\") pod \"catalog-operator-68c6474976-hsr94\" (UID: \"51c43cdc-1f52-4850-9c2d-d317dc38c754\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hsr94" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.358176 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ffd1ef63-d06a-4c98-8644-9efc1e7ae8fa-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-hg8kp\" (UID: \"ffd1ef63-d06a-4c98-8644-9efc1e7ae8fa\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hg8kp" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.358192 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e24b6492-332b-421d-b2cd-ab1f11e06432-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-2xltn\" (UID: \"e24b6492-332b-421d-b2cd-ab1f11e06432\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2xltn" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.358208 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ff567791-a234-47f5-8350-154f93477bb9-etcd-client\") pod \"apiserver-76f77b778f-qq9xx\" (UID: \"ff567791-a234-47f5-8350-154f93477bb9\") " pod="openshift-apiserver/apiserver-76f77b778f-qq9xx" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.358224 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zsn8k\" (UniqueName: \"kubernetes.io/projected/554afb3a-2388-4d1f-8949-7c9c9941d685-kube-api-access-zsn8k\") pod \"machine-approver-56656f9798-9qxnc\" (UID: \"554afb3a-2388-4d1f-8949-7c9c9941d685\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9qxnc" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.358239 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/64222460-9199-40c1-bf14-07be02855e39-signing-cabundle\") pod \"service-ca-9c57cc56f-4svjh\" (UID: \"64222460-9199-40c1-bf14-07be02855e39\") " pod="openshift-service-ca/service-ca-9c57cc56f-4svjh" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.358288 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ffd1ef63-d06a-4c98-8644-9efc1e7ae8fa-client-ca\") pod \"controller-manager-879f6c89f-hg8kp\" (UID: \"ffd1ef63-d06a-4c98-8644-9efc1e7ae8fa\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hg8kp" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.358308 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5864ca9-f8a0-40c0-ade0-8d4e6e0a350c-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-tqb9k\" (UID: \"c5864ca9-f8a0-40c0-ade0-8d4e6e0a350c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tqb9k" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.358345 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/ff567791-a234-47f5-8350-154f93477bb9-etcd-serving-ca\") pod \"apiserver-76f77b778f-qq9xx\" (UID: \"ff567791-a234-47f5-8350-154f93477bb9\") " pod="openshift-apiserver/apiserver-76f77b778f-qq9xx" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.358369 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6f7tz\" (UniqueName: \"kubernetes.io/projected/e24b6492-332b-421d-b2cd-ab1f11e06432-kube-api-access-6f7tz\") pod \"apiserver-7bbb656c7d-2xltn\" (UID: \"e24b6492-332b-421d-b2cd-ab1f11e06432\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2xltn" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.358387 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wlws\" (UniqueName: \"kubernetes.io/projected/ff567791-a234-47f5-8350-154f93477bb9-kube-api-access-4wlws\") pod \"apiserver-76f77b778f-qq9xx\" (UID: \"ff567791-a234-47f5-8350-154f93477bb9\") " pod="openshift-apiserver/apiserver-76f77b778f-qq9xx" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.358403 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zlb6s\" (UniqueName: \"kubernetes.io/projected/0e36c673-8d30-4267-a115-ec3b62ad093a-kube-api-access-zlb6s\") pod \"cluster-image-registry-operator-dc59b4c8b-tvhtn\" (UID: \"0e36c673-8d30-4267-a115-ec3b62ad093a\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-tvhtn" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.358424 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/2c8e6760-4ff7-4417-917e-61bdc115d710-stats-auth\") pod \"router-default-5444994796-rbx59\" (UID: \"2c8e6760-4ff7-4417-917e-61bdc115d710\") " pod="openshift-ingress/router-default-5444994796-rbx59" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.358482 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbbgb\" (UniqueName: \"kubernetes.io/projected/3ec3ee66-08f2-44d5-a414-90354f5a4a9e-kube-api-access-nbbgb\") pod \"openshift-config-operator-7777fb866f-4k4dq\" (UID: \"3ec3ee66-08f2-44d5-a414-90354f5a4a9e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-4k4dq" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.358540 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/e24b6492-332b-421d-b2cd-ab1f11e06432-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-2xltn\" (UID: \"e24b6492-332b-421d-b2cd-ab1f11e06432\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2xltn" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.358565 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/2cc53e20-383b-4e3a-a00a-d54ac8272e00-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-9lbsk\" (UID: \"2cc53e20-383b-4e3a-a00a-d54ac8272e00\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-9lbsk" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.358590 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/554afb3a-2388-4d1f-8949-7c9c9941d685-machine-approver-tls\") pod \"machine-approver-56656f9798-9qxnc\" (UID: \"554afb3a-2388-4d1f-8949-7c9c9941d685\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9qxnc" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.358609 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/0e36c673-8d30-4267-a115-ec3b62ad093a-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-tvhtn\" (UID: \"0e36c673-8d30-4267-a115-ec3b62ad093a\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-tvhtn" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.358664 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxwbs\" (UniqueName: \"kubernetes.io/projected/2dd2dd89-ba90-440f-abc8-74ab27d7db69-kube-api-access-sxwbs\") pod \"collect-profiles-29566710-vh2t7\" (UID: \"2dd2dd89-ba90-440f-abc8-74ab27d7db69\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566710-vh2t7" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.358691 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ff567791-a234-47f5-8350-154f93477bb9-audit-dir\") pod \"apiserver-76f77b778f-qq9xx\" (UID: \"ff567791-a234-47f5-8350-154f93477bb9\") " pod="openshift-apiserver/apiserver-76f77b778f-qq9xx" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.358717 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wng4v\" (UniqueName: \"kubernetes.io/projected/61338f55-8e52-48cb-962d-40aefc460991-kube-api-access-wng4v\") pod \"etcd-operator-b45778765-8ttxd\" (UID: \"61338f55-8e52-48cb-962d-40aefc460991\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8ttxd" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.358747 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8ch9\" (UniqueName: \"kubernetes.io/projected/0a5bf285-90ef-47c5-a959-7bae63c410a5-kube-api-access-m8ch9\") pod \"downloads-7954f5f757-hxtfq\" (UID: \"0a5bf285-90ef-47c5-a959-7bae63c410a5\") " pod="openshift-console/downloads-7954f5f757-hxtfq" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.358750 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.358792 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e24b6492-332b-421d-b2cd-ab1f11e06432-audit-policies\") pod \"apiserver-7bbb656c7d-2xltn\" (UID: \"e24b6492-332b-421d-b2cd-ab1f11e06432\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2xltn" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.358817 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ffd1ef63-d06a-4c98-8644-9efc1e7ae8fa-config\") pod \"controller-manager-879f6c89f-hg8kp\" (UID: \"ffd1ef63-d06a-4c98-8644-9efc1e7ae8fa\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hg8kp" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.358964 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hn5gq"] Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.359221 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.359680 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8xtg5"] Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.360239 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8xtg5" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.360446 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hn5gq" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.360460 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566718-7hmv9"] Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.369376 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-tvhtn"] Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.369462 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566718-7hmv9" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.369490 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-26svx"] Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.369658 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-4k4dq"] Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.369679 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-fftbt"] Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.369691 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-vn8zj"] Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.370154 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-9lbsk"] Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.370201 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-vn8zj" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.370640 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-m7vd5"] Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.371550 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-brfsp"] Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.372521 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-xwxlw"] Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.374568 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7zqwq"] Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.374730 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.375672 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-7cnpj"] Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.377095 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-2xltn"] Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.378301 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-p46hq"] Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.379504 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tqb9k"] Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.380460 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-kc5xl"] Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.381960 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-xpncn"] Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.382645 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-v79zw"] Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.383923 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-9x2kc"] Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.387612 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-c5vw7"] Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.388629 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-dmmdh"] Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.389929 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-tlkp2"] Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.390939 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-8ttxd"] Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.392332 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-sksdb"] Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.394289 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zbshh"] Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.394316 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-hg8kp"] Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.395274 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-hxtfq"] Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.397964 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6q2px"] Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.398017 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zwht2"] Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.400912 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-n6wkh"] Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.401290 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-4svjh"] Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.403943 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hn5gq"] Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.403988 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-phlbm"] Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.404000 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-qq9xx"] Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.405477 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8xtg5"] Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.407270 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hsr94"] Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.408414 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-cfzr7"] Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.408878 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.416806 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.419863 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566710-vh2t7"] Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.420237 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-cfzr7" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.422711 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-vn8zj"] Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.427849 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566718-7hmv9"] Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.429773 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-rcw2w"] Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.431278 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-cfzr7"] Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.432592 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-p52m7"] Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.433718 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-p52m7" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.435208 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-9cpg7"] Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.435316 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.435796 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-9cpg7" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.455827 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.459486 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0e36c673-8d30-4267-a115-ec3b62ad093a-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-tvhtn\" (UID: \"0e36c673-8d30-4267-a115-ec3b62ad093a\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-tvhtn" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.459603 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/61338f55-8e52-48cb-962d-40aefc460991-serving-cert\") pod \"etcd-operator-b45778765-8ttxd\" (UID: \"61338f55-8e52-48cb-962d-40aefc460991\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8ttxd" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.459671 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2dd2dd89-ba90-440f-abc8-74ab27d7db69-config-volume\") pod \"collect-profiles-29566710-vh2t7\" (UID: \"2dd2dd89-ba90-440f-abc8-74ab27d7db69\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566710-vh2t7" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.459737 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff567791-a234-47f5-8350-154f93477bb9-config\") pod \"apiserver-76f77b778f-qq9xx\" (UID: \"ff567791-a234-47f5-8350-154f93477bb9\") " pod="openshift-apiserver/apiserver-76f77b778f-qq9xx" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.459802 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/2cc53e20-383b-4e3a-a00a-d54ac8272e00-images\") pod \"machine-api-operator-5694c8668f-9lbsk\" (UID: \"2cc53e20-383b-4e3a-a00a-d54ac8272e00\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-9lbsk" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.459903 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49x5z\" (UniqueName: \"kubernetes.io/projected/8a3c049c-4ed1-4b65-9ba9-854aa25764e5-kube-api-access-49x5z\") pod \"dns-operator-744455d44c-fftbt\" (UID: \"8a3c049c-4ed1-4b65-9ba9-854aa25764e5\") " pod="openshift-dns-operator/dns-operator-744455d44c-fftbt" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.459973 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e24b6492-332b-421d-b2cd-ab1f11e06432-etcd-client\") pod \"apiserver-7bbb656c7d-2xltn\" (UID: \"e24b6492-332b-421d-b2cd-ab1f11e06432\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2xltn" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.460043 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/51c43cdc-1f52-4850-9c2d-d317dc38c754-profile-collector-cert\") pod \"catalog-operator-68c6474976-hsr94\" (UID: \"51c43cdc-1f52-4850-9c2d-d317dc38c754\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hsr94" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.460113 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ffd1ef63-d06a-4c98-8644-9efc1e7ae8fa-serving-cert\") pod \"controller-manager-879f6c89f-hg8kp\" (UID: \"ffd1ef63-d06a-4c98-8644-9efc1e7ae8fa\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hg8kp" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.460185 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9cpv\" (UniqueName: \"kubernetes.io/projected/9d26944d-4b2d-4411-aa57-e74df0293000-kube-api-access-z9cpv\") pod \"kube-storage-version-migrator-operator-b67b599dd-p46hq\" (UID: \"9d26944d-4b2d-4411-aa57-e74df0293000\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-p46hq" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.460288 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/10c5886c-4f03-45fb-b71a-a9213027a4fd-webhook-cert\") pod \"packageserver-d55dfcdfc-zbshh\" (UID: \"10c5886c-4f03-45fb-b71a-a9213027a4fd\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zbshh" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.460365 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e24b6492-332b-421d-b2cd-ab1f11e06432-audit-dir\") pod \"apiserver-7bbb656c7d-2xltn\" (UID: \"e24b6492-332b-421d-b2cd-ab1f11e06432\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2xltn" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.460431 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n8rk6\" (UniqueName: \"kubernetes.io/projected/a6110c56-5634-4ef9-92b1-4c7c75dd4986-kube-api-access-n8rk6\") pod \"control-plane-machine-set-operator-78cbb6b69f-v79zw\" (UID: \"a6110c56-5634-4ef9-92b1-4c7c75dd4986\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-v79zw" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.460505 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/a6110c56-5634-4ef9-92b1-4c7c75dd4986-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-v79zw\" (UID: \"a6110c56-5634-4ef9-92b1-4c7c75dd4986\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-v79zw" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.460576 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ff567791-a234-47f5-8350-154f93477bb9-trusted-ca-bundle\") pod \"apiserver-76f77b778f-qq9xx\" (UID: \"ff567791-a234-47f5-8350-154f93477bb9\") " pod="openshift-apiserver/apiserver-76f77b778f-qq9xx" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.460647 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/122fe6aa-0852-4ddd-a678-6ce541c38250-serving-cert\") pod \"console-operator-58897d9998-9x2kc\" (UID: \"122fe6aa-0852-4ddd-a678-6ce541c38250\") " pod="openshift-console-operator/console-operator-58897d9998-9x2kc" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.460739 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e24b6492-332b-421d-b2cd-ab1f11e06432-audit-dir\") pod \"apiserver-7bbb656c7d-2xltn\" (UID: \"e24b6492-332b-421d-b2cd-ab1f11e06432\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2xltn" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.460886 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/554afb3a-2388-4d1f-8949-7c9c9941d685-config\") pod \"machine-approver-56656f9798-9qxnc\" (UID: \"554afb3a-2388-4d1f-8949-7c9c9941d685\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9qxnc" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.461557 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/64222460-9199-40c1-bf14-07be02855e39-signing-key\") pod \"service-ca-9c57cc56f-4svjh\" (UID: \"64222460-9199-40c1-bf14-07be02855e39\") " pod="openshift-service-ca/service-ca-9c57cc56f-4svjh" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.461642 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2dd2dd89-ba90-440f-abc8-74ab27d7db69-secret-volume\") pod \"collect-profiles-29566710-vh2t7\" (UID: \"2dd2dd89-ba90-440f-abc8-74ab27d7db69\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566710-vh2t7" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.461237 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0e36c673-8d30-4267-a115-ec3b62ad093a-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-tvhtn\" (UID: \"0e36c673-8d30-4267-a115-ec3b62ad093a\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-tvhtn" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.461498 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/2cc53e20-383b-4e3a-a00a-d54ac8272e00-images\") pod \"machine-api-operator-5694c8668f-9lbsk\" (UID: \"2cc53e20-383b-4e3a-a00a-d54ac8272e00\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-9lbsk" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.461381 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff567791-a234-47f5-8350-154f93477bb9-config\") pod \"apiserver-76f77b778f-qq9xx\" (UID: \"ff567791-a234-47f5-8350-154f93477bb9\") " pod="openshift-apiserver/apiserver-76f77b778f-qq9xx" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.461393 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/554afb3a-2388-4d1f-8949-7c9c9941d685-config\") pod \"machine-approver-56656f9798-9qxnc\" (UID: \"554afb3a-2388-4d1f-8949-7c9c9941d685\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9qxnc" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.461902 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ff567791-a234-47f5-8350-154f93477bb9-trusted-ca-bundle\") pod \"apiserver-76f77b778f-qq9xx\" (UID: \"ff567791-a234-47f5-8350-154f93477bb9\") " pod="openshift-apiserver/apiserver-76f77b778f-qq9xx" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.462349 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3ec3ee66-08f2-44d5-a414-90354f5a4a9e-serving-cert\") pod \"openshift-config-operator-7777fb866f-4k4dq\" (UID: \"3ec3ee66-08f2-44d5-a414-90354f5a4a9e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-4k4dq" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.462469 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7kmfs\" (UniqueName: \"kubernetes.io/projected/2cc53e20-383b-4e3a-a00a-d54ac8272e00-kube-api-access-7kmfs\") pod \"machine-api-operator-5694c8668f-9lbsk\" (UID: \"2cc53e20-383b-4e3a-a00a-d54ac8272e00\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-9lbsk" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.462545 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/61338f55-8e52-48cb-962d-40aefc460991-etcd-client\") pod \"etcd-operator-b45778765-8ttxd\" (UID: \"61338f55-8e52-48cb-962d-40aefc460991\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8ttxd" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.462617 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xjt6c\" (UniqueName: \"kubernetes.io/projected/c5864ca9-f8a0-40c0-ade0-8d4e6e0a350c-kube-api-access-xjt6c\") pod \"openshift-controller-manager-operator-756b6f6bc6-tqb9k\" (UID: \"c5864ca9-f8a0-40c0-ade0-8d4e6e0a350c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tqb9k" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.463234 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/2c8e6760-4ff7-4417-917e-61bdc115d710-default-certificate\") pod \"router-default-5444994796-rbx59\" (UID: \"2c8e6760-4ff7-4417-917e-61bdc115d710\") " pod="openshift-ingress/router-default-5444994796-rbx59" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.463345 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ng4rk\" (UniqueName: \"kubernetes.io/projected/9abcdd14-d386-4279-9d4c-4a7326a32a11-kube-api-access-ng4rk\") pod \"marketplace-operator-79b997595-xpncn\" (UID: \"9abcdd14-d386-4279-9d4c-4a7326a32a11\") " pod="openshift-marketplace/marketplace-operator-79b997595-xpncn" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.463456 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ff567791-a234-47f5-8350-154f93477bb9-serving-cert\") pod \"apiserver-76f77b778f-qq9xx\" (UID: \"ff567791-a234-47f5-8350-154f93477bb9\") " pod="openshift-apiserver/apiserver-76f77b778f-qq9xx" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.463526 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb3bf250-9fda-4426-baa3-48eed453f90d-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-7zqwq\" (UID: \"eb3bf250-9fda-4426-baa3-48eed453f90d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7zqwq" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.463625 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0e36c673-8d30-4267-a115-ec3b62ad093a-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-tvhtn\" (UID: \"0e36c673-8d30-4267-a115-ec3b62ad093a\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-tvhtn" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.464378 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hp5qr\" (UniqueName: \"kubernetes.io/projected/ffd1ef63-d06a-4c98-8644-9efc1e7ae8fa-kube-api-access-hp5qr\") pod \"controller-manager-879f6c89f-hg8kp\" (UID: \"ffd1ef63-d06a-4c98-8644-9efc1e7ae8fa\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hg8kp" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.464505 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c5864ca9-f8a0-40c0-ade0-8d4e6e0a350c-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-tqb9k\" (UID: \"c5864ca9-f8a0-40c0-ade0-8d4e6e0a350c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tqb9k" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.464626 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7bfg\" (UniqueName: \"kubernetes.io/projected/6c6c249f-695d-4875-94ad-a608e8bd7d5f-kube-api-access-x7bfg\") pod \"cluster-samples-operator-665b6dd947-26svx\" (UID: \"6c6c249f-695d-4875-94ad-a608e8bd7d5f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-26svx" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.464337 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb3bf250-9fda-4426-baa3-48eed453f90d-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-7zqwq\" (UID: \"eb3bf250-9fda-4426-baa3-48eed453f90d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7zqwq" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.464946 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0f21328d-966f-4d28-b80a-c950c1c83b6e-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-6q2px\" (UID: \"0f21328d-966f-4d28-b80a-c950c1c83b6e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6q2px" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.465018 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/ff567791-a234-47f5-8350-154f93477bb9-node-pullsecrets\") pod \"apiserver-76f77b778f-qq9xx\" (UID: \"ff567791-a234-47f5-8350-154f93477bb9\") " pod="openshift-apiserver/apiserver-76f77b778f-qq9xx" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.465179 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/554afb3a-2388-4d1f-8949-7c9c9941d685-auth-proxy-config\") pod \"machine-approver-56656f9798-9qxnc\" (UID: \"554afb3a-2388-4d1f-8949-7c9c9941d685\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9qxnc" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.465261 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2cc53e20-383b-4e3a-a00a-d54ac8272e00-config\") pod \"machine-api-operator-5694c8668f-9lbsk\" (UID: \"2cc53e20-383b-4e3a-a00a-d54ac8272e00\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-9lbsk" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.465336 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-29kvb\" (UniqueName: \"kubernetes.io/projected/122fe6aa-0852-4ddd-a678-6ce541c38250-kube-api-access-29kvb\") pod \"console-operator-58897d9998-9x2kc\" (UID: \"122fe6aa-0852-4ddd-a678-6ce541c38250\") " pod="openshift-console-operator/console-operator-58897d9998-9x2kc" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.465406 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5q9zp\" (UniqueName: \"kubernetes.io/projected/64222460-9199-40c1-bf14-07be02855e39-kube-api-access-5q9zp\") pod \"service-ca-9c57cc56f-4svjh\" (UID: \"64222460-9199-40c1-bf14-07be02855e39\") " pod="openshift-service-ca/service-ca-9c57cc56f-4svjh" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.465155 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/ff567791-a234-47f5-8350-154f93477bb9-node-pullsecrets\") pod \"apiserver-76f77b778f-qq9xx\" (UID: \"ff567791-a234-47f5-8350-154f93477bb9\") " pod="openshift-apiserver/apiserver-76f77b778f-qq9xx" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.466517 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2cc53e20-383b-4e3a-a00a-d54ac8272e00-config\") pod \"machine-api-operator-5694c8668f-9lbsk\" (UID: \"2cc53e20-383b-4e3a-a00a-d54ac8272e00\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-9lbsk" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.466775 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/61338f55-8e52-48cb-962d-40aefc460991-serving-cert\") pod \"etcd-operator-b45778765-8ttxd\" (UID: \"61338f55-8e52-48cb-962d-40aefc460991\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8ttxd" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.466947 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/2c8e6760-4ff7-4417-917e-61bdc115d710-default-certificate\") pod \"router-default-5444994796-rbx59\" (UID: \"2c8e6760-4ff7-4417-917e-61bdc115d710\") " pod="openshift-ingress/router-default-5444994796-rbx59" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.467259 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/122fe6aa-0852-4ddd-a678-6ce541c38250-serving-cert\") pod \"console-operator-58897d9998-9x2kc\" (UID: \"122fe6aa-0852-4ddd-a678-6ce541c38250\") " pod="openshift-console-operator/console-operator-58897d9998-9x2kc" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.467587 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e24b6492-332b-421d-b2cd-ab1f11e06432-etcd-client\") pod \"apiserver-7bbb656c7d-2xltn\" (UID: \"e24b6492-332b-421d-b2cd-ab1f11e06432\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2xltn" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.467762 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ff567791-a234-47f5-8350-154f93477bb9-serving-cert\") pod \"apiserver-76f77b778f-qq9xx\" (UID: \"ff567791-a234-47f5-8350-154f93477bb9\") " pod="openshift-apiserver/apiserver-76f77b778f-qq9xx" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.468780 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3ec3ee66-08f2-44d5-a414-90354f5a4a9e-serving-cert\") pod \"openshift-config-operator-7777fb866f-4k4dq\" (UID: \"3ec3ee66-08f2-44d5-a414-90354f5a4a9e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-4k4dq" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.474071 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2c8e6760-4ff7-4417-917e-61bdc115d710-metrics-certs\") pod \"router-default-5444994796-rbx59\" (UID: \"2c8e6760-4ff7-4417-917e-61bdc115d710\") " pod="openshift-ingress/router-default-5444994796-rbx59" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.474349 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x5ssh\" (UniqueName: \"kubernetes.io/projected/2c8e6760-4ff7-4417-917e-61bdc115d710-kube-api-access-x5ssh\") pod \"router-default-5444994796-rbx59\" (UID: \"2c8e6760-4ff7-4417-917e-61bdc115d710\") " pod="openshift-ingress/router-default-5444994796-rbx59" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.474377 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a32808c8-cba3-478e-bcf0-01434602a895-proxy-tls\") pod \"machine-config-operator-74547568cd-n6wkh\" (UID: \"a32808c8-cba3-478e-bcf0-01434602a895\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-n6wkh" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.474397 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/10c5886c-4f03-45fb-b71a-a9213027a4fd-tmpfs\") pod \"packageserver-d55dfcdfc-zbshh\" (UID: \"10c5886c-4f03-45fb-b71a-a9213027a4fd\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zbshh" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.474432 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/3ec3ee66-08f2-44d5-a414-90354f5a4a9e-available-featuregates\") pod \"openshift-config-operator-7777fb866f-4k4dq\" (UID: \"3ec3ee66-08f2-44d5-a414-90354f5a4a9e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-4k4dq" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.474449 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e24b6492-332b-421d-b2cd-ab1f11e06432-serving-cert\") pod \"apiserver-7bbb656c7d-2xltn\" (UID: \"e24b6492-332b-421d-b2cd-ab1f11e06432\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2xltn" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.474467 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/ff567791-a234-47f5-8350-154f93477bb9-audit\") pod \"apiserver-76f77b778f-qq9xx\" (UID: \"ff567791-a234-47f5-8350-154f93477bb9\") " pod="openshift-apiserver/apiserver-76f77b778f-qq9xx" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.474489 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/51c43cdc-1f52-4850-9c2d-d317dc38c754-srv-cert\") pod \"catalog-operator-68c6474976-hsr94\" (UID: \"51c43cdc-1f52-4850-9c2d-d317dc38c754\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hsr94" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.474509 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ffd1ef63-d06a-4c98-8644-9efc1e7ae8fa-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-hg8kp\" (UID: \"ffd1ef63-d06a-4c98-8644-9efc1e7ae8fa\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hg8kp" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.474529 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d26944d-4b2d-4411-aa57-e74df0293000-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-p46hq\" (UID: \"9d26944d-4b2d-4411-aa57-e74df0293000\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-p46hq" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.474546 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e24b6492-332b-421d-b2cd-ab1f11e06432-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-2xltn\" (UID: \"e24b6492-332b-421d-b2cd-ab1f11e06432\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2xltn" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.474563 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ff567791-a234-47f5-8350-154f93477bb9-etcd-client\") pod \"apiserver-76f77b778f-qq9xx\" (UID: \"ff567791-a234-47f5-8350-154f93477bb9\") " pod="openshift-apiserver/apiserver-76f77b778f-qq9xx" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.474580 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zsn8k\" (UniqueName: \"kubernetes.io/projected/554afb3a-2388-4d1f-8949-7c9c9941d685-kube-api-access-zsn8k\") pod \"machine-approver-56656f9798-9qxnc\" (UID: \"554afb3a-2388-4d1f-8949-7c9c9941d685\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9qxnc" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.474598 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/64222460-9199-40c1-bf14-07be02855e39-signing-cabundle\") pod \"service-ca-9c57cc56f-4svjh\" (UID: \"64222460-9199-40c1-bf14-07be02855e39\") " pod="openshift-service-ca/service-ca-9c57cc56f-4svjh" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.474618 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5864ca9-f8a0-40c0-ade0-8d4e6e0a350c-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-tqb9k\" (UID: \"c5864ca9-f8a0-40c0-ade0-8d4e6e0a350c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tqb9k" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.474645 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ffd1ef63-d06a-4c98-8644-9efc1e7ae8fa-client-ca\") pod \"controller-manager-879f6c89f-hg8kp\" (UID: \"ffd1ef63-d06a-4c98-8644-9efc1e7ae8fa\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hg8kp" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.474663 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/ff567791-a234-47f5-8350-154f93477bb9-etcd-serving-ca\") pod \"apiserver-76f77b778f-qq9xx\" (UID: \"ff567791-a234-47f5-8350-154f93477bb9\") " pod="openshift-apiserver/apiserver-76f77b778f-qq9xx" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.474680 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a32808c8-cba3-478e-bcf0-01434602a895-auth-proxy-config\") pod \"machine-config-operator-74547568cd-n6wkh\" (UID: \"a32808c8-cba3-478e-bcf0-01434602a895\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-n6wkh" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.474701 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wlws\" (UniqueName: \"kubernetes.io/projected/ff567791-a234-47f5-8350-154f93477bb9-kube-api-access-4wlws\") pod \"apiserver-76f77b778f-qq9xx\" (UID: \"ff567791-a234-47f5-8350-154f93477bb9\") " pod="openshift-apiserver/apiserver-76f77b778f-qq9xx" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.474719 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zlb6s\" (UniqueName: \"kubernetes.io/projected/0e36c673-8d30-4267-a115-ec3b62ad093a-kube-api-access-zlb6s\") pod \"cluster-image-registry-operator-dc59b4c8b-tvhtn\" (UID: \"0e36c673-8d30-4267-a115-ec3b62ad093a\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-tvhtn" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.474738 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/2c8e6760-4ff7-4417-917e-61bdc115d710-stats-auth\") pod \"router-default-5444994796-rbx59\" (UID: \"2c8e6760-4ff7-4417-917e-61bdc115d710\") " pod="openshift-ingress/router-default-5444994796-rbx59" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.474760 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6f7tz\" (UniqueName: \"kubernetes.io/projected/e24b6492-332b-421d-b2cd-ab1f11e06432-kube-api-access-6f7tz\") pod \"apiserver-7bbb656c7d-2xltn\" (UID: \"e24b6492-332b-421d-b2cd-ab1f11e06432\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2xltn" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.474776 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/e24b6492-332b-421d-b2cd-ab1f11e06432-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-2xltn\" (UID: \"e24b6492-332b-421d-b2cd-ab1f11e06432\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2xltn" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.474797 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/2cc53e20-383b-4e3a-a00a-d54ac8272e00-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-9lbsk\" (UID: \"2cc53e20-383b-4e3a-a00a-d54ac8272e00\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-9lbsk" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.474821 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/554afb3a-2388-4d1f-8949-7c9c9941d685-machine-approver-tls\") pod \"machine-approver-56656f9798-9qxnc\" (UID: \"554afb3a-2388-4d1f-8949-7c9c9941d685\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9qxnc" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.474863 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/0e36c673-8d30-4267-a115-ec3b62ad093a-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-tvhtn\" (UID: \"0e36c673-8d30-4267-a115-ec3b62ad093a\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-tvhtn" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.474888 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nbbgb\" (UniqueName: \"kubernetes.io/projected/3ec3ee66-08f2-44d5-a414-90354f5a4a9e-kube-api-access-nbbgb\") pod \"openshift-config-operator-7777fb866f-4k4dq\" (UID: \"3ec3ee66-08f2-44d5-a414-90354f5a4a9e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-4k4dq" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.474909 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxwbs\" (UniqueName: \"kubernetes.io/projected/2dd2dd89-ba90-440f-abc8-74ab27d7db69-kube-api-access-sxwbs\") pod \"collect-profiles-29566710-vh2t7\" (UID: \"2dd2dd89-ba90-440f-abc8-74ab27d7db69\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566710-vh2t7" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.474925 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ff567791-a234-47f5-8350-154f93477bb9-audit-dir\") pod \"apiserver-76f77b778f-qq9xx\" (UID: \"ff567791-a234-47f5-8350-154f93477bb9\") " pod="openshift-apiserver/apiserver-76f77b778f-qq9xx" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.474942 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wng4v\" (UniqueName: \"kubernetes.io/projected/61338f55-8e52-48cb-962d-40aefc460991-kube-api-access-wng4v\") pod \"etcd-operator-b45778765-8ttxd\" (UID: \"61338f55-8e52-48cb-962d-40aefc460991\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8ttxd" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.474959 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m8ch9\" (UniqueName: \"kubernetes.io/projected/0a5bf285-90ef-47c5-a959-7bae63c410a5-kube-api-access-m8ch9\") pod \"downloads-7954f5f757-hxtfq\" (UID: \"0a5bf285-90ef-47c5-a959-7bae63c410a5\") " pod="openshift-console/downloads-7954f5f757-hxtfq" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.474978 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/9abcdd14-d386-4279-9d4c-4a7326a32a11-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-xpncn\" (UID: \"9abcdd14-d386-4279-9d4c-4a7326a32a11\") " pod="openshift-marketplace/marketplace-operator-79b997595-xpncn" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.475000 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ffd1ef63-d06a-4c98-8644-9efc1e7ae8fa-config\") pod \"controller-manager-879f6c89f-hg8kp\" (UID: \"ffd1ef63-d06a-4c98-8644-9efc1e7ae8fa\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hg8kp" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.475026 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e24b6492-332b-421d-b2cd-ab1f11e06432-audit-policies\") pod \"apiserver-7bbb656c7d-2xltn\" (UID: \"e24b6492-332b-421d-b2cd-ab1f11e06432\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2xltn" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.475048 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f21328d-966f-4d28-b80a-c950c1c83b6e-config\") pod \"kube-apiserver-operator-766d6c64bb-6q2px\" (UID: \"0f21328d-966f-4d28-b80a-c950c1c83b6e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6q2px" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.475067 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/a32808c8-cba3-478e-bcf0-01434602a895-images\") pod \"machine-config-operator-74547568cd-n6wkh\" (UID: \"a32808c8-cba3-478e-bcf0-01434602a895\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-n6wkh" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.475088 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gmk9t\" (UniqueName: \"kubernetes.io/projected/51c43cdc-1f52-4850-9c2d-d317dc38c754-kube-api-access-gmk9t\") pod \"catalog-operator-68c6474976-hsr94\" (UID: \"51c43cdc-1f52-4850-9c2d-d317dc38c754\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hsr94" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.475106 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cb2j6\" (UniqueName: \"kubernetes.io/projected/10c5886c-4f03-45fb-b71a-a9213027a4fd-kube-api-access-cb2j6\") pod \"packageserver-d55dfcdfc-zbshh\" (UID: \"10c5886c-4f03-45fb-b71a-a9213027a4fd\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zbshh" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.475123 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/e24b6492-332b-421d-b2cd-ab1f11e06432-encryption-config\") pod \"apiserver-7bbb656c7d-2xltn\" (UID: \"e24b6492-332b-421d-b2cd-ab1f11e06432\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2xltn" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.475140 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/122fe6aa-0852-4ddd-a678-6ce541c38250-config\") pod \"console-operator-58897d9998-9x2kc\" (UID: \"122fe6aa-0852-4ddd-a678-6ce541c38250\") " pod="openshift-console-operator/console-operator-58897d9998-9x2kc" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.475159 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/61338f55-8e52-48cb-962d-40aefc460991-etcd-service-ca\") pod \"etcd-operator-b45778765-8ttxd\" (UID: \"61338f55-8e52-48cb-962d-40aefc460991\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8ttxd" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.475179 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/10c5886c-4f03-45fb-b71a-a9213027a4fd-apiservice-cert\") pod \"packageserver-d55dfcdfc-zbshh\" (UID: \"10c5886c-4f03-45fb-b71a-a9213027a4fd\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zbshh" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.475220 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eb3bf250-9fda-4426-baa3-48eed453f90d-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-7zqwq\" (UID: \"eb3bf250-9fda-4426-baa3-48eed453f90d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7zqwq" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.475238 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0f21328d-966f-4d28-b80a-c950c1c83b6e-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-6q2px\" (UID: \"0f21328d-966f-4d28-b80a-c950c1c83b6e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6q2px" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.475259 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9abcdd14-d386-4279-9d4c-4a7326a32a11-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-xpncn\" (UID: \"9abcdd14-d386-4279-9d4c-4a7326a32a11\") " pod="openshift-marketplace/marketplace-operator-79b997595-xpncn" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.475275 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/ff567791-a234-47f5-8350-154f93477bb9-image-import-ca\") pod \"apiserver-76f77b778f-qq9xx\" (UID: \"ff567791-a234-47f5-8350-154f93477bb9\") " pod="openshift-apiserver/apiserver-76f77b778f-qq9xx" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.475292 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2c8e6760-4ff7-4417-917e-61bdc115d710-service-ca-bundle\") pod \"router-default-5444994796-rbx59\" (UID: \"2c8e6760-4ff7-4417-917e-61bdc115d710\") " pod="openshift-ingress/router-default-5444994796-rbx59" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.475293 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/ff567791-a234-47f5-8350-154f93477bb9-audit\") pod \"apiserver-76f77b778f-qq9xx\" (UID: \"ff567791-a234-47f5-8350-154f93477bb9\") " pod="openshift-apiserver/apiserver-76f77b778f-qq9xx" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.475309 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27knb\" (UniqueName: \"kubernetes.io/projected/a32808c8-cba3-478e-bcf0-01434602a895-kube-api-access-27knb\") pod \"machine-config-operator-74547568cd-n6wkh\" (UID: \"a32808c8-cba3-478e-bcf0-01434602a895\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-n6wkh" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.475330 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/ff567791-a234-47f5-8350-154f93477bb9-encryption-config\") pod \"apiserver-76f77b778f-qq9xx\" (UID: \"ff567791-a234-47f5-8350-154f93477bb9\") " pod="openshift-apiserver/apiserver-76f77b778f-qq9xx" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.475348 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/eb3bf250-9fda-4426-baa3-48eed453f90d-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-7zqwq\" (UID: \"eb3bf250-9fda-4426-baa3-48eed453f90d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7zqwq" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.475378 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/122fe6aa-0852-4ddd-a678-6ce541c38250-trusted-ca\") pod \"console-operator-58897d9998-9x2kc\" (UID: \"122fe6aa-0852-4ddd-a678-6ce541c38250\") " pod="openshift-console-operator/console-operator-58897d9998-9x2kc" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.475393 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/61338f55-8e52-48cb-962d-40aefc460991-config\") pod \"etcd-operator-b45778765-8ttxd\" (UID: \"61338f55-8e52-48cb-962d-40aefc460991\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8ttxd" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.475407 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/61338f55-8e52-48cb-962d-40aefc460991-etcd-ca\") pod \"etcd-operator-b45778765-8ttxd\" (UID: \"61338f55-8e52-48cb-962d-40aefc460991\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8ttxd" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.475426 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d26944d-4b2d-4411-aa57-e74df0293000-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-p46hq\" (UID: \"9d26944d-4b2d-4411-aa57-e74df0293000\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-p46hq" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.475451 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/6c6c249f-695d-4875-94ad-a608e8bd7d5f-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-26svx\" (UID: \"6c6c249f-695d-4875-94ad-a608e8bd7d5f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-26svx" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.475466 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8a3c049c-4ed1-4b65-9ba9-854aa25764e5-metrics-tls\") pod \"dns-operator-744455d44c-fftbt\" (UID: \"8a3c049c-4ed1-4b65-9ba9-854aa25764e5\") " pod="openshift-dns-operator/dns-operator-744455d44c-fftbt" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.476248 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c5864ca9-f8a0-40c0-ade0-8d4e6e0a350c-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-tqb9k\" (UID: \"c5864ca9-f8a0-40c0-ade0-8d4e6e0a350c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tqb9k" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.476305 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e24b6492-332b-421d-b2cd-ab1f11e06432-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-2xltn\" (UID: \"e24b6492-332b-421d-b2cd-ab1f11e06432\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2xltn" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.476350 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/ff567791-a234-47f5-8350-154f93477bb9-etcd-serving-ca\") pod \"apiserver-76f77b778f-qq9xx\" (UID: \"ff567791-a234-47f5-8350-154f93477bb9\") " pod="openshift-apiserver/apiserver-76f77b778f-qq9xx" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.476818 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/61338f55-8e52-48cb-962d-40aefc460991-etcd-service-ca\") pod \"etcd-operator-b45778765-8ttxd\" (UID: \"61338f55-8e52-48cb-962d-40aefc460991\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8ttxd" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.474769 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/3ec3ee66-08f2-44d5-a414-90354f5a4a9e-available-featuregates\") pod \"openshift-config-operator-7777fb866f-4k4dq\" (UID: \"3ec3ee66-08f2-44d5-a414-90354f5a4a9e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-4k4dq" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.477517 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2c8e6760-4ff7-4417-917e-61bdc115d710-service-ca-bundle\") pod \"router-default-5444994796-rbx59\" (UID: \"2c8e6760-4ff7-4417-917e-61bdc115d710\") " pod="openshift-ingress/router-default-5444994796-rbx59" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.477596 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/61338f55-8e52-48cb-962d-40aefc460991-config\") pod \"etcd-operator-b45778765-8ttxd\" (UID: \"61338f55-8e52-48cb-962d-40aefc460991\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8ttxd" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.477650 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ff567791-a234-47f5-8350-154f93477bb9-audit-dir\") pod \"apiserver-76f77b778f-qq9xx\" (UID: \"ff567791-a234-47f5-8350-154f93477bb9\") " pod="openshift-apiserver/apiserver-76f77b778f-qq9xx" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.478283 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/ff567791-a234-47f5-8350-154f93477bb9-image-import-ca\") pod \"apiserver-76f77b778f-qq9xx\" (UID: \"ff567791-a234-47f5-8350-154f93477bb9\") " pod="openshift-apiserver/apiserver-76f77b778f-qq9xx" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.478456 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e24b6492-332b-421d-b2cd-ab1f11e06432-audit-policies\") pod \"apiserver-7bbb656c7d-2xltn\" (UID: \"e24b6492-332b-421d-b2cd-ab1f11e06432\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2xltn" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.478613 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/61338f55-8e52-48cb-962d-40aefc460991-etcd-ca\") pod \"etcd-operator-b45778765-8ttxd\" (UID: \"61338f55-8e52-48cb-962d-40aefc460991\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8ttxd" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.478683 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/e24b6492-332b-421d-b2cd-ab1f11e06432-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-2xltn\" (UID: \"e24b6492-332b-421d-b2cd-ab1f11e06432\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2xltn" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.479228 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2c8e6760-4ff7-4417-917e-61bdc115d710-metrics-certs\") pod \"router-default-5444994796-rbx59\" (UID: \"2c8e6760-4ff7-4417-917e-61bdc115d710\") " pod="openshift-ingress/router-default-5444994796-rbx59" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.479247 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5864ca9-f8a0-40c0-ade0-8d4e6e0a350c-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-tqb9k\" (UID: \"c5864ca9-f8a0-40c0-ade0-8d4e6e0a350c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tqb9k" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.479896 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/0e36c673-8d30-4267-a115-ec3b62ad093a-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-tvhtn\" (UID: \"0e36c673-8d30-4267-a115-ec3b62ad093a\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-tvhtn" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.480353 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8a3c049c-4ed1-4b65-9ba9-854aa25764e5-metrics-tls\") pod \"dns-operator-744455d44c-fftbt\" (UID: \"8a3c049c-4ed1-4b65-9ba9-854aa25764e5\") " pod="openshift-dns-operator/dns-operator-744455d44c-fftbt" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.480643 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e24b6492-332b-421d-b2cd-ab1f11e06432-serving-cert\") pod \"apiserver-7bbb656c7d-2xltn\" (UID: \"e24b6492-332b-421d-b2cd-ab1f11e06432\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2xltn" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.480743 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/122fe6aa-0852-4ddd-a678-6ce541c38250-trusted-ca\") pod \"console-operator-58897d9998-9x2kc\" (UID: \"122fe6aa-0852-4ddd-a678-6ce541c38250\") " pod="openshift-console-operator/console-operator-58897d9998-9x2kc" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.481462 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/122fe6aa-0852-4ddd-a678-6ce541c38250-config\") pod \"console-operator-58897d9998-9x2kc\" (UID: \"122fe6aa-0852-4ddd-a678-6ce541c38250\") " pod="openshift-console-operator/console-operator-58897d9998-9x2kc" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.482198 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ff567791-a234-47f5-8350-154f93477bb9-etcd-client\") pod \"apiserver-76f77b778f-qq9xx\" (UID: \"ff567791-a234-47f5-8350-154f93477bb9\") " pod="openshift-apiserver/apiserver-76f77b778f-qq9xx" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.482325 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/6c6c249f-695d-4875-94ad-a608e8bd7d5f-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-26svx\" (UID: \"6c6c249f-695d-4875-94ad-a608e8bd7d5f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-26svx" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.482556 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/61338f55-8e52-48cb-962d-40aefc460991-etcd-client\") pod \"etcd-operator-b45778765-8ttxd\" (UID: \"61338f55-8e52-48cb-962d-40aefc460991\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8ttxd" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.482576 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/ff567791-a234-47f5-8350-154f93477bb9-encryption-config\") pod \"apiserver-76f77b778f-qq9xx\" (UID: \"ff567791-a234-47f5-8350-154f93477bb9\") " pod="openshift-apiserver/apiserver-76f77b778f-qq9xx" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.482747 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/e24b6492-332b-421d-b2cd-ab1f11e06432-encryption-config\") pod \"apiserver-7bbb656c7d-2xltn\" (UID: \"e24b6492-332b-421d-b2cd-ab1f11e06432\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2xltn" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.482951 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/554afb3a-2388-4d1f-8949-7c9c9941d685-machine-approver-tls\") pod \"machine-approver-56656f9798-9qxnc\" (UID: \"554afb3a-2388-4d1f-8949-7c9c9941d685\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9qxnc" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.483218 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eb3bf250-9fda-4426-baa3-48eed453f90d-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-7zqwq\" (UID: \"eb3bf250-9fda-4426-baa3-48eed453f90d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7zqwq" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.483827 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/2cc53e20-383b-4e3a-a00a-d54ac8272e00-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-9lbsk\" (UID: \"2cc53e20-383b-4e3a-a00a-d54ac8272e00\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-9lbsk" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.484890 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/2c8e6760-4ff7-4417-917e-61bdc115d710-stats-auth\") pod \"router-default-5444994796-rbx59\" (UID: \"2c8e6760-4ff7-4417-917e-61bdc115d710\") " pod="openshift-ingress/router-default-5444994796-rbx59" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.485498 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.496457 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.508766 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-dwfwm"] Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.509641 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-dwfwm" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.514521 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.522946 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-dwfwm"] Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.554820 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.575652 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.576240 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/a32808c8-cba3-478e-bcf0-01434602a895-images\") pod \"machine-config-operator-74547568cd-n6wkh\" (UID: \"a32808c8-cba3-478e-bcf0-01434602a895\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-n6wkh" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.576302 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cb2j6\" (UniqueName: \"kubernetes.io/projected/10c5886c-4f03-45fb-b71a-a9213027a4fd-kube-api-access-cb2j6\") pod \"packageserver-d55dfcdfc-zbshh\" (UID: \"10c5886c-4f03-45fb-b71a-a9213027a4fd\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zbshh" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.576322 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/10c5886c-4f03-45fb-b71a-a9213027a4fd-apiservice-cert\") pod \"packageserver-d55dfcdfc-zbshh\" (UID: \"10c5886c-4f03-45fb-b71a-a9213027a4fd\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zbshh" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.576353 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27knb\" (UniqueName: \"kubernetes.io/projected/a32808c8-cba3-478e-bcf0-01434602a895-kube-api-access-27knb\") pod \"machine-config-operator-74547568cd-n6wkh\" (UID: \"a32808c8-cba3-478e-bcf0-01434602a895\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-n6wkh" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.576389 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d26944d-4b2d-4411-aa57-e74df0293000-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-p46hq\" (UID: \"9d26944d-4b2d-4411-aa57-e74df0293000\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-p46hq" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.576425 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z9cpv\" (UniqueName: \"kubernetes.io/projected/9d26944d-4b2d-4411-aa57-e74df0293000-kube-api-access-z9cpv\") pod \"kube-storage-version-migrator-operator-b67b599dd-p46hq\" (UID: \"9d26944d-4b2d-4411-aa57-e74df0293000\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-p46hq" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.576442 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/10c5886c-4f03-45fb-b71a-a9213027a4fd-webhook-cert\") pod \"packageserver-d55dfcdfc-zbshh\" (UID: \"10c5886c-4f03-45fb-b71a-a9213027a4fd\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zbshh" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.576493 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ng4rk\" (UniqueName: \"kubernetes.io/projected/9abcdd14-d386-4279-9d4c-4a7326a32a11-kube-api-access-ng4rk\") pod \"marketplace-operator-79b997595-xpncn\" (UID: \"9abcdd14-d386-4279-9d4c-4a7326a32a11\") " pod="openshift-marketplace/marketplace-operator-79b997595-xpncn" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.576551 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a32808c8-cba3-478e-bcf0-01434602a895-proxy-tls\") pod \"machine-config-operator-74547568cd-n6wkh\" (UID: \"a32808c8-cba3-478e-bcf0-01434602a895\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-n6wkh" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.576566 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/10c5886c-4f03-45fb-b71a-a9213027a4fd-tmpfs\") pod \"packageserver-d55dfcdfc-zbshh\" (UID: \"10c5886c-4f03-45fb-b71a-a9213027a4fd\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zbshh" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.576605 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d26944d-4b2d-4411-aa57-e74df0293000-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-p46hq\" (UID: \"9d26944d-4b2d-4411-aa57-e74df0293000\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-p46hq" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.576640 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a32808c8-cba3-478e-bcf0-01434602a895-auth-proxy-config\") pod \"machine-config-operator-74547568cd-n6wkh\" (UID: \"a32808c8-cba3-478e-bcf0-01434602a895\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-n6wkh" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.576687 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/9abcdd14-d386-4279-9d4c-4a7326a32a11-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-xpncn\" (UID: \"9abcdd14-d386-4279-9d4c-4a7326a32a11\") " pod="openshift-marketplace/marketplace-operator-79b997595-xpncn" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.578165 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d26944d-4b2d-4411-aa57-e74df0293000-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-p46hq\" (UID: \"9d26944d-4b2d-4411-aa57-e74df0293000\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-p46hq" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.578254 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/10c5886c-4f03-45fb-b71a-a9213027a4fd-tmpfs\") pod \"packageserver-d55dfcdfc-zbshh\" (UID: \"10c5886c-4f03-45fb-b71a-a9213027a4fd\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zbshh" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.578697 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a32808c8-cba3-478e-bcf0-01434602a895-auth-proxy-config\") pod \"machine-config-operator-74547568cd-n6wkh\" (UID: \"a32808c8-cba3-478e-bcf0-01434602a895\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-n6wkh" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.601466 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.616779 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.635115 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.637331 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/554afb3a-2388-4d1f-8949-7c9c9941d685-auth-proxy-config\") pod \"machine-approver-56656f9798-9qxnc\" (UID: \"554afb3a-2388-4d1f-8949-7c9c9941d685\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9qxnc" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.641233 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d26944d-4b2d-4411-aa57-e74df0293000-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-p46hq\" (UID: \"9d26944d-4b2d-4411-aa57-e74df0293000\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-p46hq" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.654927 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.675661 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.695402 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.715529 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.735252 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.747748 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0f21328d-966f-4d28-b80a-c950c1c83b6e-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-6q2px\" (UID: \"0f21328d-966f-4d28-b80a-c950c1c83b6e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6q2px" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.756035 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.760478 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f21328d-966f-4d28-b80a-c950c1c83b6e-config\") pod \"kube-apiserver-operator-766d6c64bb-6q2px\" (UID: \"0f21328d-966f-4d28-b80a-c950c1c83b6e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6q2px" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.775745 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.796018 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.815406 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.847519 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.855792 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.859246 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9abcdd14-d386-4279-9d4c-4a7326a32a11-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-xpncn\" (UID: \"9abcdd14-d386-4279-9d4c-4a7326a32a11\") " pod="openshift-marketplace/marketplace-operator-79b997595-xpncn" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.874925 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.896013 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.917088 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.937343 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.956356 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.962504 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/9abcdd14-d386-4279-9d4c-4a7326a32a11-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-xpncn\" (UID: \"9abcdd14-d386-4279-9d4c-4a7326a32a11\") " pod="openshift-marketplace/marketplace-operator-79b997595-xpncn" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.977677 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 20 10:38:13 crc kubenswrapper[4748]: I0320 10:38:13.995681 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 20 10:38:14 crc kubenswrapper[4748]: I0320 10:38:14.016045 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 20 10:38:14 crc kubenswrapper[4748]: I0320 10:38:14.036057 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 20 10:38:14 crc kubenswrapper[4748]: I0320 10:38:14.055221 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 20 10:38:14 crc kubenswrapper[4748]: I0320 10:38:14.057504 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/a32808c8-cba3-478e-bcf0-01434602a895-images\") pod \"machine-config-operator-74547568cd-n6wkh\" (UID: \"a32808c8-cba3-478e-bcf0-01434602a895\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-n6wkh" Mar 20 10:38:14 crc kubenswrapper[4748]: I0320 10:38:14.075729 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 20 10:38:14 crc kubenswrapper[4748]: I0320 10:38:14.096594 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 20 10:38:14 crc kubenswrapper[4748]: I0320 10:38:14.102200 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a32808c8-cba3-478e-bcf0-01434602a895-proxy-tls\") pod \"machine-config-operator-74547568cd-n6wkh\" (UID: \"a32808c8-cba3-478e-bcf0-01434602a895\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-n6wkh" Mar 20 10:38:14 crc kubenswrapper[4748]: I0320 10:38:14.115640 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 20 10:38:14 crc kubenswrapper[4748]: I0320 10:38:14.135353 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 20 10:38:14 crc kubenswrapper[4748]: I0320 10:38:14.156468 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 20 10:38:14 crc kubenswrapper[4748]: I0320 10:38:14.167224 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/64222460-9199-40c1-bf14-07be02855e39-signing-key\") pod \"service-ca-9c57cc56f-4svjh\" (UID: \"64222460-9199-40c1-bf14-07be02855e39\") " pod="openshift-service-ca/service-ca-9c57cc56f-4svjh" Mar 20 10:38:14 crc kubenswrapper[4748]: I0320 10:38:14.174719 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 20 10:38:14 crc kubenswrapper[4748]: I0320 10:38:14.178621 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/64222460-9199-40c1-bf14-07be02855e39-signing-cabundle\") pod \"service-ca-9c57cc56f-4svjh\" (UID: \"64222460-9199-40c1-bf14-07be02855e39\") " pod="openshift-service-ca/service-ca-9c57cc56f-4svjh" Mar 20 10:38:14 crc kubenswrapper[4748]: I0320 10:38:14.195971 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 20 10:38:14 crc kubenswrapper[4748]: I0320 10:38:14.215634 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 20 10:38:14 crc kubenswrapper[4748]: I0320 10:38:14.235584 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 20 10:38:14 crc kubenswrapper[4748]: I0320 10:38:14.255978 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 20 10:38:14 crc kubenswrapper[4748]: I0320 10:38:14.275885 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 20 10:38:14 crc kubenswrapper[4748]: I0320 10:38:14.296000 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 20 10:38:14 crc kubenswrapper[4748]: I0320 10:38:14.315365 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 20 10:38:14 crc kubenswrapper[4748]: I0320 10:38:14.333885 4748 request.go:700] Waited for 1.014805077s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-console/configmaps?fieldSelector=metadata.name%3Dtrusted-ca-bundle&limit=500&resourceVersion=0 Mar 20 10:38:14 crc kubenswrapper[4748]: I0320 10:38:14.349341 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 20 10:38:14 crc kubenswrapper[4748]: I0320 10:38:14.355326 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 20 10:38:14 crc kubenswrapper[4748]: I0320 10:38:14.378385 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 20 10:38:14 crc kubenswrapper[4748]: I0320 10:38:14.389111 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:38:14 crc kubenswrapper[4748]: I0320 10:38:14.389296 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:38:14 crc kubenswrapper[4748]: I0320 10:38:14.389327 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:38:14 crc kubenswrapper[4748]: I0320 10:38:14.389481 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:38:14 crc kubenswrapper[4748]: I0320 10:38:14.389566 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:38:14 crc kubenswrapper[4748]: E0320 10:38:14.389922 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:38:46.389872364 +0000 UTC m=+161.531418218 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:38:14 crc kubenswrapper[4748]: I0320 10:38:14.390915 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:38:14 crc kubenswrapper[4748]: I0320 10:38:14.395065 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:38:14 crc kubenswrapper[4748]: I0320 10:38:14.395698 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:38:14 crc kubenswrapper[4748]: I0320 10:38:14.396266 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 20 10:38:14 crc kubenswrapper[4748]: I0320 10:38:14.397883 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:38:14 crc kubenswrapper[4748]: I0320 10:38:14.416764 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 20 10:38:14 crc kubenswrapper[4748]: I0320 10:38:14.425118 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/a6110c56-5634-4ef9-92b1-4c7c75dd4986-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-v79zw\" (UID: \"a6110c56-5634-4ef9-92b1-4c7c75dd4986\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-v79zw" Mar 20 10:38:14 crc kubenswrapper[4748]: I0320 10:38:14.445342 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 20 10:38:14 crc kubenswrapper[4748]: I0320 10:38:14.454772 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 20 10:38:14 crc kubenswrapper[4748]: E0320 10:38:14.460286 4748 configmap.go:193] Couldn't get configMap openshift-operator-lifecycle-manager/collect-profiles-config: failed to sync configmap cache: timed out waiting for the condition Mar 20 10:38:14 crc kubenswrapper[4748]: E0320 10:38:14.460382 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/2dd2dd89-ba90-440f-abc8-74ab27d7db69-config-volume podName:2dd2dd89-ba90-440f-abc8-74ab27d7db69 nodeName:}" failed. No retries permitted until 2026-03-20 10:38:14.960357595 +0000 UTC m=+130.101903419 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-volume" (UniqueName: "kubernetes.io/configmap/2dd2dd89-ba90-440f-abc8-74ab27d7db69-config-volume") pod "collect-profiles-29566710-vh2t7" (UID: "2dd2dd89-ba90-440f-abc8-74ab27d7db69") : failed to sync configmap cache: timed out waiting for the condition Mar 20 10:38:14 crc kubenswrapper[4748]: E0320 10:38:14.460533 4748 secret.go:188] Couldn't get secret openshift-controller-manager/serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 20 10:38:14 crc kubenswrapper[4748]: E0320 10:38:14.460592 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ffd1ef63-d06a-4c98-8644-9efc1e7ae8fa-serving-cert podName:ffd1ef63-d06a-4c98-8644-9efc1e7ae8fa nodeName:}" failed. No retries permitted until 2026-03-20 10:38:14.96057639 +0000 UTC m=+130.102122214 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/ffd1ef63-d06a-4c98-8644-9efc1e7ae8fa-serving-cert") pod "controller-manager-879f6c89f-hg8kp" (UID: "ffd1ef63-d06a-4c98-8644-9efc1e7ae8fa") : failed to sync secret cache: timed out waiting for the condition Mar 20 10:38:14 crc kubenswrapper[4748]: E0320 10:38:14.460861 4748 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/pprof-cert: failed to sync secret cache: timed out waiting for the condition Mar 20 10:38:14 crc kubenswrapper[4748]: E0320 10:38:14.461045 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/51c43cdc-1f52-4850-9c2d-d317dc38c754-profile-collector-cert podName:51c43cdc-1f52-4850-9c2d-d317dc38c754 nodeName:}" failed. No retries permitted until 2026-03-20 10:38:14.961024301 +0000 UTC m=+130.102570125 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "profile-collector-cert" (UniqueName: "kubernetes.io/secret/51c43cdc-1f52-4850-9c2d-d317dc38c754-profile-collector-cert") pod "catalog-operator-68c6474976-hsr94" (UID: "51c43cdc-1f52-4850-9c2d-d317dc38c754") : failed to sync secret cache: timed out waiting for the condition Mar 20 10:38:14 crc kubenswrapper[4748]: E0320 10:38:14.462475 4748 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/pprof-cert: failed to sync secret cache: timed out waiting for the condition Mar 20 10:38:14 crc kubenswrapper[4748]: E0320 10:38:14.462589 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2dd2dd89-ba90-440f-abc8-74ab27d7db69-secret-volume podName:2dd2dd89-ba90-440f-abc8-74ab27d7db69 nodeName:}" failed. No retries permitted until 2026-03-20 10:38:14.962560428 +0000 UTC m=+130.104106282 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-volume" (UniqueName: "kubernetes.io/secret/2dd2dd89-ba90-440f-abc8-74ab27d7db69-secret-volume") pod "collect-profiles-29566710-vh2t7" (UID: "2dd2dd89-ba90-440f-abc8-74ab27d7db69") : failed to sync secret cache: timed out waiting for the condition Mar 20 10:38:14 crc kubenswrapper[4748]: E0320 10:38:14.475173 4748 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 20 10:38:14 crc kubenswrapper[4748]: E0320 10:38:14.475286 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/51c43cdc-1f52-4850-9c2d-d317dc38c754-srv-cert podName:51c43cdc-1f52-4850-9c2d-d317dc38c754 nodeName:}" failed. No retries permitted until 2026-03-20 10:38:14.975258756 +0000 UTC m=+130.116804590 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/51c43cdc-1f52-4850-9c2d-d317dc38c754-srv-cert") pod "catalog-operator-68c6474976-hsr94" (UID: "51c43cdc-1f52-4850-9c2d-d317dc38c754") : failed to sync secret cache: timed out waiting for the condition Mar 20 10:38:14 crc kubenswrapper[4748]: E0320 10:38:14.475611 4748 configmap.go:193] Couldn't get configMap openshift-controller-manager/openshift-global-ca: failed to sync configmap cache: timed out waiting for the condition Mar 20 10:38:14 crc kubenswrapper[4748]: E0320 10:38:14.475813 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ffd1ef63-d06a-4c98-8644-9efc1e7ae8fa-proxy-ca-bundles podName:ffd1ef63-d06a-4c98-8644-9efc1e7ae8fa nodeName:}" failed. No retries permitted until 2026-03-20 10:38:14.975787979 +0000 UTC m=+130.117333883 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-ca-bundles" (UniqueName: "kubernetes.io/configmap/ffd1ef63-d06a-4c98-8644-9efc1e7ae8fa-proxy-ca-bundles") pod "controller-manager-879f6c89f-hg8kp" (UID: "ffd1ef63-d06a-4c98-8644-9efc1e7ae8fa") : failed to sync configmap cache: timed out waiting for the condition Mar 20 10:38:14 crc kubenswrapper[4748]: I0320 10:38:14.476305 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 20 10:38:14 crc kubenswrapper[4748]: E0320 10:38:14.477933 4748 configmap.go:193] Couldn't get configMap openshift-controller-manager/config: failed to sync configmap cache: timed out waiting for the condition Mar 20 10:38:14 crc kubenswrapper[4748]: E0320 10:38:14.478090 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ffd1ef63-d06a-4c98-8644-9efc1e7ae8fa-config podName:ffd1ef63-d06a-4c98-8644-9efc1e7ae8fa nodeName:}" failed. No retries permitted until 2026-03-20 10:38:14.978076775 +0000 UTC m=+130.119622599 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/ffd1ef63-d06a-4c98-8644-9efc1e7ae8fa-config") pod "controller-manager-879f6c89f-hg8kp" (UID: "ffd1ef63-d06a-4c98-8644-9efc1e7ae8fa") : failed to sync configmap cache: timed out waiting for the condition Mar 20 10:38:14 crc kubenswrapper[4748]: E0320 10:38:14.479003 4748 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: failed to sync configmap cache: timed out waiting for the condition Mar 20 10:38:14 crc kubenswrapper[4748]: E0320 10:38:14.479190 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ffd1ef63-d06a-4c98-8644-9efc1e7ae8fa-client-ca podName:ffd1ef63-d06a-4c98-8644-9efc1e7ae8fa nodeName:}" failed. No retries permitted until 2026-03-20 10:38:14.979173331 +0000 UTC m=+130.120719245 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/ffd1ef63-d06a-4c98-8644-9efc1e7ae8fa-client-ca") pod "controller-manager-879f6c89f-hg8kp" (UID: "ffd1ef63-d06a-4c98-8644-9efc1e7ae8fa") : failed to sync configmap cache: timed out waiting for the condition Mar 20 10:38:14 crc kubenswrapper[4748]: I0320 10:38:14.495863 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 20 10:38:14 crc kubenswrapper[4748]: I0320 10:38:14.515131 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 20 10:38:14 crc kubenswrapper[4748]: I0320 10:38:14.535534 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 20 10:38:14 crc kubenswrapper[4748]: I0320 10:38:14.556521 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 20 10:38:14 crc kubenswrapper[4748]: I0320 10:38:14.575781 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 20 10:38:14 crc kubenswrapper[4748]: E0320 10:38:14.578026 4748 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/packageserver-service-cert: failed to sync secret cache: timed out waiting for the condition Mar 20 10:38:14 crc kubenswrapper[4748]: E0320 10:38:14.578105 4748 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/packageserver-service-cert: failed to sync secret cache: timed out waiting for the condition Mar 20 10:38:14 crc kubenswrapper[4748]: E0320 10:38:14.578135 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/10c5886c-4f03-45fb-b71a-a9213027a4fd-webhook-cert podName:10c5886c-4f03-45fb-b71a-a9213027a4fd nodeName:}" failed. No retries permitted until 2026-03-20 10:38:15.078107812 +0000 UTC m=+130.219653636 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-cert" (UniqueName: "kubernetes.io/secret/10c5886c-4f03-45fb-b71a-a9213027a4fd-webhook-cert") pod "packageserver-d55dfcdfc-zbshh" (UID: "10c5886c-4f03-45fb-b71a-a9213027a4fd") : failed to sync secret cache: timed out waiting for the condition Mar 20 10:38:14 crc kubenswrapper[4748]: E0320 10:38:14.578192 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/10c5886c-4f03-45fb-b71a-a9213027a4fd-apiservice-cert podName:10c5886c-4f03-45fb-b71a-a9213027a4fd nodeName:}" failed. No retries permitted until 2026-03-20 10:38:15.078163454 +0000 UTC m=+130.219709368 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/10c5886c-4f03-45fb-b71a-a9213027a4fd-apiservice-cert") pod "packageserver-d55dfcdfc-zbshh" (UID: "10c5886c-4f03-45fb-b71a-a9213027a4fd") : failed to sync secret cache: timed out waiting for the condition Mar 20 10:38:14 crc kubenswrapper[4748]: I0320 10:38:14.595923 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 20 10:38:14 crc kubenswrapper[4748]: I0320 10:38:14.616395 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 20 10:38:14 crc kubenswrapper[4748]: I0320 10:38:14.636559 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 20 10:38:14 crc kubenswrapper[4748]: I0320 10:38:14.641941 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:38:14 crc kubenswrapper[4748]: I0320 10:38:14.656186 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 20 10:38:14 crc kubenswrapper[4748]: I0320 10:38:14.659100 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:38:14 crc kubenswrapper[4748]: I0320 10:38:14.671803 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:38:14 crc kubenswrapper[4748]: I0320 10:38:14.676513 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 20 10:38:14 crc kubenswrapper[4748]: I0320 10:38:14.700081 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 20 10:38:14 crc kubenswrapper[4748]: I0320 10:38:14.715738 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 20 10:38:14 crc kubenswrapper[4748]: I0320 10:38:14.736128 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 20 10:38:14 crc kubenswrapper[4748]: I0320 10:38:14.756995 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 20 10:38:14 crc kubenswrapper[4748]: I0320 10:38:14.785538 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 20 10:38:14 crc kubenswrapper[4748]: I0320 10:38:14.798770 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 20 10:38:14 crc kubenswrapper[4748]: I0320 10:38:14.815279 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 20 10:38:14 crc kubenswrapper[4748]: I0320 10:38:14.855454 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 20 10:38:14 crc kubenswrapper[4748]: I0320 10:38:14.876663 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 20 10:38:14 crc kubenswrapper[4748]: I0320 10:38:14.899071 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 20 10:38:14 crc kubenswrapper[4748]: W0320 10:38:14.918926 4748 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-0646ebe80a1039991fa39171ad361d487467867eafc8e3f84d266e0f75319b5b WatchSource:0}: Error finding container 0646ebe80a1039991fa39171ad361d487467867eafc8e3f84d266e0f75319b5b: Status 404 returned error can't find the container with id 0646ebe80a1039991fa39171ad361d487467867eafc8e3f84d266e0f75319b5b Mar 20 10:38:14 crc kubenswrapper[4748]: I0320 10:38:14.921364 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 20 10:38:14 crc kubenswrapper[4748]: I0320 10:38:14.934818 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 20 10:38:14 crc kubenswrapper[4748]: W0320 10:38:14.949542 4748 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-9b4d0bef054a3f3cdeea1b6e337b25bc1a974fb2175c52c4b0225daa4caa2de5 WatchSource:0}: Error finding container 9b4d0bef054a3f3cdeea1b6e337b25bc1a974fb2175c52c4b0225daa4caa2de5: Status 404 returned error can't find the container with id 9b4d0bef054a3f3cdeea1b6e337b25bc1a974fb2175c52c4b0225daa4caa2de5 Mar 20 10:38:14 crc kubenswrapper[4748]: I0320 10:38:14.959397 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 20 10:38:14 crc kubenswrapper[4748]: I0320 10:38:14.975533 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 20 10:38:14 crc kubenswrapper[4748]: I0320 10:38:14.995263 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 10:38:15 crc kubenswrapper[4748]: I0320 10:38:15.000539 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2dd2dd89-ba90-440f-abc8-74ab27d7db69-config-volume\") pod \"collect-profiles-29566710-vh2t7\" (UID: \"2dd2dd89-ba90-440f-abc8-74ab27d7db69\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566710-vh2t7" Mar 20 10:38:15 crc kubenswrapper[4748]: I0320 10:38:15.000591 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/51c43cdc-1f52-4850-9c2d-d317dc38c754-profile-collector-cert\") pod \"catalog-operator-68c6474976-hsr94\" (UID: \"51c43cdc-1f52-4850-9c2d-d317dc38c754\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hsr94" Mar 20 10:38:15 crc kubenswrapper[4748]: I0320 10:38:15.000623 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ffd1ef63-d06a-4c98-8644-9efc1e7ae8fa-serving-cert\") pod \"controller-manager-879f6c89f-hg8kp\" (UID: \"ffd1ef63-d06a-4c98-8644-9efc1e7ae8fa\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hg8kp" Mar 20 10:38:15 crc kubenswrapper[4748]: I0320 10:38:15.000653 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2dd2dd89-ba90-440f-abc8-74ab27d7db69-secret-volume\") pod \"collect-profiles-29566710-vh2t7\" (UID: \"2dd2dd89-ba90-440f-abc8-74ab27d7db69\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566710-vh2t7" Mar 20 10:38:15 crc kubenswrapper[4748]: I0320 10:38:15.000733 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/51c43cdc-1f52-4850-9c2d-d317dc38c754-srv-cert\") pod \"catalog-operator-68c6474976-hsr94\" (UID: \"51c43cdc-1f52-4850-9c2d-d317dc38c754\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hsr94" Mar 20 10:38:15 crc kubenswrapper[4748]: I0320 10:38:15.000760 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ffd1ef63-d06a-4c98-8644-9efc1e7ae8fa-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-hg8kp\" (UID: \"ffd1ef63-d06a-4c98-8644-9efc1e7ae8fa\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hg8kp" Mar 20 10:38:15 crc kubenswrapper[4748]: I0320 10:38:15.000794 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ffd1ef63-d06a-4c98-8644-9efc1e7ae8fa-client-ca\") pod \"controller-manager-879f6c89f-hg8kp\" (UID: \"ffd1ef63-d06a-4c98-8644-9efc1e7ae8fa\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hg8kp" Mar 20 10:38:15 crc kubenswrapper[4748]: I0320 10:38:15.000881 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ffd1ef63-d06a-4c98-8644-9efc1e7ae8fa-config\") pod \"controller-manager-879f6c89f-hg8kp\" (UID: \"ffd1ef63-d06a-4c98-8644-9efc1e7ae8fa\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hg8kp" Mar 20 10:38:15 crc kubenswrapper[4748]: I0320 10:38:15.001511 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2dd2dd89-ba90-440f-abc8-74ab27d7db69-config-volume\") pod \"collect-profiles-29566710-vh2t7\" (UID: \"2dd2dd89-ba90-440f-abc8-74ab27d7db69\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566710-vh2t7" Mar 20 10:38:15 crc kubenswrapper[4748]: I0320 10:38:15.002132 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ffd1ef63-d06a-4c98-8644-9efc1e7ae8fa-config\") pod \"controller-manager-879f6c89f-hg8kp\" (UID: \"ffd1ef63-d06a-4c98-8644-9efc1e7ae8fa\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hg8kp" Mar 20 10:38:15 crc kubenswrapper[4748]: I0320 10:38:15.002455 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ffd1ef63-d06a-4c98-8644-9efc1e7ae8fa-client-ca\") pod \"controller-manager-879f6c89f-hg8kp\" (UID: \"ffd1ef63-d06a-4c98-8644-9efc1e7ae8fa\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hg8kp" Mar 20 10:38:15 crc kubenswrapper[4748]: I0320 10:38:15.002484 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ffd1ef63-d06a-4c98-8644-9efc1e7ae8fa-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-hg8kp\" (UID: \"ffd1ef63-d06a-4c98-8644-9efc1e7ae8fa\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hg8kp" Mar 20 10:38:15 crc kubenswrapper[4748]: I0320 10:38:15.005017 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ffd1ef63-d06a-4c98-8644-9efc1e7ae8fa-serving-cert\") pod \"controller-manager-879f6c89f-hg8kp\" (UID: \"ffd1ef63-d06a-4c98-8644-9efc1e7ae8fa\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hg8kp" Mar 20 10:38:15 crc kubenswrapper[4748]: I0320 10:38:15.005024 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/51c43cdc-1f52-4850-9c2d-d317dc38c754-srv-cert\") pod \"catalog-operator-68c6474976-hsr94\" (UID: \"51c43cdc-1f52-4850-9c2d-d317dc38c754\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hsr94" Mar 20 10:38:15 crc kubenswrapper[4748]: I0320 10:38:15.005024 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2dd2dd89-ba90-440f-abc8-74ab27d7db69-secret-volume\") pod \"collect-profiles-29566710-vh2t7\" (UID: \"2dd2dd89-ba90-440f-abc8-74ab27d7db69\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566710-vh2t7" Mar 20 10:38:15 crc kubenswrapper[4748]: I0320 10:38:15.005120 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/51c43cdc-1f52-4850-9c2d-d317dc38c754-profile-collector-cert\") pod \"catalog-operator-68c6474976-hsr94\" (UID: \"51c43cdc-1f52-4850-9c2d-d317dc38c754\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hsr94" Mar 20 10:38:15 crc kubenswrapper[4748]: I0320 10:38:15.016185 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 10:38:15 crc kubenswrapper[4748]: I0320 10:38:15.034985 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 20 10:38:15 crc kubenswrapper[4748]: I0320 10:38:15.055665 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 20 10:38:15 crc kubenswrapper[4748]: I0320 10:38:15.075448 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 20 10:38:15 crc kubenswrapper[4748]: I0320 10:38:15.095203 4748 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 20 10:38:15 crc kubenswrapper[4748]: I0320 10:38:15.102401 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/10c5886c-4f03-45fb-b71a-a9213027a4fd-apiservice-cert\") pod \"packageserver-d55dfcdfc-zbshh\" (UID: \"10c5886c-4f03-45fb-b71a-a9213027a4fd\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zbshh" Mar 20 10:38:15 crc kubenswrapper[4748]: I0320 10:38:15.102560 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/10c5886c-4f03-45fb-b71a-a9213027a4fd-webhook-cert\") pod \"packageserver-d55dfcdfc-zbshh\" (UID: \"10c5886c-4f03-45fb-b71a-a9213027a4fd\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zbshh" Mar 20 10:38:15 crc kubenswrapper[4748]: I0320 10:38:15.106408 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/10c5886c-4f03-45fb-b71a-a9213027a4fd-apiservice-cert\") pod \"packageserver-d55dfcdfc-zbshh\" (UID: \"10c5886c-4f03-45fb-b71a-a9213027a4fd\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zbshh" Mar 20 10:38:15 crc kubenswrapper[4748]: I0320 10:38:15.107963 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/10c5886c-4f03-45fb-b71a-a9213027a4fd-webhook-cert\") pod \"packageserver-d55dfcdfc-zbshh\" (UID: \"10c5886c-4f03-45fb-b71a-a9213027a4fd\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zbshh" Mar 20 10:38:15 crc kubenswrapper[4748]: I0320 10:38:15.115188 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 20 10:38:15 crc kubenswrapper[4748]: I0320 10:38:15.134626 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 20 10:38:15 crc kubenswrapper[4748]: I0320 10:38:15.154967 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 20 10:38:15 crc kubenswrapper[4748]: I0320 10:38:15.176044 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 20 10:38:15 crc kubenswrapper[4748]: I0320 10:38:15.197375 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 20 10:38:15 crc kubenswrapper[4748]: I0320 10:38:15.215359 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-sysctl-allowlist" Mar 20 10:38:15 crc kubenswrapper[4748]: I0320 10:38:15.254375 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49x5z\" (UniqueName: \"kubernetes.io/projected/8a3c049c-4ed1-4b65-9ba9-854aa25764e5-kube-api-access-49x5z\") pod \"dns-operator-744455d44c-fftbt\" (UID: \"8a3c049c-4ed1-4b65-9ba9-854aa25764e5\") " pod="openshift-dns-operator/dns-operator-744455d44c-fftbt" Mar 20 10:38:15 crc kubenswrapper[4748]: I0320 10:38:15.283082 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8rk6\" (UniqueName: \"kubernetes.io/projected/a6110c56-5634-4ef9-92b1-4c7c75dd4986-kube-api-access-n8rk6\") pod \"control-plane-machine-set-operator-78cbb6b69f-v79zw\" (UID: \"a6110c56-5634-4ef9-92b1-4c7c75dd4986\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-v79zw" Mar 20 10:38:15 crc kubenswrapper[4748]: I0320 10:38:15.304039 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7kmfs\" (UniqueName: \"kubernetes.io/projected/2cc53e20-383b-4e3a-a00a-d54ac8272e00-kube-api-access-7kmfs\") pod \"machine-api-operator-5694c8668f-9lbsk\" (UID: \"2cc53e20-383b-4e3a-a00a-d54ac8272e00\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-9lbsk" Mar 20 10:38:15 crc kubenswrapper[4748]: I0320 10:38:15.326827 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjt6c\" (UniqueName: \"kubernetes.io/projected/c5864ca9-f8a0-40c0-ade0-8d4e6e0a350c-kube-api-access-xjt6c\") pod \"openshift-controller-manager-operator-756b6f6bc6-tqb9k\" (UID: \"c5864ca9-f8a0-40c0-ade0-8d4e6e0a350c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tqb9k" Mar 20 10:38:15 crc kubenswrapper[4748]: I0320 10:38:15.331421 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"e1f22c599fa152725b7e9ecf25e457175bd3aa1358f670422795141b7efa94a8"} Mar 20 10:38:15 crc kubenswrapper[4748]: I0320 10:38:15.331485 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"9b4d0bef054a3f3cdeea1b6e337b25bc1a974fb2175c52c4b0225daa4caa2de5"} Mar 20 10:38:15 crc kubenswrapper[4748]: I0320 10:38:15.333704 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"7bb0fb386b9c84fab7ac08dd3a61a7e8c9d506671738292620ef96d8fb8627a2"} Mar 20 10:38:15 crc kubenswrapper[4748]: I0320 10:38:15.336541 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"aa3c321ac7029844c796d4c11d472a9894d91ccaee45a53facd398698651fb1e"} Mar 20 10:38:15 crc kubenswrapper[4748]: I0320 10:38:15.336601 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"0646ebe80a1039991fa39171ad361d487467867eafc8e3f84d266e0f75319b5b"} Mar 20 10:38:15 crc kubenswrapper[4748]: I0320 10:38:15.336875 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:38:15 crc kubenswrapper[4748]: I0320 10:38:15.342997 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0e36c673-8d30-4267-a115-ec3b62ad093a-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-tvhtn\" (UID: \"0e36c673-8d30-4267-a115-ec3b62ad093a\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-tvhtn" Mar 20 10:38:15 crc kubenswrapper[4748]: I0320 10:38:15.353481 4748 request.go:700] Waited for 1.888649554s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-cluster-samples-operator/serviceaccounts/cluster-samples-operator/token Mar 20 10:38:15 crc kubenswrapper[4748]: I0320 10:38:15.363317 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hp5qr\" (UniqueName: \"kubernetes.io/projected/ffd1ef63-d06a-4c98-8644-9efc1e7ae8fa-kube-api-access-hp5qr\") pod \"controller-manager-879f6c89f-hg8kp\" (UID: \"ffd1ef63-d06a-4c98-8644-9efc1e7ae8fa\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hg8kp" Mar 20 10:38:15 crc kubenswrapper[4748]: I0320 10:38:15.370007 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7bfg\" (UniqueName: \"kubernetes.io/projected/6c6c249f-695d-4875-94ad-a608e8bd7d5f-kube-api-access-x7bfg\") pod \"cluster-samples-operator-665b6dd947-26svx\" (UID: \"6c6c249f-695d-4875-94ad-a608e8bd7d5f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-26svx" Mar 20 10:38:15 crc kubenswrapper[4748]: I0320 10:38:15.392790 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-v79zw" Mar 20 10:38:15 crc kubenswrapper[4748]: I0320 10:38:15.396003 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-29kvb\" (UniqueName: \"kubernetes.io/projected/122fe6aa-0852-4ddd-a678-6ce541c38250-kube-api-access-29kvb\") pod \"console-operator-58897d9998-9x2kc\" (UID: \"122fe6aa-0852-4ddd-a678-6ce541c38250\") " pod="openshift-console-operator/console-operator-58897d9998-9x2kc" Mar 20 10:38:15 crc kubenswrapper[4748]: I0320 10:38:15.405599 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-hg8kp" Mar 20 10:38:15 crc kubenswrapper[4748]: I0320 10:38:15.406348 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-26svx" Mar 20 10:38:15 crc kubenswrapper[4748]: I0320 10:38:15.415043 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-9x2kc" Mar 20 10:38:15 crc kubenswrapper[4748]: I0320 10:38:15.418773 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x5ssh\" (UniqueName: \"kubernetes.io/projected/2c8e6760-4ff7-4417-917e-61bdc115d710-kube-api-access-x5ssh\") pod \"router-default-5444994796-rbx59\" (UID: \"2c8e6760-4ff7-4417-917e-61bdc115d710\") " pod="openshift-ingress/router-default-5444994796-rbx59" Mar 20 10:38:15 crc kubenswrapper[4748]: I0320 10:38:15.441020 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6f7tz\" (UniqueName: \"kubernetes.io/projected/e24b6492-332b-421d-b2cd-ab1f11e06432-kube-api-access-6f7tz\") pod \"apiserver-7bbb656c7d-2xltn\" (UID: \"e24b6492-332b-421d-b2cd-ab1f11e06432\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2xltn" Mar 20 10:38:15 crc kubenswrapper[4748]: I0320 10:38:15.447993 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-fftbt" Mar 20 10:38:15 crc kubenswrapper[4748]: I0320 10:38:15.458714 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wlws\" (UniqueName: \"kubernetes.io/projected/ff567791-a234-47f5-8350-154f93477bb9-kube-api-access-4wlws\") pod \"apiserver-76f77b778f-qq9xx\" (UID: \"ff567791-a234-47f5-8350-154f93477bb9\") " pod="openshift-apiserver/apiserver-76f77b778f-qq9xx" Mar 20 10:38:15 crc kubenswrapper[4748]: I0320 10:38:15.479013 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zlb6s\" (UniqueName: \"kubernetes.io/projected/0e36c673-8d30-4267-a115-ec3b62ad093a-kube-api-access-zlb6s\") pod \"cluster-image-registry-operator-dc59b4c8b-tvhtn\" (UID: \"0e36c673-8d30-4267-a115-ec3b62ad093a\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-tvhtn" Mar 20 10:38:15 crc kubenswrapper[4748]: I0320 10:38:15.480095 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-qq9xx" Mar 20 10:38:15 crc kubenswrapper[4748]: I0320 10:38:15.486382 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2xltn" Mar 20 10:38:15 crc kubenswrapper[4748]: I0320 10:38:15.494424 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-9lbsk" Mar 20 10:38:15 crc kubenswrapper[4748]: I0320 10:38:15.501101 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0f21328d-966f-4d28-b80a-c950c1c83b6e-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-6q2px\" (UID: \"0f21328d-966f-4d28-b80a-c950c1c83b6e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6q2px" Mar 20 10:38:15 crc kubenswrapper[4748]: I0320 10:38:15.527445 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zsn8k\" (UniqueName: \"kubernetes.io/projected/554afb3a-2388-4d1f-8949-7c9c9941d685-kube-api-access-zsn8k\") pod \"machine-approver-56656f9798-9qxnc\" (UID: \"554afb3a-2388-4d1f-8949-7c9c9941d685\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9qxnc" Mar 20 10:38:15 crc kubenswrapper[4748]: I0320 10:38:15.553232 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-rbx59" Mar 20 10:38:15 crc kubenswrapper[4748]: I0320 10:38:15.553657 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbbgb\" (UniqueName: \"kubernetes.io/projected/3ec3ee66-08f2-44d5-a414-90354f5a4a9e-kube-api-access-nbbgb\") pod \"openshift-config-operator-7777fb866f-4k4dq\" (UID: \"3ec3ee66-08f2-44d5-a414-90354f5a4a9e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-4k4dq" Mar 20 10:38:15 crc kubenswrapper[4748]: I0320 10:38:15.558578 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tqb9k" Mar 20 10:38:15 crc kubenswrapper[4748]: I0320 10:38:15.573442 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxwbs\" (UniqueName: \"kubernetes.io/projected/2dd2dd89-ba90-440f-abc8-74ab27d7db69-kube-api-access-sxwbs\") pod \"collect-profiles-29566710-vh2t7\" (UID: \"2dd2dd89-ba90-440f-abc8-74ab27d7db69\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566710-vh2t7" Mar 20 10:38:15 crc kubenswrapper[4748]: I0320 10:38:15.579481 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wng4v\" (UniqueName: \"kubernetes.io/projected/61338f55-8e52-48cb-962d-40aefc460991-kube-api-access-wng4v\") pod \"etcd-operator-b45778765-8ttxd\" (UID: \"61338f55-8e52-48cb-962d-40aefc460991\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8ttxd" Mar 20 10:38:15 crc kubenswrapper[4748]: I0320 10:38:15.593761 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-v79zw"] Mar 20 10:38:15 crc kubenswrapper[4748]: I0320 10:38:15.612395 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/eb3bf250-9fda-4426-baa3-48eed453f90d-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-7zqwq\" (UID: \"eb3bf250-9fda-4426-baa3-48eed453f90d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7zqwq" Mar 20 10:38:15 crc kubenswrapper[4748]: I0320 10:38:15.631676 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmk9t\" (UniqueName: \"kubernetes.io/projected/51c43cdc-1f52-4850-9c2d-d317dc38c754-kube-api-access-gmk9t\") pod \"catalog-operator-68c6474976-hsr94\" (UID: \"51c43cdc-1f52-4850-9c2d-d317dc38c754\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hsr94" Mar 20 10:38:15 crc kubenswrapper[4748]: I0320 10:38:15.632366 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6q2px" Mar 20 10:38:15 crc kubenswrapper[4748]: I0320 10:38:15.637050 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9qxnc" Mar 20 10:38:15 crc kubenswrapper[4748]: I0320 10:38:15.638721 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8ch9\" (UniqueName: \"kubernetes.io/projected/0a5bf285-90ef-47c5-a959-7bae63c410a5-kube-api-access-m8ch9\") pod \"downloads-7954f5f757-hxtfq\" (UID: \"0a5bf285-90ef-47c5-a959-7bae63c410a5\") " pod="openshift-console/downloads-7954f5f757-hxtfq" Mar 20 10:38:15 crc kubenswrapper[4748]: I0320 10:38:15.650979 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5q9zp\" (UniqueName: \"kubernetes.io/projected/64222460-9199-40c1-bf14-07be02855e39-kube-api-access-5q9zp\") pod \"service-ca-9c57cc56f-4svjh\" (UID: \"64222460-9199-40c1-bf14-07be02855e39\") " pod="openshift-service-ca/service-ca-9c57cc56f-4svjh" Mar 20 10:38:15 crc kubenswrapper[4748]: I0320 10:38:15.655527 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 20 10:38:15 crc kubenswrapper[4748]: I0320 10:38:15.656361 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-tvhtn" Mar 20 10:38:15 crc kubenswrapper[4748]: W0320 10:38:15.661547 4748 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda6110c56_5634_4ef9_92b1_4c7c75dd4986.slice/crio-31523ee70ad45009907db5dbeb5f614cd99f3bc0b886ab94857ab315dd8a1fcd WatchSource:0}: Error finding container 31523ee70ad45009907db5dbeb5f614cd99f3bc0b886ab94857ab315dd8a1fcd: Status 404 returned error can't find the container with id 31523ee70ad45009907db5dbeb5f614cd99f3bc0b886ab94857ab315dd8a1fcd Mar 20 10:38:15 crc kubenswrapper[4748]: I0320 10:38:15.672023 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-4svjh" Mar 20 10:38:15 crc kubenswrapper[4748]: I0320 10:38:15.676428 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-4k4dq" Mar 20 10:38:15 crc kubenswrapper[4748]: I0320 10:38:15.681961 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 20 10:38:15 crc kubenswrapper[4748]: I0320 10:38:15.695743 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 20 10:38:15 crc kubenswrapper[4748]: I0320 10:38:15.720856 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566710-vh2t7" Mar 20 10:38:15 crc kubenswrapper[4748]: I0320 10:38:15.725045 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 20 10:38:15 crc kubenswrapper[4748]: I0320 10:38:15.727718 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-26svx"] Mar 20 10:38:15 crc kubenswrapper[4748]: I0320 10:38:15.767604 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hsr94" Mar 20 10:38:15 crc kubenswrapper[4748]: I0320 10:38:15.772778 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cb2j6\" (UniqueName: \"kubernetes.io/projected/10c5886c-4f03-45fb-b71a-a9213027a4fd-kube-api-access-cb2j6\") pod \"packageserver-d55dfcdfc-zbshh\" (UID: \"10c5886c-4f03-45fb-b71a-a9213027a4fd\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zbshh" Mar 20 10:38:15 crc kubenswrapper[4748]: I0320 10:38:15.790801 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27knb\" (UniqueName: \"kubernetes.io/projected/a32808c8-cba3-478e-bcf0-01434602a895-kube-api-access-27knb\") pod \"machine-config-operator-74547568cd-n6wkh\" (UID: \"a32808c8-cba3-478e-bcf0-01434602a895\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-n6wkh" Mar 20 10:38:15 crc kubenswrapper[4748]: I0320 10:38:15.813845 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-8ttxd" Mar 20 10:38:15 crc kubenswrapper[4748]: I0320 10:38:15.817200 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9cpv\" (UniqueName: \"kubernetes.io/projected/9d26944d-4b2d-4411-aa57-e74df0293000-kube-api-access-z9cpv\") pod \"kube-storage-version-migrator-operator-b67b599dd-p46hq\" (UID: \"9d26944d-4b2d-4411-aa57-e74df0293000\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-p46hq" Mar 20 10:38:15 crc kubenswrapper[4748]: I0320 10:38:15.827605 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7zqwq" Mar 20 10:38:15 crc kubenswrapper[4748]: I0320 10:38:15.836110 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ng4rk\" (UniqueName: \"kubernetes.io/projected/9abcdd14-d386-4279-9d4c-4a7326a32a11-kube-api-access-ng4rk\") pod \"marketplace-operator-79b997595-xpncn\" (UID: \"9abcdd14-d386-4279-9d4c-4a7326a32a11\") " pod="openshift-marketplace/marketplace-operator-79b997595-xpncn" Mar 20 10:38:15 crc kubenswrapper[4748]: I0320 10:38:15.900231 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-p46hq" Mar 20 10:38:15 crc kubenswrapper[4748]: I0320 10:38:15.920587 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/4d3ff749-37cf-414c-ad3b-fe72fc1cfe28-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-c5vw7\" (UID: \"4d3ff749-37cf-414c-ad3b-fe72fc1cfe28\") " pod="openshift-authentication/oauth-openshift-558db77b4-c5vw7" Mar 20 10:38:15 crc kubenswrapper[4748]: I0320 10:38:15.920641 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7da7288f-ca47-4172-a3dd-80a79e803277-registry-certificates\") pod \"image-registry-697d97f7c8-tlkp2\" (UID: \"7da7288f-ca47-4172-a3dd-80a79e803277\") " pod="openshift-image-registry/image-registry-697d97f7c8-tlkp2" Mar 20 10:38:15 crc kubenswrapper[4748]: I0320 10:38:15.920663 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zhx9\" (UniqueName: \"kubernetes.io/projected/0481ff64-8f10-4e72-a81e-d2c53278246a-kube-api-access-5zhx9\") pod \"ingress-operator-5b745b69d9-sksdb\" (UID: \"0481ff64-8f10-4e72-a81e-d2c53278246a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-sksdb" Mar 20 10:38:15 crc kubenswrapper[4748]: I0320 10:38:15.920684 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6a3e2353-3374-48ad-968b-8f4b487d63ed-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-phlbm\" (UID: \"6a3e2353-3374-48ad-968b-8f4b487d63ed\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-phlbm" Mar 20 10:38:15 crc kubenswrapper[4748]: I0320 10:38:15.920718 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fc943725-6ebd-4214-92c4-6f4d434ec186-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-kc5xl\" (UID: \"fc943725-6ebd-4214-92c4-6f4d434ec186\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kc5xl" Mar 20 10:38:15 crc kubenswrapper[4748]: I0320 10:38:15.920751 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0481ff64-8f10-4e72-a81e-d2c53278246a-trusted-ca\") pod \"ingress-operator-5b745b69d9-sksdb\" (UID: \"0481ff64-8f10-4e72-a81e-d2c53278246a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-sksdb" Mar 20 10:38:15 crc kubenswrapper[4748]: I0320 10:38:15.920780 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqndt\" (UniqueName: \"kubernetes.io/projected/ae334fdf-f952-4b6b-8372-1fd7ef332362-kube-api-access-wqndt\") pod \"console-f9d7485db-dmmdh\" (UID: \"ae334fdf-f952-4b6b-8372-1fd7ef332362\") " pod="openshift-console/console-f9d7485db-dmmdh" Mar 20 10:38:15 crc kubenswrapper[4748]: I0320 10:38:15.920798 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4d3ff749-37cf-414c-ad3b-fe72fc1cfe28-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-c5vw7\" (UID: \"4d3ff749-37cf-414c-ad3b-fe72fc1cfe28\") " pod="openshift-authentication/oauth-openshift-558db77b4-c5vw7" Mar 20 10:38:15 crc kubenswrapper[4748]: I0320 10:38:15.920818 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc943725-6ebd-4214-92c4-6f4d434ec186-config\") pod \"authentication-operator-69f744f599-kc5xl\" (UID: \"fc943725-6ebd-4214-92c4-6f4d434ec186\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kc5xl" Mar 20 10:38:15 crc kubenswrapper[4748]: I0320 10:38:15.920858 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ef9e528c-68f5-4e04-9ceb-99e84335173c-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-m7vd5\" (UID: \"ef9e528c-68f5-4e04-9ceb-99e84335173c\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-m7vd5" Mar 20 10:38:15 crc kubenswrapper[4748]: I0320 10:38:15.920880 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54kwz\" (UniqueName: \"kubernetes.io/projected/6a3e2353-3374-48ad-968b-8f4b487d63ed-kube-api-access-54kwz\") pod \"machine-config-controller-84d6567774-phlbm\" (UID: \"6a3e2353-3374-48ad-968b-8f4b487d63ed\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-phlbm" Mar 20 10:38:15 crc kubenswrapper[4748]: I0320 10:38:15.920916 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/4d3ff749-37cf-414c-ad3b-fe72fc1cfe28-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-c5vw7\" (UID: \"4d3ff749-37cf-414c-ad3b-fe72fc1cfe28\") " pod="openshift-authentication/oauth-openshift-558db77b4-c5vw7" Mar 20 10:38:15 crc kubenswrapper[4748]: I0320 10:38:15.920938 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/96d563e0-088d-4fd3-a3cf-dc70e4a40d99-config\") pod \"openshift-apiserver-operator-796bbdcf4f-zwht2\" (UID: \"96d563e0-088d-4fd3-a3cf-dc70e4a40d99\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zwht2" Mar 20 10:38:15 crc kubenswrapper[4748]: I0320 10:38:15.920958 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tbbln\" (UniqueName: \"kubernetes.io/projected/4d3ff749-37cf-414c-ad3b-fe72fc1cfe28-kube-api-access-tbbln\") pod \"oauth-openshift-558db77b4-c5vw7\" (UID: \"4d3ff749-37cf-414c-ad3b-fe72fc1cfe28\") " pod="openshift-authentication/oauth-openshift-558db77b4-c5vw7" Mar 20 10:38:15 crc kubenswrapper[4748]: I0320 10:38:15.920981 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ef9e528c-68f5-4e04-9ceb-99e84335173c-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-m7vd5\" (UID: \"ef9e528c-68f5-4e04-9ceb-99e84335173c\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-m7vd5" Mar 20 10:38:15 crc kubenswrapper[4748]: I0320 10:38:15.921001 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6a3e2353-3374-48ad-968b-8f4b487d63ed-proxy-tls\") pod \"machine-config-controller-84d6567774-phlbm\" (UID: \"6a3e2353-3374-48ad-968b-8f4b487d63ed\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-phlbm" Mar 20 10:38:15 crc kubenswrapper[4748]: I0320 10:38:15.921045 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ae334fdf-f952-4b6b-8372-1fd7ef332362-console-oauth-config\") pod \"console-f9d7485db-dmmdh\" (UID: \"ae334fdf-f952-4b6b-8372-1fd7ef332362\") " pod="openshift-console/console-f9d7485db-dmmdh" Mar 20 10:38:15 crc kubenswrapper[4748]: I0320 10:38:15.921069 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7bqkp\" (UniqueName: \"kubernetes.io/projected/7da7288f-ca47-4172-a3dd-80a79e803277-kube-api-access-7bqkp\") pod \"image-registry-697d97f7c8-tlkp2\" (UID: \"7da7288f-ca47-4172-a3dd-80a79e803277\") " pod="openshift-image-registry/image-registry-697d97f7c8-tlkp2" Mar 20 10:38:15 crc kubenswrapper[4748]: I0320 10:38:15.921091 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/65bf442d-adbb-4d6d-b0ee-55bbba31e305-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-7cnpj\" (UID: \"65bf442d-adbb-4d6d-b0ee-55bbba31e305\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-7cnpj" Mar 20 10:38:15 crc kubenswrapper[4748]: I0320 10:38:15.921114 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/4d3ff749-37cf-414c-ad3b-fe72fc1cfe28-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-c5vw7\" (UID: \"4d3ff749-37cf-414c-ad3b-fe72fc1cfe28\") " pod="openshift-authentication/oauth-openshift-558db77b4-c5vw7" Mar 20 10:38:15 crc kubenswrapper[4748]: I0320 10:38:15.921134 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7da7288f-ca47-4172-a3dd-80a79e803277-installation-pull-secrets\") pod \"image-registry-697d97f7c8-tlkp2\" (UID: \"7da7288f-ca47-4172-a3dd-80a79e803277\") " pod="openshift-image-registry/image-registry-697d97f7c8-tlkp2" Mar 20 10:38:15 crc kubenswrapper[4748]: I0320 10:38:15.921167 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tlkp2\" (UID: \"7da7288f-ca47-4172-a3dd-80a79e803277\") " pod="openshift-image-registry/image-registry-697d97f7c8-tlkp2" Mar 20 10:38:15 crc kubenswrapper[4748]: I0320 10:38:15.921202 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7da7288f-ca47-4172-a3dd-80a79e803277-trusted-ca\") pod \"image-registry-697d97f7c8-tlkp2\" (UID: \"7da7288f-ca47-4172-a3dd-80a79e803277\") " pod="openshift-image-registry/image-registry-697d97f7c8-tlkp2" Mar 20 10:38:15 crc kubenswrapper[4748]: I0320 10:38:15.921226 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7da7288f-ca47-4172-a3dd-80a79e803277-bound-sa-token\") pod \"image-registry-697d97f7c8-tlkp2\" (UID: \"7da7288f-ca47-4172-a3dd-80a79e803277\") " pod="openshift-image-registry/image-registry-697d97f7c8-tlkp2" Mar 20 10:38:15 crc kubenswrapper[4748]: I0320 10:38:15.921271 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lh8zn\" (UniqueName: \"kubernetes.io/projected/96d563e0-088d-4fd3-a3cf-dc70e4a40d99-kube-api-access-lh8zn\") pod \"openshift-apiserver-operator-796bbdcf4f-zwht2\" (UID: \"96d563e0-088d-4fd3-a3cf-dc70e4a40d99\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zwht2" Mar 20 10:38:15 crc kubenswrapper[4748]: I0320 10:38:15.921294 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/4d3ff749-37cf-414c-ad3b-fe72fc1cfe28-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-c5vw7\" (UID: \"4d3ff749-37cf-414c-ad3b-fe72fc1cfe28\") " pod="openshift-authentication/oauth-openshift-558db77b4-c5vw7" Mar 20 10:38:15 crc kubenswrapper[4748]: I0320 10:38:15.921315 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/4d3ff749-37cf-414c-ad3b-fe72fc1cfe28-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-c5vw7\" (UID: \"4d3ff749-37cf-414c-ad3b-fe72fc1cfe28\") " pod="openshift-authentication/oauth-openshift-558db77b4-c5vw7" Mar 20 10:38:15 crc kubenswrapper[4748]: I0320 10:38:15.921420 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-br2dv\" (UniqueName: \"kubernetes.io/projected/fc943725-6ebd-4214-92c4-6f4d434ec186-kube-api-access-br2dv\") pod \"authentication-operator-69f744f599-kc5xl\" (UID: \"fc943725-6ebd-4214-92c4-6f4d434ec186\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kc5xl" Mar 20 10:38:15 crc kubenswrapper[4748]: I0320 10:38:15.921444 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/96d563e0-088d-4fd3-a3cf-dc70e4a40d99-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-zwht2\" (UID: \"96d563e0-088d-4fd3-a3cf-dc70e4a40d99\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zwht2" Mar 20 10:38:15 crc kubenswrapper[4748]: I0320 10:38:15.921531 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7da7288f-ca47-4172-a3dd-80a79e803277-registry-tls\") pod \"image-registry-697d97f7c8-tlkp2\" (UID: \"7da7288f-ca47-4172-a3dd-80a79e803277\") " pod="openshift-image-registry/image-registry-697d97f7c8-tlkp2" Mar 20 10:38:15 crc kubenswrapper[4748]: I0320 10:38:15.921563 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4hzl\" (UniqueName: \"kubernetes.io/projected/65bf442d-adbb-4d6d-b0ee-55bbba31e305-kube-api-access-j4hzl\") pod \"multus-admission-controller-857f4d67dd-7cnpj\" (UID: \"65bf442d-adbb-4d6d-b0ee-55bbba31e305\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-7cnpj" Mar 20 10:38:15 crc kubenswrapper[4748]: I0320 10:38:15.921585 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4d3ff749-37cf-414c-ad3b-fe72fc1cfe28-audit-dir\") pod \"oauth-openshift-558db77b4-c5vw7\" (UID: \"4d3ff749-37cf-414c-ad3b-fe72fc1cfe28\") " pod="openshift-authentication/oauth-openshift-558db77b4-c5vw7" Mar 20 10:38:15 crc kubenswrapper[4748]: I0320 10:38:15.921606 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/4d3ff749-37cf-414c-ad3b-fe72fc1cfe28-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-c5vw7\" (UID: \"4d3ff749-37cf-414c-ad3b-fe72fc1cfe28\") " pod="openshift-authentication/oauth-openshift-558db77b4-c5vw7" Mar 20 10:38:15 crc kubenswrapper[4748]: I0320 10:38:15.921631 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/4d3ff749-37cf-414c-ad3b-fe72fc1cfe28-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-c5vw7\" (UID: \"4d3ff749-37cf-414c-ad3b-fe72fc1cfe28\") " pod="openshift-authentication/oauth-openshift-558db77b4-c5vw7" Mar 20 10:38:15 crc kubenswrapper[4748]: I0320 10:38:15.921665 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4d3ff749-37cf-414c-ad3b-fe72fc1cfe28-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-c5vw7\" (UID: \"4d3ff749-37cf-414c-ad3b-fe72fc1cfe28\") " pod="openshift-authentication/oauth-openshift-558db77b4-c5vw7" Mar 20 10:38:15 crc kubenswrapper[4748]: I0320 10:38:15.921690 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zt8rk\" (UniqueName: \"kubernetes.io/projected/b59f4996-13e0-4b70-b835-fbeea765c681-kube-api-access-zt8rk\") pod \"migrator-59844c95c7-brfsp\" (UID: \"b59f4996-13e0-4b70-b835-fbeea765c681\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-brfsp" Mar 20 10:38:15 crc kubenswrapper[4748]: I0320 10:38:15.921710 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4d3ff749-37cf-414c-ad3b-fe72fc1cfe28-audit-policies\") pod \"oauth-openshift-558db77b4-c5vw7\" (UID: \"4d3ff749-37cf-414c-ad3b-fe72fc1cfe28\") " pod="openshift-authentication/oauth-openshift-558db77b4-c5vw7" Mar 20 10:38:15 crc kubenswrapper[4748]: I0320 10:38:15.921731 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ae334fdf-f952-4b6b-8372-1fd7ef332362-service-ca\") pod \"console-f9d7485db-dmmdh\" (UID: \"ae334fdf-f952-4b6b-8372-1fd7ef332362\") " pod="openshift-console/console-f9d7485db-dmmdh" Mar 20 10:38:15 crc kubenswrapper[4748]: I0320 10:38:15.921753 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfjwf\" (UniqueName: \"kubernetes.io/projected/a49c3549-30d5-4927-ae41-cc2b9e7f47e2-kube-api-access-gfjwf\") pod \"route-controller-manager-6576b87f9c-xwxlw\" (UID: \"a49c3549-30d5-4927-ae41-cc2b9e7f47e2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-xwxlw" Mar 20 10:38:15 crc kubenswrapper[4748]: I0320 10:38:15.921802 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ae334fdf-f952-4b6b-8372-1fd7ef332362-console-serving-cert\") pod \"console-f9d7485db-dmmdh\" (UID: \"ae334fdf-f952-4b6b-8372-1fd7ef332362\") " pod="openshift-console/console-f9d7485db-dmmdh" Mar 20 10:38:15 crc kubenswrapper[4748]: I0320 10:38:15.921823 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a49c3549-30d5-4927-ae41-cc2b9e7f47e2-config\") pod \"route-controller-manager-6576b87f9c-xwxlw\" (UID: \"a49c3549-30d5-4927-ae41-cc2b9e7f47e2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-xwxlw" Mar 20 10:38:15 crc kubenswrapper[4748]: I0320 10:38:15.921896 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a49c3549-30d5-4927-ae41-cc2b9e7f47e2-serving-cert\") pod \"route-controller-manager-6576b87f9c-xwxlw\" (UID: \"a49c3549-30d5-4927-ae41-cc2b9e7f47e2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-xwxlw" Mar 20 10:38:15 crc kubenswrapper[4748]: I0320 10:38:15.921920 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef9e528c-68f5-4e04-9ceb-99e84335173c-config\") pod \"kube-controller-manager-operator-78b949d7b-m7vd5\" (UID: \"ef9e528c-68f5-4e04-9ceb-99e84335173c\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-m7vd5" Mar 20 10:38:15 crc kubenswrapper[4748]: I0320 10:38:15.921948 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fc943725-6ebd-4214-92c4-6f4d434ec186-service-ca-bundle\") pod \"authentication-operator-69f744f599-kc5xl\" (UID: \"fc943725-6ebd-4214-92c4-6f4d434ec186\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kc5xl" Mar 20 10:38:15 crc kubenswrapper[4748]: I0320 10:38:15.921970 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0481ff64-8f10-4e72-a81e-d2c53278246a-metrics-tls\") pod \"ingress-operator-5b745b69d9-sksdb\" (UID: \"0481ff64-8f10-4e72-a81e-d2c53278246a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-sksdb" Mar 20 10:38:15 crc kubenswrapper[4748]: I0320 10:38:15.922034 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0481ff64-8f10-4e72-a81e-d2c53278246a-bound-sa-token\") pod \"ingress-operator-5b745b69d9-sksdb\" (UID: \"0481ff64-8f10-4e72-a81e-d2c53278246a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-sksdb" Mar 20 10:38:15 crc kubenswrapper[4748]: I0320 10:38:15.922063 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/4d3ff749-37cf-414c-ad3b-fe72fc1cfe28-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-c5vw7\" (UID: \"4d3ff749-37cf-414c-ad3b-fe72fc1cfe28\") " pod="openshift-authentication/oauth-openshift-558db77b4-c5vw7" Mar 20 10:38:15 crc kubenswrapper[4748]: I0320 10:38:15.922087 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ae334fdf-f952-4b6b-8372-1fd7ef332362-console-config\") pod \"console-f9d7485db-dmmdh\" (UID: \"ae334fdf-f952-4b6b-8372-1fd7ef332362\") " pod="openshift-console/console-f9d7485db-dmmdh" Mar 20 10:38:15 crc kubenswrapper[4748]: I0320 10:38:15.922107 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ae334fdf-f952-4b6b-8372-1fd7ef332362-trusted-ca-bundle\") pod \"console-f9d7485db-dmmdh\" (UID: \"ae334fdf-f952-4b6b-8372-1fd7ef332362\") " pod="openshift-console/console-f9d7485db-dmmdh" Mar 20 10:38:15 crc kubenswrapper[4748]: I0320 10:38:15.922128 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/4d3ff749-37cf-414c-ad3b-fe72fc1cfe28-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-c5vw7\" (UID: \"4d3ff749-37cf-414c-ad3b-fe72fc1cfe28\") " pod="openshift-authentication/oauth-openshift-558db77b4-c5vw7" Mar 20 10:38:15 crc kubenswrapper[4748]: I0320 10:38:15.922165 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fc943725-6ebd-4214-92c4-6f4d434ec186-serving-cert\") pod \"authentication-operator-69f744f599-kc5xl\" (UID: \"fc943725-6ebd-4214-92c4-6f4d434ec186\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kc5xl" Mar 20 10:38:15 crc kubenswrapper[4748]: I0320 10:38:15.922187 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7da7288f-ca47-4172-a3dd-80a79e803277-ca-trust-extracted\") pod \"image-registry-697d97f7c8-tlkp2\" (UID: \"7da7288f-ca47-4172-a3dd-80a79e803277\") " pod="openshift-image-registry/image-registry-697d97f7c8-tlkp2" Mar 20 10:38:15 crc kubenswrapper[4748]: I0320 10:38:15.922223 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ae334fdf-f952-4b6b-8372-1fd7ef332362-oauth-serving-cert\") pod \"console-f9d7485db-dmmdh\" (UID: \"ae334fdf-f952-4b6b-8372-1fd7ef332362\") " pod="openshift-console/console-f9d7485db-dmmdh" Mar 20 10:38:15 crc kubenswrapper[4748]: I0320 10:38:15.922252 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a49c3549-30d5-4927-ae41-cc2b9e7f47e2-client-ca\") pod \"route-controller-manager-6576b87f9c-xwxlw\" (UID: \"a49c3549-30d5-4927-ae41-cc2b9e7f47e2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-xwxlw" Mar 20 10:38:15 crc kubenswrapper[4748]: E0320 10:38:15.932218 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:38:16.432199304 +0000 UTC m=+131.573745118 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tlkp2" (UID: "7da7288f-ca47-4172-a3dd-80a79e803277") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:38:15 crc kubenswrapper[4748]: I0320 10:38:15.937509 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-hxtfq" Mar 20 10:38:15 crc kubenswrapper[4748]: I0320 10:38:15.950475 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-xpncn" Mar 20 10:38:15 crc kubenswrapper[4748]: I0320 10:38:15.970103 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-n6wkh" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.005342 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-9x2kc"] Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.007860 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-hg8kp"] Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.015505 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zbshh" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.019104 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-fftbt"] Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.023400 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:38:16 crc kubenswrapper[4748]: E0320 10:38:16.023577 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:38:16.52352384 +0000 UTC m=+131.665069654 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.023618 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5zhx9\" (UniqueName: \"kubernetes.io/projected/0481ff64-8f10-4e72-a81e-d2c53278246a-kube-api-access-5zhx9\") pod \"ingress-operator-5b745b69d9-sksdb\" (UID: \"0481ff64-8f10-4e72-a81e-d2c53278246a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-sksdb" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.023776 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6a3e2353-3374-48ad-968b-8f4b487d63ed-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-phlbm\" (UID: \"6a3e2353-3374-48ad-968b-8f4b487d63ed\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-phlbm" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.023848 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fc943725-6ebd-4214-92c4-6f4d434ec186-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-kc5xl\" (UID: \"fc943725-6ebd-4214-92c4-6f4d434ec186\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kc5xl" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.023880 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0481ff64-8f10-4e72-a81e-d2c53278246a-trusted-ca\") pod \"ingress-operator-5b745b69d9-sksdb\" (UID: \"0481ff64-8f10-4e72-a81e-d2c53278246a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-sksdb" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.023947 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wqndt\" (UniqueName: \"kubernetes.io/projected/ae334fdf-f952-4b6b-8372-1fd7ef332362-kube-api-access-wqndt\") pod \"console-f9d7485db-dmmdh\" (UID: \"ae334fdf-f952-4b6b-8372-1fd7ef332362\") " pod="openshift-console/console-f9d7485db-dmmdh" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.023980 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4d3ff749-37cf-414c-ad3b-fe72fc1cfe28-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-c5vw7\" (UID: \"4d3ff749-37cf-414c-ad3b-fe72fc1cfe28\") " pod="openshift-authentication/oauth-openshift-558db77b4-c5vw7" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.024004 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc943725-6ebd-4214-92c4-6f4d434ec186-config\") pod \"authentication-operator-69f744f599-kc5xl\" (UID: \"fc943725-6ebd-4214-92c4-6f4d434ec186\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kc5xl" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.024024 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ef9e528c-68f5-4e04-9ceb-99e84335173c-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-m7vd5\" (UID: \"ef9e528c-68f5-4e04-9ceb-99e84335173c\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-m7vd5" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.024678 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6a3e2353-3374-48ad-968b-8f4b487d63ed-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-phlbm\" (UID: \"6a3e2353-3374-48ad-968b-8f4b487d63ed\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-phlbm" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.025534 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4d3ff749-37cf-414c-ad3b-fe72fc1cfe28-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-c5vw7\" (UID: \"4d3ff749-37cf-414c-ad3b-fe72fc1cfe28\") " pod="openshift-authentication/oauth-openshift-558db77b4-c5vw7" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.025572 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0481ff64-8f10-4e72-a81e-d2c53278246a-trusted-ca\") pod \"ingress-operator-5b745b69d9-sksdb\" (UID: \"0481ff64-8f10-4e72-a81e-d2c53278246a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-sksdb" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.025753 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54kwz\" (UniqueName: \"kubernetes.io/projected/6a3e2353-3374-48ad-968b-8f4b487d63ed-kube-api-access-54kwz\") pod \"machine-config-controller-84d6567774-phlbm\" (UID: \"6a3e2353-3374-48ad-968b-8f4b487d63ed\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-phlbm" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.025810 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/4d3ff749-37cf-414c-ad3b-fe72fc1cfe28-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-c5vw7\" (UID: \"4d3ff749-37cf-414c-ad3b-fe72fc1cfe28\") " pod="openshift-authentication/oauth-openshift-558db77b4-c5vw7" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.025865 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/96d563e0-088d-4fd3-a3cf-dc70e4a40d99-config\") pod \"openshift-apiserver-operator-796bbdcf4f-zwht2\" (UID: \"96d563e0-088d-4fd3-a3cf-dc70e4a40d99\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zwht2" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.025888 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tbbln\" (UniqueName: \"kubernetes.io/projected/4d3ff749-37cf-414c-ad3b-fe72fc1cfe28-kube-api-access-tbbln\") pod \"oauth-openshift-558db77b4-c5vw7\" (UID: \"4d3ff749-37cf-414c-ad3b-fe72fc1cfe28\") " pod="openshift-authentication/oauth-openshift-558db77b4-c5vw7" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.025918 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7e2e266d-1d26-4bbc-987e-e08b2fa0ccfe-serving-cert\") pod \"service-ca-operator-777779d784-rcw2w\" (UID: \"7e2e266d-1d26-4bbc-987e-e08b2fa0ccfe\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-rcw2w" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.025959 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ef9e528c-68f5-4e04-9ceb-99e84335173c-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-m7vd5\" (UID: \"ef9e528c-68f5-4e04-9ceb-99e84335173c\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-m7vd5" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.025989 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6a3e2353-3374-48ad-968b-8f4b487d63ed-proxy-tls\") pod \"machine-config-controller-84d6567774-phlbm\" (UID: \"6a3e2353-3374-48ad-968b-8f4b487d63ed\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-phlbm" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.026011 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f2320984-2f2c-4a56-a1d1-0c9b2e5f2d55-metrics-tls\") pod \"dns-default-vn8zj\" (UID: \"f2320984-2f2c-4a56-a1d1-0c9b2e5f2d55\") " pod="openshift-dns/dns-default-vn8zj" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.026056 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ae334fdf-f952-4b6b-8372-1fd7ef332362-console-oauth-config\") pod \"console-f9d7485db-dmmdh\" (UID: \"ae334fdf-f952-4b6b-8372-1fd7ef332362\") " pod="openshift-console/console-f9d7485db-dmmdh" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.026080 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7bqkp\" (UniqueName: \"kubernetes.io/projected/7da7288f-ca47-4172-a3dd-80a79e803277-kube-api-access-7bqkp\") pod \"image-registry-697d97f7c8-tlkp2\" (UID: \"7da7288f-ca47-4172-a3dd-80a79e803277\") " pod="openshift-image-registry/image-registry-697d97f7c8-tlkp2" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.026101 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c0116c88-07ce-460e-90ba-2b5d8fd6a921-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-9cpg7\" (UID: \"c0116c88-07ce-460e-90ba-2b5d8fd6a921\") " pod="openshift-multus/cni-sysctl-allowlist-ds-9cpg7" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.026124 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/afa0ff09-5605-496f-806a-a8719dd3d958-cert\") pod \"ingress-canary-dwfwm\" (UID: \"afa0ff09-5605-496f-806a-a8719dd3d958\") " pod="openshift-ingress-canary/ingress-canary-dwfwm" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.026146 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/65bf442d-adbb-4d6d-b0ee-55bbba31e305-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-7cnpj\" (UID: \"65bf442d-adbb-4d6d-b0ee-55bbba31e305\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-7cnpj" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.026174 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/4d3ff749-37cf-414c-ad3b-fe72fc1cfe28-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-c5vw7\" (UID: \"4d3ff749-37cf-414c-ad3b-fe72fc1cfe28\") " pod="openshift-authentication/oauth-openshift-558db77b4-c5vw7" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.028284 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/96d563e0-088d-4fd3-a3cf-dc70e4a40d99-config\") pod \"openshift-apiserver-operator-796bbdcf4f-zwht2\" (UID: \"96d563e0-088d-4fd3-a3cf-dc70e4a40d99\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zwht2" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.029386 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tlkp2\" (UID: \"7da7288f-ca47-4172-a3dd-80a79e803277\") " pod="openshift-image-registry/image-registry-697d97f7c8-tlkp2" Mar 20 10:38:16 crc kubenswrapper[4748]: E0320 10:38:16.029882 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:38:16.529855074 +0000 UTC m=+131.671400978 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tlkp2" (UID: "7da7288f-ca47-4172-a3dd-80a79e803277") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.030186 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc943725-6ebd-4214-92c4-6f4d434ec186-config\") pod \"authentication-operator-69f744f599-kc5xl\" (UID: \"fc943725-6ebd-4214-92c4-6f4d434ec186\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kc5xl" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.030744 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fc943725-6ebd-4214-92c4-6f4d434ec186-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-kc5xl\" (UID: \"fc943725-6ebd-4214-92c4-6f4d434ec186\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kc5xl" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.030908 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7da7288f-ca47-4172-a3dd-80a79e803277-installation-pull-secrets\") pod \"image-registry-697d97f7c8-tlkp2\" (UID: \"7da7288f-ca47-4172-a3dd-80a79e803277\") " pod="openshift-image-registry/image-registry-697d97f7c8-tlkp2" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.030950 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b34a7162-6088-486a-a1fc-6e01d4bd7ca6-srv-cert\") pod \"olm-operator-6b444d44fb-8xtg5\" (UID: \"b34a7162-6088-486a-a1fc-6e01d4bd7ca6\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8xtg5" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.031125 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fb7kd\" (UniqueName: \"kubernetes.io/projected/c0116c88-07ce-460e-90ba-2b5d8fd6a921-kube-api-access-fb7kd\") pod \"cni-sysctl-allowlist-ds-9cpg7\" (UID: \"c0116c88-07ce-460e-90ba-2b5d8fd6a921\") " pod="openshift-multus/cni-sysctl-allowlist-ds-9cpg7" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.031187 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7da7288f-ca47-4172-a3dd-80a79e803277-trusted-ca\") pod \"image-registry-697d97f7c8-tlkp2\" (UID: \"7da7288f-ca47-4172-a3dd-80a79e803277\") " pod="openshift-image-registry/image-registry-697d97f7c8-tlkp2" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.031238 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/c0116c88-07ce-460e-90ba-2b5d8fd6a921-ready\") pod \"cni-sysctl-allowlist-ds-9cpg7\" (UID: \"c0116c88-07ce-460e-90ba-2b5d8fd6a921\") " pod="openshift-multus/cni-sysctl-allowlist-ds-9cpg7" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.031272 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7da7288f-ca47-4172-a3dd-80a79e803277-bound-sa-token\") pod \"image-registry-697d97f7c8-tlkp2\" (UID: \"7da7288f-ca47-4172-a3dd-80a79e803277\") " pod="openshift-image-registry/image-registry-697d97f7c8-tlkp2" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.031304 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lh8zn\" (UniqueName: \"kubernetes.io/projected/96d563e0-088d-4fd3-a3cf-dc70e4a40d99-kube-api-access-lh8zn\") pod \"openshift-apiserver-operator-796bbdcf4f-zwht2\" (UID: \"96d563e0-088d-4fd3-a3cf-dc70e4a40d99\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zwht2" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.031325 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/4d3ff749-37cf-414c-ad3b-fe72fc1cfe28-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-c5vw7\" (UID: \"4d3ff749-37cf-414c-ad3b-fe72fc1cfe28\") " pod="openshift-authentication/oauth-openshift-558db77b4-c5vw7" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.031362 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/4d3ff749-37cf-414c-ad3b-fe72fc1cfe28-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-c5vw7\" (UID: \"4d3ff749-37cf-414c-ad3b-fe72fc1cfe28\") " pod="openshift-authentication/oauth-openshift-558db77b4-c5vw7" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.031447 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrb8l\" (UniqueName: \"kubernetes.io/projected/b34a7162-6088-486a-a1fc-6e01d4bd7ca6-kube-api-access-wrb8l\") pod \"olm-operator-6b444d44fb-8xtg5\" (UID: \"b34a7162-6088-486a-a1fc-6e01d4bd7ca6\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8xtg5" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.031469 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f2320984-2f2c-4a56-a1d1-0c9b2e5f2d55-config-volume\") pod \"dns-default-vn8zj\" (UID: \"f2320984-2f2c-4a56-a1d1-0c9b2e5f2d55\") " pod="openshift-dns/dns-default-vn8zj" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.031488 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/3a7350f6-60df-4463-a08c-1b231d9443c6-plugins-dir\") pod \"csi-hostpathplugin-cfzr7\" (UID: \"3a7350f6-60df-4463-a08c-1b231d9443c6\") " pod="hostpath-provisioner/csi-hostpathplugin-cfzr7" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.031520 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/3a7350f6-60df-4463-a08c-1b231d9443c6-mountpoint-dir\") pod \"csi-hostpathplugin-cfzr7\" (UID: \"3a7350f6-60df-4463-a08c-1b231d9443c6\") " pod="hostpath-provisioner/csi-hostpathplugin-cfzr7" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.038252 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/65bf442d-adbb-4d6d-b0ee-55bbba31e305-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-7cnpj\" (UID: \"65bf442d-adbb-4d6d-b0ee-55bbba31e305\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-7cnpj" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.035708 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/0a71010c-36bf-4830-91be-1f54483db83f-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-hn5gq\" (UID: \"0a71010c-36bf-4830-91be-1f54483db83f\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hn5gq" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.039359 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4wxw\" (UniqueName: \"kubernetes.io/projected/f2320984-2f2c-4a56-a1d1-0c9b2e5f2d55-kube-api-access-s4wxw\") pod \"dns-default-vn8zj\" (UID: \"f2320984-2f2c-4a56-a1d1-0c9b2e5f2d55\") " pod="openshift-dns/dns-default-vn8zj" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.039400 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/3a7350f6-60df-4463-a08c-1b231d9443c6-registration-dir\") pod \"csi-hostpathplugin-cfzr7\" (UID: \"3a7350f6-60df-4463-a08c-1b231d9443c6\") " pod="hostpath-provisioner/csi-hostpathplugin-cfzr7" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.039496 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4jjs\" (UniqueName: \"kubernetes.io/projected/afa0ff09-5605-496f-806a-a8719dd3d958-kube-api-access-x4jjs\") pod \"ingress-canary-dwfwm\" (UID: \"afa0ff09-5605-496f-806a-a8719dd3d958\") " pod="openshift-ingress-canary/ingress-canary-dwfwm" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.039667 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-br2dv\" (UniqueName: \"kubernetes.io/projected/fc943725-6ebd-4214-92c4-6f4d434ec186-kube-api-access-br2dv\") pod \"authentication-operator-69f744f599-kc5xl\" (UID: \"fc943725-6ebd-4214-92c4-6f4d434ec186\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kc5xl" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.039717 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/96d563e0-088d-4fd3-a3cf-dc70e4a40d99-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-zwht2\" (UID: \"96d563e0-088d-4fd3-a3cf-dc70e4a40d99\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zwht2" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.039808 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/4d3ff749-37cf-414c-ad3b-fe72fc1cfe28-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-c5vw7\" (UID: \"4d3ff749-37cf-414c-ad3b-fe72fc1cfe28\") " pod="openshift-authentication/oauth-openshift-558db77b4-c5vw7" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.040026 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7da7288f-ca47-4172-a3dd-80a79e803277-registry-tls\") pod \"image-registry-697d97f7c8-tlkp2\" (UID: \"7da7288f-ca47-4172-a3dd-80a79e803277\") " pod="openshift-image-registry/image-registry-697d97f7c8-tlkp2" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.040967 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7da7288f-ca47-4172-a3dd-80a79e803277-trusted-ca\") pod \"image-registry-697d97f7c8-tlkp2\" (UID: \"7da7288f-ca47-4172-a3dd-80a79e803277\") " pod="openshift-image-registry/image-registry-697d97f7c8-tlkp2" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.036659 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ef9e528c-68f5-4e04-9ceb-99e84335173c-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-m7vd5\" (UID: \"ef9e528c-68f5-4e04-9ceb-99e84335173c\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-m7vd5" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.042118 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/4d3ff749-37cf-414c-ad3b-fe72fc1cfe28-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-c5vw7\" (UID: \"4d3ff749-37cf-414c-ad3b-fe72fc1cfe28\") " pod="openshift-authentication/oauth-openshift-558db77b4-c5vw7" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.049171 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/4d3ff749-37cf-414c-ad3b-fe72fc1cfe28-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-c5vw7\" (UID: \"4d3ff749-37cf-414c-ad3b-fe72fc1cfe28\") " pod="openshift-authentication/oauth-openshift-558db77b4-c5vw7" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.050337 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ae334fdf-f952-4b6b-8372-1fd7ef332362-console-oauth-config\") pod \"console-f9d7485db-dmmdh\" (UID: \"ae334fdf-f952-4b6b-8372-1fd7ef332362\") " pod="openshift-console/console-f9d7485db-dmmdh" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.050378 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j4hzl\" (UniqueName: \"kubernetes.io/projected/65bf442d-adbb-4d6d-b0ee-55bbba31e305-kube-api-access-j4hzl\") pod \"multus-admission-controller-857f4d67dd-7cnpj\" (UID: \"65bf442d-adbb-4d6d-b0ee-55bbba31e305\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-7cnpj" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.050433 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4d3ff749-37cf-414c-ad3b-fe72fc1cfe28-audit-dir\") pod \"oauth-openshift-558db77b4-c5vw7\" (UID: \"4d3ff749-37cf-414c-ad3b-fe72fc1cfe28\") " pod="openshift-authentication/oauth-openshift-558db77b4-c5vw7" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.050467 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/4d3ff749-37cf-414c-ad3b-fe72fc1cfe28-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-c5vw7\" (UID: \"4d3ff749-37cf-414c-ad3b-fe72fc1cfe28\") " pod="openshift-authentication/oauth-openshift-558db77b4-c5vw7" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.050599 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/4d3ff749-37cf-414c-ad3b-fe72fc1cfe28-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-c5vw7\" (UID: \"4d3ff749-37cf-414c-ad3b-fe72fc1cfe28\") " pod="openshift-authentication/oauth-openshift-558db77b4-c5vw7" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.050705 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b34a7162-6088-486a-a1fc-6e01d4bd7ca6-profile-collector-cert\") pod \"olm-operator-6b444d44fb-8xtg5\" (UID: \"b34a7162-6088-486a-a1fc-6e01d4bd7ca6\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8xtg5" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.050770 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zt8rk\" (UniqueName: \"kubernetes.io/projected/b59f4996-13e0-4b70-b835-fbeea765c681-kube-api-access-zt8rk\") pod \"migrator-59844c95c7-brfsp\" (UID: \"b59f4996-13e0-4b70-b835-fbeea765c681\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-brfsp" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.050805 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4d3ff749-37cf-414c-ad3b-fe72fc1cfe28-audit-policies\") pod \"oauth-openshift-558db77b4-c5vw7\" (UID: \"4d3ff749-37cf-414c-ad3b-fe72fc1cfe28\") " pod="openshift-authentication/oauth-openshift-558db77b4-c5vw7" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.052825 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4d3ff749-37cf-414c-ad3b-fe72fc1cfe28-audit-dir\") pod \"oauth-openshift-558db77b4-c5vw7\" (UID: \"4d3ff749-37cf-414c-ad3b-fe72fc1cfe28\") " pod="openshift-authentication/oauth-openshift-558db77b4-c5vw7" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.053127 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4d3ff749-37cf-414c-ad3b-fe72fc1cfe28-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-c5vw7\" (UID: \"4d3ff749-37cf-414c-ad3b-fe72fc1cfe28\") " pod="openshift-authentication/oauth-openshift-558db77b4-c5vw7" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.053265 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gb9sq\" (UniqueName: \"kubernetes.io/projected/e5e57a00-19c2-4a17-9db0-ae67ea9fc70e-kube-api-access-gb9sq\") pod \"machine-config-server-p52m7\" (UID: \"e5e57a00-19c2-4a17-9db0-ae67ea9fc70e\") " pod="openshift-machine-config-operator/machine-config-server-p52m7" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.053465 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ae334fdf-f952-4b6b-8372-1fd7ef332362-service-ca\") pod \"console-f9d7485db-dmmdh\" (UID: \"ae334fdf-f952-4b6b-8372-1fd7ef332362\") " pod="openshift-console/console-f9d7485db-dmmdh" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.053475 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/4d3ff749-37cf-414c-ad3b-fe72fc1cfe28-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-c5vw7\" (UID: \"4d3ff749-37cf-414c-ad3b-fe72fc1cfe28\") " pod="openshift-authentication/oauth-openshift-558db77b4-c5vw7" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.053896 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gfjwf\" (UniqueName: \"kubernetes.io/projected/a49c3549-30d5-4927-ae41-cc2b9e7f47e2-kube-api-access-gfjwf\") pod \"route-controller-manager-6576b87f9c-xwxlw\" (UID: \"a49c3549-30d5-4927-ae41-cc2b9e7f47e2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-xwxlw" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.054091 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/3a7350f6-60df-4463-a08c-1b231d9443c6-csi-data-dir\") pod \"csi-hostpathplugin-cfzr7\" (UID: \"3a7350f6-60df-4463-a08c-1b231d9443c6\") " pod="hostpath-provisioner/csi-hostpathplugin-cfzr7" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.054252 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e2e266d-1d26-4bbc-987e-e08b2fa0ccfe-config\") pod \"service-ca-operator-777779d784-rcw2w\" (UID: \"7e2e266d-1d26-4bbc-987e-e08b2fa0ccfe\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-rcw2w" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.054496 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/3a7350f6-60df-4463-a08c-1b231d9443c6-socket-dir\") pod \"csi-hostpathplugin-cfzr7\" (UID: \"3a7350f6-60df-4463-a08c-1b231d9443c6\") " pod="hostpath-provisioner/csi-hostpathplugin-cfzr7" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.054734 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ae334fdf-f952-4b6b-8372-1fd7ef332362-console-serving-cert\") pod \"console-f9d7485db-dmmdh\" (UID: \"ae334fdf-f952-4b6b-8372-1fd7ef332362\") " pod="openshift-console/console-f9d7485db-dmmdh" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.054782 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a49c3549-30d5-4927-ae41-cc2b9e7f47e2-config\") pod \"route-controller-manager-6576b87f9c-xwxlw\" (UID: \"a49c3549-30d5-4927-ae41-cc2b9e7f47e2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-xwxlw" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.054809 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a49c3549-30d5-4927-ae41-cc2b9e7f47e2-serving-cert\") pod \"route-controller-manager-6576b87f9c-xwxlw\" (UID: \"a49c3549-30d5-4927-ae41-cc2b9e7f47e2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-xwxlw" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.054905 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef9e528c-68f5-4e04-9ceb-99e84335173c-config\") pod \"kube-controller-manager-operator-78b949d7b-m7vd5\" (UID: \"ef9e528c-68f5-4e04-9ceb-99e84335173c\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-m7vd5" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.055016 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6a3e2353-3374-48ad-968b-8f4b487d63ed-proxy-tls\") pod \"machine-config-controller-84d6567774-phlbm\" (UID: \"6a3e2353-3374-48ad-968b-8f4b487d63ed\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-phlbm" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.055540 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4d3ff749-37cf-414c-ad3b-fe72fc1cfe28-audit-policies\") pod \"oauth-openshift-558db77b4-c5vw7\" (UID: \"4d3ff749-37cf-414c-ad3b-fe72fc1cfe28\") " pod="openshift-authentication/oauth-openshift-558db77b4-c5vw7" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.055544 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/4d3ff749-37cf-414c-ad3b-fe72fc1cfe28-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-c5vw7\" (UID: \"4d3ff749-37cf-414c-ad3b-fe72fc1cfe28\") " pod="openshift-authentication/oauth-openshift-558db77b4-c5vw7" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.055787 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4d3ff749-37cf-414c-ad3b-fe72fc1cfe28-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-c5vw7\" (UID: \"4d3ff749-37cf-414c-ad3b-fe72fc1cfe28\") " pod="openshift-authentication/oauth-openshift-558db77b4-c5vw7" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.055861 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ae334fdf-f952-4b6b-8372-1fd7ef332362-service-ca\") pod \"console-f9d7485db-dmmdh\" (UID: \"ae334fdf-f952-4b6b-8372-1fd7ef332362\") " pod="openshift-console/console-f9d7485db-dmmdh" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.056089 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fc943725-6ebd-4214-92c4-6f4d434ec186-service-ca-bundle\") pod \"authentication-operator-69f744f599-kc5xl\" (UID: \"fc943725-6ebd-4214-92c4-6f4d434ec186\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kc5xl" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.056258 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0481ff64-8f10-4e72-a81e-d2c53278246a-metrics-tls\") pod \"ingress-operator-5b745b69d9-sksdb\" (UID: \"0481ff64-8f10-4e72-a81e-d2c53278246a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-sksdb" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.056296 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef9e528c-68f5-4e04-9ceb-99e84335173c-config\") pod \"kube-controller-manager-operator-78b949d7b-m7vd5\" (UID: \"ef9e528c-68f5-4e04-9ceb-99e84335173c\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-m7vd5" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.056530 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0481ff64-8f10-4e72-a81e-d2c53278246a-bound-sa-token\") pod \"ingress-operator-5b745b69d9-sksdb\" (UID: \"0481ff64-8f10-4e72-a81e-d2c53278246a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-sksdb" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.056573 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/4d3ff749-37cf-414c-ad3b-fe72fc1cfe28-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-c5vw7\" (UID: \"4d3ff749-37cf-414c-ad3b-fe72fc1cfe28\") " pod="openshift-authentication/oauth-openshift-558db77b4-c5vw7" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.056650 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5srzd\" (UniqueName: \"kubernetes.io/projected/3a7350f6-60df-4463-a08c-1b231d9443c6-kube-api-access-5srzd\") pod \"csi-hostpathplugin-cfzr7\" (UID: \"3a7350f6-60df-4463-a08c-1b231d9443c6\") " pod="hostpath-provisioner/csi-hostpathplugin-cfzr7" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.056685 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/4d3ff749-37cf-414c-ad3b-fe72fc1cfe28-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-c5vw7\" (UID: \"4d3ff749-37cf-414c-ad3b-fe72fc1cfe28\") " pod="openshift-authentication/oauth-openshift-558db77b4-c5vw7" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.057084 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ae334fdf-f952-4b6b-8372-1fd7ef332362-console-config\") pod \"console-f9d7485db-dmmdh\" (UID: \"ae334fdf-f952-4b6b-8372-1fd7ef332362\") " pod="openshift-console/console-f9d7485db-dmmdh" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.057237 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ae334fdf-f952-4b6b-8372-1fd7ef332362-trusted-ca-bundle\") pod \"console-f9d7485db-dmmdh\" (UID: \"ae334fdf-f952-4b6b-8372-1fd7ef332362\") " pod="openshift-console/console-f9d7485db-dmmdh" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.057272 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/4d3ff749-37cf-414c-ad3b-fe72fc1cfe28-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-c5vw7\" (UID: \"4d3ff749-37cf-414c-ad3b-fe72fc1cfe28\") " pod="openshift-authentication/oauth-openshift-558db77b4-c5vw7" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.057447 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/e5e57a00-19c2-4a17-9db0-ae67ea9fc70e-node-bootstrap-token\") pod \"machine-config-server-p52m7\" (UID: \"e5e57a00-19c2-4a17-9db0-ae67ea9fc70e\") " pod="openshift-machine-config-operator/machine-config-server-p52m7" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.057676 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a49c3549-30d5-4927-ae41-cc2b9e7f47e2-config\") pod \"route-controller-manager-6576b87f9c-xwxlw\" (UID: \"a49c3549-30d5-4927-ae41-cc2b9e7f47e2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-xwxlw" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.058060 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m49nh\" (UniqueName: \"kubernetes.io/projected/0a71010c-36bf-4830-91be-1f54483db83f-kube-api-access-m49nh\") pod \"package-server-manager-789f6589d5-hn5gq\" (UID: \"0a71010c-36bf-4830-91be-1f54483db83f\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hn5gq" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.058098 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c0116c88-07ce-460e-90ba-2b5d8fd6a921-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-9cpg7\" (UID: \"c0116c88-07ce-460e-90ba-2b5d8fd6a921\") " pod="openshift-multus/cni-sysctl-allowlist-ds-9cpg7" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.058143 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fc943725-6ebd-4214-92c4-6f4d434ec186-serving-cert\") pod \"authentication-operator-69f744f599-kc5xl\" (UID: \"fc943725-6ebd-4214-92c4-6f4d434ec186\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kc5xl" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.058213 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7da7288f-ca47-4172-a3dd-80a79e803277-ca-trust-extracted\") pod \"image-registry-697d97f7c8-tlkp2\" (UID: \"7da7288f-ca47-4172-a3dd-80a79e803277\") " pod="openshift-image-registry/image-registry-697d97f7c8-tlkp2" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.058259 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-smnzw\" (UniqueName: \"kubernetes.io/projected/7e2e266d-1d26-4bbc-987e-e08b2fa0ccfe-kube-api-access-smnzw\") pod \"service-ca-operator-777779d784-rcw2w\" (UID: \"7e2e266d-1d26-4bbc-987e-e08b2fa0ccfe\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-rcw2w" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.058538 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ae334fdf-f952-4b6b-8372-1fd7ef332362-oauth-serving-cert\") pod \"console-f9d7485db-dmmdh\" (UID: \"ae334fdf-f952-4b6b-8372-1fd7ef332362\") " pod="openshift-console/console-f9d7485db-dmmdh" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.058565 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a49c3549-30d5-4927-ae41-cc2b9e7f47e2-client-ca\") pod \"route-controller-manager-6576b87f9c-xwxlw\" (UID: \"a49c3549-30d5-4927-ae41-cc2b9e7f47e2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-xwxlw" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.058644 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/4d3ff749-37cf-414c-ad3b-fe72fc1cfe28-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-c5vw7\" (UID: \"4d3ff749-37cf-414c-ad3b-fe72fc1cfe28\") " pod="openshift-authentication/oauth-openshift-558db77b4-c5vw7" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.058691 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/e5e57a00-19c2-4a17-9db0-ae67ea9fc70e-certs\") pod \"machine-config-server-p52m7\" (UID: \"e5e57a00-19c2-4a17-9db0-ae67ea9fc70e\") " pod="openshift-machine-config-operator/machine-config-server-p52m7" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.058727 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7da7288f-ca47-4172-a3dd-80a79e803277-registry-certificates\") pod \"image-registry-697d97f7c8-tlkp2\" (UID: \"7da7288f-ca47-4172-a3dd-80a79e803277\") " pod="openshift-image-registry/image-registry-697d97f7c8-tlkp2" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.058744 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpwpj\" (UniqueName: \"kubernetes.io/projected/0de9aa72-edab-4ae9-b2dd-e20ef6b83277-kube-api-access-wpwpj\") pod \"auto-csr-approver-29566718-7hmv9\" (UID: \"0de9aa72-edab-4ae9-b2dd-e20ef6b83277\") " pod="openshift-infra/auto-csr-approver-29566718-7hmv9" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.059555 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7da7288f-ca47-4172-a3dd-80a79e803277-ca-trust-extracted\") pod \"image-registry-697d97f7c8-tlkp2\" (UID: \"7da7288f-ca47-4172-a3dd-80a79e803277\") " pod="openshift-image-registry/image-registry-697d97f7c8-tlkp2" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.059759 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a49c3549-30d5-4927-ae41-cc2b9e7f47e2-client-ca\") pod \"route-controller-manager-6576b87f9c-xwxlw\" (UID: \"a49c3549-30d5-4927-ae41-cc2b9e7f47e2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-xwxlw" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.060814 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7da7288f-ca47-4172-a3dd-80a79e803277-registry-certificates\") pod \"image-registry-697d97f7c8-tlkp2\" (UID: \"7da7288f-ca47-4172-a3dd-80a79e803277\") " pod="openshift-image-registry/image-registry-697d97f7c8-tlkp2" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.060983 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fc943725-6ebd-4214-92c4-6f4d434ec186-service-ca-bundle\") pod \"authentication-operator-69f744f599-kc5xl\" (UID: \"fc943725-6ebd-4214-92c4-6f4d434ec186\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kc5xl" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.060999 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ae334fdf-f952-4b6b-8372-1fd7ef332362-console-config\") pod \"console-f9d7485db-dmmdh\" (UID: \"ae334fdf-f952-4b6b-8372-1fd7ef332362\") " pod="openshift-console/console-f9d7485db-dmmdh" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.062313 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/4d3ff749-37cf-414c-ad3b-fe72fc1cfe28-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-c5vw7\" (UID: \"4d3ff749-37cf-414c-ad3b-fe72fc1cfe28\") " pod="openshift-authentication/oauth-openshift-558db77b4-c5vw7" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.063623 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ae334fdf-f952-4b6b-8372-1fd7ef332362-trusted-ca-bundle\") pod \"console-f9d7485db-dmmdh\" (UID: \"ae334fdf-f952-4b6b-8372-1fd7ef332362\") " pod="openshift-console/console-f9d7485db-dmmdh" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.063817 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/4d3ff749-37cf-414c-ad3b-fe72fc1cfe28-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-c5vw7\" (UID: \"4d3ff749-37cf-414c-ad3b-fe72fc1cfe28\") " pod="openshift-authentication/oauth-openshift-558db77b4-c5vw7" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.064258 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7da7288f-ca47-4172-a3dd-80a79e803277-registry-tls\") pod \"image-registry-697d97f7c8-tlkp2\" (UID: \"7da7288f-ca47-4172-a3dd-80a79e803277\") " pod="openshift-image-registry/image-registry-697d97f7c8-tlkp2" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.064507 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ae334fdf-f952-4b6b-8372-1fd7ef332362-oauth-serving-cert\") pod \"console-f9d7485db-dmmdh\" (UID: \"ae334fdf-f952-4b6b-8372-1fd7ef332362\") " pod="openshift-console/console-f9d7485db-dmmdh" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.065313 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7da7288f-ca47-4172-a3dd-80a79e803277-installation-pull-secrets\") pod \"image-registry-697d97f7c8-tlkp2\" (UID: \"7da7288f-ca47-4172-a3dd-80a79e803277\") " pod="openshift-image-registry/image-registry-697d97f7c8-tlkp2" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.081268 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-9lbsk"] Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.081720 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0481ff64-8f10-4e72-a81e-d2c53278246a-metrics-tls\") pod \"ingress-operator-5b745b69d9-sksdb\" (UID: \"0481ff64-8f10-4e72-a81e-d2c53278246a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-sksdb" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.081970 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a49c3549-30d5-4927-ae41-cc2b9e7f47e2-serving-cert\") pod \"route-controller-manager-6576b87f9c-xwxlw\" (UID: \"a49c3549-30d5-4927-ae41-cc2b9e7f47e2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-xwxlw" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.082124 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-qq9xx"] Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.083359 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/96d563e0-088d-4fd3-a3cf-dc70e4a40d99-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-zwht2\" (UID: \"96d563e0-088d-4fd3-a3cf-dc70e4a40d99\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zwht2" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.083495 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ae334fdf-f952-4b6b-8372-1fd7ef332362-console-serving-cert\") pod \"console-f9d7485db-dmmdh\" (UID: \"ae334fdf-f952-4b6b-8372-1fd7ef332362\") " pod="openshift-console/console-f9d7485db-dmmdh" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.083500 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/4d3ff749-37cf-414c-ad3b-fe72fc1cfe28-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-c5vw7\" (UID: \"4d3ff749-37cf-414c-ad3b-fe72fc1cfe28\") " pod="openshift-authentication/oauth-openshift-558db77b4-c5vw7" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.083938 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fc943725-6ebd-4214-92c4-6f4d434ec186-serving-cert\") pod \"authentication-operator-69f744f599-kc5xl\" (UID: \"fc943725-6ebd-4214-92c4-6f4d434ec186\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kc5xl" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.087436 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zhx9\" (UniqueName: \"kubernetes.io/projected/0481ff64-8f10-4e72-a81e-d2c53278246a-kube-api-access-5zhx9\") pod \"ingress-operator-5b745b69d9-sksdb\" (UID: \"0481ff64-8f10-4e72-a81e-d2c53278246a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-sksdb" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.093697 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqndt\" (UniqueName: \"kubernetes.io/projected/ae334fdf-f952-4b6b-8372-1fd7ef332362-kube-api-access-wqndt\") pod \"console-f9d7485db-dmmdh\" (UID: \"ae334fdf-f952-4b6b-8372-1fd7ef332362\") " pod="openshift-console/console-f9d7485db-dmmdh" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.133983 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54kwz\" (UniqueName: \"kubernetes.io/projected/6a3e2353-3374-48ad-968b-8f4b487d63ed-kube-api-access-54kwz\") pod \"machine-config-controller-84d6567774-phlbm\" (UID: \"6a3e2353-3374-48ad-968b-8f4b487d63ed\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-phlbm" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.160354 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.160568 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7e2e266d-1d26-4bbc-987e-e08b2fa0ccfe-serving-cert\") pod \"service-ca-operator-777779d784-rcw2w\" (UID: \"7e2e266d-1d26-4bbc-987e-e08b2fa0ccfe\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-rcw2w" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.160610 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f2320984-2f2c-4a56-a1d1-0c9b2e5f2d55-metrics-tls\") pod \"dns-default-vn8zj\" (UID: \"f2320984-2f2c-4a56-a1d1-0c9b2e5f2d55\") " pod="openshift-dns/dns-default-vn8zj" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.160645 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c0116c88-07ce-460e-90ba-2b5d8fd6a921-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-9cpg7\" (UID: \"c0116c88-07ce-460e-90ba-2b5d8fd6a921\") " pod="openshift-multus/cni-sysctl-allowlist-ds-9cpg7" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.160668 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/afa0ff09-5605-496f-806a-a8719dd3d958-cert\") pod \"ingress-canary-dwfwm\" (UID: \"afa0ff09-5605-496f-806a-a8719dd3d958\") " pod="openshift-ingress-canary/ingress-canary-dwfwm" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.160706 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b34a7162-6088-486a-a1fc-6e01d4bd7ca6-srv-cert\") pod \"olm-operator-6b444d44fb-8xtg5\" (UID: \"b34a7162-6088-486a-a1fc-6e01d4bd7ca6\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8xtg5" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.160731 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fb7kd\" (UniqueName: \"kubernetes.io/projected/c0116c88-07ce-460e-90ba-2b5d8fd6a921-kube-api-access-fb7kd\") pod \"cni-sysctl-allowlist-ds-9cpg7\" (UID: \"c0116c88-07ce-460e-90ba-2b5d8fd6a921\") " pod="openshift-multus/cni-sysctl-allowlist-ds-9cpg7" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.160751 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/c0116c88-07ce-460e-90ba-2b5d8fd6a921-ready\") pod \"cni-sysctl-allowlist-ds-9cpg7\" (UID: \"c0116c88-07ce-460e-90ba-2b5d8fd6a921\") " pod="openshift-multus/cni-sysctl-allowlist-ds-9cpg7" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.160796 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/3a7350f6-60df-4463-a08c-1b231d9443c6-plugins-dir\") pod \"csi-hostpathplugin-cfzr7\" (UID: \"3a7350f6-60df-4463-a08c-1b231d9443c6\") " pod="hostpath-provisioner/csi-hostpathplugin-cfzr7" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.160819 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrb8l\" (UniqueName: \"kubernetes.io/projected/b34a7162-6088-486a-a1fc-6e01d4bd7ca6-kube-api-access-wrb8l\") pod \"olm-operator-6b444d44fb-8xtg5\" (UID: \"b34a7162-6088-486a-a1fc-6e01d4bd7ca6\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8xtg5" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.160859 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f2320984-2f2c-4a56-a1d1-0c9b2e5f2d55-config-volume\") pod \"dns-default-vn8zj\" (UID: \"f2320984-2f2c-4a56-a1d1-0c9b2e5f2d55\") " pod="openshift-dns/dns-default-vn8zj" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.160882 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/3a7350f6-60df-4463-a08c-1b231d9443c6-mountpoint-dir\") pod \"csi-hostpathplugin-cfzr7\" (UID: \"3a7350f6-60df-4463-a08c-1b231d9443c6\") " pod="hostpath-provisioner/csi-hostpathplugin-cfzr7" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.160904 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/0a71010c-36bf-4830-91be-1f54483db83f-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-hn5gq\" (UID: \"0a71010c-36bf-4830-91be-1f54483db83f\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hn5gq" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.160929 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4wxw\" (UniqueName: \"kubernetes.io/projected/f2320984-2f2c-4a56-a1d1-0c9b2e5f2d55-kube-api-access-s4wxw\") pod \"dns-default-vn8zj\" (UID: \"f2320984-2f2c-4a56-a1d1-0c9b2e5f2d55\") " pod="openshift-dns/dns-default-vn8zj" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.160956 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/3a7350f6-60df-4463-a08c-1b231d9443c6-registration-dir\") pod \"csi-hostpathplugin-cfzr7\" (UID: \"3a7350f6-60df-4463-a08c-1b231d9443c6\") " pod="hostpath-provisioner/csi-hostpathplugin-cfzr7" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.160991 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4jjs\" (UniqueName: \"kubernetes.io/projected/afa0ff09-5605-496f-806a-a8719dd3d958-kube-api-access-x4jjs\") pod \"ingress-canary-dwfwm\" (UID: \"afa0ff09-5605-496f-806a-a8719dd3d958\") " pod="openshift-ingress-canary/ingress-canary-dwfwm" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.161038 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b34a7162-6088-486a-a1fc-6e01d4bd7ca6-profile-collector-cert\") pod \"olm-operator-6b444d44fb-8xtg5\" (UID: \"b34a7162-6088-486a-a1fc-6e01d4bd7ca6\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8xtg5" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.161073 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gb9sq\" (UniqueName: \"kubernetes.io/projected/e5e57a00-19c2-4a17-9db0-ae67ea9fc70e-kube-api-access-gb9sq\") pod \"machine-config-server-p52m7\" (UID: \"e5e57a00-19c2-4a17-9db0-ae67ea9fc70e\") " pod="openshift-machine-config-operator/machine-config-server-p52m7" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.161105 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/3a7350f6-60df-4463-a08c-1b231d9443c6-csi-data-dir\") pod \"csi-hostpathplugin-cfzr7\" (UID: \"3a7350f6-60df-4463-a08c-1b231d9443c6\") " pod="hostpath-provisioner/csi-hostpathplugin-cfzr7" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.161127 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e2e266d-1d26-4bbc-987e-e08b2fa0ccfe-config\") pod \"service-ca-operator-777779d784-rcw2w\" (UID: \"7e2e266d-1d26-4bbc-987e-e08b2fa0ccfe\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-rcw2w" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.161149 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/3a7350f6-60df-4463-a08c-1b231d9443c6-socket-dir\") pod \"csi-hostpathplugin-cfzr7\" (UID: \"3a7350f6-60df-4463-a08c-1b231d9443c6\") " pod="hostpath-provisioner/csi-hostpathplugin-cfzr7" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.161186 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5srzd\" (UniqueName: \"kubernetes.io/projected/3a7350f6-60df-4463-a08c-1b231d9443c6-kube-api-access-5srzd\") pod \"csi-hostpathplugin-cfzr7\" (UID: \"3a7350f6-60df-4463-a08c-1b231d9443c6\") " pod="hostpath-provisioner/csi-hostpathplugin-cfzr7" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.161213 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/e5e57a00-19c2-4a17-9db0-ae67ea9fc70e-node-bootstrap-token\") pod \"machine-config-server-p52m7\" (UID: \"e5e57a00-19c2-4a17-9db0-ae67ea9fc70e\") " pod="openshift-machine-config-operator/machine-config-server-p52m7" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.161243 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m49nh\" (UniqueName: \"kubernetes.io/projected/0a71010c-36bf-4830-91be-1f54483db83f-kube-api-access-m49nh\") pod \"package-server-manager-789f6589d5-hn5gq\" (UID: \"0a71010c-36bf-4830-91be-1f54483db83f\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hn5gq" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.161269 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c0116c88-07ce-460e-90ba-2b5d8fd6a921-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-9cpg7\" (UID: \"c0116c88-07ce-460e-90ba-2b5d8fd6a921\") " pod="openshift-multus/cni-sysctl-allowlist-ds-9cpg7" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.161294 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-smnzw\" (UniqueName: \"kubernetes.io/projected/7e2e266d-1d26-4bbc-987e-e08b2fa0ccfe-kube-api-access-smnzw\") pod \"service-ca-operator-777779d784-rcw2w\" (UID: \"7e2e266d-1d26-4bbc-987e-e08b2fa0ccfe\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-rcw2w" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.161318 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/e5e57a00-19c2-4a17-9db0-ae67ea9fc70e-certs\") pod \"machine-config-server-p52m7\" (UID: \"e5e57a00-19c2-4a17-9db0-ae67ea9fc70e\") " pod="openshift-machine-config-operator/machine-config-server-p52m7" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.161344 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wpwpj\" (UniqueName: \"kubernetes.io/projected/0de9aa72-edab-4ae9-b2dd-e20ef6b83277-kube-api-access-wpwpj\") pod \"auto-csr-approver-29566718-7hmv9\" (UID: \"0de9aa72-edab-4ae9-b2dd-e20ef6b83277\") " pod="openshift-infra/auto-csr-approver-29566718-7hmv9" Mar 20 10:38:16 crc kubenswrapper[4748]: E0320 10:38:16.162277 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:38:16.662253577 +0000 UTC m=+131.803799391 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.162910 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/3a7350f6-60df-4463-a08c-1b231d9443c6-plugins-dir\") pod \"csi-hostpathplugin-cfzr7\" (UID: \"3a7350f6-60df-4463-a08c-1b231d9443c6\") " pod="hostpath-provisioner/csi-hostpathplugin-cfzr7" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.163046 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tbbln\" (UniqueName: \"kubernetes.io/projected/4d3ff749-37cf-414c-ad3b-fe72fc1cfe28-kube-api-access-tbbln\") pod \"oauth-openshift-558db77b4-c5vw7\" (UID: \"4d3ff749-37cf-414c-ad3b-fe72fc1cfe28\") " pod="openshift-authentication/oauth-openshift-558db77b4-c5vw7" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.163131 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/3a7350f6-60df-4463-a08c-1b231d9443c6-registration-dir\") pod \"csi-hostpathplugin-cfzr7\" (UID: \"3a7350f6-60df-4463-a08c-1b231d9443c6\") " pod="hostpath-provisioner/csi-hostpathplugin-cfzr7" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.163167 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/3a7350f6-60df-4463-a08c-1b231d9443c6-mountpoint-dir\") pod \"csi-hostpathplugin-cfzr7\" (UID: \"3a7350f6-60df-4463-a08c-1b231d9443c6\") " pod="hostpath-provisioner/csi-hostpathplugin-cfzr7" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.163976 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/3a7350f6-60df-4463-a08c-1b231d9443c6-socket-dir\") pod \"csi-hostpathplugin-cfzr7\" (UID: \"3a7350f6-60df-4463-a08c-1b231d9443c6\") " pod="hostpath-provisioner/csi-hostpathplugin-cfzr7" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.164135 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c0116c88-07ce-460e-90ba-2b5d8fd6a921-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-9cpg7\" (UID: \"c0116c88-07ce-460e-90ba-2b5d8fd6a921\") " pod="openshift-multus/cni-sysctl-allowlist-ds-9cpg7" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.164291 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/3a7350f6-60df-4463-a08c-1b231d9443c6-csi-data-dir\") pod \"csi-hostpathplugin-cfzr7\" (UID: \"3a7350f6-60df-4463-a08c-1b231d9443c6\") " pod="hostpath-provisioner/csi-hostpathplugin-cfzr7" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.164708 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f2320984-2f2c-4a56-a1d1-0c9b2e5f2d55-config-volume\") pod \"dns-default-vn8zj\" (UID: \"f2320984-2f2c-4a56-a1d1-0c9b2e5f2d55\") " pod="openshift-dns/dns-default-vn8zj" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.165287 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/c0116c88-07ce-460e-90ba-2b5d8fd6a921-ready\") pod \"cni-sysctl-allowlist-ds-9cpg7\" (UID: \"c0116c88-07ce-460e-90ba-2b5d8fd6a921\") " pod="openshift-multus/cni-sysctl-allowlist-ds-9cpg7" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.165664 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c0116c88-07ce-460e-90ba-2b5d8fd6a921-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-9cpg7\" (UID: \"c0116c88-07ce-460e-90ba-2b5d8fd6a921\") " pod="openshift-multus/cni-sysctl-allowlist-ds-9cpg7" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.166280 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e2e266d-1d26-4bbc-987e-e08b2fa0ccfe-config\") pod \"service-ca-operator-777779d784-rcw2w\" (UID: \"7e2e266d-1d26-4bbc-987e-e08b2fa0ccfe\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-rcw2w" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.169361 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/0a71010c-36bf-4830-91be-1f54483db83f-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-hn5gq\" (UID: \"0a71010c-36bf-4830-91be-1f54483db83f\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hn5gq" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.172754 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b34a7162-6088-486a-a1fc-6e01d4bd7ca6-profile-collector-cert\") pod \"olm-operator-6b444d44fb-8xtg5\" (UID: \"b34a7162-6088-486a-a1fc-6e01d4bd7ca6\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8xtg5" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.177393 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/e5e57a00-19c2-4a17-9db0-ae67ea9fc70e-certs\") pod \"machine-config-server-p52m7\" (UID: \"e5e57a00-19c2-4a17-9db0-ae67ea9fc70e\") " pod="openshift-machine-config-operator/machine-config-server-p52m7" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.178170 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6q2px"] Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.179090 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/e5e57a00-19c2-4a17-9db0-ae67ea9fc70e-node-bootstrap-token\") pod \"machine-config-server-p52m7\" (UID: \"e5e57a00-19c2-4a17-9db0-ae67ea9fc70e\") " pod="openshift-machine-config-operator/machine-config-server-p52m7" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.179336 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7bqkp\" (UniqueName: \"kubernetes.io/projected/7da7288f-ca47-4172-a3dd-80a79e803277-kube-api-access-7bqkp\") pod \"image-registry-697d97f7c8-tlkp2\" (UID: \"7da7288f-ca47-4172-a3dd-80a79e803277\") " pod="openshift-image-registry/image-registry-697d97f7c8-tlkp2" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.181239 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f2320984-2f2c-4a56-a1d1-0c9b2e5f2d55-metrics-tls\") pod \"dns-default-vn8zj\" (UID: \"f2320984-2f2c-4a56-a1d1-0c9b2e5f2d55\") " pod="openshift-dns/dns-default-vn8zj" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.183536 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ef9e528c-68f5-4e04-9ceb-99e84335173c-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-m7vd5\" (UID: \"ef9e528c-68f5-4e04-9ceb-99e84335173c\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-m7vd5" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.184189 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tqb9k"] Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.186899 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b34a7162-6088-486a-a1fc-6e01d4bd7ca6-srv-cert\") pod \"olm-operator-6b444d44fb-8xtg5\" (UID: \"b34a7162-6088-486a-a1fc-6e01d4bd7ca6\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8xtg5" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.186909 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/afa0ff09-5605-496f-806a-a8719dd3d958-cert\") pod \"ingress-canary-dwfwm\" (UID: \"afa0ff09-5605-496f-806a-a8719dd3d958\") " pod="openshift-ingress-canary/ingress-canary-dwfwm" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.187160 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7e2e266d-1d26-4bbc-987e-e08b2fa0ccfe-serving-cert\") pod \"service-ca-operator-777779d784-rcw2w\" (UID: \"7e2e266d-1d26-4bbc-987e-e08b2fa0ccfe\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-rcw2w" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.188130 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-2xltn"] Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.191394 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7da7288f-ca47-4172-a3dd-80a79e803277-bound-sa-token\") pod \"image-registry-697d97f7c8-tlkp2\" (UID: \"7da7288f-ca47-4172-a3dd-80a79e803277\") " pod="openshift-image-registry/image-registry-697d97f7c8-tlkp2" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.214139 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lh8zn\" (UniqueName: \"kubernetes.io/projected/96d563e0-088d-4fd3-a3cf-dc70e4a40d99-kube-api-access-lh8zn\") pod \"openshift-apiserver-operator-796bbdcf4f-zwht2\" (UID: \"96d563e0-088d-4fd3-a3cf-dc70e4a40d99\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zwht2" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.235796 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-br2dv\" (UniqueName: \"kubernetes.io/projected/fc943725-6ebd-4214-92c4-6f4d434ec186-kube-api-access-br2dv\") pod \"authentication-operator-69f744f599-kc5xl\" (UID: \"fc943725-6ebd-4214-92c4-6f4d434ec186\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kc5xl" Mar 20 10:38:16 crc kubenswrapper[4748]: W0320 10:38:16.237150 4748 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc5864ca9_f8a0_40c0_ade0_8d4e6e0a350c.slice/crio-47c0f566fbda796ccd97f133a5c0d2ca137746920d91598aefa0604c6a300dff WatchSource:0}: Error finding container 47c0f566fbda796ccd97f133a5c0d2ca137746920d91598aefa0604c6a300dff: Status 404 returned error can't find the container with id 47c0f566fbda796ccd97f133a5c0d2ca137746920d91598aefa0604c6a300dff Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.243947 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zwht2" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.262320 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tlkp2\" (UID: \"7da7288f-ca47-4172-a3dd-80a79e803277\") " pod="openshift-image-registry/image-registry-697d97f7c8-tlkp2" Mar 20 10:38:16 crc kubenswrapper[4748]: E0320 10:38:16.262701 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:38:16.762684094 +0000 UTC m=+131.904229908 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tlkp2" (UID: "7da7288f-ca47-4172-a3dd-80a79e803277") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.273894 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4hzl\" (UniqueName: \"kubernetes.io/projected/65bf442d-adbb-4d6d-b0ee-55bbba31e305-kube-api-access-j4hzl\") pod \"multus-admission-controller-857f4d67dd-7cnpj\" (UID: \"65bf442d-adbb-4d6d-b0ee-55bbba31e305\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-7cnpj" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.279711 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-dmmdh" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.289514 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zt8rk\" (UniqueName: \"kubernetes.io/projected/b59f4996-13e0-4b70-b835-fbeea765c681-kube-api-access-zt8rk\") pod \"migrator-59844c95c7-brfsp\" (UID: \"b59f4996-13e0-4b70-b835-fbeea765c681\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-brfsp" Mar 20 10:38:16 crc kubenswrapper[4748]: W0320 10:38:16.289657 4748 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode24b6492_332b_421d_b2cd_ab1f11e06432.slice/crio-d2efe3765418a6209069249c270fcdb643dc520170bea396a8e21a0e3784854f WatchSource:0}: Error finding container d2efe3765418a6209069249c270fcdb643dc520170bea396a8e21a0e3784854f: Status 404 returned error can't find the container with id d2efe3765418a6209069249c270fcdb643dc520170bea396a8e21a0e3784854f Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.302129 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-phlbm" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.311490 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfjwf\" (UniqueName: \"kubernetes.io/projected/a49c3549-30d5-4927-ae41-cc2b9e7f47e2-kube-api-access-gfjwf\") pod \"route-controller-manager-6576b87f9c-xwxlw\" (UID: \"a49c3549-30d5-4927-ae41-cc2b9e7f47e2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-xwxlw" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.332736 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0481ff64-8f10-4e72-a81e-d2c53278246a-bound-sa-token\") pod \"ingress-operator-5b745b69d9-sksdb\" (UID: \"0481ff64-8f10-4e72-a81e-d2c53278246a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-sksdb" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.341858 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-xwxlw" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.350876 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpwpj\" (UniqueName: \"kubernetes.io/projected/0de9aa72-edab-4ae9-b2dd-e20ef6b83277-kube-api-access-wpwpj\") pod \"auto-csr-approver-29566718-7hmv9\" (UID: \"0de9aa72-edab-4ae9-b2dd-e20ef6b83277\") " pod="openshift-infra/auto-csr-approver-29566718-7hmv9" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.357926 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tqb9k" event={"ID":"c5864ca9-f8a0-40c0-ade0-8d4e6e0a350c","Type":"ContainerStarted","Data":"47c0f566fbda796ccd97f133a5c0d2ca137746920d91598aefa0604c6a300dff"} Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.362006 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-26svx" event={"ID":"6c6c249f-695d-4875-94ad-a608e8bd7d5f","Type":"ContainerStarted","Data":"12a2bbf0979b12d10c351de55ef1c76b2607e5a8e2db34c4858cb22fa8510416"} Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.362040 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-26svx" event={"ID":"6c6c249f-695d-4875-94ad-a608e8bd7d5f","Type":"ContainerStarted","Data":"df747a1c5df6f56303e002670ac8fe59e723ae566f3a459f1a03a7a66816b134"} Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.362897 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:38:16 crc kubenswrapper[4748]: E0320 10:38:16.363168 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:38:16.863139182 +0000 UTC m=+132.004684986 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.363359 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tlkp2\" (UID: \"7da7288f-ca47-4172-a3dd-80a79e803277\") " pod="openshift-image-registry/image-registry-697d97f7c8-tlkp2" Mar 20 10:38:16 crc kubenswrapper[4748]: E0320 10:38:16.363841 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:38:16.863815339 +0000 UTC m=+132.005361153 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tlkp2" (UID: "7da7288f-ca47-4172-a3dd-80a79e803277") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.364532 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"667e9c73fa33e6a602024142054e9bdbe878857ac11b86203982b8e7d76ae100"} Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.367242 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9qxnc" event={"ID":"554afb3a-2388-4d1f-8949-7c9c9941d685","Type":"ContainerStarted","Data":"a25d4cbccde56765205bc7bf07ef5ad839d032e1d5eef583f8305444165864aa"} Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.367281 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9qxnc" event={"ID":"554afb3a-2388-4d1f-8949-7c9c9941d685","Type":"ContainerStarted","Data":"a9611e9b3c9508dfd5af594db20f17c0ada525f8617a2573876309ff217ee036"} Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.373800 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrb8l\" (UniqueName: \"kubernetes.io/projected/b34a7162-6088-486a-a1fc-6e01d4bd7ca6-kube-api-access-wrb8l\") pod \"olm-operator-6b444d44fb-8xtg5\" (UID: \"b34a7162-6088-486a-a1fc-6e01d4bd7ca6\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8xtg5" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.380167 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2xltn" event={"ID":"e24b6492-332b-421d-b2cd-ab1f11e06432","Type":"ContainerStarted","Data":"d2efe3765418a6209069249c270fcdb643dc520170bea396a8e21a0e3784854f"} Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.381205 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-tvhtn"] Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.382556 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hsr94"] Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.382788 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566718-7hmv9" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.383459 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-4svjh"] Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.387928 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-hg8kp" event={"ID":"ffd1ef63-d06a-4c98-8644-9efc1e7ae8fa","Type":"ContainerStarted","Data":"d72e87d8b8c59f8bfec7e04485646c905ac0293152e1c4035189637f966e6004"} Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.392497 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5srzd\" (UniqueName: \"kubernetes.io/projected/3a7350f6-60df-4463-a08c-1b231d9443c6-kube-api-access-5srzd\") pod \"csi-hostpathplugin-cfzr7\" (UID: \"3a7350f6-60df-4463-a08c-1b231d9443c6\") " pod="hostpath-provisioner/csi-hostpathplugin-cfzr7" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.393461 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-4k4dq"] Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.393746 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-cfzr7" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.393757 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-rbx59" event={"ID":"2c8e6760-4ff7-4417-917e-61bdc115d710","Type":"ContainerStarted","Data":"e54d4467bbd697fd8a649ee20bf1b25c1bb641fc85a935737721e42f32484aab"} Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.393781 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-rbx59" event={"ID":"2c8e6760-4ff7-4417-917e-61bdc115d710","Type":"ContainerStarted","Data":"21850889096ef16694e2d2d8b52010f32d94830371d418f073c860cf270a8761"} Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.396884 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-9lbsk" event={"ID":"2cc53e20-383b-4e3a-a00a-d54ac8272e00","Type":"ContainerStarted","Data":"b41a528b9cd6585bf8110a5833024b6ccd6709fde535ca359821673497cdde22"} Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.403723 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-p46hq"] Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.410052 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4jjs\" (UniqueName: \"kubernetes.io/projected/afa0ff09-5605-496f-806a-a8719dd3d958-kube-api-access-x4jjs\") pod \"ingress-canary-dwfwm\" (UID: \"afa0ff09-5605-496f-806a-a8719dd3d958\") " pod="openshift-ingress-canary/ingress-canary-dwfwm" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.413079 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-fftbt" event={"ID":"8a3c049c-4ed1-4b65-9ba9-854aa25764e5","Type":"ContainerStarted","Data":"3061704a8f348fabc7f0cd6cf46a292a52518f0e1a9b70f22a89a0305fc460e5"} Mar 20 10:38:16 crc kubenswrapper[4748]: W0320 10:38:16.416692 4748 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0e36c673_8d30_4267_a115_ec3b62ad093a.slice/crio-11f9fc7d6bd156b41d5f3d8f229b2b99f70423fe0bdb42c06d2d9412aa234bca WatchSource:0}: Error finding container 11f9fc7d6bd156b41d5f3d8f229b2b99f70423fe0bdb42c06d2d9412aa234bca: Status 404 returned error can't find the container with id 11f9fc7d6bd156b41d5f3d8f229b2b99f70423fe0bdb42c06d2d9412aa234bca Mar 20 10:38:16 crc kubenswrapper[4748]: W0320 10:38:16.421822 4748 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d26944d_4b2d_4411_aa57_e74df0293000.slice/crio-010fa5f14b405de07f926da94634db6052da6a201de201ed54dd9b6a2a9d0c17 WatchSource:0}: Error finding container 010fa5f14b405de07f926da94634db6052da6a201de201ed54dd9b6a2a9d0c17: Status 404 returned error can't find the container with id 010fa5f14b405de07f926da94634db6052da6a201de201ed54dd9b6a2a9d0c17 Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.426415 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-dwfwm" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.427518 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-9x2kc" event={"ID":"122fe6aa-0852-4ddd-a678-6ce541c38250","Type":"ContainerStarted","Data":"237c54b4462f9d1298c0a4dec9ecff99a6554d29552c3f4c9fde26e0bf22a1f2"} Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.429905 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gb9sq\" (UniqueName: \"kubernetes.io/projected/e5e57a00-19c2-4a17-9db0-ae67ea9fc70e-kube-api-access-gb9sq\") pod \"machine-config-server-p52m7\" (UID: \"e5e57a00-19c2-4a17-9db0-ae67ea9fc70e\") " pod="openshift-machine-config-operator/machine-config-server-p52m7" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.431769 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-m7vd5" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.435163 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6q2px" event={"ID":"0f21328d-966f-4d28-b80a-c950c1c83b6e","Type":"ContainerStarted","Data":"ed9ed683d576441a82ba0f7e2ab7465d781bce0655086dea1a4cfaa81dd26d95"} Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.437335 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-qq9xx" event={"ID":"ff567791-a234-47f5-8350-154f93477bb9","Type":"ContainerStarted","Data":"d7dabfa8f012f0228cdbc75666d20b794d50c2fbc6cc981d4d726236df98b500"} Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.438900 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-c5vw7" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.439314 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-v79zw" event={"ID":"a6110c56-5634-4ef9-92b1-4c7c75dd4986","Type":"ContainerStarted","Data":"9f690f83402d48cf3bedda84f255a95972912df761587225e7c025a806e54a3d"} Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.439357 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-v79zw" event={"ID":"a6110c56-5634-4ef9-92b1-4c7c75dd4986","Type":"ContainerStarted","Data":"31523ee70ad45009907db5dbeb5f614cd99f3bc0b886ab94857ab315dd8a1fcd"} Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.448599 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-hxtfq"] Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.460248 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zbshh"] Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.461718 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m49nh\" (UniqueName: \"kubernetes.io/projected/0a71010c-36bf-4830-91be-1f54483db83f-kube-api-access-m49nh\") pod \"package-server-manager-789f6589d5-hn5gq\" (UID: \"0a71010c-36bf-4830-91be-1f54483db83f\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hn5gq" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.464510 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:38:16 crc kubenswrapper[4748]: E0320 10:38:16.464728 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:38:16.964694187 +0000 UTC m=+132.106240081 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.466271 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tlkp2\" (UID: \"7da7288f-ca47-4172-a3dd-80a79e803277\") " pod="openshift-image-registry/image-registry-697d97f7c8-tlkp2" Mar 20 10:38:16 crc kubenswrapper[4748]: E0320 10:38:16.466597 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:38:16.966585503 +0000 UTC m=+132.108131327 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tlkp2" (UID: "7da7288f-ca47-4172-a3dd-80a79e803277") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.472426 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fb7kd\" (UniqueName: \"kubernetes.io/projected/c0116c88-07ce-460e-90ba-2b5d8fd6a921-kube-api-access-fb7kd\") pod \"cni-sysctl-allowlist-ds-9cpg7\" (UID: \"c0116c88-07ce-460e-90ba-2b5d8fd6a921\") " pod="openshift-multus/cni-sysctl-allowlist-ds-9cpg7" Mar 20 10:38:16 crc kubenswrapper[4748]: W0320 10:38:16.487443 4748 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod51c43cdc_1f52_4850_9c2d_d317dc38c754.slice/crio-5581c5fd0f3dc7a388852eb9d1173fcfd455896647f0a933e126b93dba1058ca WatchSource:0}: Error finding container 5581c5fd0f3dc7a388852eb9d1173fcfd455896647f0a933e126b93dba1058ca: Status 404 returned error can't find the container with id 5581c5fd0f3dc7a388852eb9d1173fcfd455896647f0a933e126b93dba1058ca Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.492610 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4wxw\" (UniqueName: \"kubernetes.io/projected/f2320984-2f2c-4a56-a1d1-0c9b2e5f2d55-kube-api-access-s4wxw\") pod \"dns-default-vn8zj\" (UID: \"f2320984-2f2c-4a56-a1d1-0c9b2e5f2d55\") " pod="openshift-dns/dns-default-vn8zj" Mar 20 10:38:16 crc kubenswrapper[4748]: W0320 10:38:16.498091 4748 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3ec3ee66_08f2_44d5_a414_90354f5a4a9e.slice/crio-eeafd1b2aa85b09858b0969c850d18ba4fca9bd63eb6071a060860eecd626ceb WatchSource:0}: Error finding container eeafd1b2aa85b09858b0969c850d18ba4fca9bd63eb6071a060860eecd626ceb: Status 404 returned error can't find the container with id eeafd1b2aa85b09858b0969c850d18ba4fca9bd63eb6071a060860eecd626ceb Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.513949 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566710-vh2t7"] Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.514307 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-brfsp" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.515505 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-smnzw\" (UniqueName: \"kubernetes.io/projected/7e2e266d-1d26-4bbc-987e-e08b2fa0ccfe-kube-api-access-smnzw\") pod \"service-ca-operator-777779d784-rcw2w\" (UID: \"7e2e266d-1d26-4bbc-987e-e08b2fa0ccfe\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-rcw2w" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.527575 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-kc5xl" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.534350 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-8ttxd"] Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.537931 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7zqwq"] Mar 20 10:38:16 crc kubenswrapper[4748]: W0320 10:38:16.542406 4748 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10c5886c_4f03_45fb_b71a_a9213027a4fd.slice/crio-e488d90461f07c778f3ae691650cdc76c90cc48f2b25e2748348b7f669599b73 WatchSource:0}: Error finding container e488d90461f07c778f3ae691650cdc76c90cc48f2b25e2748348b7f669599b73: Status 404 returned error can't find the container with id e488d90461f07c778f3ae691650cdc76c90cc48f2b25e2748348b7f669599b73 Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.554053 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-rbx59" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.555519 4748 patch_prober.go:28] interesting pod/router-default-5444994796-rbx59 container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.555578 4748 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-rbx59" podUID="2c8e6760-4ff7-4417-917e-61bdc115d710" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.557250 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-7cnpj" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.568200 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:38:16 crc kubenswrapper[4748]: E0320 10:38:16.569057 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:38:17.069031249 +0000 UTC m=+132.210577063 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.584080 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-sksdb" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.629754 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-n6wkh"] Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.630050 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-rcw2w" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.640972 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8xtg5" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.641786 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hn5gq" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.667709 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-xpncn"] Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.670467 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tlkp2\" (UID: \"7da7288f-ca47-4172-a3dd-80a79e803277\") " pod="openshift-image-registry/image-registry-697d97f7c8-tlkp2" Mar 20 10:38:16 crc kubenswrapper[4748]: E0320 10:38:16.670801 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:38:17.170787968 +0000 UTC m=+132.312333782 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tlkp2" (UID: "7da7288f-ca47-4172-a3dd-80a79e803277") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.689360 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-vn8zj" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.704806 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zwht2"] Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.713902 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-p52m7" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.721598 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-9cpg7" Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.771606 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:38:16 crc kubenswrapper[4748]: E0320 10:38:16.772091 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:38:17.272074046 +0000 UTC m=+132.413619860 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.824326 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-phlbm"] Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.873360 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tlkp2\" (UID: \"7da7288f-ca47-4172-a3dd-80a79e803277\") " pod="openshift-image-registry/image-registry-697d97f7c8-tlkp2" Mar 20 10:38:16 crc kubenswrapper[4748]: E0320 10:38:16.873814 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:38:17.373795045 +0000 UTC m=+132.515340849 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tlkp2" (UID: "7da7288f-ca47-4172-a3dd-80a79e803277") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:38:16 crc kubenswrapper[4748]: W0320 10:38:16.912791 4748 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod96d563e0_088d_4fd3_a3cf_dc70e4a40d99.slice/crio-b07b09110f9533478a7939ef75fca3c4ba205d6507e5112a4e1a872a7daea4c5 WatchSource:0}: Error finding container b07b09110f9533478a7939ef75fca3c4ba205d6507e5112a4e1a872a7daea4c5: Status 404 returned error can't find the container with id b07b09110f9533478a7939ef75fca3c4ba205d6507e5112a4e1a872a7daea4c5 Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.955961 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-xwxlw"] Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.974123 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:38:16 crc kubenswrapper[4748]: E0320 10:38:16.974279 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:38:17.474250286 +0000 UTC m=+132.615796090 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.974393 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tlkp2\" (UID: \"7da7288f-ca47-4172-a3dd-80a79e803277\") " pod="openshift-image-registry/image-registry-697d97f7c8-tlkp2" Mar 20 10:38:16 crc kubenswrapper[4748]: E0320 10:38:16.974755 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:38:17.474739219 +0000 UTC m=+132.616285033 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tlkp2" (UID: "7da7288f-ca47-4172-a3dd-80a79e803277") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:38:16 crc kubenswrapper[4748]: I0320 10:38:16.981716 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-c5vw7"] Mar 20 10:38:17 crc kubenswrapper[4748]: I0320 10:38:17.064971 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-dmmdh"] Mar 20 10:38:17 crc kubenswrapper[4748]: I0320 10:38:17.075799 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:38:17 crc kubenswrapper[4748]: E0320 10:38:17.076041 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:38:17.57598589 +0000 UTC m=+132.717531714 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:38:17 crc kubenswrapper[4748]: I0320 10:38:17.177045 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tlkp2\" (UID: \"7da7288f-ca47-4172-a3dd-80a79e803277\") " pod="openshift-image-registry/image-registry-697d97f7c8-tlkp2" Mar 20 10:38:17 crc kubenswrapper[4748]: E0320 10:38:17.177437 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:38:17.677423876 +0000 UTC m=+132.818969690 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tlkp2" (UID: "7da7288f-ca47-4172-a3dd-80a79e803277") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:38:17 crc kubenswrapper[4748]: I0320 10:38:17.277553 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:38:17 crc kubenswrapper[4748]: E0320 10:38:17.277959 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:38:17.777941707 +0000 UTC m=+132.919487521 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:38:17 crc kubenswrapper[4748]: I0320 10:38:17.289015 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-v79zw" podStartSLOduration=73.288998439 podStartE2EDuration="1m13.288998439s" podCreationTimestamp="2026-03-20 10:37:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:38:17.288752573 +0000 UTC m=+132.430298387" watchObservedRunningTime="2026-03-20 10:38:17.288998439 +0000 UTC m=+132.430544253" Mar 20 10:38:17 crc kubenswrapper[4748]: I0320 10:38:17.379432 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tlkp2\" (UID: \"7da7288f-ca47-4172-a3dd-80a79e803277\") " pod="openshift-image-registry/image-registry-697d97f7c8-tlkp2" Mar 20 10:38:17 crc kubenswrapper[4748]: E0320 10:38:17.379754 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:38:17.879741053 +0000 UTC m=+133.021286867 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tlkp2" (UID: "7da7288f-ca47-4172-a3dd-80a79e803277") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:38:17 crc kubenswrapper[4748]: I0320 10:38:17.484081 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:38:17 crc kubenswrapper[4748]: E0320 10:38:17.484599 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:38:17.984583239 +0000 UTC m=+133.126129053 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:38:17 crc kubenswrapper[4748]: I0320 10:38:17.494619 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zwht2" event={"ID":"96d563e0-088d-4fd3-a3cf-dc70e4a40d99","Type":"ContainerStarted","Data":"b07b09110f9533478a7939ef75fca3c4ba205d6507e5112a4e1a872a7daea4c5"} Mar 20 10:38:17 crc kubenswrapper[4748]: I0320 10:38:17.498570 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-rbx59" podStartSLOduration=73.498550478 podStartE2EDuration="1m13.498550478s" podCreationTimestamp="2026-03-20 10:37:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:38:17.494606654 +0000 UTC m=+132.636152468" watchObservedRunningTime="2026-03-20 10:38:17.498550478 +0000 UTC m=+132.640096292" Mar 20 10:38:17 crc kubenswrapper[4748]: I0320 10:38:17.505261 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-xwxlw" event={"ID":"a49c3549-30d5-4927-ae41-cc2b9e7f47e2","Type":"ContainerStarted","Data":"b291e9456845147eb9330180fa0fe46d069360b20e2f5ae1b5e6bdbb4864313a"} Mar 20 10:38:17 crc kubenswrapper[4748]: I0320 10:38:17.509100 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-fftbt" event={"ID":"8a3c049c-4ed1-4b65-9ba9-854aa25764e5","Type":"ContainerStarted","Data":"1a86afe86dbafe3ee6fa4639f6267c5e8fc1ac5766397676f65e99290e0c384e"} Mar 20 10:38:17 crc kubenswrapper[4748]: I0320 10:38:17.511372 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566710-vh2t7" event={"ID":"2dd2dd89-ba90-440f-abc8-74ab27d7db69","Type":"ContainerStarted","Data":"6289653e02f4403943a0bee471c80acadad5ccb5e9266bcde2d99f64dea0c33d"} Mar 20 10:38:17 crc kubenswrapper[4748]: I0320 10:38:17.544770 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-4k4dq" event={"ID":"3ec3ee66-08f2-44d5-a414-90354f5a4a9e","Type":"ContainerStarted","Data":"ee4c81d5e98498d39b6c7b8702d8e5d358c7af0b4051970f802b9005eb9845a9"} Mar 20 10:38:17 crc kubenswrapper[4748]: I0320 10:38:17.544807 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-4k4dq" event={"ID":"3ec3ee66-08f2-44d5-a414-90354f5a4a9e","Type":"ContainerStarted","Data":"eeafd1b2aa85b09858b0969c850d18ba4fca9bd63eb6071a060860eecd626ceb"} Mar 20 10:38:17 crc kubenswrapper[4748]: I0320 10:38:17.544822 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-c5vw7" event={"ID":"4d3ff749-37cf-414c-ad3b-fe72fc1cfe28","Type":"ContainerStarted","Data":"6e5c6044d4ee71169aebf20b11677808229d917450e7803e53d2be232d9ddc4f"} Mar 20 10:38:17 crc kubenswrapper[4748]: I0320 10:38:17.544845 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6q2px" event={"ID":"0f21328d-966f-4d28-b80a-c950c1c83b6e","Type":"ContainerStarted","Data":"1349985df0ca218cc0cac4e4d7ec409bd0a7eb877646a7e3d922d5d3f0be5aab"} Mar 20 10:38:17 crc kubenswrapper[4748]: I0320 10:38:17.544856 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-phlbm" event={"ID":"6a3e2353-3374-48ad-968b-8f4b487d63ed","Type":"ContainerStarted","Data":"8ad4a7ba78f58a3b213bed065d27fac91f145eda98af363c8a182088d959b079"} Mar 20 10:38:17 crc kubenswrapper[4748]: I0320 10:38:17.545074 4748 generic.go:334] "Generic (PLEG): container finished" podID="e24b6492-332b-421d-b2cd-ab1f11e06432" containerID="6f3dc3491806ff26050c7242abbe70eb9155b2ca82c58b1aec2d8927cbe48d41" exitCode=0 Mar 20 10:38:17 crc kubenswrapper[4748]: I0320 10:38:17.545148 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2xltn" event={"ID":"e24b6492-332b-421d-b2cd-ab1f11e06432","Type":"ContainerDied","Data":"6f3dc3491806ff26050c7242abbe70eb9155b2ca82c58b1aec2d8927cbe48d41"} Mar 20 10:38:17 crc kubenswrapper[4748]: I0320 10:38:17.558821 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-9lbsk" event={"ID":"2cc53e20-383b-4e3a-a00a-d54ac8272e00","Type":"ContainerStarted","Data":"a4aea92f873fec2c62be0e49cb81646378c0860e491e775e7427794184b8655f"} Mar 20 10:38:17 crc kubenswrapper[4748]: I0320 10:38:17.563677 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-n6wkh" event={"ID":"a32808c8-cba3-478e-bcf0-01434602a895","Type":"ContainerStarted","Data":"c582e62c9a0ad3275d225765548c041e4e2fc4ed54ee341cfdacc14d1d03ceb7"} Mar 20 10:38:17 crc kubenswrapper[4748]: I0320 10:38:17.571240 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-8ttxd" event={"ID":"61338f55-8e52-48cb-962d-40aefc460991","Type":"ContainerStarted","Data":"073c30f340884e42310950d9000af624cf4cf8d1da8780786f9e5fb14560b086"} Mar 20 10:38:17 crc kubenswrapper[4748]: I0320 10:38:17.572228 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-xpncn" event={"ID":"9abcdd14-d386-4279-9d4c-4a7326a32a11","Type":"ContainerStarted","Data":"3723617594ce7ac27a00b57c0746f02379306603b55346b8e86b2e1542c23d25"} Mar 20 10:38:17 crc kubenswrapper[4748]: I0320 10:38:17.574800 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-9x2kc" event={"ID":"122fe6aa-0852-4ddd-a678-6ce541c38250","Type":"ContainerStarted","Data":"b2d234601d1bf464003af80108c513ab8277d03b726a125b8247ed9dd7900938"} Mar 20 10:38:17 crc kubenswrapper[4748]: I0320 10:38:17.576370 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-4svjh" event={"ID":"64222460-9199-40c1-bf14-07be02855e39","Type":"ContainerStarted","Data":"5bf81d44d63d09e59e2d5275391695723752523d62af92ae0137641de40d3519"} Mar 20 10:38:17 crc kubenswrapper[4748]: I0320 10:38:17.577802 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-tvhtn" event={"ID":"0e36c673-8d30-4267-a115-ec3b62ad093a","Type":"ContainerStarted","Data":"11f9fc7d6bd156b41d5f3d8f229b2b99f70423fe0bdb42c06d2d9412aa234bca"} Mar 20 10:38:17 crc kubenswrapper[4748]: I0320 10:38:17.578456 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7zqwq" event={"ID":"eb3bf250-9fda-4426-baa3-48eed453f90d","Type":"ContainerStarted","Data":"1219cd620d6a16d40204b0fde9c1deb7f999c610dd9678813a388b0789957df5"} Mar 20 10:38:17 crc kubenswrapper[4748]: I0320 10:38:17.579626 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zbshh" event={"ID":"10c5886c-4f03-45fb-b71a-a9213027a4fd","Type":"ContainerStarted","Data":"e488d90461f07c778f3ae691650cdc76c90cc48f2b25e2748348b7f669599b73"} Mar 20 10:38:17 crc kubenswrapper[4748]: W0320 10:38:17.584391 4748 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc0116c88_07ce_460e_90ba_2b5d8fd6a921.slice/crio-81e4b77d8b16210c5e522889b70c1a16f09db91b82a58af028394ed897c21715 WatchSource:0}: Error finding container 81e4b77d8b16210c5e522889b70c1a16f09db91b82a58af028394ed897c21715: Status 404 returned error can't find the container with id 81e4b77d8b16210c5e522889b70c1a16f09db91b82a58af028394ed897c21715 Mar 20 10:38:17 crc kubenswrapper[4748]: I0320 10:38:17.586146 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tlkp2\" (UID: \"7da7288f-ca47-4172-a3dd-80a79e803277\") " pod="openshift-image-registry/image-registry-697d97f7c8-tlkp2" Mar 20 10:38:17 crc kubenswrapper[4748]: I0320 10:38:17.588882 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-hg8kp" event={"ID":"ffd1ef63-d06a-4c98-8644-9efc1e7ae8fa","Type":"ContainerStarted","Data":"e74d946db6099fda9b80174c7678cf74393b28765301edb3f89a3be9b76f75e0"} Mar 20 10:38:17 crc kubenswrapper[4748]: I0320 10:38:17.589556 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-hg8kp" Mar 20 10:38:17 crc kubenswrapper[4748]: E0320 10:38:17.590763 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:38:18.090749921 +0000 UTC m=+133.232295735 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tlkp2" (UID: "7da7288f-ca47-4172-a3dd-80a79e803277") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:38:17 crc kubenswrapper[4748]: I0320 10:38:17.597409 4748 generic.go:334] "Generic (PLEG): container finished" podID="ff567791-a234-47f5-8350-154f93477bb9" containerID="4431b6c9eb31157bb65035820870fb5a52ceb1a7944382ad92f64a010407e3ff" exitCode=0 Mar 20 10:38:17 crc kubenswrapper[4748]: I0320 10:38:17.597506 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-qq9xx" event={"ID":"ff567791-a234-47f5-8350-154f93477bb9","Type":"ContainerDied","Data":"4431b6c9eb31157bb65035820870fb5a52ceb1a7944382ad92f64a010407e3ff"} Mar 20 10:38:17 crc kubenswrapper[4748]: I0320 10:38:17.599620 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-dmmdh" event={"ID":"ae334fdf-f952-4b6b-8372-1fd7ef332362","Type":"ContainerStarted","Data":"51f9ea693ca416aeffb6c23f1fb4c79c8b0c6bd47470b07515d782f1059f369a"} Mar 20 10:38:17 crc kubenswrapper[4748]: I0320 10:38:17.626969 4748 patch_prober.go:28] interesting pod/router-default-5444994796-rbx59 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 10:38:17 crc kubenswrapper[4748]: [-]has-synced failed: reason withheld Mar 20 10:38:17 crc kubenswrapper[4748]: [+]process-running ok Mar 20 10:38:17 crc kubenswrapper[4748]: healthz check failed Mar 20 10:38:17 crc kubenswrapper[4748]: I0320 10:38:17.627038 4748 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-rbx59" podUID="2c8e6760-4ff7-4417-917e-61bdc115d710" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 10:38:17 crc kubenswrapper[4748]: I0320 10:38:17.630797 4748 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-hg8kp container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.35:8443/healthz\": dial tcp 10.217.0.35:8443: connect: connection refused" start-of-body= Mar 20 10:38:17 crc kubenswrapper[4748]: I0320 10:38:17.630849 4748 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-hg8kp" podUID="ffd1ef63-d06a-4c98-8644-9efc1e7ae8fa" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.35:8443/healthz\": dial tcp 10.217.0.35:8443: connect: connection refused" Mar 20 10:38:17 crc kubenswrapper[4748]: I0320 10:38:17.631425 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-p46hq" event={"ID":"9d26944d-4b2d-4411-aa57-e74df0293000","Type":"ContainerStarted","Data":"477637abca965ae770f5c2ef6f38b7edeadd987b3beef3fcc2d0261eb53cdd32"} Mar 20 10:38:17 crc kubenswrapper[4748]: I0320 10:38:17.631479 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-p46hq" event={"ID":"9d26944d-4b2d-4411-aa57-e74df0293000","Type":"ContainerStarted","Data":"010fa5f14b405de07f926da94634db6052da6a201de201ed54dd9b6a2a9d0c17"} Mar 20 10:38:17 crc kubenswrapper[4748]: I0320 10:38:17.634603 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-hxtfq" event={"ID":"0a5bf285-90ef-47c5-a959-7bae63c410a5","Type":"ContainerStarted","Data":"32fcbf402f4cf70bcf0b9fab4cd3dafb70fc26381ae2dc3db5de408c9e99b868"} Mar 20 10:38:17 crc kubenswrapper[4748]: I0320 10:38:17.643729 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hsr94" event={"ID":"51c43cdc-1f52-4850-9c2d-d317dc38c754","Type":"ContainerStarted","Data":"fbdf387207b7255c81a3a65c67413da4ae067a3a247ce30942b8630035ae57f1"} Mar 20 10:38:17 crc kubenswrapper[4748]: I0320 10:38:17.643782 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hsr94" event={"ID":"51c43cdc-1f52-4850-9c2d-d317dc38c754","Type":"ContainerStarted","Data":"5581c5fd0f3dc7a388852eb9d1173fcfd455896647f0a933e126b93dba1058ca"} Mar 20 10:38:17 crc kubenswrapper[4748]: I0320 10:38:17.645111 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hsr94" Mar 20 10:38:17 crc kubenswrapper[4748]: I0320 10:38:17.648294 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tqb9k" event={"ID":"c5864ca9-f8a0-40c0-ade0-8d4e6e0a350c","Type":"ContainerStarted","Data":"077707af3ef89b1d61c7162b33a0fdc937a9348d9d40b3a2c1b627d2eda110fe"} Mar 20 10:38:17 crc kubenswrapper[4748]: I0320 10:38:17.656195 4748 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-hsr94 container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.28:8443/healthz\": dial tcp 10.217.0.28:8443: connect: connection refused" start-of-body= Mar 20 10:38:17 crc kubenswrapper[4748]: I0320 10:38:17.656266 4748 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hsr94" podUID="51c43cdc-1f52-4850-9c2d-d317dc38c754" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.28:8443/healthz\": dial tcp 10.217.0.28:8443: connect: connection refused" Mar 20 10:38:17 crc kubenswrapper[4748]: I0320 10:38:17.668173 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-26svx" event={"ID":"6c6c249f-695d-4875-94ad-a608e8bd7d5f","Type":"ContainerStarted","Data":"177d90bcb246db55c8054ea52ae55165ba9a35a16b13deff008738f7e47abacb"} Mar 20 10:38:17 crc kubenswrapper[4748]: I0320 10:38:17.687212 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:38:17 crc kubenswrapper[4748]: E0320 10:38:17.688510 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:38:18.188485539 +0000 UTC m=+133.330031363 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:38:17 crc kubenswrapper[4748]: I0320 10:38:17.790701 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tlkp2\" (UID: \"7da7288f-ca47-4172-a3dd-80a79e803277\") " pod="openshift-image-registry/image-registry-697d97f7c8-tlkp2" Mar 20 10:38:17 crc kubenswrapper[4748]: E0320 10:38:17.791333 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:38:18.291310352 +0000 UTC m=+133.432856166 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tlkp2" (UID: "7da7288f-ca47-4172-a3dd-80a79e803277") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:38:17 crc kubenswrapper[4748]: I0320 10:38:17.799553 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-cfzr7"] Mar 20 10:38:17 crc kubenswrapper[4748]: W0320 10:38:17.846035 4748 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3a7350f6_60df_4463_a08c_1b231d9443c6.slice/crio-cf5e54029355cdfb1316e2336a55a4e8a743da6eb91115ec20518f6443a3ccfa WatchSource:0}: Error finding container cf5e54029355cdfb1316e2336a55a4e8a743da6eb91115ec20518f6443a3ccfa: Status 404 returned error can't find the container with id cf5e54029355cdfb1316e2336a55a4e8a743da6eb91115ec20518f6443a3ccfa Mar 20 10:38:17 crc kubenswrapper[4748]: I0320 10:38:17.894484 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:38:17 crc kubenswrapper[4748]: E0320 10:38:17.894889 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:38:18.394867364 +0000 UTC m=+133.536413178 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:38:17 crc kubenswrapper[4748]: I0320 10:38:17.908076 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8xtg5"] Mar 20 10:38:17 crc kubenswrapper[4748]: I0320 10:38:17.921235 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-kc5xl"] Mar 20 10:38:17 crc kubenswrapper[4748]: I0320 10:38:17.924746 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hn5gq"] Mar 20 10:38:17 crc kubenswrapper[4748]: I0320 10:38:17.924776 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566718-7hmv9"] Mar 20 10:38:17 crc kubenswrapper[4748]: I0320 10:38:17.930068 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-dwfwm"] Mar 20 10:38:17 crc kubenswrapper[4748]: I0320 10:38:17.938414 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-vn8zj"] Mar 20 10:38:17 crc kubenswrapper[4748]: W0320 10:38:17.994648 4748 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podafa0ff09_5605_496f_806a_a8719dd3d958.slice/crio-36263be7d8efe75e5d70f567d6c283564d98af46e639f72fc451ab774bf0759c WatchSource:0}: Error finding container 36263be7d8efe75e5d70f567d6c283564d98af46e639f72fc451ab774bf0759c: Status 404 returned error can't find the container with id 36263be7d8efe75e5d70f567d6c283564d98af46e639f72fc451ab774bf0759c Mar 20 10:38:17 crc kubenswrapper[4748]: I0320 10:38:17.995715 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tlkp2\" (UID: \"7da7288f-ca47-4172-a3dd-80a79e803277\") " pod="openshift-image-registry/image-registry-697d97f7c8-tlkp2" Mar 20 10:38:17 crc kubenswrapper[4748]: E0320 10:38:17.996051 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:38:18.496039034 +0000 UTC m=+133.637584838 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tlkp2" (UID: "7da7288f-ca47-4172-a3dd-80a79e803277") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:38:18 crc kubenswrapper[4748]: W0320 10:38:18.013042 4748 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf2320984_2f2c_4a56_a1d1_0c9b2e5f2d55.slice/crio-b8d663893a0f0cb277161ed1b90aeda62f667d7e1a7747a65447f09d3b14cc8d WatchSource:0}: Error finding container b8d663893a0f0cb277161ed1b90aeda62f667d7e1a7747a65447f09d3b14cc8d: Status 404 returned error can't find the container with id b8d663893a0f0cb277161ed1b90aeda62f667d7e1a7747a65447f09d3b14cc8d Mar 20 10:38:18 crc kubenswrapper[4748]: I0320 10:38:18.023951 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-m7vd5"] Mar 20 10:38:18 crc kubenswrapper[4748]: I0320 10:38:18.034026 4748 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 10:38:18 crc kubenswrapper[4748]: I0320 10:38:18.037695 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-brfsp"] Mar 20 10:38:18 crc kubenswrapper[4748]: I0320 10:38:18.042356 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-sksdb"] Mar 20 10:38:18 crc kubenswrapper[4748]: I0320 10:38:18.059036 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-7cnpj"] Mar 20 10:38:18 crc kubenswrapper[4748]: I0320 10:38:18.098115 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:38:18 crc kubenswrapper[4748]: E0320 10:38:18.098541 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:38:18.598520158 +0000 UTC m=+133.740065972 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:38:18 crc kubenswrapper[4748]: I0320 10:38:18.112683 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-rcw2w"] Mar 20 10:38:18 crc kubenswrapper[4748]: W0320 10:38:18.135951 4748 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7e2e266d_1d26_4bbc_987e_e08b2fa0ccfe.slice/crio-e9c44412dde98d0cd2ea744588340bbacee85ce247ccae1c540fcd9ef4893fd3 WatchSource:0}: Error finding container e9c44412dde98d0cd2ea744588340bbacee85ce247ccae1c540fcd9ef4893fd3: Status 404 returned error can't find the container with id e9c44412dde98d0cd2ea744588340bbacee85ce247ccae1c540fcd9ef4893fd3 Mar 20 10:38:18 crc kubenswrapper[4748]: I0320 10:38:18.201489 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tlkp2\" (UID: \"7da7288f-ca47-4172-a3dd-80a79e803277\") " pod="openshift-image-registry/image-registry-697d97f7c8-tlkp2" Mar 20 10:38:18 crc kubenswrapper[4748]: E0320 10:38:18.201930 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:38:18.701915346 +0000 UTC m=+133.843461160 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tlkp2" (UID: "7da7288f-ca47-4172-a3dd-80a79e803277") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:38:18 crc kubenswrapper[4748]: I0320 10:38:18.213864 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-hg8kp" podStartSLOduration=74.21382767 podStartE2EDuration="1m14.21382767s" podCreationTimestamp="2026-03-20 10:37:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:38:18.213497871 +0000 UTC m=+133.355043685" watchObservedRunningTime="2026-03-20 10:38:18.21382767 +0000 UTC m=+133.355373484" Mar 20 10:38:18 crc kubenswrapper[4748]: I0320 10:38:18.302789 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:38:18 crc kubenswrapper[4748]: E0320 10:38:18.303087 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:38:18.803056074 +0000 UTC m=+133.944601908 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:38:18 crc kubenswrapper[4748]: I0320 10:38:18.303156 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tlkp2\" (UID: \"7da7288f-ca47-4172-a3dd-80a79e803277\") " pod="openshift-image-registry/image-registry-697d97f7c8-tlkp2" Mar 20 10:38:18 crc kubenswrapper[4748]: E0320 10:38:18.303608 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:38:18.803594088 +0000 UTC m=+133.945139902 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tlkp2" (UID: "7da7288f-ca47-4172-a3dd-80a79e803277") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:38:18 crc kubenswrapper[4748]: I0320 10:38:18.376659 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-p46hq" podStartSLOduration=74.376635206 podStartE2EDuration="1m14.376635206s" podCreationTimestamp="2026-03-20 10:37:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:38:18.375047744 +0000 UTC m=+133.516593558" watchObservedRunningTime="2026-03-20 10:38:18.376635206 +0000 UTC m=+133.518181020" Mar 20 10:38:18 crc kubenswrapper[4748]: I0320 10:38:18.406605 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:38:18 crc kubenswrapper[4748]: E0320 10:38:18.407059 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:38:18.907036568 +0000 UTC m=+134.048582382 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:38:18 crc kubenswrapper[4748]: I0320 10:38:18.495610 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-26svx" podStartSLOduration=74.495587734 podStartE2EDuration="1m14.495587734s" podCreationTimestamp="2026-03-20 10:37:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:38:18.491103596 +0000 UTC m=+133.632649410" watchObservedRunningTime="2026-03-20 10:38:18.495587734 +0000 UTC m=+133.637133548" Mar 20 10:38:18 crc kubenswrapper[4748]: I0320 10:38:18.508889 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tlkp2\" (UID: \"7da7288f-ca47-4172-a3dd-80a79e803277\") " pod="openshift-image-registry/image-registry-697d97f7c8-tlkp2" Mar 20 10:38:18 crc kubenswrapper[4748]: E0320 10:38:18.509350 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:38:19.009335467 +0000 UTC m=+134.150881281 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tlkp2" (UID: "7da7288f-ca47-4172-a3dd-80a79e803277") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:38:18 crc kubenswrapper[4748]: I0320 10:38:18.558731 4748 patch_prober.go:28] interesting pod/router-default-5444994796-rbx59 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 10:38:18 crc kubenswrapper[4748]: [-]has-synced failed: reason withheld Mar 20 10:38:18 crc kubenswrapper[4748]: [+]process-running ok Mar 20 10:38:18 crc kubenswrapper[4748]: healthz check failed Mar 20 10:38:18 crc kubenswrapper[4748]: I0320 10:38:18.558860 4748 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-rbx59" podUID="2c8e6760-4ff7-4417-917e-61bdc115d710" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 10:38:18 crc kubenswrapper[4748]: I0320 10:38:18.607308 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-9x2kc" podStartSLOduration=74.607282621 podStartE2EDuration="1m14.607282621s" podCreationTimestamp="2026-03-20 10:37:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:38:18.606774058 +0000 UTC m=+133.748319882" watchObservedRunningTime="2026-03-20 10:38:18.607282621 +0000 UTC m=+133.748828435" Mar 20 10:38:18 crc kubenswrapper[4748]: I0320 10:38:18.609900 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:38:18 crc kubenswrapper[4748]: E0320 10:38:18.610301 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:38:19.11028501 +0000 UTC m=+134.251830824 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:38:18 crc kubenswrapper[4748]: I0320 10:38:18.674388 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-m7vd5" event={"ID":"ef9e528c-68f5-4e04-9ceb-99e84335173c","Type":"ContainerStarted","Data":"a677d6cca2865ff6f8e9f8e43b33ba9eb24c71a90fcb7e71e29a06eda2dc84f9"} Mar 20 10:38:18 crc kubenswrapper[4748]: I0320 10:38:18.677070 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-brfsp" event={"ID":"b59f4996-13e0-4b70-b835-fbeea765c681","Type":"ContainerStarted","Data":"9a2a9a2375fe4d022ac7988a3d83ec18a79c4f9b7e70b42e51307056cfbfa15a"} Mar 20 10:38:18 crc kubenswrapper[4748]: I0320 10:38:18.678544 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-n6wkh" event={"ID":"a32808c8-cba3-478e-bcf0-01434602a895","Type":"ContainerStarted","Data":"1b4a248bdd9343367afc6a3d4438539f50f2653dbf048edfc75f0f5e3e5471e5"} Mar 20 10:38:18 crc kubenswrapper[4748]: I0320 10:38:18.680544 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-xpncn" event={"ID":"9abcdd14-d386-4279-9d4c-4a7326a32a11","Type":"ContainerStarted","Data":"dc05cb3effe7316098cdb06421a4067203e2d5538b4be516ead6a95e449ace37"} Mar 20 10:38:18 crc kubenswrapper[4748]: I0320 10:38:18.681910 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-sksdb" event={"ID":"0481ff64-8f10-4e72-a81e-d2c53278246a","Type":"ContainerStarted","Data":"a1fe474e935336b793718b1e1d5f19b3521ab4f1ebb0569e03ec44ecfca5d29a"} Mar 20 10:38:18 crc kubenswrapper[4748]: I0320 10:38:18.682976 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hn5gq" event={"ID":"0a71010c-36bf-4830-91be-1f54483db83f","Type":"ContainerStarted","Data":"9ea02d62af8a3bd72c30faafa273c83c03db949d19d3eb1c3c6c312148f3ea25"} Mar 20 10:38:18 crc kubenswrapper[4748]: I0320 10:38:18.684415 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-8ttxd" event={"ID":"61338f55-8e52-48cb-962d-40aefc460991","Type":"ContainerStarted","Data":"0d975708ab29e91dba65704f5fbb9d917a34fa2ca0cfb3ddec31ed1007f89619"} Mar 20 10:38:18 crc kubenswrapper[4748]: I0320 10:38:18.691666 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-c5vw7" event={"ID":"4d3ff749-37cf-414c-ad3b-fe72fc1cfe28","Type":"ContainerStarted","Data":"d5f20b96ea371a511f703240ed488aebd4732d30be3a9a7e46e36675c6839e85"} Mar 20 10:38:18 crc kubenswrapper[4748]: I0320 10:38:18.693302 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-p52m7" event={"ID":"e5e57a00-19c2-4a17-9db0-ae67ea9fc70e","Type":"ContainerStarted","Data":"4e3648df8abce87c7fd46022678a8b958d5fd0de8903774159397b2ffdabd51a"} Mar 20 10:38:18 crc kubenswrapper[4748]: I0320 10:38:18.693326 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-p52m7" event={"ID":"e5e57a00-19c2-4a17-9db0-ae67ea9fc70e","Type":"ContainerStarted","Data":"38bf8ff05e8b837a402bc825cc51145cf8be63aab002156d82289808f5dd204b"} Mar 20 10:38:18 crc kubenswrapper[4748]: I0320 10:38:18.695564 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-fftbt" event={"ID":"8a3c049c-4ed1-4b65-9ba9-854aa25764e5","Type":"ContainerStarted","Data":"28e54f6a5418d2184e694db8c162d73bef23a5646e1cc74cec04abee5e507694"} Mar 20 10:38:18 crc kubenswrapper[4748]: I0320 10:38:18.696994 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9qxnc" event={"ID":"554afb3a-2388-4d1f-8949-7c9c9941d685","Type":"ContainerStarted","Data":"5ebf6a74c96afe5915220bd44db0a85dc44d0cee7bed392cd1b8d12ba3031db1"} Mar 20 10:38:18 crc kubenswrapper[4748]: I0320 10:38:18.701619 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2xltn" event={"ID":"e24b6492-332b-421d-b2cd-ab1f11e06432","Type":"ContainerStarted","Data":"70f06f0ef1af24a7dbc0fbe98dcb6be18418b81e2b44242ad2aa8368a25f4134"} Mar 20 10:38:18 crc kubenswrapper[4748]: I0320 10:38:18.702843 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-hxtfq" event={"ID":"0a5bf285-90ef-47c5-a959-7bae63c410a5","Type":"ContainerStarted","Data":"15c51df52bf01172d5fcdcf50daf2cbb15103b18c13ac0217b4576703757b02a"} Mar 20 10:38:18 crc kubenswrapper[4748]: I0320 10:38:18.703545 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-cfzr7" event={"ID":"3a7350f6-60df-4463-a08c-1b231d9443c6","Type":"ContainerStarted","Data":"cf5e54029355cdfb1316e2336a55a4e8a743da6eb91115ec20518f6443a3ccfa"} Mar 20 10:38:18 crc kubenswrapper[4748]: I0320 10:38:18.704225 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-dwfwm" event={"ID":"afa0ff09-5605-496f-806a-a8719dd3d958","Type":"ContainerStarted","Data":"36263be7d8efe75e5d70f567d6c283564d98af46e639f72fc451ab774bf0759c"} Mar 20 10:38:18 crc kubenswrapper[4748]: I0320 10:38:18.705564 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-xwxlw" event={"ID":"a49c3549-30d5-4927-ae41-cc2b9e7f47e2","Type":"ContainerStarted","Data":"3ef3b3bb6b3ec40cdd4a0fde48ac8210369a91a6f8e2b6ead01596a0c4f4f3c2"} Mar 20 10:38:18 crc kubenswrapper[4748]: I0320 10:38:18.706122 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-xwxlw" Mar 20 10:38:18 crc kubenswrapper[4748]: I0320 10:38:18.707203 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-rcw2w" event={"ID":"7e2e266d-1d26-4bbc-987e-e08b2fa0ccfe","Type":"ContainerStarted","Data":"e9c44412dde98d0cd2ea744588340bbacee85ce247ccae1c540fcd9ef4893fd3"} Mar 20 10:38:18 crc kubenswrapper[4748]: I0320 10:38:18.708900 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-4svjh" event={"ID":"64222460-9199-40c1-bf14-07be02855e39","Type":"ContainerStarted","Data":"87f487f874459efa214c3fc937f5e42a173c301f6db7615c6b834879a4414cab"} Mar 20 10:38:18 crc kubenswrapper[4748]: I0320 10:38:18.710817 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tlkp2\" (UID: \"7da7288f-ca47-4172-a3dd-80a79e803277\") " pod="openshift-image-registry/image-registry-697d97f7c8-tlkp2" Mar 20 10:38:18 crc kubenswrapper[4748]: E0320 10:38:18.711599 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:38:19.211580653 +0000 UTC m=+134.353126467 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tlkp2" (UID: "7da7288f-ca47-4172-a3dd-80a79e803277") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:38:18 crc kubenswrapper[4748]: I0320 10:38:18.714330 4748 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-xwxlw container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.17:8443/healthz\": dial tcp 10.217.0.17:8443: connect: connection refused" start-of-body= Mar 20 10:38:18 crc kubenswrapper[4748]: I0320 10:38:18.714416 4748 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-xwxlw" podUID="a49c3549-30d5-4927-ae41-cc2b9e7f47e2" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.17:8443/healthz\": dial tcp 10.217.0.17:8443: connect: connection refused" Mar 20 10:38:18 crc kubenswrapper[4748]: I0320 10:38:18.716695 4748 generic.go:334] "Generic (PLEG): container finished" podID="3ec3ee66-08f2-44d5-a414-90354f5a4a9e" containerID="ee4c81d5e98498d39b6c7b8702d8e5d358c7af0b4051970f802b9005eb9845a9" exitCode=0 Mar 20 10:38:18 crc kubenswrapper[4748]: I0320 10:38:18.716841 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-4k4dq" event={"ID":"3ec3ee66-08f2-44d5-a414-90354f5a4a9e","Type":"ContainerDied","Data":"ee4c81d5e98498d39b6c7b8702d8e5d358c7af0b4051970f802b9005eb9845a9"} Mar 20 10:38:18 crc kubenswrapper[4748]: I0320 10:38:18.722763 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566710-vh2t7" event={"ID":"2dd2dd89-ba90-440f-abc8-74ab27d7db69","Type":"ContainerStarted","Data":"f26e13d2a4bc838ed9c49216a3f99f4fc8a056e05e61c73cdfe6e29da627a0df"} Mar 20 10:38:18 crc kubenswrapper[4748]: I0320 10:38:18.724097 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-7cnpj" event={"ID":"65bf442d-adbb-4d6d-b0ee-55bbba31e305","Type":"ContainerStarted","Data":"5d16f57ad2dc081df69ec89ab7fb306a1758c7259352b29ecaeb9f9cd21515da"} Mar 20 10:38:18 crc kubenswrapper[4748]: I0320 10:38:18.726049 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zwht2" event={"ID":"96d563e0-088d-4fd3-a3cf-dc70e4a40d99","Type":"ContainerStarted","Data":"a0f2c861b896d35871c1ec0843b62eb00b3ed316e42c869ed9b4e9560bc45c23"} Mar 20 10:38:18 crc kubenswrapper[4748]: I0320 10:38:18.728374 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8xtg5" event={"ID":"b34a7162-6088-486a-a1fc-6e01d4bd7ca6","Type":"ContainerStarted","Data":"06a3cff25a7416f663061ec6f8be3f2ce785f64d69a755355ce2508e81c405a8"} Mar 20 10:38:18 crc kubenswrapper[4748]: I0320 10:38:18.728404 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8xtg5" event={"ID":"b34a7162-6088-486a-a1fc-6e01d4bd7ca6","Type":"ContainerStarted","Data":"2d11afe415cac263014c8775c1bbe26e1efc8ebebe0280bd020838410d1913d4"} Mar 20 10:38:18 crc kubenswrapper[4748]: I0320 10:38:18.730867 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-kc5xl" event={"ID":"fc943725-6ebd-4214-92c4-6f4d434ec186","Type":"ContainerStarted","Data":"90708eb1a52fd74f151765f599b950e753300729e428cc863934abfbb927d6ff"} Mar 20 10:38:18 crc kubenswrapper[4748]: I0320 10:38:18.730892 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-kc5xl" event={"ID":"fc943725-6ebd-4214-92c4-6f4d434ec186","Type":"ContainerStarted","Data":"5ddb3193d3c3ea71ec197ea140148451e91cb1c68407d7604358688a4cecb4f6"} Mar 20 10:38:18 crc kubenswrapper[4748]: I0320 10:38:18.732422 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tqb9k" podStartSLOduration=74.732397622 podStartE2EDuration="1m14.732397622s" podCreationTimestamp="2026-03-20 10:37:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:38:18.730177024 +0000 UTC m=+133.871722838" watchObservedRunningTime="2026-03-20 10:38:18.732397622 +0000 UTC m=+133.873943436" Mar 20 10:38:18 crc kubenswrapper[4748]: I0320 10:38:18.733017 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-9lbsk" event={"ID":"2cc53e20-383b-4e3a-a00a-d54ac8272e00","Type":"ContainerStarted","Data":"a8cb0f23fa8fae826379e307c3e36c50f6d5ab73779f3d5d72cc8a9ea31173b4"} Mar 20 10:38:18 crc kubenswrapper[4748]: I0320 10:38:18.734419 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566718-7hmv9" event={"ID":"0de9aa72-edab-4ae9-b2dd-e20ef6b83277","Type":"ContainerStarted","Data":"e28cd3ed09ba0b5d63e50a5244ab15bacfd9d12d0375232c3a16a9ff4ce18ef7"} Mar 20 10:38:18 crc kubenswrapper[4748]: I0320 10:38:18.743765 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7zqwq" event={"ID":"eb3bf250-9fda-4426-baa3-48eed453f90d","Type":"ContainerStarted","Data":"4bed520f45f282afd4387c91491a57cc82f27fd4856d675181658babac47c859"} Mar 20 10:38:18 crc kubenswrapper[4748]: I0320 10:38:18.750212 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-tvhtn" event={"ID":"0e36c673-8d30-4267-a115-ec3b62ad093a","Type":"ContainerStarted","Data":"cb27d16fa422b010d7c56d2a4b4e275eef007c4e4538594bfe399b5ca738cdca"} Mar 20 10:38:18 crc kubenswrapper[4748]: I0320 10:38:18.751759 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-phlbm" event={"ID":"6a3e2353-3374-48ad-968b-8f4b487d63ed","Type":"ContainerStarted","Data":"ef81c80ff950d144514630208f9cd68ab6964b27c33e359f3b1750f6912365a6"} Mar 20 10:38:18 crc kubenswrapper[4748]: I0320 10:38:18.752815 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-dmmdh" event={"ID":"ae334fdf-f952-4b6b-8372-1fd7ef332362","Type":"ContainerStarted","Data":"8b271e3b6bd380bd3d266526eeff076faa8628bd20c184b5920981e864ec7a47"} Mar 20 10:38:18 crc kubenswrapper[4748]: I0320 10:38:18.754027 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-vn8zj" event={"ID":"f2320984-2f2c-4a56-a1d1-0c9b2e5f2d55","Type":"ContainerStarted","Data":"b8d663893a0f0cb277161ed1b90aeda62f667d7e1a7747a65447f09d3b14cc8d"} Mar 20 10:38:18 crc kubenswrapper[4748]: I0320 10:38:18.756920 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zbshh" event={"ID":"10c5886c-4f03-45fb-b71a-a9213027a4fd","Type":"ContainerStarted","Data":"32857229f70d989493cc3f55cb66158c77520c383c4ecbda2c521880f39e094c"} Mar 20 10:38:18 crc kubenswrapper[4748]: I0320 10:38:18.757861 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zbshh" Mar 20 10:38:18 crc kubenswrapper[4748]: I0320 10:38:18.759497 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-9cpg7" event={"ID":"c0116c88-07ce-460e-90ba-2b5d8fd6a921","Type":"ContainerStarted","Data":"74d970657f1637dca93f3ccd0f395ea17b1b976614aebcf35538a39c03b75a60"} Mar 20 10:38:18 crc kubenswrapper[4748]: I0320 10:38:18.759519 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-9cpg7" event={"ID":"c0116c88-07ce-460e-90ba-2b5d8fd6a921","Type":"ContainerStarted","Data":"81e4b77d8b16210c5e522889b70c1a16f09db91b82a58af028394ed897c21715"} Mar 20 10:38:18 crc kubenswrapper[4748]: I0320 10:38:18.764617 4748 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-zbshh container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.34:5443/healthz\": dial tcp 10.217.0.34:5443: connect: connection refused" start-of-body= Mar 20 10:38:18 crc kubenswrapper[4748]: I0320 10:38:18.764705 4748 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zbshh" podUID="10c5886c-4f03-45fb-b71a-a9213027a4fd" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.34:5443/healthz\": dial tcp 10.217.0.34:5443: connect: connection refused" Mar 20 10:38:18 crc kubenswrapper[4748]: I0320 10:38:18.767090 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-9x2kc" Mar 20 10:38:18 crc kubenswrapper[4748]: I0320 10:38:18.768022 4748 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-hsr94 container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.28:8443/healthz\": dial tcp 10.217.0.28:8443: connect: connection refused" start-of-body= Mar 20 10:38:18 crc kubenswrapper[4748]: I0320 10:38:18.768086 4748 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hsr94" podUID="51c43cdc-1f52-4850-9c2d-d317dc38c754" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.28:8443/healthz\": dial tcp 10.217.0.28:8443: connect: connection refused" Mar 20 10:38:18 crc kubenswrapper[4748]: I0320 10:38:18.769517 4748 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-hg8kp container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.35:8443/healthz\": dial tcp 10.217.0.35:8443: connect: connection refused" start-of-body= Mar 20 10:38:18 crc kubenswrapper[4748]: I0320 10:38:18.769563 4748 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-hg8kp" podUID="ffd1ef63-d06a-4c98-8644-9efc1e7ae8fa" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.35:8443/healthz\": dial tcp 10.217.0.35:8443: connect: connection refused" Mar 20 10:38:18 crc kubenswrapper[4748]: I0320 10:38:18.776127 4748 patch_prober.go:28] interesting pod/console-operator-58897d9998-9x2kc container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.20:8443/readyz\": dial tcp 10.217.0.20:8443: connect: connection refused" start-of-body= Mar 20 10:38:18 crc kubenswrapper[4748]: I0320 10:38:18.776207 4748 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-9x2kc" podUID="122fe6aa-0852-4ddd-a678-6ce541c38250" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.20:8443/readyz\": dial tcp 10.217.0.20:8443: connect: connection refused" Mar 20 10:38:18 crc kubenswrapper[4748]: I0320 10:38:18.776300 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6q2px" podStartSLOduration=74.77627243 podStartE2EDuration="1m14.77627243s" podCreationTimestamp="2026-03-20 10:37:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:38:18.764335005 +0000 UTC m=+133.905880819" watchObservedRunningTime="2026-03-20 10:38:18.77627243 +0000 UTC m=+133.917818244" Mar 20 10:38:18 crc kubenswrapper[4748]: I0320 10:38:18.811493 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hsr94" podStartSLOduration=73.811394716 podStartE2EDuration="1m13.811394716s" podCreationTimestamp="2026-03-20 10:37:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:38:18.809139707 +0000 UTC m=+133.950685521" watchObservedRunningTime="2026-03-20 10:38:18.811394716 +0000 UTC m=+133.952940530" Mar 20 10:38:18 crc kubenswrapper[4748]: I0320 10:38:18.812338 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:38:18 crc kubenswrapper[4748]: E0320 10:38:18.812616 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:38:19.312567927 +0000 UTC m=+134.454113741 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:38:18 crc kubenswrapper[4748]: I0320 10:38:18.812874 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tlkp2\" (UID: \"7da7288f-ca47-4172-a3dd-80a79e803277\") " pod="openshift-image-registry/image-registry-697d97f7c8-tlkp2" Mar 20 10:38:18 crc kubenswrapper[4748]: E0320 10:38:18.816945 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:38:19.316926862 +0000 UTC m=+134.458472766 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tlkp2" (UID: "7da7288f-ca47-4172-a3dd-80a79e803277") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:38:18 crc kubenswrapper[4748]: I0320 10:38:18.914005 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:38:18 crc kubenswrapper[4748]: E0320 10:38:18.914153 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:38:19.414128627 +0000 UTC m=+134.555674441 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:38:18 crc kubenswrapper[4748]: I0320 10:38:18.914251 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tlkp2\" (UID: \"7da7288f-ca47-4172-a3dd-80a79e803277\") " pod="openshift-image-registry/image-registry-697d97f7c8-tlkp2" Mar 20 10:38:18 crc kubenswrapper[4748]: E0320 10:38:18.914573 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:38:19.414560408 +0000 UTC m=+134.556106222 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tlkp2" (UID: "7da7288f-ca47-4172-a3dd-80a79e803277") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:38:19 crc kubenswrapper[4748]: I0320 10:38:19.016458 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:38:19 crc kubenswrapper[4748]: E0320 10:38:19.016616 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:38:19.5165945 +0000 UTC m=+134.658140314 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:38:19 crc kubenswrapper[4748]: I0320 10:38:19.016895 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tlkp2\" (UID: \"7da7288f-ca47-4172-a3dd-80a79e803277\") " pod="openshift-image-registry/image-registry-697d97f7c8-tlkp2" Mar 20 10:38:19 crc kubenswrapper[4748]: E0320 10:38:19.017243 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:38:19.517220417 +0000 UTC m=+134.658766231 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tlkp2" (UID: "7da7288f-ca47-4172-a3dd-80a79e803277") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:38:19 crc kubenswrapper[4748]: I0320 10:38:19.119225 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:38:19 crc kubenswrapper[4748]: E0320 10:38:19.119454 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:38:19.619421493 +0000 UTC m=+134.760967317 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:38:19 crc kubenswrapper[4748]: I0320 10:38:19.119539 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tlkp2\" (UID: \"7da7288f-ca47-4172-a3dd-80a79e803277\") " pod="openshift-image-registry/image-registry-697d97f7c8-tlkp2" Mar 20 10:38:19 crc kubenswrapper[4748]: E0320 10:38:19.119978 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:38:19.619966608 +0000 UTC m=+134.761512442 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tlkp2" (UID: "7da7288f-ca47-4172-a3dd-80a79e803277") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:38:19 crc kubenswrapper[4748]: I0320 10:38:19.130631 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zwht2" podStartSLOduration=75.130603398 podStartE2EDuration="1m15.130603398s" podCreationTimestamp="2026-03-20 10:37:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:38:19.129716425 +0000 UTC m=+134.271262259" watchObservedRunningTime="2026-03-20 10:38:19.130603398 +0000 UTC m=+134.272149212" Mar 20 10:38:19 crc kubenswrapper[4748]: I0320 10:38:19.203524 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-p52m7" podStartSLOduration=6.203502472 podStartE2EDuration="6.203502472s" podCreationTimestamp="2026-03-20 10:38:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:38:19.168427326 +0000 UTC m=+134.309973140" watchObservedRunningTime="2026-03-20 10:38:19.203502472 +0000 UTC m=+134.345048276" Mar 20 10:38:19 crc kubenswrapper[4748]: I0320 10:38:19.220358 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:38:19 crc kubenswrapper[4748]: E0320 10:38:19.220518 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:38:19.72048629 +0000 UTC m=+134.862032104 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:38:19 crc kubenswrapper[4748]: I0320 10:38:19.220643 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tlkp2\" (UID: \"7da7288f-ca47-4172-a3dd-80a79e803277\") " pod="openshift-image-registry/image-registry-697d97f7c8-tlkp2" Mar 20 10:38:19 crc kubenswrapper[4748]: E0320 10:38:19.221025 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:38:19.721016884 +0000 UTC m=+134.862562698 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tlkp2" (UID: "7da7288f-ca47-4172-a3dd-80a79e803277") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:38:19 crc kubenswrapper[4748]: I0320 10:38:19.244762 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9qxnc" podStartSLOduration=75.24473852 podStartE2EDuration="1m15.24473852s" podCreationTimestamp="2026-03-20 10:37:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:38:19.204013685 +0000 UTC m=+134.345559499" watchObservedRunningTime="2026-03-20 10:38:19.24473852 +0000 UTC m=+134.386284334" Mar 20 10:38:19 crc kubenswrapper[4748]: I0320 10:38:19.284659 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-tvhtn" podStartSLOduration=75.284639843 podStartE2EDuration="1m15.284639843s" podCreationTimestamp="2026-03-20 10:37:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:38:19.249814854 +0000 UTC m=+134.391360668" watchObservedRunningTime="2026-03-20 10:38:19.284639843 +0000 UTC m=+134.426185657" Mar 20 10:38:19 crc kubenswrapper[4748]: I0320 10:38:19.286142 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-9lbsk" podStartSLOduration=75.286137252 podStartE2EDuration="1m15.286137252s" podCreationTimestamp="2026-03-20 10:37:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:38:19.284886109 +0000 UTC m=+134.426431923" watchObservedRunningTime="2026-03-20 10:38:19.286137252 +0000 UTC m=+134.427683066" Mar 20 10:38:19 crc kubenswrapper[4748]: I0320 10:38:19.324526 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:38:19 crc kubenswrapper[4748]: E0320 10:38:19.324683 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:38:19.824654718 +0000 UTC m=+134.966200532 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:38:19 crc kubenswrapper[4748]: I0320 10:38:19.324945 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tlkp2\" (UID: \"7da7288f-ca47-4172-a3dd-80a79e803277\") " pod="openshift-image-registry/image-registry-697d97f7c8-tlkp2" Mar 20 10:38:19 crc kubenswrapper[4748]: E0320 10:38:19.325285 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:38:19.825270945 +0000 UTC m=+134.966816759 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tlkp2" (UID: "7da7288f-ca47-4172-a3dd-80a79e803277") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:38:19 crc kubenswrapper[4748]: I0320 10:38:19.331543 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-xwxlw" podStartSLOduration=74.33152202 podStartE2EDuration="1m14.33152202s" podCreationTimestamp="2026-03-20 10:37:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:38:19.330580455 +0000 UTC m=+134.472126269" watchObservedRunningTime="2026-03-20 10:38:19.33152202 +0000 UTC m=+134.473067834" Mar 20 10:38:19 crc kubenswrapper[4748]: I0320 10:38:19.408655 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-4svjh" podStartSLOduration=74.408629604 podStartE2EDuration="1m14.408629604s" podCreationTimestamp="2026-03-20 10:37:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:38:19.406047676 +0000 UTC m=+134.547593490" watchObservedRunningTime="2026-03-20 10:38:19.408629604 +0000 UTC m=+134.550175418" Mar 20 10:38:19 crc kubenswrapper[4748]: I0320 10:38:19.426629 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:38:19 crc kubenswrapper[4748]: E0320 10:38:19.426845 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:38:19.926807593 +0000 UTC m=+135.068353407 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:38:19 crc kubenswrapper[4748]: I0320 10:38:19.427062 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tlkp2\" (UID: \"7da7288f-ca47-4172-a3dd-80a79e803277\") " pod="openshift-image-registry/image-registry-697d97f7c8-tlkp2" Mar 20 10:38:19 crc kubenswrapper[4748]: E0320 10:38:19.427446 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:38:19.92743133 +0000 UTC m=+135.068977134 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tlkp2" (UID: "7da7288f-ca47-4172-a3dd-80a79e803277") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:38:19 crc kubenswrapper[4748]: I0320 10:38:19.434996 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-dmmdh" podStartSLOduration=75.434963969 podStartE2EDuration="1m15.434963969s" podCreationTimestamp="2026-03-20 10:37:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:38:19.432957916 +0000 UTC m=+134.574503740" watchObservedRunningTime="2026-03-20 10:38:19.434963969 +0000 UTC m=+134.576509783" Mar 20 10:38:19 crc kubenswrapper[4748]: I0320 10:38:19.528400 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:38:19 crc kubenswrapper[4748]: E0320 10:38:19.528811 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:38:20.028554758 +0000 UTC m=+135.170100572 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:38:19 crc kubenswrapper[4748]: I0320 10:38:19.528853 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tlkp2\" (UID: \"7da7288f-ca47-4172-a3dd-80a79e803277\") " pod="openshift-image-registry/image-registry-697d97f7c8-tlkp2" Mar 20 10:38:19 crc kubenswrapper[4748]: E0320 10:38:19.532379 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:38:20.032366089 +0000 UTC m=+135.173911903 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tlkp2" (UID: "7da7288f-ca47-4172-a3dd-80a79e803277") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:38:19 crc kubenswrapper[4748]: I0320 10:38:19.534697 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7zqwq" podStartSLOduration=75.53467662 podStartE2EDuration="1m15.53467662s" podCreationTimestamp="2026-03-20 10:37:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:38:19.532660386 +0000 UTC m=+134.674206200" watchObservedRunningTime="2026-03-20 10:38:19.53467662 +0000 UTC m=+134.676222434" Mar 20 10:38:19 crc kubenswrapper[4748]: I0320 10:38:19.566362 4748 patch_prober.go:28] interesting pod/router-default-5444994796-rbx59 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 10:38:19 crc kubenswrapper[4748]: [-]has-synced failed: reason withheld Mar 20 10:38:19 crc kubenswrapper[4748]: [+]process-running ok Mar 20 10:38:19 crc kubenswrapper[4748]: healthz check failed Mar 20 10:38:19 crc kubenswrapper[4748]: I0320 10:38:19.566433 4748 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-rbx59" podUID="2c8e6760-4ff7-4417-917e-61bdc115d710" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 10:38:19 crc kubenswrapper[4748]: I0320 10:38:19.566784 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-8ttxd" podStartSLOduration=75.566764956 podStartE2EDuration="1m15.566764956s" podCreationTimestamp="2026-03-20 10:37:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:38:19.566348205 +0000 UTC m=+134.707894019" watchObservedRunningTime="2026-03-20 10:38:19.566764956 +0000 UTC m=+134.708310770" Mar 20 10:38:19 crc kubenswrapper[4748]: I0320 10:38:19.625629 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zbshh" podStartSLOduration=74.625607379 podStartE2EDuration="1m14.625607379s" podCreationTimestamp="2026-03-20 10:37:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:38:19.595668149 +0000 UTC m=+134.737213973" watchObservedRunningTime="2026-03-20 10:38:19.625607379 +0000 UTC m=+134.767153183" Mar 20 10:38:19 crc kubenswrapper[4748]: I0320 10:38:19.626607 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29566710-vh2t7" podStartSLOduration=75.626600835 podStartE2EDuration="1m15.626600835s" podCreationTimestamp="2026-03-20 10:37:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:38:19.624593972 +0000 UTC m=+134.766139786" watchObservedRunningTime="2026-03-20 10:38:19.626600835 +0000 UTC m=+134.768146649" Mar 20 10:38:19 crc kubenswrapper[4748]: I0320 10:38:19.631323 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:38:19 crc kubenswrapper[4748]: E0320 10:38:19.631694 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:38:20.131677179 +0000 UTC m=+135.273222993 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:38:19 crc kubenswrapper[4748]: I0320 10:38:19.732922 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tlkp2\" (UID: \"7da7288f-ca47-4172-a3dd-80a79e803277\") " pod="openshift-image-registry/image-registry-697d97f7c8-tlkp2" Mar 20 10:38:19 crc kubenswrapper[4748]: E0320 10:38:19.733388 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:38:20.233364062 +0000 UTC m=+135.374909876 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tlkp2" (UID: "7da7288f-ca47-4172-a3dd-80a79e803277") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:38:19 crc kubenswrapper[4748]: I0320 10:38:19.767497 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hn5gq" event={"ID":"0a71010c-36bf-4830-91be-1f54483db83f","Type":"ContainerStarted","Data":"6fdbffe1f8bb7daac06634f076577fce90cdeab5af3e5afb0fcdef4808cdd7a2"} Mar 20 10:38:19 crc kubenswrapper[4748]: I0320 10:38:19.767805 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hn5gq" event={"ID":"0a71010c-36bf-4830-91be-1f54483db83f","Type":"ContainerStarted","Data":"062aeeb39bab4c393e7ba0892ca3521553a91662bd89c83e1f4f3de65bdb80ed"} Mar 20 10:38:19 crc kubenswrapper[4748]: I0320 10:38:19.768823 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hn5gq" Mar 20 10:38:19 crc kubenswrapper[4748]: I0320 10:38:19.771883 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-n6wkh" event={"ID":"a32808c8-cba3-478e-bcf0-01434602a895","Type":"ContainerStarted","Data":"533888e489430867b9b434b99f3103857a27eb0f718b9f067c5d23767fbb4f8a"} Mar 20 10:38:19 crc kubenswrapper[4748]: I0320 10:38:19.774632 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-rcw2w" event={"ID":"7e2e266d-1d26-4bbc-987e-e08b2fa0ccfe","Type":"ContainerStarted","Data":"2aa0625693bb55813bd8620673c60bcaa44a46024b75d355f3849e40a25ca699"} Mar 20 10:38:19 crc kubenswrapper[4748]: I0320 10:38:19.778892 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-sksdb" event={"ID":"0481ff64-8f10-4e72-a81e-d2c53278246a","Type":"ContainerStarted","Data":"bce435ab97a936421fc3e73e8cd607a771c8e6fb8aa365b28c7a2e77c4858bba"} Mar 20 10:38:19 crc kubenswrapper[4748]: I0320 10:38:19.781434 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:38:19 crc kubenswrapper[4748]: I0320 10:38:19.783502 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-brfsp" event={"ID":"b59f4996-13e0-4b70-b835-fbeea765c681","Type":"ContainerStarted","Data":"3c2e1fb7e7b44b97535e50e27d6e467d2ba826d77a0c38fecd7ae67df50d9297"} Mar 20 10:38:19 crc kubenswrapper[4748]: I0320 10:38:19.783551 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-brfsp" event={"ID":"b59f4996-13e0-4b70-b835-fbeea765c681","Type":"ContainerStarted","Data":"29d1e8186d3301fa716607203f1272df5555d3e54a15488a9c49a99cb038b925"} Mar 20 10:38:19 crc kubenswrapper[4748]: I0320 10:38:19.792906 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hn5gq" podStartSLOduration=74.792879412 podStartE2EDuration="1m14.792879412s" podCreationTimestamp="2026-03-20 10:37:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:38:19.786124984 +0000 UTC m=+134.927670798" watchObservedRunningTime="2026-03-20 10:38:19.792879412 +0000 UTC m=+134.934425226" Mar 20 10:38:19 crc kubenswrapper[4748]: I0320 10:38:19.801381 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-7cnpj" event={"ID":"65bf442d-adbb-4d6d-b0ee-55bbba31e305","Type":"ContainerStarted","Data":"85f6f223a8a58d76c9a4613e3bb9c1ad121528847f13aa86bc41506bbd0466aa"} Mar 20 10:38:19 crc kubenswrapper[4748]: I0320 10:38:19.805106 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-m7vd5" event={"ID":"ef9e528c-68f5-4e04-9ceb-99e84335173c","Type":"ContainerStarted","Data":"f162900735df89516afdf691284a512736cff90bee92d86407ba802fc9c72ffa"} Mar 20 10:38:19 crc kubenswrapper[4748]: I0320 10:38:19.807907 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-dwfwm" event={"ID":"afa0ff09-5605-496f-806a-a8719dd3d958","Type":"ContainerStarted","Data":"5dc4a1b8b5963edf3f5ac6b463d269491cf315a001c3a3038c7e293cd4d25ffb"} Mar 20 10:38:19 crc kubenswrapper[4748]: I0320 10:38:19.812606 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-n6wkh" podStartSLOduration=75.812580002 podStartE2EDuration="1m15.812580002s" podCreationTimestamp="2026-03-20 10:37:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:38:19.809450659 +0000 UTC m=+134.950996473" watchObservedRunningTime="2026-03-20 10:38:19.812580002 +0000 UTC m=+134.954125816" Mar 20 10:38:19 crc kubenswrapper[4748]: I0320 10:38:19.821300 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-vn8zj" event={"ID":"f2320984-2f2c-4a56-a1d1-0c9b2e5f2d55","Type":"ContainerStarted","Data":"c38ba579acac3ba7590d64bb9322fc05c4be86024d8b0164623efa133f6ac389"} Mar 20 10:38:19 crc kubenswrapper[4748]: I0320 10:38:19.834218 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:38:19 crc kubenswrapper[4748]: E0320 10:38:19.835638 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:38:20.335609189 +0000 UTC m=+135.477155033 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:38:19 crc kubenswrapper[4748]: I0320 10:38:19.839856 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-rcw2w" podStartSLOduration=74.839819271 podStartE2EDuration="1m14.839819271s" podCreationTimestamp="2026-03-20 10:37:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:38:19.832666382 +0000 UTC m=+134.974212216" watchObservedRunningTime="2026-03-20 10:38:19.839819271 +0000 UTC m=+134.981365085" Mar 20 10:38:19 crc kubenswrapper[4748]: I0320 10:38:19.863491 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-brfsp" podStartSLOduration=75.863464394 podStartE2EDuration="1m15.863464394s" podCreationTimestamp="2026-03-20 10:37:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:38:19.852968517 +0000 UTC m=+134.994514341" watchObservedRunningTime="2026-03-20 10:38:19.863464394 +0000 UTC m=+135.005010218" Mar 20 10:38:19 crc kubenswrapper[4748]: I0320 10:38:19.895337 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-4k4dq" event={"ID":"3ec3ee66-08f2-44d5-a414-90354f5a4a9e","Type":"ContainerStarted","Data":"9570ba57b7744a9538ce0ae2f908e78e4082b6155e98f61bff3239734ba30448"} Mar 20 10:38:19 crc kubenswrapper[4748]: I0320 10:38:19.895395 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-4k4dq" Mar 20 10:38:19 crc kubenswrapper[4748]: I0320 10:38:19.909550 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-m7vd5" podStartSLOduration=75.90952823 podStartE2EDuration="1m15.90952823s" podCreationTimestamp="2026-03-20 10:37:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:38:19.906565822 +0000 UTC m=+135.048111636" watchObservedRunningTime="2026-03-20 10:38:19.90952823 +0000 UTC m=+135.051074044" Mar 20 10:38:19 crc kubenswrapper[4748]: I0320 10:38:19.934774 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-phlbm" event={"ID":"6a3e2353-3374-48ad-968b-8f4b487d63ed","Type":"ContainerStarted","Data":"dbfdafa90003def11a05aae9fc3f4c2c253caa9543c622bc40fa90ce40360be9"} Mar 20 10:38:19 crc kubenswrapper[4748]: I0320 10:38:19.935704 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tlkp2\" (UID: \"7da7288f-ca47-4172-a3dd-80a79e803277\") " pod="openshift-image-registry/image-registry-697d97f7c8-tlkp2" Mar 20 10:38:19 crc kubenswrapper[4748]: I0320 10:38:19.939190 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-dwfwm" podStartSLOduration=6.939167722 podStartE2EDuration="6.939167722s" podCreationTimestamp="2026-03-20 10:38:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:38:19.937680692 +0000 UTC m=+135.079226506" watchObservedRunningTime="2026-03-20 10:38:19.939167722 +0000 UTC m=+135.080713536" Mar 20 10:38:19 crc kubenswrapper[4748]: E0320 10:38:19.940175 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:38:20.440150728 +0000 UTC m=+135.581696612 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tlkp2" (UID: "7da7288f-ca47-4172-a3dd-80a79e803277") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:38:19 crc kubenswrapper[4748]: I0320 10:38:19.951173 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-qq9xx" event={"ID":"ff567791-a234-47f5-8350-154f93477bb9","Type":"ContainerStarted","Data":"2bd8201333e0e9a76b0ce3b628b0def30b0bbca25cea201f91a1ef5f87137a68"} Mar 20 10:38:19 crc kubenswrapper[4748]: I0320 10:38:19.952351 4748 patch_prober.go:28] interesting pod/console-operator-58897d9998-9x2kc container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.20:8443/readyz\": dial tcp 10.217.0.20:8443: connect: connection refused" start-of-body= Mar 20 10:38:19 crc kubenswrapper[4748]: I0320 10:38:19.952488 4748 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-hg8kp container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.35:8443/healthz\": dial tcp 10.217.0.35:8443: connect: connection refused" start-of-body= Mar 20 10:38:19 crc kubenswrapper[4748]: I0320 10:38:19.952537 4748 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-hg8kp" podUID="ffd1ef63-d06a-4c98-8644-9efc1e7ae8fa" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.35:8443/healthz\": dial tcp 10.217.0.35:8443: connect: connection refused" Mar 20 10:38:19 crc kubenswrapper[4748]: I0320 10:38:19.952702 4748 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-9x2kc" podUID="122fe6aa-0852-4ddd-a678-6ce541c38250" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.20:8443/readyz\": dial tcp 10.217.0.20:8443: connect: connection refused" Mar 20 10:38:19 crc kubenswrapper[4748]: I0320 10:38:19.952708 4748 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-hsr94 container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.28:8443/healthz\": dial tcp 10.217.0.28:8443: connect: connection refused" start-of-body= Mar 20 10:38:19 crc kubenswrapper[4748]: I0320 10:38:19.952768 4748 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hsr94" podUID="51c43cdc-1f52-4850-9c2d-d317dc38c754" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.28:8443/healthz\": dial tcp 10.217.0.28:8443: connect: connection refused" Mar 20 10:38:19 crc kubenswrapper[4748]: I0320 10:38:19.953034 4748 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-zbshh container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.34:5443/healthz\": dial tcp 10.217.0.34:5443: connect: connection refused" start-of-body= Mar 20 10:38:19 crc kubenswrapper[4748]: I0320 10:38:19.953060 4748 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zbshh" podUID="10c5886c-4f03-45fb-b71a-a9213027a4fd" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.34:5443/healthz\": dial tcp 10.217.0.34:5443: connect: connection refused" Mar 20 10:38:19 crc kubenswrapper[4748]: I0320 10:38:19.955070 4748 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-xwxlw container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.17:8443/healthz\": dial tcp 10.217.0.17:8443: connect: connection refused" start-of-body= Mar 20 10:38:19 crc kubenswrapper[4748]: I0320 10:38:19.955099 4748 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-xwxlw" podUID="a49c3549-30d5-4927-ae41-cc2b9e7f47e2" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.17:8443/healthz\": dial tcp 10.217.0.17:8443: connect: connection refused" Mar 20 10:38:19 crc kubenswrapper[4748]: I0320 10:38:19.970813 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-phlbm" podStartSLOduration=75.970785756 podStartE2EDuration="1m15.970785756s" podCreationTimestamp="2026-03-20 10:37:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:38:19.965012784 +0000 UTC m=+135.106558618" watchObservedRunningTime="2026-03-20 10:38:19.970785756 +0000 UTC m=+135.112331570" Mar 20 10:38:20 crc kubenswrapper[4748]: I0320 10:38:20.027561 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-4k4dq" podStartSLOduration=76.027539763 podStartE2EDuration="1m16.027539763s" podCreationTimestamp="2026-03-20 10:37:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:38:19.999474613 +0000 UTC m=+135.141020437" watchObservedRunningTime="2026-03-20 10:38:20.027539763 +0000 UTC m=+135.169085567" Mar 20 10:38:20 crc kubenswrapper[4748]: I0320 10:38:20.033902 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-c5vw7" podStartSLOduration=76.03387398 podStartE2EDuration="1m16.03387398s" podCreationTimestamp="2026-03-20 10:37:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:38:20.02703881 +0000 UTC m=+135.168584624" watchObservedRunningTime="2026-03-20 10:38:20.03387398 +0000 UTC m=+135.175419794" Mar 20 10:38:20 crc kubenswrapper[4748]: I0320 10:38:20.036811 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:38:20 crc kubenswrapper[4748]: E0320 10:38:20.037242 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:38:20.537213399 +0000 UTC m=+135.678759213 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:38:20 crc kubenswrapper[4748]: I0320 10:38:20.039732 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tlkp2\" (UID: \"7da7288f-ca47-4172-a3dd-80a79e803277\") " pod="openshift-image-registry/image-registry-697d97f7c8-tlkp2" Mar 20 10:38:20 crc kubenswrapper[4748]: E0320 10:38:20.057771 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:38:20.55774337 +0000 UTC m=+135.699289184 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tlkp2" (UID: "7da7288f-ca47-4172-a3dd-80a79e803277") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:38:20 crc kubenswrapper[4748]: I0320 10:38:20.067971 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/cni-sysctl-allowlist-ds-9cpg7" podStartSLOduration=7.067939939 podStartE2EDuration="7.067939939s" podCreationTimestamp="2026-03-20 10:38:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:38:20.056401805 +0000 UTC m=+135.197947619" watchObservedRunningTime="2026-03-20 10:38:20.067939939 +0000 UTC m=+135.209485773" Mar 20 10:38:20 crc kubenswrapper[4748]: I0320 10:38:20.091619 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-xpncn" podStartSLOduration=76.091596353 podStartE2EDuration="1m16.091596353s" podCreationTimestamp="2026-03-20 10:37:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:38:20.089807186 +0000 UTC m=+135.231353000" watchObservedRunningTime="2026-03-20 10:38:20.091596353 +0000 UTC m=+135.233142167" Mar 20 10:38:20 crc kubenswrapper[4748]: I0320 10:38:20.144559 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:38:20 crc kubenswrapper[4748]: E0320 10:38:20.144824 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:38:20.644782827 +0000 UTC m=+135.786328641 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:38:20 crc kubenswrapper[4748]: I0320 10:38:20.145079 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tlkp2\" (UID: \"7da7288f-ca47-4172-a3dd-80a79e803277\") " pod="openshift-image-registry/image-registry-697d97f7c8-tlkp2" Mar 20 10:38:20 crc kubenswrapper[4748]: E0320 10:38:20.145498 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:38:20.645487265 +0000 UTC m=+135.787033079 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tlkp2" (UID: "7da7288f-ca47-4172-a3dd-80a79e803277") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:38:20 crc kubenswrapper[4748]: I0320 10:38:20.169406 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-fftbt" podStartSLOduration=76.169381846 podStartE2EDuration="1m16.169381846s" podCreationTimestamp="2026-03-20 10:37:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:38:20.137577297 +0000 UTC m=+135.279123111" watchObservedRunningTime="2026-03-20 10:38:20.169381846 +0000 UTC m=+135.310927660" Mar 20 10:38:20 crc kubenswrapper[4748]: I0320 10:38:20.170001 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-hxtfq" podStartSLOduration=76.169994362 podStartE2EDuration="1m16.169994362s" podCreationTimestamp="2026-03-20 10:37:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:38:20.164384474 +0000 UTC m=+135.305930308" watchObservedRunningTime="2026-03-20 10:38:20.169994362 +0000 UTC m=+135.311540206" Mar 20 10:38:20 crc kubenswrapper[4748]: I0320 10:38:20.246334 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:38:20 crc kubenswrapper[4748]: E0320 10:38:20.246522 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:38:20.74649169 +0000 UTC m=+135.888037504 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:38:20 crc kubenswrapper[4748]: I0320 10:38:20.246874 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tlkp2\" (UID: \"7da7288f-ca47-4172-a3dd-80a79e803277\") " pod="openshift-image-registry/image-registry-697d97f7c8-tlkp2" Mar 20 10:38:20 crc kubenswrapper[4748]: E0320 10:38:20.247265 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:38:20.7472569 +0000 UTC m=+135.888802714 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tlkp2" (UID: "7da7288f-ca47-4172-a3dd-80a79e803277") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:38:20 crc kubenswrapper[4748]: I0320 10:38:20.262356 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-kc5xl" podStartSLOduration=76.262336688 podStartE2EDuration="1m16.262336688s" podCreationTimestamp="2026-03-20 10:37:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:38:20.211872527 +0000 UTC m=+135.353418351" watchObservedRunningTime="2026-03-20 10:38:20.262336688 +0000 UTC m=+135.403882502" Mar 20 10:38:20 crc kubenswrapper[4748]: I0320 10:38:20.263408 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8xtg5" podStartSLOduration=75.263402116 podStartE2EDuration="1m15.263402116s" podCreationTimestamp="2026-03-20 10:37:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:38:20.261731012 +0000 UTC m=+135.403276826" watchObservedRunningTime="2026-03-20 10:38:20.263402116 +0000 UTC m=+135.404947930" Mar 20 10:38:20 crc kubenswrapper[4748]: I0320 10:38:20.295725 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2xltn" podStartSLOduration=75.295705369 podStartE2EDuration="1m15.295705369s" podCreationTimestamp="2026-03-20 10:37:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:38:20.29310434 +0000 UTC m=+135.434650154" watchObservedRunningTime="2026-03-20 10:38:20.295705369 +0000 UTC m=+135.437251183" Mar 20 10:38:20 crc kubenswrapper[4748]: I0320 10:38:20.347753 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:38:20 crc kubenswrapper[4748]: E0320 10:38:20.348454 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:38:20.84843428 +0000 UTC m=+135.989980094 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:38:20 crc kubenswrapper[4748]: E0320 10:38:20.450228 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:38:20.950211245 +0000 UTC m=+136.091757059 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tlkp2" (UID: "7da7288f-ca47-4172-a3dd-80a79e803277") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:38:20 crc kubenswrapper[4748]: I0320 10:38:20.449827 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tlkp2\" (UID: \"7da7288f-ca47-4172-a3dd-80a79e803277\") " pod="openshift-image-registry/image-registry-697d97f7c8-tlkp2" Mar 20 10:38:20 crc kubenswrapper[4748]: I0320 10:38:20.487204 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2xltn" Mar 20 10:38:20 crc kubenswrapper[4748]: I0320 10:38:20.488107 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2xltn" Mar 20 10:38:20 crc kubenswrapper[4748]: I0320 10:38:20.489977 4748 patch_prober.go:28] interesting pod/apiserver-7bbb656c7d-2xltn container/oauth-apiserver namespace/openshift-oauth-apiserver: Startup probe status=failure output="Get \"https://10.217.0.9:8443/livez\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Mar 20 10:38:20 crc kubenswrapper[4748]: I0320 10:38:20.490030 4748 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2xltn" podUID="e24b6492-332b-421d-b2cd-ab1f11e06432" containerName="oauth-apiserver" probeResult="failure" output="Get \"https://10.217.0.9:8443/livez\": dial tcp 10.217.0.9:8443: connect: connection refused" Mar 20 10:38:20 crc kubenswrapper[4748]: I0320 10:38:20.551747 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:38:20 crc kubenswrapper[4748]: E0320 10:38:20.552174 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:38:21.052156785 +0000 UTC m=+136.193702599 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:38:20 crc kubenswrapper[4748]: I0320 10:38:20.560347 4748 patch_prober.go:28] interesting pod/router-default-5444994796-rbx59 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 10:38:20 crc kubenswrapper[4748]: [-]has-synced failed: reason withheld Mar 20 10:38:20 crc kubenswrapper[4748]: [+]process-running ok Mar 20 10:38:20 crc kubenswrapper[4748]: healthz check failed Mar 20 10:38:20 crc kubenswrapper[4748]: I0320 10:38:20.560699 4748 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-rbx59" podUID="2c8e6760-4ff7-4417-917e-61bdc115d710" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 10:38:20 crc kubenswrapper[4748]: I0320 10:38:20.653286 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tlkp2\" (UID: \"7da7288f-ca47-4172-a3dd-80a79e803277\") " pod="openshift-image-registry/image-registry-697d97f7c8-tlkp2" Mar 20 10:38:20 crc kubenswrapper[4748]: E0320 10:38:20.653712 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:38:21.153694424 +0000 UTC m=+136.295240238 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tlkp2" (UID: "7da7288f-ca47-4172-a3dd-80a79e803277") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:38:20 crc kubenswrapper[4748]: I0320 10:38:20.753935 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:38:20 crc kubenswrapper[4748]: E0320 10:38:20.754262 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:38:21.254240167 +0000 UTC m=+136.395785981 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:38:20 crc kubenswrapper[4748]: I0320 10:38:20.856624 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tlkp2\" (UID: \"7da7288f-ca47-4172-a3dd-80a79e803277\") " pod="openshift-image-registry/image-registry-697d97f7c8-tlkp2" Mar 20 10:38:20 crc kubenswrapper[4748]: E0320 10:38:20.857092 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:38:21.357075809 +0000 UTC m=+136.498621623 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tlkp2" (UID: "7da7288f-ca47-4172-a3dd-80a79e803277") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:38:20 crc kubenswrapper[4748]: I0320 10:38:20.957547 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:38:20 crc kubenswrapper[4748]: E0320 10:38:20.957847 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:38:21.457815217 +0000 UTC m=+136.599361031 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:38:20 crc kubenswrapper[4748]: I0320 10:38:20.989522 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-sksdb" event={"ID":"0481ff64-8f10-4e72-a81e-d2c53278246a","Type":"ContainerStarted","Data":"594b6d0915378726eb56a2b89eac8293d3fdd65aa29e40f948b821cb88749981"} Mar 20 10:38:21 crc kubenswrapper[4748]: I0320 10:38:21.001880 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-qq9xx" event={"ID":"ff567791-a234-47f5-8350-154f93477bb9","Type":"ContainerStarted","Data":"93f1a7279c27a1cbf49009c86b7f4f37dd6d21613154296da70b62eb2d797427"} Mar 20 10:38:21 crc kubenswrapper[4748]: I0320 10:38:21.007699 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-7cnpj" event={"ID":"65bf442d-adbb-4d6d-b0ee-55bbba31e305","Type":"ContainerStarted","Data":"e5501cb8c3d948b25de91391c613498def1dea527b0e0607e3dfef9b2be6d591"} Mar 20 10:38:21 crc kubenswrapper[4748]: I0320 10:38:21.014987 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-cfzr7" event={"ID":"3a7350f6-60df-4463-a08c-1b231d9443c6","Type":"ContainerStarted","Data":"52753d819dfb978baee5433469fe57467a6b141f3ab03ef91ab3567871400c30"} Mar 20 10:38:21 crc kubenswrapper[4748]: I0320 10:38:21.019378 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-vn8zj" event={"ID":"f2320984-2f2c-4a56-a1d1-0c9b2e5f2d55","Type":"ContainerStarted","Data":"fc502871f71eb7b69e3c91afa58b4aa96915be12db3b3343e24b5104b9ae00b4"} Mar 20 10:38:21 crc kubenswrapper[4748]: I0320 10:38:21.019430 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-vn8zj" Mar 20 10:38:21 crc kubenswrapper[4748]: I0320 10:38:21.034145 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-sksdb" podStartSLOduration=77.03412651 podStartE2EDuration="1m17.03412651s" podCreationTimestamp="2026-03-20 10:37:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:38:21.031316016 +0000 UTC m=+136.172861830" watchObservedRunningTime="2026-03-20 10:38:21.03412651 +0000 UTC m=+136.175672324" Mar 20 10:38:21 crc kubenswrapper[4748]: I0320 10:38:21.079093 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tlkp2\" (UID: \"7da7288f-ca47-4172-a3dd-80a79e803277\") " pod="openshift-image-registry/image-registry-697d97f7c8-tlkp2" Mar 20 10:38:21 crc kubenswrapper[4748]: E0320 10:38:21.081715 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:38:21.581701546 +0000 UTC m=+136.723247350 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tlkp2" (UID: "7da7288f-ca47-4172-a3dd-80a79e803277") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:38:21 crc kubenswrapper[4748]: I0320 10:38:21.099707 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-qq9xx" podStartSLOduration=77.09968607 podStartE2EDuration="1m17.09968607s" podCreationTimestamp="2026-03-20 10:37:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:38:21.067811549 +0000 UTC m=+136.209357373" watchObservedRunningTime="2026-03-20 10:38:21.09968607 +0000 UTC m=+136.241231884" Mar 20 10:38:21 crc kubenswrapper[4748]: I0320 10:38:21.101777 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-7cnpj" podStartSLOduration=76.101767375 podStartE2EDuration="1m16.101767375s" podCreationTimestamp="2026-03-20 10:37:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:38:21.099233728 +0000 UTC m=+136.240779542" watchObservedRunningTime="2026-03-20 10:38:21.101767375 +0000 UTC m=+136.243313189" Mar 20 10:38:21 crc kubenswrapper[4748]: I0320 10:38:21.158250 4748 ???:1] "http: TLS handshake error from 192.168.126.11:52806: no serving certificate available for the kubelet" Mar 20 10:38:21 crc kubenswrapper[4748]: I0320 10:38:21.180239 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:38:21 crc kubenswrapper[4748]: E0320 10:38:21.180564 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:38:21.680546264 +0000 UTC m=+136.822092068 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:38:21 crc kubenswrapper[4748]: I0320 10:38:21.281699 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tlkp2\" (UID: \"7da7288f-ca47-4172-a3dd-80a79e803277\") " pod="openshift-image-registry/image-registry-697d97f7c8-tlkp2" Mar 20 10:38:21 crc kubenswrapper[4748]: E0320 10:38:21.282093 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:38:21.782080632 +0000 UTC m=+136.923626446 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tlkp2" (UID: "7da7288f-ca47-4172-a3dd-80a79e803277") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:38:21 crc kubenswrapper[4748]: I0320 10:38:21.290293 4748 ???:1] "http: TLS handshake error from 192.168.126.11:52822: no serving certificate available for the kubelet" Mar 20 10:38:21 crc kubenswrapper[4748]: I0320 10:38:21.383048 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:38:21 crc kubenswrapper[4748]: E0320 10:38:21.383410 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:38:21.883391075 +0000 UTC m=+137.024936889 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:38:21 crc kubenswrapper[4748]: I0320 10:38:21.437089 4748 ???:1] "http: TLS handshake error from 192.168.126.11:52824: no serving certificate available for the kubelet" Mar 20 10:38:21 crc kubenswrapper[4748]: I0320 10:38:21.485043 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tlkp2\" (UID: \"7da7288f-ca47-4172-a3dd-80a79e803277\") " pod="openshift-image-registry/image-registry-697d97f7c8-tlkp2" Mar 20 10:38:21 crc kubenswrapper[4748]: E0320 10:38:21.485361 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:38:21.985346385 +0000 UTC m=+137.126892199 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tlkp2" (UID: "7da7288f-ca47-4172-a3dd-80a79e803277") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:38:21 crc kubenswrapper[4748]: I0320 10:38:21.558078 4748 patch_prober.go:28] interesting pod/router-default-5444994796-rbx59 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 10:38:21 crc kubenswrapper[4748]: [-]has-synced failed: reason withheld Mar 20 10:38:21 crc kubenswrapper[4748]: [+]process-running ok Mar 20 10:38:21 crc kubenswrapper[4748]: healthz check failed Mar 20 10:38:21 crc kubenswrapper[4748]: I0320 10:38:21.558142 4748 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-rbx59" podUID="2c8e6760-4ff7-4417-917e-61bdc115d710" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 10:38:21 crc kubenswrapper[4748]: I0320 10:38:21.586236 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:38:21 crc kubenswrapper[4748]: E0320 10:38:21.586573 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:38:22.086556986 +0000 UTC m=+137.228102800 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:38:21 crc kubenswrapper[4748]: I0320 10:38:21.629493 4748 ???:1] "http: TLS handshake error from 192.168.126.11:52828: no serving certificate available for the kubelet" Mar 20 10:38:21 crc kubenswrapper[4748]: I0320 10:38:21.635356 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-vn8zj" podStartSLOduration=8.635336533 podStartE2EDuration="8.635336533s" podCreationTimestamp="2026-03-20 10:38:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:38:21.149208717 +0000 UTC m=+136.290754531" watchObservedRunningTime="2026-03-20 10:38:21.635336533 +0000 UTC m=+136.776882337" Mar 20 10:38:21 crc kubenswrapper[4748]: I0320 10:38:21.656603 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Mar 20 10:38:21 crc kubenswrapper[4748]: I0320 10:38:21.663187 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-xwxlw" Mar 20 10:38:21 crc kubenswrapper[4748]: I0320 10:38:21.692660 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tlkp2\" (UID: \"7da7288f-ca47-4172-a3dd-80a79e803277\") " pod="openshift-image-registry/image-registry-697d97f7c8-tlkp2" Mar 20 10:38:21 crc kubenswrapper[4748]: E0320 10:38:21.693300 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:38:22.193280422 +0000 UTC m=+137.334826316 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tlkp2" (UID: "7da7288f-ca47-4172-a3dd-80a79e803277") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:38:21 crc kubenswrapper[4748]: I0320 10:38:21.706658 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=0.706642474 podStartE2EDuration="706.642474ms" podCreationTimestamp="2026-03-20 10:38:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:38:21.705407191 +0000 UTC m=+136.846953005" watchObservedRunningTime="2026-03-20 10:38:21.706642474 +0000 UTC m=+136.848188288" Mar 20 10:38:21 crc kubenswrapper[4748]: I0320 10:38:21.794910 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:38:21 crc kubenswrapper[4748]: E0320 10:38:21.795199 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:38:22.2951805 +0000 UTC m=+137.436726314 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:38:21 crc kubenswrapper[4748]: I0320 10:38:21.807850 4748 ???:1] "http: TLS handshake error from 192.168.126.11:52842: no serving certificate available for the kubelet" Mar 20 10:38:21 crc kubenswrapper[4748]: I0320 10:38:21.896327 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tlkp2\" (UID: \"7da7288f-ca47-4172-a3dd-80a79e803277\") " pod="openshift-image-registry/image-registry-697d97f7c8-tlkp2" Mar 20 10:38:21 crc kubenswrapper[4748]: E0320 10:38:21.896663 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:38:22.396650467 +0000 UTC m=+137.538196281 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tlkp2" (UID: "7da7288f-ca47-4172-a3dd-80a79e803277") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:38:21 crc kubenswrapper[4748]: I0320 10:38:21.915315 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-llsmr"] Mar 20 10:38:21 crc kubenswrapper[4748]: I0320 10:38:21.916250 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-llsmr" Mar 20 10:38:21 crc kubenswrapper[4748]: I0320 10:38:21.945045 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 20 10:38:21 crc kubenswrapper[4748]: I0320 10:38:21.963653 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-llsmr"] Mar 20 10:38:21 crc kubenswrapper[4748]: I0320 10:38:21.969698 4748 ???:1] "http: TLS handshake error from 192.168.126.11:52858: no serving certificate available for the kubelet" Mar 20 10:38:21 crc kubenswrapper[4748]: I0320 10:38:21.988942 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zbshh" Mar 20 10:38:21 crc kubenswrapper[4748]: I0320 10:38:21.997161 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:38:21 crc kubenswrapper[4748]: E0320 10:38:21.997403 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:38:22.497360014 +0000 UTC m=+137.638905828 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:38:21 crc kubenswrapper[4748]: I0320 10:38:21.997623 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5196dc0c-2a46-4fb2-891b-682a6ce5eed9-catalog-content\") pod \"community-operators-llsmr\" (UID: \"5196dc0c-2a46-4fb2-891b-682a6ce5eed9\") " pod="openshift-marketplace/community-operators-llsmr" Mar 20 10:38:21 crc kubenswrapper[4748]: I0320 10:38:21.997716 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tlkp2\" (UID: \"7da7288f-ca47-4172-a3dd-80a79e803277\") " pod="openshift-image-registry/image-registry-697d97f7c8-tlkp2" Mar 20 10:38:21 crc kubenswrapper[4748]: I0320 10:38:21.997802 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5196dc0c-2a46-4fb2-891b-682a6ce5eed9-utilities\") pod \"community-operators-llsmr\" (UID: \"5196dc0c-2a46-4fb2-891b-682a6ce5eed9\") " pod="openshift-marketplace/community-operators-llsmr" Mar 20 10:38:21 crc kubenswrapper[4748]: I0320 10:38:21.997897 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bcb89\" (UniqueName: \"kubernetes.io/projected/5196dc0c-2a46-4fb2-891b-682a6ce5eed9-kube-api-access-bcb89\") pod \"community-operators-llsmr\" (UID: \"5196dc0c-2a46-4fb2-891b-682a6ce5eed9\") " pod="openshift-marketplace/community-operators-llsmr" Mar 20 10:38:21 crc kubenswrapper[4748]: E0320 10:38:21.998032 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:38:22.498022622 +0000 UTC m=+137.639568436 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tlkp2" (UID: "7da7288f-ca47-4172-a3dd-80a79e803277") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:38:22 crc kubenswrapper[4748]: I0320 10:38:22.098939 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:38:22 crc kubenswrapper[4748]: E0320 10:38:22.099175 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:38:22.59913543 +0000 UTC m=+137.740681244 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:38:22 crc kubenswrapper[4748]: I0320 10:38:22.099633 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5196dc0c-2a46-4fb2-891b-682a6ce5eed9-catalog-content\") pod \"community-operators-llsmr\" (UID: \"5196dc0c-2a46-4fb2-891b-682a6ce5eed9\") " pod="openshift-marketplace/community-operators-llsmr" Mar 20 10:38:22 crc kubenswrapper[4748]: I0320 10:38:22.099754 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tlkp2\" (UID: \"7da7288f-ca47-4172-a3dd-80a79e803277\") " pod="openshift-image-registry/image-registry-697d97f7c8-tlkp2" Mar 20 10:38:22 crc kubenswrapper[4748]: I0320 10:38:22.099848 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5196dc0c-2a46-4fb2-891b-682a6ce5eed9-utilities\") pod \"community-operators-llsmr\" (UID: \"5196dc0c-2a46-4fb2-891b-682a6ce5eed9\") " pod="openshift-marketplace/community-operators-llsmr" Mar 20 10:38:22 crc kubenswrapper[4748]: I0320 10:38:22.099927 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bcb89\" (UniqueName: \"kubernetes.io/projected/5196dc0c-2a46-4fb2-891b-682a6ce5eed9-kube-api-access-bcb89\") pod \"community-operators-llsmr\" (UID: \"5196dc0c-2a46-4fb2-891b-682a6ce5eed9\") " pod="openshift-marketplace/community-operators-llsmr" Mar 20 10:38:22 crc kubenswrapper[4748]: E0320 10:38:22.101150 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:38:22.601138193 +0000 UTC m=+137.742684007 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tlkp2" (UID: "7da7288f-ca47-4172-a3dd-80a79e803277") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:38:22 crc kubenswrapper[4748]: I0320 10:38:22.102194 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5196dc0c-2a46-4fb2-891b-682a6ce5eed9-utilities\") pod \"community-operators-llsmr\" (UID: \"5196dc0c-2a46-4fb2-891b-682a6ce5eed9\") " pod="openshift-marketplace/community-operators-llsmr" Mar 20 10:38:22 crc kubenswrapper[4748]: I0320 10:38:22.103089 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5196dc0c-2a46-4fb2-891b-682a6ce5eed9-catalog-content\") pod \"community-operators-llsmr\" (UID: \"5196dc0c-2a46-4fb2-891b-682a6ce5eed9\") " pod="openshift-marketplace/community-operators-llsmr" Mar 20 10:38:22 crc kubenswrapper[4748]: I0320 10:38:22.105987 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-b6j7x"] Mar 20 10:38:22 crc kubenswrapper[4748]: I0320 10:38:22.106871 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b6j7x" Mar 20 10:38:22 crc kubenswrapper[4748]: I0320 10:38:22.112890 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 20 10:38:22 crc kubenswrapper[4748]: I0320 10:38:22.129772 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-b6j7x"] Mar 20 10:38:22 crc kubenswrapper[4748]: I0320 10:38:22.171042 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bcb89\" (UniqueName: \"kubernetes.io/projected/5196dc0c-2a46-4fb2-891b-682a6ce5eed9-kube-api-access-bcb89\") pod \"community-operators-llsmr\" (UID: \"5196dc0c-2a46-4fb2-891b-682a6ce5eed9\") " pod="openshift-marketplace/community-operators-llsmr" Mar 20 10:38:22 crc kubenswrapper[4748]: I0320 10:38:22.203988 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:38:22 crc kubenswrapper[4748]: I0320 10:38:22.204269 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bsdsv\" (UniqueName: \"kubernetes.io/projected/61390690-1bd1-43c9-b82b-e2c5fe3450f9-kube-api-access-bsdsv\") pod \"certified-operators-b6j7x\" (UID: \"61390690-1bd1-43c9-b82b-e2c5fe3450f9\") " pod="openshift-marketplace/certified-operators-b6j7x" Mar 20 10:38:22 crc kubenswrapper[4748]: I0320 10:38:22.204293 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61390690-1bd1-43c9-b82b-e2c5fe3450f9-catalog-content\") pod \"certified-operators-b6j7x\" (UID: \"61390690-1bd1-43c9-b82b-e2c5fe3450f9\") " pod="openshift-marketplace/certified-operators-b6j7x" Mar 20 10:38:22 crc kubenswrapper[4748]: I0320 10:38:22.204341 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61390690-1bd1-43c9-b82b-e2c5fe3450f9-utilities\") pod \"certified-operators-b6j7x\" (UID: \"61390690-1bd1-43c9-b82b-e2c5fe3450f9\") " pod="openshift-marketplace/certified-operators-b6j7x" Mar 20 10:38:22 crc kubenswrapper[4748]: E0320 10:38:22.204489 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:38:22.704473389 +0000 UTC m=+137.846019203 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:38:22 crc kubenswrapper[4748]: I0320 10:38:22.216227 4748 ???:1] "http: TLS handshake error from 192.168.126.11:52864: no serving certificate available for the kubelet" Mar 20 10:38:22 crc kubenswrapper[4748]: I0320 10:38:22.236256 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-llsmr" Mar 20 10:38:22 crc kubenswrapper[4748]: I0320 10:38:22.307362 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61390690-1bd1-43c9-b82b-e2c5fe3450f9-utilities\") pod \"certified-operators-b6j7x\" (UID: \"61390690-1bd1-43c9-b82b-e2c5fe3450f9\") " pod="openshift-marketplace/certified-operators-b6j7x" Mar 20 10:38:22 crc kubenswrapper[4748]: I0320 10:38:22.307426 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tlkp2\" (UID: \"7da7288f-ca47-4172-a3dd-80a79e803277\") " pod="openshift-image-registry/image-registry-697d97f7c8-tlkp2" Mar 20 10:38:22 crc kubenswrapper[4748]: I0320 10:38:22.307451 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bsdsv\" (UniqueName: \"kubernetes.io/projected/61390690-1bd1-43c9-b82b-e2c5fe3450f9-kube-api-access-bsdsv\") pod \"certified-operators-b6j7x\" (UID: \"61390690-1bd1-43c9-b82b-e2c5fe3450f9\") " pod="openshift-marketplace/certified-operators-b6j7x" Mar 20 10:38:22 crc kubenswrapper[4748]: I0320 10:38:22.307469 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61390690-1bd1-43c9-b82b-e2c5fe3450f9-catalog-content\") pod \"certified-operators-b6j7x\" (UID: \"61390690-1bd1-43c9-b82b-e2c5fe3450f9\") " pod="openshift-marketplace/certified-operators-b6j7x" Mar 20 10:38:22 crc kubenswrapper[4748]: I0320 10:38:22.307866 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61390690-1bd1-43c9-b82b-e2c5fe3450f9-catalog-content\") pod \"certified-operators-b6j7x\" (UID: \"61390690-1bd1-43c9-b82b-e2c5fe3450f9\") " pod="openshift-marketplace/certified-operators-b6j7x" Mar 20 10:38:22 crc kubenswrapper[4748]: I0320 10:38:22.308149 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61390690-1bd1-43c9-b82b-e2c5fe3450f9-utilities\") pod \"certified-operators-b6j7x\" (UID: \"61390690-1bd1-43c9-b82b-e2c5fe3450f9\") " pod="openshift-marketplace/certified-operators-b6j7x" Mar 20 10:38:22 crc kubenswrapper[4748]: E0320 10:38:22.308393 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:38:22.8083765 +0000 UTC m=+137.949922314 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tlkp2" (UID: "7da7288f-ca47-4172-a3dd-80a79e803277") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:38:22 crc kubenswrapper[4748]: I0320 10:38:22.308561 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-mhpdf"] Mar 20 10:38:22 crc kubenswrapper[4748]: I0320 10:38:22.309430 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mhpdf" Mar 20 10:38:22 crc kubenswrapper[4748]: I0320 10:38:22.334493 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mhpdf"] Mar 20 10:38:22 crc kubenswrapper[4748]: I0320 10:38:22.357252 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bsdsv\" (UniqueName: \"kubernetes.io/projected/61390690-1bd1-43c9-b82b-e2c5fe3450f9-kube-api-access-bsdsv\") pod \"certified-operators-b6j7x\" (UID: \"61390690-1bd1-43c9-b82b-e2c5fe3450f9\") " pod="openshift-marketplace/certified-operators-b6j7x" Mar 20 10:38:22 crc kubenswrapper[4748]: I0320 10:38:22.410472 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:38:22 crc kubenswrapper[4748]: E0320 10:38:22.410642 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:38:22.910610888 +0000 UTC m=+138.052156692 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:38:22 crc kubenswrapper[4748]: I0320 10:38:22.410698 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tlkp2\" (UID: \"7da7288f-ca47-4172-a3dd-80a79e803277\") " pod="openshift-image-registry/image-registry-697d97f7c8-tlkp2" Mar 20 10:38:22 crc kubenswrapper[4748]: I0320 10:38:22.410964 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/623e7576-03b2-41ba-9cd5-e7d3ab7ecdd7-utilities\") pod \"community-operators-mhpdf\" (UID: \"623e7576-03b2-41ba-9cd5-e7d3ab7ecdd7\") " pod="openshift-marketplace/community-operators-mhpdf" Mar 20 10:38:22 crc kubenswrapper[4748]: I0320 10:38:22.411015 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjvb4\" (UniqueName: \"kubernetes.io/projected/623e7576-03b2-41ba-9cd5-e7d3ab7ecdd7-kube-api-access-sjvb4\") pod \"community-operators-mhpdf\" (UID: \"623e7576-03b2-41ba-9cd5-e7d3ab7ecdd7\") " pod="openshift-marketplace/community-operators-mhpdf" Mar 20 10:38:22 crc kubenswrapper[4748]: I0320 10:38:22.411035 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/623e7576-03b2-41ba-9cd5-e7d3ab7ecdd7-catalog-content\") pod \"community-operators-mhpdf\" (UID: \"623e7576-03b2-41ba-9cd5-e7d3ab7ecdd7\") " pod="openshift-marketplace/community-operators-mhpdf" Mar 20 10:38:22 crc kubenswrapper[4748]: E0320 10:38:22.411349 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:38:22.911340347 +0000 UTC m=+138.052886161 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tlkp2" (UID: "7da7288f-ca47-4172-a3dd-80a79e803277") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:38:22 crc kubenswrapper[4748]: I0320 10:38:22.437087 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b6j7x" Mar 20 10:38:22 crc kubenswrapper[4748]: I0320 10:38:22.515538 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:38:22 crc kubenswrapper[4748]: E0320 10:38:22.532140 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:38:23.032098663 +0000 UTC m=+138.173644477 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:38:22 crc kubenswrapper[4748]: I0320 10:38:22.536713 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/623e7576-03b2-41ba-9cd5-e7d3ab7ecdd7-utilities\") pod \"community-operators-mhpdf\" (UID: \"623e7576-03b2-41ba-9cd5-e7d3ab7ecdd7\") " pod="openshift-marketplace/community-operators-mhpdf" Mar 20 10:38:22 crc kubenswrapper[4748]: I0320 10:38:22.536790 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sjvb4\" (UniqueName: \"kubernetes.io/projected/623e7576-03b2-41ba-9cd5-e7d3ab7ecdd7-kube-api-access-sjvb4\") pod \"community-operators-mhpdf\" (UID: \"623e7576-03b2-41ba-9cd5-e7d3ab7ecdd7\") " pod="openshift-marketplace/community-operators-mhpdf" Mar 20 10:38:22 crc kubenswrapper[4748]: I0320 10:38:22.536812 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/623e7576-03b2-41ba-9cd5-e7d3ab7ecdd7-catalog-content\") pod \"community-operators-mhpdf\" (UID: \"623e7576-03b2-41ba-9cd5-e7d3ab7ecdd7\") " pod="openshift-marketplace/community-operators-mhpdf" Mar 20 10:38:22 crc kubenswrapper[4748]: I0320 10:38:22.536868 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tlkp2\" (UID: \"7da7288f-ca47-4172-a3dd-80a79e803277\") " pod="openshift-image-registry/image-registry-697d97f7c8-tlkp2" Mar 20 10:38:22 crc kubenswrapper[4748]: E0320 10:38:22.537339 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:38:23.037319891 +0000 UTC m=+138.178865705 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tlkp2" (UID: "7da7288f-ca47-4172-a3dd-80a79e803277") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:38:22 crc kubenswrapper[4748]: I0320 10:38:22.537739 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/623e7576-03b2-41ba-9cd5-e7d3ab7ecdd7-utilities\") pod \"community-operators-mhpdf\" (UID: \"623e7576-03b2-41ba-9cd5-e7d3ab7ecdd7\") " pod="openshift-marketplace/community-operators-mhpdf" Mar 20 10:38:22 crc kubenswrapper[4748]: I0320 10:38:22.538244 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/623e7576-03b2-41ba-9cd5-e7d3ab7ecdd7-catalog-content\") pod \"community-operators-mhpdf\" (UID: \"623e7576-03b2-41ba-9cd5-e7d3ab7ecdd7\") " pod="openshift-marketplace/community-operators-mhpdf" Mar 20 10:38:22 crc kubenswrapper[4748]: I0320 10:38:22.540509 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-5fgxw"] Mar 20 10:38:22 crc kubenswrapper[4748]: I0320 10:38:22.541562 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5fgxw" Mar 20 10:38:22 crc kubenswrapper[4748]: I0320 10:38:22.560076 4748 patch_prober.go:28] interesting pod/router-default-5444994796-rbx59 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 10:38:22 crc kubenswrapper[4748]: [-]has-synced failed: reason withheld Mar 20 10:38:22 crc kubenswrapper[4748]: [+]process-running ok Mar 20 10:38:22 crc kubenswrapper[4748]: healthz check failed Mar 20 10:38:22 crc kubenswrapper[4748]: I0320 10:38:22.560123 4748 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-rbx59" podUID="2c8e6760-4ff7-4417-917e-61bdc115d710" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 10:38:22 crc kubenswrapper[4748]: I0320 10:38:22.561686 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5fgxw"] Mar 20 10:38:22 crc kubenswrapper[4748]: I0320 10:38:22.606854 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjvb4\" (UniqueName: \"kubernetes.io/projected/623e7576-03b2-41ba-9cd5-e7d3ab7ecdd7-kube-api-access-sjvb4\") pod \"community-operators-mhpdf\" (UID: \"623e7576-03b2-41ba-9cd5-e7d3ab7ecdd7\") " pod="openshift-marketplace/community-operators-mhpdf" Mar 20 10:38:22 crc kubenswrapper[4748]: I0320 10:38:22.624281 4748 ???:1] "http: TLS handshake error from 192.168.126.11:42620: no serving certificate available for the kubelet" Mar 20 10:38:22 crc kubenswrapper[4748]: I0320 10:38:22.639668 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mhpdf" Mar 20 10:38:22 crc kubenswrapper[4748]: I0320 10:38:22.640849 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:38:22 crc kubenswrapper[4748]: I0320 10:38:22.640993 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e0225901-1ac3-4097-9b8b-4a2c7f0f3f3b-catalog-content\") pod \"certified-operators-5fgxw\" (UID: \"e0225901-1ac3-4097-9b8b-4a2c7f0f3f3b\") " pod="openshift-marketplace/certified-operators-5fgxw" Mar 20 10:38:22 crc kubenswrapper[4748]: I0320 10:38:22.641048 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d7a2dfc5-1dd9-4ef1-9419-39f60da74b16-metrics-certs\") pod \"network-metrics-daemon-5jzd5\" (UID: \"d7a2dfc5-1dd9-4ef1-9419-39f60da74b16\") " pod="openshift-multus/network-metrics-daemon-5jzd5" Mar 20 10:38:22 crc kubenswrapper[4748]: I0320 10:38:22.641075 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mgnhj\" (UniqueName: \"kubernetes.io/projected/e0225901-1ac3-4097-9b8b-4a2c7f0f3f3b-kube-api-access-mgnhj\") pod \"certified-operators-5fgxw\" (UID: \"e0225901-1ac3-4097-9b8b-4a2c7f0f3f3b\") " pod="openshift-marketplace/certified-operators-5fgxw" Mar 20 10:38:22 crc kubenswrapper[4748]: I0320 10:38:22.641101 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e0225901-1ac3-4097-9b8b-4a2c7f0f3f3b-utilities\") pod \"certified-operators-5fgxw\" (UID: \"e0225901-1ac3-4097-9b8b-4a2c7f0f3f3b\") " pod="openshift-marketplace/certified-operators-5fgxw" Mar 20 10:38:22 crc kubenswrapper[4748]: E0320 10:38:22.641196 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:38:23.141180681 +0000 UTC m=+138.282726495 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:38:22 crc kubenswrapper[4748]: I0320 10:38:22.656515 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d7a2dfc5-1dd9-4ef1-9419-39f60da74b16-metrics-certs\") pod \"network-metrics-daemon-5jzd5\" (UID: \"d7a2dfc5-1dd9-4ef1-9419-39f60da74b16\") " pod="openshift-multus/network-metrics-daemon-5jzd5" Mar 20 10:38:22 crc kubenswrapper[4748]: I0320 10:38:22.743566 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e0225901-1ac3-4097-9b8b-4a2c7f0f3f3b-utilities\") pod \"certified-operators-5fgxw\" (UID: \"e0225901-1ac3-4097-9b8b-4a2c7f0f3f3b\") " pod="openshift-marketplace/certified-operators-5fgxw" Mar 20 10:38:22 crc kubenswrapper[4748]: I0320 10:38:22.743646 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e0225901-1ac3-4097-9b8b-4a2c7f0f3f3b-catalog-content\") pod \"certified-operators-5fgxw\" (UID: \"e0225901-1ac3-4097-9b8b-4a2c7f0f3f3b\") " pod="openshift-marketplace/certified-operators-5fgxw" Mar 20 10:38:22 crc kubenswrapper[4748]: I0320 10:38:22.743686 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tlkp2\" (UID: \"7da7288f-ca47-4172-a3dd-80a79e803277\") " pod="openshift-image-registry/image-registry-697d97f7c8-tlkp2" Mar 20 10:38:22 crc kubenswrapper[4748]: I0320 10:38:22.743737 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mgnhj\" (UniqueName: \"kubernetes.io/projected/e0225901-1ac3-4097-9b8b-4a2c7f0f3f3b-kube-api-access-mgnhj\") pod \"certified-operators-5fgxw\" (UID: \"e0225901-1ac3-4097-9b8b-4a2c7f0f3f3b\") " pod="openshift-marketplace/certified-operators-5fgxw" Mar 20 10:38:22 crc kubenswrapper[4748]: I0320 10:38:22.744288 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e0225901-1ac3-4097-9b8b-4a2c7f0f3f3b-utilities\") pod \"certified-operators-5fgxw\" (UID: \"e0225901-1ac3-4097-9b8b-4a2c7f0f3f3b\") " pod="openshift-marketplace/certified-operators-5fgxw" Mar 20 10:38:22 crc kubenswrapper[4748]: I0320 10:38:22.744566 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e0225901-1ac3-4097-9b8b-4a2c7f0f3f3b-catalog-content\") pod \"certified-operators-5fgxw\" (UID: \"e0225901-1ac3-4097-9b8b-4a2c7f0f3f3b\") " pod="openshift-marketplace/certified-operators-5fgxw" Mar 20 10:38:22 crc kubenswrapper[4748]: E0320 10:38:22.744869 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:38:23.244855117 +0000 UTC m=+138.386400931 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tlkp2" (UID: "7da7288f-ca47-4172-a3dd-80a79e803277") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:38:22 crc kubenswrapper[4748]: I0320 10:38:22.790799 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mgnhj\" (UniqueName: \"kubernetes.io/projected/e0225901-1ac3-4097-9b8b-4a2c7f0f3f3b-kube-api-access-mgnhj\") pod \"certified-operators-5fgxw\" (UID: \"e0225901-1ac3-4097-9b8b-4a2c7f0f3f3b\") " pod="openshift-marketplace/certified-operators-5fgxw" Mar 20 10:38:22 crc kubenswrapper[4748]: I0320 10:38:22.791104 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5jzd5" Mar 20 10:38:22 crc kubenswrapper[4748]: I0320 10:38:22.844558 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:38:22 crc kubenswrapper[4748]: E0320 10:38:22.844953 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:38:23.344927837 +0000 UTC m=+138.486473651 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:38:22 crc kubenswrapper[4748]: I0320 10:38:22.946498 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tlkp2\" (UID: \"7da7288f-ca47-4172-a3dd-80a79e803277\") " pod="openshift-image-registry/image-registry-697d97f7c8-tlkp2" Mar 20 10:38:22 crc kubenswrapper[4748]: E0320 10:38:22.946817 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:38:23.446805225 +0000 UTC m=+138.588351039 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tlkp2" (UID: "7da7288f-ca47-4172-a3dd-80a79e803277") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:38:22 crc kubenswrapper[4748]: I0320 10:38:22.976684 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5fgxw" Mar 20 10:38:22 crc kubenswrapper[4748]: I0320 10:38:22.981114 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-llsmr"] Mar 20 10:38:22 crc kubenswrapper[4748]: I0320 10:38:22.994781 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-hg8kp"] Mar 20 10:38:22 crc kubenswrapper[4748]: I0320 10:38:22.995332 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-hg8kp" podUID="ffd1ef63-d06a-4c98-8644-9efc1e7ae8fa" containerName="controller-manager" containerID="cri-o://e74d946db6099fda9b80174c7678cf74393b28765301edb3f89a3be9b76f75e0" gracePeriod=30 Mar 20 10:38:23 crc kubenswrapper[4748]: W0320 10:38:23.012078 4748 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5196dc0c_2a46_4fb2_891b_682a6ce5eed9.slice/crio-c5597c5769d09d8f641ee101ee959907fac4c4a7fcf56e80203ed0deb77ad325 WatchSource:0}: Error finding container c5597c5769d09d8f641ee101ee959907fac4c4a7fcf56e80203ed0deb77ad325: Status 404 returned error can't find the container with id c5597c5769d09d8f641ee101ee959907fac4c4a7fcf56e80203ed0deb77ad325 Mar 20 10:38:23 crc kubenswrapper[4748]: I0320 10:38:23.012131 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-xwxlw"] Mar 20 10:38:23 crc kubenswrapper[4748]: I0320 10:38:23.019690 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-hg8kp" Mar 20 10:38:23 crc kubenswrapper[4748]: I0320 10:38:23.051865 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:38:23 crc kubenswrapper[4748]: E0320 10:38:23.052176 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:38:23.552149564 +0000 UTC m=+138.693695378 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:38:23 crc kubenswrapper[4748]: I0320 10:38:23.118322 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-xwxlw" podUID="a49c3549-30d5-4927-ae41-cc2b9e7f47e2" containerName="route-controller-manager" containerID="cri-o://3ef3b3bb6b3ec40cdd4a0fde48ac8210369a91a6f8e2b6ead01596a0c4f4f3c2" gracePeriod=30 Mar 20 10:38:23 crc kubenswrapper[4748]: I0320 10:38:23.118461 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-cfzr7" event={"ID":"3a7350f6-60df-4463-a08c-1b231d9443c6","Type":"ContainerStarted","Data":"58e35404fb669167b783f6222314c77e6d5850d70c2a153cb6eb3cce33b388ba"} Mar 20 10:38:23 crc kubenswrapper[4748]: I0320 10:38:23.118486 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-cfzr7" event={"ID":"3a7350f6-60df-4463-a08c-1b231d9443c6","Type":"ContainerStarted","Data":"c436c68bdd64b37bbdbd16767d53c8e54e0ce24fbc8e75e365e34842f70b0a6d"} Mar 20 10:38:23 crc kubenswrapper[4748]: I0320 10:38:23.153602 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tlkp2\" (UID: \"7da7288f-ca47-4172-a3dd-80a79e803277\") " pod="openshift-image-registry/image-registry-697d97f7c8-tlkp2" Mar 20 10:38:23 crc kubenswrapper[4748]: E0320 10:38:23.153994 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:38:23.653980171 +0000 UTC m=+138.795525985 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tlkp2" (UID: "7da7288f-ca47-4172-a3dd-80a79e803277") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:38:23 crc kubenswrapper[4748]: I0320 10:38:23.256424 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:38:23 crc kubenswrapper[4748]: E0320 10:38:23.257003 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:38:23.756986229 +0000 UTC m=+138.898532033 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:38:23 crc kubenswrapper[4748]: I0320 10:38:23.356727 4748 ???:1] "http: TLS handshake error from 192.168.126.11:42624: no serving certificate available for the kubelet" Mar 20 10:38:23 crc kubenswrapper[4748]: I0320 10:38:23.358424 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tlkp2\" (UID: \"7da7288f-ca47-4172-a3dd-80a79e803277\") " pod="openshift-image-registry/image-registry-697d97f7c8-tlkp2" Mar 20 10:38:23 crc kubenswrapper[4748]: E0320 10:38:23.358707 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:38:23.858694482 +0000 UTC m=+139.000240286 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tlkp2" (UID: "7da7288f-ca47-4172-a3dd-80a79e803277") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:38:23 crc kubenswrapper[4748]: I0320 10:38:23.365151 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-b6j7x"] Mar 20 10:38:23 crc kubenswrapper[4748]: I0320 10:38:23.381875 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 20 10:38:23 crc kubenswrapper[4748]: I0320 10:38:23.382465 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 10:38:23 crc kubenswrapper[4748]: I0320 10:38:23.393874 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 20 10:38:23 crc kubenswrapper[4748]: I0320 10:38:23.397650 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 20 10:38:23 crc kubenswrapper[4748]: I0320 10:38:23.397895 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 20 10:38:23 crc kubenswrapper[4748]: I0320 10:38:23.452054 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mhpdf"] Mar 20 10:38:23 crc kubenswrapper[4748]: I0320 10:38:23.460304 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:38:23 crc kubenswrapper[4748]: I0320 10:38:23.460550 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bb9bf98a-c134-47ae-9ab8-d287fc56d70c-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"bb9bf98a-c134-47ae-9ab8-d287fc56d70c\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 10:38:23 crc kubenswrapper[4748]: I0320 10:38:23.460605 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bb9bf98a-c134-47ae-9ab8-d287fc56d70c-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"bb9bf98a-c134-47ae-9ab8-d287fc56d70c\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 10:38:23 crc kubenswrapper[4748]: E0320 10:38:23.460753 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:38:23.960737134 +0000 UTC m=+139.102282948 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:38:23 crc kubenswrapper[4748]: I0320 10:38:23.583887 4748 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Mar 20 10:38:23 crc kubenswrapper[4748]: I0320 10:38:23.584299 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bb9bf98a-c134-47ae-9ab8-d287fc56d70c-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"bb9bf98a-c134-47ae-9ab8-d287fc56d70c\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 10:38:23 crc kubenswrapper[4748]: I0320 10:38:23.584420 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tlkp2\" (UID: \"7da7288f-ca47-4172-a3dd-80a79e803277\") " pod="openshift-image-registry/image-registry-697d97f7c8-tlkp2" Mar 20 10:38:23 crc kubenswrapper[4748]: I0320 10:38:23.584476 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bb9bf98a-c134-47ae-9ab8-d287fc56d70c-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"bb9bf98a-c134-47ae-9ab8-d287fc56d70c\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 10:38:23 crc kubenswrapper[4748]: I0320 10:38:23.584620 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bb9bf98a-c134-47ae-9ab8-d287fc56d70c-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"bb9bf98a-c134-47ae-9ab8-d287fc56d70c\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 10:38:23 crc kubenswrapper[4748]: E0320 10:38:23.584890 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:38:24.08487522 +0000 UTC m=+139.226421034 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tlkp2" (UID: "7da7288f-ca47-4172-a3dd-80a79e803277") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:38:23 crc kubenswrapper[4748]: I0320 10:38:23.607140 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-5jzd5"] Mar 20 10:38:23 crc kubenswrapper[4748]: I0320 10:38:23.609244 4748 patch_prober.go:28] interesting pod/router-default-5444994796-rbx59 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 10:38:23 crc kubenswrapper[4748]: [-]has-synced failed: reason withheld Mar 20 10:38:23 crc kubenswrapper[4748]: [+]process-running ok Mar 20 10:38:23 crc kubenswrapper[4748]: healthz check failed Mar 20 10:38:23 crc kubenswrapper[4748]: I0320 10:38:23.609309 4748 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-rbx59" podUID="2c8e6760-4ff7-4417-917e-61bdc115d710" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 10:38:23 crc kubenswrapper[4748]: I0320 10:38:23.613921 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bb9bf98a-c134-47ae-9ab8-d287fc56d70c-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"bb9bf98a-c134-47ae-9ab8-d287fc56d70c\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 10:38:23 crc kubenswrapper[4748]: I0320 10:38:23.687419 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:38:23 crc kubenswrapper[4748]: E0320 10:38:23.687700 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:38:24.187682062 +0000 UTC m=+139.329227876 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:38:23 crc kubenswrapper[4748]: I0320 10:38:23.739438 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 10:38:23 crc kubenswrapper[4748]: I0320 10:38:23.788550 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5fgxw"] Mar 20 10:38:23 crc kubenswrapper[4748]: I0320 10:38:23.789297 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tlkp2\" (UID: \"7da7288f-ca47-4172-a3dd-80a79e803277\") " pod="openshift-image-registry/image-registry-697d97f7c8-tlkp2" Mar 20 10:38:23 crc kubenswrapper[4748]: E0320 10:38:23.789635 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:38:24.289618602 +0000 UTC m=+139.431164416 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tlkp2" (UID: "7da7288f-ca47-4172-a3dd-80a79e803277") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:38:23 crc kubenswrapper[4748]: I0320 10:38:23.794148 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-hg8kp" Mar 20 10:38:23 crc kubenswrapper[4748]: I0320 10:38:23.883963 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-xwxlw" Mar 20 10:38:23 crc kubenswrapper[4748]: I0320 10:38:23.889662 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ffd1ef63-d06a-4c98-8644-9efc1e7ae8fa-client-ca\") pod \"ffd1ef63-d06a-4c98-8644-9efc1e7ae8fa\" (UID: \"ffd1ef63-d06a-4c98-8644-9efc1e7ae8fa\") " Mar 20 10:38:23 crc kubenswrapper[4748]: I0320 10:38:23.889706 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ffd1ef63-d06a-4c98-8644-9efc1e7ae8fa-config\") pod \"ffd1ef63-d06a-4c98-8644-9efc1e7ae8fa\" (UID: \"ffd1ef63-d06a-4c98-8644-9efc1e7ae8fa\") " Mar 20 10:38:23 crc kubenswrapper[4748]: I0320 10:38:23.889792 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:38:23 crc kubenswrapper[4748]: I0320 10:38:23.889819 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ffd1ef63-d06a-4c98-8644-9efc1e7ae8fa-serving-cert\") pod \"ffd1ef63-d06a-4c98-8644-9efc1e7ae8fa\" (UID: \"ffd1ef63-d06a-4c98-8644-9efc1e7ae8fa\") " Mar 20 10:38:23 crc kubenswrapper[4748]: I0320 10:38:23.889891 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ffd1ef63-d06a-4c98-8644-9efc1e7ae8fa-proxy-ca-bundles\") pod \"ffd1ef63-d06a-4c98-8644-9efc1e7ae8fa\" (UID: \"ffd1ef63-d06a-4c98-8644-9efc1e7ae8fa\") " Mar 20 10:38:23 crc kubenswrapper[4748]: I0320 10:38:23.889954 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hp5qr\" (UniqueName: \"kubernetes.io/projected/ffd1ef63-d06a-4c98-8644-9efc1e7ae8fa-kube-api-access-hp5qr\") pod \"ffd1ef63-d06a-4c98-8644-9efc1e7ae8fa\" (UID: \"ffd1ef63-d06a-4c98-8644-9efc1e7ae8fa\") " Mar 20 10:38:23 crc kubenswrapper[4748]: I0320 10:38:23.890868 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ffd1ef63-d06a-4c98-8644-9efc1e7ae8fa-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "ffd1ef63-d06a-4c98-8644-9efc1e7ae8fa" (UID: "ffd1ef63-d06a-4c98-8644-9efc1e7ae8fa"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:38:23 crc kubenswrapper[4748]: I0320 10:38:23.891179 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ffd1ef63-d06a-4c98-8644-9efc1e7ae8fa-client-ca" (OuterVolumeSpecName: "client-ca") pod "ffd1ef63-d06a-4c98-8644-9efc1e7ae8fa" (UID: "ffd1ef63-d06a-4c98-8644-9efc1e7ae8fa"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:38:23 crc kubenswrapper[4748]: I0320 10:38:23.891391 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ffd1ef63-d06a-4c98-8644-9efc1e7ae8fa-config" (OuterVolumeSpecName: "config") pod "ffd1ef63-d06a-4c98-8644-9efc1e7ae8fa" (UID: "ffd1ef63-d06a-4c98-8644-9efc1e7ae8fa"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:38:23 crc kubenswrapper[4748]: E0320 10:38:23.891480 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:38:24.391461219 +0000 UTC m=+139.533007033 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:38:23 crc kubenswrapper[4748]: I0320 10:38:23.898001 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ffd1ef63-d06a-4c98-8644-9efc1e7ae8fa-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "ffd1ef63-d06a-4c98-8644-9efc1e7ae8fa" (UID: "ffd1ef63-d06a-4c98-8644-9efc1e7ae8fa"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:38:23 crc kubenswrapper[4748]: I0320 10:38:23.901440 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ffd1ef63-d06a-4c98-8644-9efc1e7ae8fa-kube-api-access-hp5qr" (OuterVolumeSpecName: "kube-api-access-hp5qr") pod "ffd1ef63-d06a-4c98-8644-9efc1e7ae8fa" (UID: "ffd1ef63-d06a-4c98-8644-9efc1e7ae8fa"). InnerVolumeSpecName "kube-api-access-hp5qr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:38:23 crc kubenswrapper[4748]: I0320 10:38:23.992245 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gfjwf\" (UniqueName: \"kubernetes.io/projected/a49c3549-30d5-4927-ae41-cc2b9e7f47e2-kube-api-access-gfjwf\") pod \"a49c3549-30d5-4927-ae41-cc2b9e7f47e2\" (UID: \"a49c3549-30d5-4927-ae41-cc2b9e7f47e2\") " Mar 20 10:38:23 crc kubenswrapper[4748]: I0320 10:38:23.992309 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a49c3549-30d5-4927-ae41-cc2b9e7f47e2-serving-cert\") pod \"a49c3549-30d5-4927-ae41-cc2b9e7f47e2\" (UID: \"a49c3549-30d5-4927-ae41-cc2b9e7f47e2\") " Mar 20 10:38:23 crc kubenswrapper[4748]: I0320 10:38:23.992340 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a49c3549-30d5-4927-ae41-cc2b9e7f47e2-config\") pod \"a49c3549-30d5-4927-ae41-cc2b9e7f47e2\" (UID: \"a49c3549-30d5-4927-ae41-cc2b9e7f47e2\") " Mar 20 10:38:23 crc kubenswrapper[4748]: I0320 10:38:23.992395 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a49c3549-30d5-4927-ae41-cc2b9e7f47e2-client-ca\") pod \"a49c3549-30d5-4927-ae41-cc2b9e7f47e2\" (UID: \"a49c3549-30d5-4927-ae41-cc2b9e7f47e2\") " Mar 20 10:38:23 crc kubenswrapper[4748]: I0320 10:38:23.992561 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tlkp2\" (UID: \"7da7288f-ca47-4172-a3dd-80a79e803277\") " pod="openshift-image-registry/image-registry-697d97f7c8-tlkp2" Mar 20 10:38:23 crc kubenswrapper[4748]: I0320 10:38:23.992638 4748 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ffd1ef63-d06a-4c98-8644-9efc1e7ae8fa-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:38:23 crc kubenswrapper[4748]: I0320 10:38:23.992654 4748 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ffd1ef63-d06a-4c98-8644-9efc1e7ae8fa-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 10:38:23 crc kubenswrapper[4748]: I0320 10:38:23.992665 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hp5qr\" (UniqueName: \"kubernetes.io/projected/ffd1ef63-d06a-4c98-8644-9efc1e7ae8fa-kube-api-access-hp5qr\") on node \"crc\" DevicePath \"\"" Mar 20 10:38:23 crc kubenswrapper[4748]: I0320 10:38:23.992674 4748 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ffd1ef63-d06a-4c98-8644-9efc1e7ae8fa-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 10:38:23 crc kubenswrapper[4748]: I0320 10:38:23.992683 4748 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ffd1ef63-d06a-4c98-8644-9efc1e7ae8fa-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:38:23 crc kubenswrapper[4748]: E0320 10:38:23.992955 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:38:24.492942717 +0000 UTC m=+139.634488531 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tlkp2" (UID: "7da7288f-ca47-4172-a3dd-80a79e803277") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:38:23 crc kubenswrapper[4748]: I0320 10:38:23.993447 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a49c3549-30d5-4927-ae41-cc2b9e7f47e2-client-ca" (OuterVolumeSpecName: "client-ca") pod "a49c3549-30d5-4927-ae41-cc2b9e7f47e2" (UID: "a49c3549-30d5-4927-ae41-cc2b9e7f47e2"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:38:23 crc kubenswrapper[4748]: I0320 10:38:23.993610 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a49c3549-30d5-4927-ae41-cc2b9e7f47e2-config" (OuterVolumeSpecName: "config") pod "a49c3549-30d5-4927-ae41-cc2b9e7f47e2" (UID: "a49c3549-30d5-4927-ae41-cc2b9e7f47e2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:38:23 crc kubenswrapper[4748]: I0320 10:38:23.996100 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a49c3549-30d5-4927-ae41-cc2b9e7f47e2-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "a49c3549-30d5-4927-ae41-cc2b9e7f47e2" (UID: "a49c3549-30d5-4927-ae41-cc2b9e7f47e2"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:38:23 crc kubenswrapper[4748]: I0320 10:38:23.996205 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a49c3549-30d5-4927-ae41-cc2b9e7f47e2-kube-api-access-gfjwf" (OuterVolumeSpecName: "kube-api-access-gfjwf") pod "a49c3549-30d5-4927-ae41-cc2b9e7f47e2" (UID: "a49c3549-30d5-4927-ae41-cc2b9e7f47e2"). InnerVolumeSpecName "kube-api-access-gfjwf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:38:24 crc kubenswrapper[4748]: I0320 10:38:24.052110 4748 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-03-20T10:38:23.58413743Z","Handler":null,"Name":""} Mar 20 10:38:24 crc kubenswrapper[4748]: I0320 10:38:24.056377 4748 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Mar 20 10:38:24 crc kubenswrapper[4748]: I0320 10:38:24.056415 4748 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Mar 20 10:38:24 crc kubenswrapper[4748]: I0320 10:38:24.093576 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:38:24 crc kubenswrapper[4748]: I0320 10:38:24.093818 4748 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a49c3549-30d5-4927-ae41-cc2b9e7f47e2-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 10:38:24 crc kubenswrapper[4748]: I0320 10:38:24.093857 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gfjwf\" (UniqueName: \"kubernetes.io/projected/a49c3549-30d5-4927-ae41-cc2b9e7f47e2-kube-api-access-gfjwf\") on node \"crc\" DevicePath \"\"" Mar 20 10:38:24 crc kubenswrapper[4748]: I0320 10:38:24.093867 4748 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a49c3549-30d5-4927-ae41-cc2b9e7f47e2-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:38:24 crc kubenswrapper[4748]: I0320 10:38:24.093878 4748 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a49c3549-30d5-4927-ae41-cc2b9e7f47e2-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:38:24 crc kubenswrapper[4748]: I0320 10:38:24.098057 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 20 10:38:24 crc kubenswrapper[4748]: I0320 10:38:24.125860 4748 generic.go:334] "Generic (PLEG): container finished" podID="e0225901-1ac3-4097-9b8b-4a2c7f0f3f3b" containerID="90e88093111b4b53979d7ae3e3ea5b78ac9039492237a398606f6c5608ebbbec" exitCode=0 Mar 20 10:38:24 crc kubenswrapper[4748]: I0320 10:38:24.125916 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5fgxw" event={"ID":"e0225901-1ac3-4097-9b8b-4a2c7f0f3f3b","Type":"ContainerDied","Data":"90e88093111b4b53979d7ae3e3ea5b78ac9039492237a398606f6c5608ebbbec"} Mar 20 10:38:24 crc kubenswrapper[4748]: I0320 10:38:24.125943 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5fgxw" event={"ID":"e0225901-1ac3-4097-9b8b-4a2c7f0f3f3b","Type":"ContainerStarted","Data":"2b5d251b09957d35bc3c64e4fae137287bbfa416183552edd9f4f2ad7e06beaa"} Mar 20 10:38:24 crc kubenswrapper[4748]: I0320 10:38:24.131311 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-cfzr7" event={"ID":"3a7350f6-60df-4463-a08c-1b231d9443c6","Type":"ContainerStarted","Data":"6a06eca0baf0c40b24cbc1814d1db9b59e0727cfc877f1912149a4c714c6b8c7"} Mar 20 10:38:24 crc kubenswrapper[4748]: I0320 10:38:24.132644 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-5jzd5" event={"ID":"d7a2dfc5-1dd9-4ef1-9419-39f60da74b16","Type":"ContainerStarted","Data":"4a7c6ab2f1f2f349380db9e7d94024a9564c6d429c07f89d73a24d015457f8fd"} Mar 20 10:38:24 crc kubenswrapper[4748]: I0320 10:38:24.133781 4748 generic.go:334] "Generic (PLEG): container finished" podID="61390690-1bd1-43c9-b82b-e2c5fe3450f9" containerID="c293e7296d050a2b3c7c25422c7d08c1b492af3270631d8b6de2f0928974382b" exitCode=0 Mar 20 10:38:24 crc kubenswrapper[4748]: I0320 10:38:24.133819 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b6j7x" event={"ID":"61390690-1bd1-43c9-b82b-e2c5fe3450f9","Type":"ContainerDied","Data":"c293e7296d050a2b3c7c25422c7d08c1b492af3270631d8b6de2f0928974382b"} Mar 20 10:38:24 crc kubenswrapper[4748]: I0320 10:38:24.133846 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b6j7x" event={"ID":"61390690-1bd1-43c9-b82b-e2c5fe3450f9","Type":"ContainerStarted","Data":"d56cd3eab7c073eb4a9b927e8f7f1421bab87030d220d6f6c3db782dfe765b2b"} Mar 20 10:38:24 crc kubenswrapper[4748]: I0320 10:38:24.135874 4748 generic.go:334] "Generic (PLEG): container finished" podID="ffd1ef63-d06a-4c98-8644-9efc1e7ae8fa" containerID="e74d946db6099fda9b80174c7678cf74393b28765301edb3f89a3be9b76f75e0" exitCode=0 Mar 20 10:38:24 crc kubenswrapper[4748]: I0320 10:38:24.135907 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-hg8kp" event={"ID":"ffd1ef63-d06a-4c98-8644-9efc1e7ae8fa","Type":"ContainerDied","Data":"e74d946db6099fda9b80174c7678cf74393b28765301edb3f89a3be9b76f75e0"} Mar 20 10:38:24 crc kubenswrapper[4748]: I0320 10:38:24.135922 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-hg8kp" event={"ID":"ffd1ef63-d06a-4c98-8644-9efc1e7ae8fa","Type":"ContainerDied","Data":"d72e87d8b8c59f8bfec7e04485646c905ac0293152e1c4035189637f966e6004"} Mar 20 10:38:24 crc kubenswrapper[4748]: I0320 10:38:24.135938 4748 scope.go:117] "RemoveContainer" containerID="e74d946db6099fda9b80174c7678cf74393b28765301edb3f89a3be9b76f75e0" Mar 20 10:38:24 crc kubenswrapper[4748]: I0320 10:38:24.136006 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-hg8kp" Mar 20 10:38:24 crc kubenswrapper[4748]: I0320 10:38:24.144159 4748 generic.go:334] "Generic (PLEG): container finished" podID="623e7576-03b2-41ba-9cd5-e7d3ab7ecdd7" containerID="31def6896a1e41acae6771bff4c30bfe314ab0085f30b94adecad53864befc02" exitCode=0 Mar 20 10:38:24 crc kubenswrapper[4748]: I0320 10:38:24.144242 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mhpdf" event={"ID":"623e7576-03b2-41ba-9cd5-e7d3ab7ecdd7","Type":"ContainerDied","Data":"31def6896a1e41acae6771bff4c30bfe314ab0085f30b94adecad53864befc02"} Mar 20 10:38:24 crc kubenswrapper[4748]: I0320 10:38:24.144271 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mhpdf" event={"ID":"623e7576-03b2-41ba-9cd5-e7d3ab7ecdd7","Type":"ContainerStarted","Data":"249624e5b4dfee6dd0ee675c9b5ddcc0e94f027e96561df06404e467bf4cd4c9"} Mar 20 10:38:24 crc kubenswrapper[4748]: I0320 10:38:24.146302 4748 generic.go:334] "Generic (PLEG): container finished" podID="a49c3549-30d5-4927-ae41-cc2b9e7f47e2" containerID="3ef3b3bb6b3ec40cdd4a0fde48ac8210369a91a6f8e2b6ead01596a0c4f4f3c2" exitCode=0 Mar 20 10:38:24 crc kubenswrapper[4748]: I0320 10:38:24.146463 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-xwxlw" Mar 20 10:38:24 crc kubenswrapper[4748]: I0320 10:38:24.146597 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-xwxlw" event={"ID":"a49c3549-30d5-4927-ae41-cc2b9e7f47e2","Type":"ContainerDied","Data":"3ef3b3bb6b3ec40cdd4a0fde48ac8210369a91a6f8e2b6ead01596a0c4f4f3c2"} Mar 20 10:38:24 crc kubenswrapper[4748]: I0320 10:38:24.146683 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-xwxlw" event={"ID":"a49c3549-30d5-4927-ae41-cc2b9e7f47e2","Type":"ContainerDied","Data":"b291e9456845147eb9330180fa0fe46d069360b20e2f5ae1b5e6bdbb4864313a"} Mar 20 10:38:24 crc kubenswrapper[4748]: I0320 10:38:24.154967 4748 generic.go:334] "Generic (PLEG): container finished" podID="5196dc0c-2a46-4fb2-891b-682a6ce5eed9" containerID="a9df4eadc7d988167690805eecffd70491944b494f875516f22dfcef0c5b96ac" exitCode=0 Mar 20 10:38:24 crc kubenswrapper[4748]: I0320 10:38:24.155016 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-llsmr" event={"ID":"5196dc0c-2a46-4fb2-891b-682a6ce5eed9","Type":"ContainerDied","Data":"a9df4eadc7d988167690805eecffd70491944b494f875516f22dfcef0c5b96ac"} Mar 20 10:38:24 crc kubenswrapper[4748]: I0320 10:38:24.155047 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-llsmr" event={"ID":"5196dc0c-2a46-4fb2-891b-682a6ce5eed9","Type":"ContainerStarted","Data":"c5597c5769d09d8f641ee101ee959907fac4c4a7fcf56e80203ed0deb77ad325"} Mar 20 10:38:24 crc kubenswrapper[4748]: I0320 10:38:24.161443 4748 scope.go:117] "RemoveContainer" containerID="e74d946db6099fda9b80174c7678cf74393b28765301edb3f89a3be9b76f75e0" Mar 20 10:38:24 crc kubenswrapper[4748]: E0320 10:38:24.162199 4748 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e74d946db6099fda9b80174c7678cf74393b28765301edb3f89a3be9b76f75e0\": container with ID starting with e74d946db6099fda9b80174c7678cf74393b28765301edb3f89a3be9b76f75e0 not found: ID does not exist" containerID="e74d946db6099fda9b80174c7678cf74393b28765301edb3f89a3be9b76f75e0" Mar 20 10:38:24 crc kubenswrapper[4748]: I0320 10:38:24.162258 4748 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e74d946db6099fda9b80174c7678cf74393b28765301edb3f89a3be9b76f75e0"} err="failed to get container status \"e74d946db6099fda9b80174c7678cf74393b28765301edb3f89a3be9b76f75e0\": rpc error: code = NotFound desc = could not find container \"e74d946db6099fda9b80174c7678cf74393b28765301edb3f89a3be9b76f75e0\": container with ID starting with e74d946db6099fda9b80174c7678cf74393b28765301edb3f89a3be9b76f75e0 not found: ID does not exist" Mar 20 10:38:24 crc kubenswrapper[4748]: I0320 10:38:24.162289 4748 scope.go:117] "RemoveContainer" containerID="3ef3b3bb6b3ec40cdd4a0fde48ac8210369a91a6f8e2b6ead01596a0c4f4f3c2" Mar 20 10:38:24 crc kubenswrapper[4748]: I0320 10:38:24.183800 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-cfzr7" podStartSLOduration=11.183779572 podStartE2EDuration="11.183779572s" podCreationTimestamp="2026-03-20 10:38:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:38:24.182894918 +0000 UTC m=+139.324440732" watchObservedRunningTime="2026-03-20 10:38:24.183779572 +0000 UTC m=+139.325325386" Mar 20 10:38:24 crc kubenswrapper[4748]: I0320 10:38:24.195363 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tlkp2\" (UID: \"7da7288f-ca47-4172-a3dd-80a79e803277\") " pod="openshift-image-registry/image-registry-697d97f7c8-tlkp2" Mar 20 10:38:24 crc kubenswrapper[4748]: I0320 10:38:24.204172 4748 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 20 10:38:24 crc kubenswrapper[4748]: I0320 10:38:24.204217 4748 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tlkp2\" (UID: \"7da7288f-ca47-4172-a3dd-80a79e803277\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-tlkp2" Mar 20 10:38:24 crc kubenswrapper[4748]: I0320 10:38:24.255650 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tlkp2\" (UID: \"7da7288f-ca47-4172-a3dd-80a79e803277\") " pod="openshift-image-registry/image-registry-697d97f7c8-tlkp2" Mar 20 10:38:24 crc kubenswrapper[4748]: I0320 10:38:24.265544 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-hg8kp"] Mar 20 10:38:24 crc kubenswrapper[4748]: I0320 10:38:24.275238 4748 scope.go:117] "RemoveContainer" containerID="3ef3b3bb6b3ec40cdd4a0fde48ac8210369a91a6f8e2b6ead01596a0c4f4f3c2" Mar 20 10:38:24 crc kubenswrapper[4748]: E0320 10:38:24.277162 4748 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ef3b3bb6b3ec40cdd4a0fde48ac8210369a91a6f8e2b6ead01596a0c4f4f3c2\": container with ID starting with 3ef3b3bb6b3ec40cdd4a0fde48ac8210369a91a6f8e2b6ead01596a0c4f4f3c2 not found: ID does not exist" containerID="3ef3b3bb6b3ec40cdd4a0fde48ac8210369a91a6f8e2b6ead01596a0c4f4f3c2" Mar 20 10:38:24 crc kubenswrapper[4748]: I0320 10:38:24.277217 4748 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ef3b3bb6b3ec40cdd4a0fde48ac8210369a91a6f8e2b6ead01596a0c4f4f3c2"} err="failed to get container status \"3ef3b3bb6b3ec40cdd4a0fde48ac8210369a91a6f8e2b6ead01596a0c4f4f3c2\": rpc error: code = NotFound desc = could not find container \"3ef3b3bb6b3ec40cdd4a0fde48ac8210369a91a6f8e2b6ead01596a0c4f4f3c2\": container with ID starting with 3ef3b3bb6b3ec40cdd4a0fde48ac8210369a91a6f8e2b6ead01596a0c4f4f3c2 not found: ID does not exist" Mar 20 10:38:24 crc kubenswrapper[4748]: I0320 10:38:24.277332 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-hg8kp"] Mar 20 10:38:24 crc kubenswrapper[4748]: I0320 10:38:24.282615 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-xwxlw"] Mar 20 10:38:24 crc kubenswrapper[4748]: I0320 10:38:24.291728 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-xwxlw"] Mar 20 10:38:24 crc kubenswrapper[4748]: I0320 10:38:24.295242 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 20 10:38:24 crc kubenswrapper[4748]: I0320 10:38:24.312188 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-8kqb5"] Mar 20 10:38:24 crc kubenswrapper[4748]: E0320 10:38:24.312510 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a49c3549-30d5-4927-ae41-cc2b9e7f47e2" containerName="route-controller-manager" Mar 20 10:38:24 crc kubenswrapper[4748]: I0320 10:38:24.312536 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="a49c3549-30d5-4927-ae41-cc2b9e7f47e2" containerName="route-controller-manager" Mar 20 10:38:24 crc kubenswrapper[4748]: E0320 10:38:24.312567 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffd1ef63-d06a-4c98-8644-9efc1e7ae8fa" containerName="controller-manager" Mar 20 10:38:24 crc kubenswrapper[4748]: I0320 10:38:24.312576 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffd1ef63-d06a-4c98-8644-9efc1e7ae8fa" containerName="controller-manager" Mar 20 10:38:24 crc kubenswrapper[4748]: I0320 10:38:24.312698 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="a49c3549-30d5-4927-ae41-cc2b9e7f47e2" containerName="route-controller-manager" Mar 20 10:38:24 crc kubenswrapper[4748]: I0320 10:38:24.312724 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="ffd1ef63-d06a-4c98-8644-9efc1e7ae8fa" containerName="controller-manager" Mar 20 10:38:24 crc kubenswrapper[4748]: W0320 10:38:24.313427 4748 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podbb9bf98a_c134_47ae_9ab8_d287fc56d70c.slice/crio-af768150a772b34720cd48b64d855c5c61f6a59b029af0be0ccd3c6241fcc8fb WatchSource:0}: Error finding container af768150a772b34720cd48b64d855c5c61f6a59b029af0be0ccd3c6241fcc8fb: Status 404 returned error can't find the container with id af768150a772b34720cd48b64d855c5c61f6a59b029af0be0ccd3c6241fcc8fb Mar 20 10:38:24 crc kubenswrapper[4748]: I0320 10:38:24.313768 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8kqb5" Mar 20 10:38:24 crc kubenswrapper[4748]: I0320 10:38:24.313819 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8kqb5"] Mar 20 10:38:24 crc kubenswrapper[4748]: I0320 10:38:24.318266 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 20 10:38:24 crc kubenswrapper[4748]: I0320 10:38:24.399461 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0186ffa9-907a-4afd-953d-28665f7343da-catalog-content\") pod \"redhat-marketplace-8kqb5\" (UID: \"0186ffa9-907a-4afd-953d-28665f7343da\") " pod="openshift-marketplace/redhat-marketplace-8kqb5" Mar 20 10:38:24 crc kubenswrapper[4748]: I0320 10:38:24.399861 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bnzq2\" (UniqueName: \"kubernetes.io/projected/0186ffa9-907a-4afd-953d-28665f7343da-kube-api-access-bnzq2\") pod \"redhat-marketplace-8kqb5\" (UID: \"0186ffa9-907a-4afd-953d-28665f7343da\") " pod="openshift-marketplace/redhat-marketplace-8kqb5" Mar 20 10:38:24 crc kubenswrapper[4748]: I0320 10:38:24.399958 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0186ffa9-907a-4afd-953d-28665f7343da-utilities\") pod \"redhat-marketplace-8kqb5\" (UID: \"0186ffa9-907a-4afd-953d-28665f7343da\") " pod="openshift-marketplace/redhat-marketplace-8kqb5" Mar 20 10:38:24 crc kubenswrapper[4748]: I0320 10:38:24.501453 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0186ffa9-907a-4afd-953d-28665f7343da-utilities\") pod \"redhat-marketplace-8kqb5\" (UID: \"0186ffa9-907a-4afd-953d-28665f7343da\") " pod="openshift-marketplace/redhat-marketplace-8kqb5" Mar 20 10:38:24 crc kubenswrapper[4748]: I0320 10:38:24.501513 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0186ffa9-907a-4afd-953d-28665f7343da-catalog-content\") pod \"redhat-marketplace-8kqb5\" (UID: \"0186ffa9-907a-4afd-953d-28665f7343da\") " pod="openshift-marketplace/redhat-marketplace-8kqb5" Mar 20 10:38:24 crc kubenswrapper[4748]: I0320 10:38:24.501559 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bnzq2\" (UniqueName: \"kubernetes.io/projected/0186ffa9-907a-4afd-953d-28665f7343da-kube-api-access-bnzq2\") pod \"redhat-marketplace-8kqb5\" (UID: \"0186ffa9-907a-4afd-953d-28665f7343da\") " pod="openshift-marketplace/redhat-marketplace-8kqb5" Mar 20 10:38:24 crc kubenswrapper[4748]: I0320 10:38:24.502377 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0186ffa9-907a-4afd-953d-28665f7343da-utilities\") pod \"redhat-marketplace-8kqb5\" (UID: \"0186ffa9-907a-4afd-953d-28665f7343da\") " pod="openshift-marketplace/redhat-marketplace-8kqb5" Mar 20 10:38:24 crc kubenswrapper[4748]: I0320 10:38:24.503005 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0186ffa9-907a-4afd-953d-28665f7343da-catalog-content\") pod \"redhat-marketplace-8kqb5\" (UID: \"0186ffa9-907a-4afd-953d-28665f7343da\") " pod="openshift-marketplace/redhat-marketplace-8kqb5" Mar 20 10:38:24 crc kubenswrapper[4748]: I0320 10:38:24.524157 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bnzq2\" (UniqueName: \"kubernetes.io/projected/0186ffa9-907a-4afd-953d-28665f7343da-kube-api-access-bnzq2\") pod \"redhat-marketplace-8kqb5\" (UID: \"0186ffa9-907a-4afd-953d-28665f7343da\") " pod="openshift-marketplace/redhat-marketplace-8kqb5" Mar 20 10:38:24 crc kubenswrapper[4748]: I0320 10:38:24.546344 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-tlkp2" Mar 20 10:38:24 crc kubenswrapper[4748]: I0320 10:38:24.558969 4748 patch_prober.go:28] interesting pod/router-default-5444994796-rbx59 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 10:38:24 crc kubenswrapper[4748]: [-]has-synced failed: reason withheld Mar 20 10:38:24 crc kubenswrapper[4748]: [+]process-running ok Mar 20 10:38:24 crc kubenswrapper[4748]: healthz check failed Mar 20 10:38:24 crc kubenswrapper[4748]: I0320 10:38:24.559024 4748 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-rbx59" podUID="2c8e6760-4ff7-4417-917e-61bdc115d710" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 10:38:24 crc kubenswrapper[4748]: I0320 10:38:24.570763 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-66957d87b7-q85bk"] Mar 20 10:38:24 crc kubenswrapper[4748]: I0320 10:38:24.571567 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-66957d87b7-q85bk" Mar 20 10:38:24 crc kubenswrapper[4748]: I0320 10:38:24.573482 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 20 10:38:24 crc kubenswrapper[4748]: I0320 10:38:24.574878 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 20 10:38:24 crc kubenswrapper[4748]: I0320 10:38:24.578621 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 20 10:38:24 crc kubenswrapper[4748]: I0320 10:38:24.580209 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 20 10:38:24 crc kubenswrapper[4748]: I0320 10:38:24.580542 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 20 10:38:24 crc kubenswrapper[4748]: I0320 10:38:24.580967 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-674f4f85f8-gzrft"] Mar 20 10:38:24 crc kubenswrapper[4748]: I0320 10:38:24.581793 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-674f4f85f8-gzrft" Mar 20 10:38:24 crc kubenswrapper[4748]: I0320 10:38:24.582566 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 20 10:38:24 crc kubenswrapper[4748]: I0320 10:38:24.583529 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 20 10:38:24 crc kubenswrapper[4748]: I0320 10:38:24.584344 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 20 10:38:24 crc kubenswrapper[4748]: I0320 10:38:24.586089 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 20 10:38:24 crc kubenswrapper[4748]: I0320 10:38:24.586484 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 20 10:38:24 crc kubenswrapper[4748]: I0320 10:38:24.586616 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 20 10:38:24 crc kubenswrapper[4748]: I0320 10:38:24.586680 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 20 10:38:24 crc kubenswrapper[4748]: I0320 10:38:24.586914 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 20 10:38:24 crc kubenswrapper[4748]: I0320 10:38:24.607723 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-674f4f85f8-gzrft"] Mar 20 10:38:24 crc kubenswrapper[4748]: I0320 10:38:24.610046 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-66957d87b7-q85bk"] Mar 20 10:38:24 crc kubenswrapper[4748]: I0320 10:38:24.633262 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8kqb5" Mar 20 10:38:24 crc kubenswrapper[4748]: I0320 10:38:24.668417 4748 ???:1] "http: TLS handshake error from 192.168.126.11:42636: no serving certificate available for the kubelet" Mar 20 10:38:24 crc kubenswrapper[4748]: I0320 10:38:24.700906 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-bsq7p"] Mar 20 10:38:24 crc kubenswrapper[4748]: I0320 10:38:24.703399 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bsq7p" Mar 20 10:38:24 crc kubenswrapper[4748]: I0320 10:38:24.703754 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-4k4dq" Mar 20 10:38:24 crc kubenswrapper[4748]: I0320 10:38:24.703859 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e9c71405-b1fd-41f5-8a75-23f81e2b5a5f-config\") pod \"controller-manager-66957d87b7-q85bk\" (UID: \"e9c71405-b1fd-41f5-8a75-23f81e2b5a5f\") " pod="openshift-controller-manager/controller-manager-66957d87b7-q85bk" Mar 20 10:38:24 crc kubenswrapper[4748]: I0320 10:38:24.703904 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e9c71405-b1fd-41f5-8a75-23f81e2b5a5f-serving-cert\") pod \"controller-manager-66957d87b7-q85bk\" (UID: \"e9c71405-b1fd-41f5-8a75-23f81e2b5a5f\") " pod="openshift-controller-manager/controller-manager-66957d87b7-q85bk" Mar 20 10:38:24 crc kubenswrapper[4748]: I0320 10:38:24.703965 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4f3d9af-dccd-49ed-b84f-1a5ee38037fd-config\") pod \"route-controller-manager-674f4f85f8-gzrft\" (UID: \"a4f3d9af-dccd-49ed-b84f-1a5ee38037fd\") " pod="openshift-route-controller-manager/route-controller-manager-674f4f85f8-gzrft" Mar 20 10:38:24 crc kubenswrapper[4748]: I0320 10:38:24.703987 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crg2s\" (UniqueName: \"kubernetes.io/projected/e9c71405-b1fd-41f5-8a75-23f81e2b5a5f-kube-api-access-crg2s\") pod \"controller-manager-66957d87b7-q85bk\" (UID: \"e9c71405-b1fd-41f5-8a75-23f81e2b5a5f\") " pod="openshift-controller-manager/controller-manager-66957d87b7-q85bk" Mar 20 10:38:24 crc kubenswrapper[4748]: I0320 10:38:24.704063 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e9c71405-b1fd-41f5-8a75-23f81e2b5a5f-proxy-ca-bundles\") pod \"controller-manager-66957d87b7-q85bk\" (UID: \"e9c71405-b1fd-41f5-8a75-23f81e2b5a5f\") " pod="openshift-controller-manager/controller-manager-66957d87b7-q85bk" Mar 20 10:38:24 crc kubenswrapper[4748]: I0320 10:38:24.704109 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a4f3d9af-dccd-49ed-b84f-1a5ee38037fd-serving-cert\") pod \"route-controller-manager-674f4f85f8-gzrft\" (UID: \"a4f3d9af-dccd-49ed-b84f-1a5ee38037fd\") " pod="openshift-route-controller-manager/route-controller-manager-674f4f85f8-gzrft" Mar 20 10:38:24 crc kubenswrapper[4748]: I0320 10:38:24.704135 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rpww\" (UniqueName: \"kubernetes.io/projected/a4f3d9af-dccd-49ed-b84f-1a5ee38037fd-kube-api-access-5rpww\") pod \"route-controller-manager-674f4f85f8-gzrft\" (UID: \"a4f3d9af-dccd-49ed-b84f-1a5ee38037fd\") " pod="openshift-route-controller-manager/route-controller-manager-674f4f85f8-gzrft" Mar 20 10:38:24 crc kubenswrapper[4748]: I0320 10:38:24.704163 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e9c71405-b1fd-41f5-8a75-23f81e2b5a5f-client-ca\") pod \"controller-manager-66957d87b7-q85bk\" (UID: \"e9c71405-b1fd-41f5-8a75-23f81e2b5a5f\") " pod="openshift-controller-manager/controller-manager-66957d87b7-q85bk" Mar 20 10:38:24 crc kubenswrapper[4748]: I0320 10:38:24.704200 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a4f3d9af-dccd-49ed-b84f-1a5ee38037fd-client-ca\") pod \"route-controller-manager-674f4f85f8-gzrft\" (UID: \"a4f3d9af-dccd-49ed-b84f-1a5ee38037fd\") " pod="openshift-route-controller-manager/route-controller-manager-674f4f85f8-gzrft" Mar 20 10:38:24 crc kubenswrapper[4748]: I0320 10:38:24.707954 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bsq7p"] Mar 20 10:38:24 crc kubenswrapper[4748]: I0320 10:38:24.806139 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e9c71405-b1fd-41f5-8a75-23f81e2b5a5f-proxy-ca-bundles\") pod \"controller-manager-66957d87b7-q85bk\" (UID: \"e9c71405-b1fd-41f5-8a75-23f81e2b5a5f\") " pod="openshift-controller-manager/controller-manager-66957d87b7-q85bk" Mar 20 10:38:24 crc kubenswrapper[4748]: I0320 10:38:24.806195 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a4f3d9af-dccd-49ed-b84f-1a5ee38037fd-serving-cert\") pod \"route-controller-manager-674f4f85f8-gzrft\" (UID: \"a4f3d9af-dccd-49ed-b84f-1a5ee38037fd\") " pod="openshift-route-controller-manager/route-controller-manager-674f4f85f8-gzrft" Mar 20 10:38:24 crc kubenswrapper[4748]: I0320 10:38:24.806216 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5rpww\" (UniqueName: \"kubernetes.io/projected/a4f3d9af-dccd-49ed-b84f-1a5ee38037fd-kube-api-access-5rpww\") pod \"route-controller-manager-674f4f85f8-gzrft\" (UID: \"a4f3d9af-dccd-49ed-b84f-1a5ee38037fd\") " pod="openshift-route-controller-manager/route-controller-manager-674f4f85f8-gzrft" Mar 20 10:38:24 crc kubenswrapper[4748]: I0320 10:38:24.806238 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e9c71405-b1fd-41f5-8a75-23f81e2b5a5f-client-ca\") pod \"controller-manager-66957d87b7-q85bk\" (UID: \"e9c71405-b1fd-41f5-8a75-23f81e2b5a5f\") " pod="openshift-controller-manager/controller-manager-66957d87b7-q85bk" Mar 20 10:38:24 crc kubenswrapper[4748]: I0320 10:38:24.806262 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e249463-f3e7-4aed-a0ac-97c54af87949-catalog-content\") pod \"redhat-marketplace-bsq7p\" (UID: \"8e249463-f3e7-4aed-a0ac-97c54af87949\") " pod="openshift-marketplace/redhat-marketplace-bsq7p" Mar 20 10:38:24 crc kubenswrapper[4748]: I0320 10:38:24.806283 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a4f3d9af-dccd-49ed-b84f-1a5ee38037fd-client-ca\") pod \"route-controller-manager-674f4f85f8-gzrft\" (UID: \"a4f3d9af-dccd-49ed-b84f-1a5ee38037fd\") " pod="openshift-route-controller-manager/route-controller-manager-674f4f85f8-gzrft" Mar 20 10:38:24 crc kubenswrapper[4748]: I0320 10:38:24.806317 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e9c71405-b1fd-41f5-8a75-23f81e2b5a5f-config\") pod \"controller-manager-66957d87b7-q85bk\" (UID: \"e9c71405-b1fd-41f5-8a75-23f81e2b5a5f\") " pod="openshift-controller-manager/controller-manager-66957d87b7-q85bk" Mar 20 10:38:24 crc kubenswrapper[4748]: I0320 10:38:24.806335 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e9c71405-b1fd-41f5-8a75-23f81e2b5a5f-serving-cert\") pod \"controller-manager-66957d87b7-q85bk\" (UID: \"e9c71405-b1fd-41f5-8a75-23f81e2b5a5f\") " pod="openshift-controller-manager/controller-manager-66957d87b7-q85bk" Mar 20 10:38:24 crc kubenswrapper[4748]: I0320 10:38:24.806359 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8bpp\" (UniqueName: \"kubernetes.io/projected/8e249463-f3e7-4aed-a0ac-97c54af87949-kube-api-access-g8bpp\") pod \"redhat-marketplace-bsq7p\" (UID: \"8e249463-f3e7-4aed-a0ac-97c54af87949\") " pod="openshift-marketplace/redhat-marketplace-bsq7p" Mar 20 10:38:24 crc kubenswrapper[4748]: I0320 10:38:24.807062 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e249463-f3e7-4aed-a0ac-97c54af87949-utilities\") pod \"redhat-marketplace-bsq7p\" (UID: \"8e249463-f3e7-4aed-a0ac-97c54af87949\") " pod="openshift-marketplace/redhat-marketplace-bsq7p" Mar 20 10:38:24 crc kubenswrapper[4748]: I0320 10:38:24.807093 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4f3d9af-dccd-49ed-b84f-1a5ee38037fd-config\") pod \"route-controller-manager-674f4f85f8-gzrft\" (UID: \"a4f3d9af-dccd-49ed-b84f-1a5ee38037fd\") " pod="openshift-route-controller-manager/route-controller-manager-674f4f85f8-gzrft" Mar 20 10:38:24 crc kubenswrapper[4748]: I0320 10:38:24.807110 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-crg2s\" (UniqueName: \"kubernetes.io/projected/e9c71405-b1fd-41f5-8a75-23f81e2b5a5f-kube-api-access-crg2s\") pod \"controller-manager-66957d87b7-q85bk\" (UID: \"e9c71405-b1fd-41f5-8a75-23f81e2b5a5f\") " pod="openshift-controller-manager/controller-manager-66957d87b7-q85bk" Mar 20 10:38:24 crc kubenswrapper[4748]: I0320 10:38:24.808075 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e9c71405-b1fd-41f5-8a75-23f81e2b5a5f-client-ca\") pod \"controller-manager-66957d87b7-q85bk\" (UID: \"e9c71405-b1fd-41f5-8a75-23f81e2b5a5f\") " pod="openshift-controller-manager/controller-manager-66957d87b7-q85bk" Mar 20 10:38:24 crc kubenswrapper[4748]: I0320 10:38:24.808916 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e9c71405-b1fd-41f5-8a75-23f81e2b5a5f-proxy-ca-bundles\") pod \"controller-manager-66957d87b7-q85bk\" (UID: \"e9c71405-b1fd-41f5-8a75-23f81e2b5a5f\") " pod="openshift-controller-manager/controller-manager-66957d87b7-q85bk" Mar 20 10:38:24 crc kubenswrapper[4748]: I0320 10:38:24.810465 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e9c71405-b1fd-41f5-8a75-23f81e2b5a5f-config\") pod \"controller-manager-66957d87b7-q85bk\" (UID: \"e9c71405-b1fd-41f5-8a75-23f81e2b5a5f\") " pod="openshift-controller-manager/controller-manager-66957d87b7-q85bk" Mar 20 10:38:24 crc kubenswrapper[4748]: I0320 10:38:24.811232 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a4f3d9af-dccd-49ed-b84f-1a5ee38037fd-client-ca\") pod \"route-controller-manager-674f4f85f8-gzrft\" (UID: \"a4f3d9af-dccd-49ed-b84f-1a5ee38037fd\") " pod="openshift-route-controller-manager/route-controller-manager-674f4f85f8-gzrft" Mar 20 10:38:24 crc kubenswrapper[4748]: I0320 10:38:24.815009 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4f3d9af-dccd-49ed-b84f-1a5ee38037fd-config\") pod \"route-controller-manager-674f4f85f8-gzrft\" (UID: \"a4f3d9af-dccd-49ed-b84f-1a5ee38037fd\") " pod="openshift-route-controller-manager/route-controller-manager-674f4f85f8-gzrft" Mar 20 10:38:24 crc kubenswrapper[4748]: I0320 10:38:24.816846 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a4f3d9af-dccd-49ed-b84f-1a5ee38037fd-serving-cert\") pod \"route-controller-manager-674f4f85f8-gzrft\" (UID: \"a4f3d9af-dccd-49ed-b84f-1a5ee38037fd\") " pod="openshift-route-controller-manager/route-controller-manager-674f4f85f8-gzrft" Mar 20 10:38:24 crc kubenswrapper[4748]: I0320 10:38:24.821378 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e9c71405-b1fd-41f5-8a75-23f81e2b5a5f-serving-cert\") pod \"controller-manager-66957d87b7-q85bk\" (UID: \"e9c71405-b1fd-41f5-8a75-23f81e2b5a5f\") " pod="openshift-controller-manager/controller-manager-66957d87b7-q85bk" Mar 20 10:38:24 crc kubenswrapper[4748]: I0320 10:38:24.823104 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-tlkp2"] Mar 20 10:38:24 crc kubenswrapper[4748]: I0320 10:38:24.827465 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rpww\" (UniqueName: \"kubernetes.io/projected/a4f3d9af-dccd-49ed-b84f-1a5ee38037fd-kube-api-access-5rpww\") pod \"route-controller-manager-674f4f85f8-gzrft\" (UID: \"a4f3d9af-dccd-49ed-b84f-1a5ee38037fd\") " pod="openshift-route-controller-manager/route-controller-manager-674f4f85f8-gzrft" Mar 20 10:38:24 crc kubenswrapper[4748]: I0320 10:38:24.833038 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-crg2s\" (UniqueName: \"kubernetes.io/projected/e9c71405-b1fd-41f5-8a75-23f81e2b5a5f-kube-api-access-crg2s\") pod \"controller-manager-66957d87b7-q85bk\" (UID: \"e9c71405-b1fd-41f5-8a75-23f81e2b5a5f\") " pod="openshift-controller-manager/controller-manager-66957d87b7-q85bk" Mar 20 10:38:24 crc kubenswrapper[4748]: W0320 10:38:24.834215 4748 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7da7288f_ca47_4172_a3dd_80a79e803277.slice/crio-543bdf75f48d9514a0a08fe229aab3dc1bcc97e95fcbe956aea8900aef92ec9b WatchSource:0}: Error finding container 543bdf75f48d9514a0a08fe229aab3dc1bcc97e95fcbe956aea8900aef92ec9b: Status 404 returned error can't find the container with id 543bdf75f48d9514a0a08fe229aab3dc1bcc97e95fcbe956aea8900aef92ec9b Mar 20 10:38:24 crc kubenswrapper[4748]: I0320 10:38:24.908171 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e249463-f3e7-4aed-a0ac-97c54af87949-catalog-content\") pod \"redhat-marketplace-bsq7p\" (UID: \"8e249463-f3e7-4aed-a0ac-97c54af87949\") " pod="openshift-marketplace/redhat-marketplace-bsq7p" Mar 20 10:38:24 crc kubenswrapper[4748]: I0320 10:38:24.908255 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8bpp\" (UniqueName: \"kubernetes.io/projected/8e249463-f3e7-4aed-a0ac-97c54af87949-kube-api-access-g8bpp\") pod \"redhat-marketplace-bsq7p\" (UID: \"8e249463-f3e7-4aed-a0ac-97c54af87949\") " pod="openshift-marketplace/redhat-marketplace-bsq7p" Mar 20 10:38:24 crc kubenswrapper[4748]: I0320 10:38:24.908296 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e249463-f3e7-4aed-a0ac-97c54af87949-utilities\") pod \"redhat-marketplace-bsq7p\" (UID: \"8e249463-f3e7-4aed-a0ac-97c54af87949\") " pod="openshift-marketplace/redhat-marketplace-bsq7p" Mar 20 10:38:24 crc kubenswrapper[4748]: I0320 10:38:24.909048 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e249463-f3e7-4aed-a0ac-97c54af87949-utilities\") pod \"redhat-marketplace-bsq7p\" (UID: \"8e249463-f3e7-4aed-a0ac-97c54af87949\") " pod="openshift-marketplace/redhat-marketplace-bsq7p" Mar 20 10:38:24 crc kubenswrapper[4748]: I0320 10:38:24.910355 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-66957d87b7-q85bk" Mar 20 10:38:24 crc kubenswrapper[4748]: I0320 10:38:24.910653 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e249463-f3e7-4aed-a0ac-97c54af87949-catalog-content\") pod \"redhat-marketplace-bsq7p\" (UID: \"8e249463-f3e7-4aed-a0ac-97c54af87949\") " pod="openshift-marketplace/redhat-marketplace-bsq7p" Mar 20 10:38:24 crc kubenswrapper[4748]: I0320 10:38:24.919440 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-674f4f85f8-gzrft" Mar 20 10:38:24 crc kubenswrapper[4748]: I0320 10:38:24.946333 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8bpp\" (UniqueName: \"kubernetes.io/projected/8e249463-f3e7-4aed-a0ac-97c54af87949-kube-api-access-g8bpp\") pod \"redhat-marketplace-bsq7p\" (UID: \"8e249463-f3e7-4aed-a0ac-97c54af87949\") " pod="openshift-marketplace/redhat-marketplace-bsq7p" Mar 20 10:38:25 crc kubenswrapper[4748]: I0320 10:38:25.030168 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bsq7p" Mar 20 10:38:25 crc kubenswrapper[4748]: I0320 10:38:25.095915 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-z78kn"] Mar 20 10:38:25 crc kubenswrapper[4748]: I0320 10:38:25.098446 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z78kn" Mar 20 10:38:25 crc kubenswrapper[4748]: I0320 10:38:25.101081 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-z78kn"] Mar 20 10:38:25 crc kubenswrapper[4748]: I0320 10:38:25.103862 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 20 10:38:25 crc kubenswrapper[4748]: I0320 10:38:25.179109 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8kqb5"] Mar 20 10:38:25 crc kubenswrapper[4748]: I0320 10:38:25.225246 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e393e84-d0bb-4258-8eef-012c9269fc05-catalog-content\") pod \"redhat-operators-z78kn\" (UID: \"7e393e84-d0bb-4258-8eef-012c9269fc05\") " pod="openshift-marketplace/redhat-operators-z78kn" Mar 20 10:38:25 crc kubenswrapper[4748]: I0320 10:38:25.225293 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e393e84-d0bb-4258-8eef-012c9269fc05-utilities\") pod \"redhat-operators-z78kn\" (UID: \"7e393e84-d0bb-4258-8eef-012c9269fc05\") " pod="openshift-marketplace/redhat-operators-z78kn" Mar 20 10:38:25 crc kubenswrapper[4748]: I0320 10:38:25.225316 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cs4lq\" (UniqueName: \"kubernetes.io/projected/7e393e84-d0bb-4258-8eef-012c9269fc05-kube-api-access-cs4lq\") pod \"redhat-operators-z78kn\" (UID: \"7e393e84-d0bb-4258-8eef-012c9269fc05\") " pod="openshift-marketplace/redhat-operators-z78kn" Mar 20 10:38:25 crc kubenswrapper[4748]: I0320 10:38:25.228042 4748 generic.go:334] "Generic (PLEG): container finished" podID="2dd2dd89-ba90-440f-abc8-74ab27d7db69" containerID="f26e13d2a4bc838ed9c49216a3f99f4fc8a056e05e61c73cdfe6e29da627a0df" exitCode=0 Mar 20 10:38:25 crc kubenswrapper[4748]: I0320 10:38:25.228112 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566710-vh2t7" event={"ID":"2dd2dd89-ba90-440f-abc8-74ab27d7db69","Type":"ContainerDied","Data":"f26e13d2a4bc838ed9c49216a3f99f4fc8a056e05e61c73cdfe6e29da627a0df"} Mar 20 10:38:25 crc kubenswrapper[4748]: I0320 10:38:25.233713 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-5jzd5" event={"ID":"d7a2dfc5-1dd9-4ef1-9419-39f60da74b16","Type":"ContainerStarted","Data":"bc556b5ac9d62e063f6a43eb0a10c98ef895ca553b704acc211a07f15423009f"} Mar 20 10:38:25 crc kubenswrapper[4748]: I0320 10:38:25.233787 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-5jzd5" event={"ID":"d7a2dfc5-1dd9-4ef1-9419-39f60da74b16","Type":"ContainerStarted","Data":"05b6a048a1d1e2b995681c5a00aafc74617d793163b8944eaa52390cc2b63727"} Mar 20 10:38:25 crc kubenswrapper[4748]: I0320 10:38:25.238341 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"bb9bf98a-c134-47ae-9ab8-d287fc56d70c","Type":"ContainerStarted","Data":"8b6b2976f08bdefa6794f14e5b7417940631b13727f6b5859b930b4b3a5a13bc"} Mar 20 10:38:25 crc kubenswrapper[4748]: I0320 10:38:25.238396 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"bb9bf98a-c134-47ae-9ab8-d287fc56d70c","Type":"ContainerStarted","Data":"af768150a772b34720cd48b64d855c5c61f6a59b029af0be0ccd3c6241fcc8fb"} Mar 20 10:38:25 crc kubenswrapper[4748]: I0320 10:38:25.277891 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-5jzd5" podStartSLOduration=81.277865448 podStartE2EDuration="1m21.277865448s" podCreationTimestamp="2026-03-20 10:37:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:38:25.269559019 +0000 UTC m=+140.411104833" watchObservedRunningTime="2026-03-20 10:38:25.277865448 +0000 UTC m=+140.419411262" Mar 20 10:38:25 crc kubenswrapper[4748]: I0320 10:38:25.280827 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-tlkp2" event={"ID":"7da7288f-ca47-4172-a3dd-80a79e803277","Type":"ContainerStarted","Data":"ec12c12a7b334c9ad69d297bd9c80324cece598002a189fe9536a4aaf650a090"} Mar 20 10:38:25 crc kubenswrapper[4748]: I0320 10:38:25.281082 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-tlkp2" event={"ID":"7da7288f-ca47-4172-a3dd-80a79e803277","Type":"ContainerStarted","Data":"543bdf75f48d9514a0a08fe229aab3dc1bcc97e95fcbe956aea8900aef92ec9b"} Mar 20 10:38:25 crc kubenswrapper[4748]: I0320 10:38:25.282266 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-tlkp2" Mar 20 10:38:25 crc kubenswrapper[4748]: I0320 10:38:25.290154 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=2.290125901 podStartE2EDuration="2.290125901s" podCreationTimestamp="2026-03-20 10:38:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:38:25.288316073 +0000 UTC m=+140.429861877" watchObservedRunningTime="2026-03-20 10:38:25.290125901 +0000 UTC m=+140.431671715" Mar 20 10:38:25 crc kubenswrapper[4748]: I0320 10:38:25.312772 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-c4jdq"] Mar 20 10:38:25 crc kubenswrapper[4748]: I0320 10:38:25.314159 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c4jdq" Mar 20 10:38:25 crc kubenswrapper[4748]: I0320 10:38:25.326125 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e393e84-d0bb-4258-8eef-012c9269fc05-catalog-content\") pod \"redhat-operators-z78kn\" (UID: \"7e393e84-d0bb-4258-8eef-012c9269fc05\") " pod="openshift-marketplace/redhat-operators-z78kn" Mar 20 10:38:25 crc kubenswrapper[4748]: I0320 10:38:25.326216 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e393e84-d0bb-4258-8eef-012c9269fc05-utilities\") pod \"redhat-operators-z78kn\" (UID: \"7e393e84-d0bb-4258-8eef-012c9269fc05\") " pod="openshift-marketplace/redhat-operators-z78kn" Mar 20 10:38:25 crc kubenswrapper[4748]: I0320 10:38:25.326248 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cs4lq\" (UniqueName: \"kubernetes.io/projected/7e393e84-d0bb-4258-8eef-012c9269fc05-kube-api-access-cs4lq\") pod \"redhat-operators-z78kn\" (UID: \"7e393e84-d0bb-4258-8eef-012c9269fc05\") " pod="openshift-marketplace/redhat-operators-z78kn" Mar 20 10:38:25 crc kubenswrapper[4748]: I0320 10:38:25.327778 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e393e84-d0bb-4258-8eef-012c9269fc05-catalog-content\") pod \"redhat-operators-z78kn\" (UID: \"7e393e84-d0bb-4258-8eef-012c9269fc05\") " pod="openshift-marketplace/redhat-operators-z78kn" Mar 20 10:38:25 crc kubenswrapper[4748]: I0320 10:38:25.328359 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-tlkp2" podStartSLOduration=81.328339059 podStartE2EDuration="1m21.328339059s" podCreationTimestamp="2026-03-20 10:37:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:38:25.319568358 +0000 UTC m=+140.461114192" watchObservedRunningTime="2026-03-20 10:38:25.328339059 +0000 UTC m=+140.469884873" Mar 20 10:38:25 crc kubenswrapper[4748]: I0320 10:38:25.328684 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e393e84-d0bb-4258-8eef-012c9269fc05-utilities\") pod \"redhat-operators-z78kn\" (UID: \"7e393e84-d0bb-4258-8eef-012c9269fc05\") " pod="openshift-marketplace/redhat-operators-z78kn" Mar 20 10:38:25 crc kubenswrapper[4748]: I0320 10:38:25.334338 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-c4jdq"] Mar 20 10:38:25 crc kubenswrapper[4748]: I0320 10:38:25.371229 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cs4lq\" (UniqueName: \"kubernetes.io/projected/7e393e84-d0bb-4258-8eef-012c9269fc05-kube-api-access-cs4lq\") pod \"redhat-operators-z78kn\" (UID: \"7e393e84-d0bb-4258-8eef-012c9269fc05\") " pod="openshift-marketplace/redhat-operators-z78kn" Mar 20 10:38:25 crc kubenswrapper[4748]: I0320 10:38:25.427672 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jsdvt\" (UniqueName: \"kubernetes.io/projected/a82b758c-eb62-429b-b092-5ba4a7cf665d-kube-api-access-jsdvt\") pod \"redhat-operators-c4jdq\" (UID: \"a82b758c-eb62-429b-b092-5ba4a7cf665d\") " pod="openshift-marketplace/redhat-operators-c4jdq" Mar 20 10:38:25 crc kubenswrapper[4748]: I0320 10:38:25.427729 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a82b758c-eb62-429b-b092-5ba4a7cf665d-catalog-content\") pod \"redhat-operators-c4jdq\" (UID: \"a82b758c-eb62-429b-b092-5ba4a7cf665d\") " pod="openshift-marketplace/redhat-operators-c4jdq" Mar 20 10:38:25 crc kubenswrapper[4748]: I0320 10:38:25.427852 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a82b758c-eb62-429b-b092-5ba4a7cf665d-utilities\") pod \"redhat-operators-c4jdq\" (UID: \"a82b758c-eb62-429b-b092-5ba4a7cf665d\") " pod="openshift-marketplace/redhat-operators-c4jdq" Mar 20 10:38:25 crc kubenswrapper[4748]: I0320 10:38:25.435758 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z78kn" Mar 20 10:38:25 crc kubenswrapper[4748]: I0320 10:38:25.445815 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-66957d87b7-q85bk"] Mar 20 10:38:25 crc kubenswrapper[4748]: I0320 10:38:25.460536 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-9x2kc" Mar 20 10:38:25 crc kubenswrapper[4748]: I0320 10:38:25.482194 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-qq9xx" Mar 20 10:38:25 crc kubenswrapper[4748]: I0320 10:38:25.482245 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-qq9xx" Mar 20 10:38:25 crc kubenswrapper[4748]: I0320 10:38:25.497109 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-qq9xx" Mar 20 10:38:25 crc kubenswrapper[4748]: I0320 10:38:25.506111 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2xltn" Mar 20 10:38:25 crc kubenswrapper[4748]: I0320 10:38:25.529475 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jsdvt\" (UniqueName: \"kubernetes.io/projected/a82b758c-eb62-429b-b092-5ba4a7cf665d-kube-api-access-jsdvt\") pod \"redhat-operators-c4jdq\" (UID: \"a82b758c-eb62-429b-b092-5ba4a7cf665d\") " pod="openshift-marketplace/redhat-operators-c4jdq" Mar 20 10:38:25 crc kubenswrapper[4748]: I0320 10:38:25.529536 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a82b758c-eb62-429b-b092-5ba4a7cf665d-catalog-content\") pod \"redhat-operators-c4jdq\" (UID: \"a82b758c-eb62-429b-b092-5ba4a7cf665d\") " pod="openshift-marketplace/redhat-operators-c4jdq" Mar 20 10:38:25 crc kubenswrapper[4748]: I0320 10:38:25.529669 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a82b758c-eb62-429b-b092-5ba4a7cf665d-utilities\") pod \"redhat-operators-c4jdq\" (UID: \"a82b758c-eb62-429b-b092-5ba4a7cf665d\") " pod="openshift-marketplace/redhat-operators-c4jdq" Mar 20 10:38:25 crc kubenswrapper[4748]: I0320 10:38:25.530073 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a82b758c-eb62-429b-b092-5ba4a7cf665d-utilities\") pod \"redhat-operators-c4jdq\" (UID: \"a82b758c-eb62-429b-b092-5ba4a7cf665d\") " pod="openshift-marketplace/redhat-operators-c4jdq" Mar 20 10:38:25 crc kubenswrapper[4748]: I0320 10:38:25.531641 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a82b758c-eb62-429b-b092-5ba4a7cf665d-catalog-content\") pod \"redhat-operators-c4jdq\" (UID: \"a82b758c-eb62-429b-b092-5ba4a7cf665d\") " pod="openshift-marketplace/redhat-operators-c4jdq" Mar 20 10:38:25 crc kubenswrapper[4748]: I0320 10:38:25.560576 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jsdvt\" (UniqueName: \"kubernetes.io/projected/a82b758c-eb62-429b-b092-5ba4a7cf665d-kube-api-access-jsdvt\") pod \"redhat-operators-c4jdq\" (UID: \"a82b758c-eb62-429b-b092-5ba4a7cf665d\") " pod="openshift-marketplace/redhat-operators-c4jdq" Mar 20 10:38:25 crc kubenswrapper[4748]: I0320 10:38:25.563868 4748 patch_prober.go:28] interesting pod/router-default-5444994796-rbx59 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 10:38:25 crc kubenswrapper[4748]: [-]has-synced failed: reason withheld Mar 20 10:38:25 crc kubenswrapper[4748]: [+]process-running ok Mar 20 10:38:25 crc kubenswrapper[4748]: healthz check failed Mar 20 10:38:25 crc kubenswrapper[4748]: I0320 10:38:25.564292 4748 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-rbx59" podUID="2c8e6760-4ff7-4417-917e-61bdc115d710" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 10:38:25 crc kubenswrapper[4748]: I0320 10:38:25.564661 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Mar 20 10:38:25 crc kubenswrapper[4748]: I0320 10:38:25.565193 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a49c3549-30d5-4927-ae41-cc2b9e7f47e2" path="/var/lib/kubelet/pods/a49c3549-30d5-4927-ae41-cc2b9e7f47e2/volumes" Mar 20 10:38:25 crc kubenswrapper[4748]: I0320 10:38:25.565816 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ffd1ef63-d06a-4c98-8644-9efc1e7ae8fa" path="/var/lib/kubelet/pods/ffd1ef63-d06a-4c98-8644-9efc1e7ae8fa/volumes" Mar 20 10:38:25 crc kubenswrapper[4748]: I0320 10:38:25.566797 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2xltn" Mar 20 10:38:25 crc kubenswrapper[4748]: I0320 10:38:25.566826 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-rbx59" Mar 20 10:38:25 crc kubenswrapper[4748]: I0320 10:38:25.595335 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-674f4f85f8-gzrft"] Mar 20 10:38:25 crc kubenswrapper[4748]: I0320 10:38:25.614226 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bsq7p"] Mar 20 10:38:25 crc kubenswrapper[4748]: W0320 10:38:25.632008 4748 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e249463_f3e7_4aed_a0ac_97c54af87949.slice/crio-56dee9e757cfe3109602a31f1bd0ccad534b7f0580925c8e5b8035ac2ef22672 WatchSource:0}: Error finding container 56dee9e757cfe3109602a31f1bd0ccad534b7f0580925c8e5b8035ac2ef22672: Status 404 returned error can't find the container with id 56dee9e757cfe3109602a31f1bd0ccad534b7f0580925c8e5b8035ac2ef22672 Mar 20 10:38:25 crc kubenswrapper[4748]: I0320 10:38:25.693400 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c4jdq" Mar 20 10:38:25 crc kubenswrapper[4748]: I0320 10:38:25.700406 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 20 10:38:25 crc kubenswrapper[4748]: I0320 10:38:25.701060 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 10:38:25 crc kubenswrapper[4748]: I0320 10:38:25.705332 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Mar 20 10:38:25 crc kubenswrapper[4748]: I0320 10:38:25.705404 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 20 10:38:25 crc kubenswrapper[4748]: I0320 10:38:25.707860 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 20 10:38:25 crc kubenswrapper[4748]: I0320 10:38:25.734439 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f802f60b-ea00-4011-ab27-942219aaddd5-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"f802f60b-ea00-4011-ab27-942219aaddd5\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 10:38:25 crc kubenswrapper[4748]: I0320 10:38:25.734511 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f802f60b-ea00-4011-ab27-942219aaddd5-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"f802f60b-ea00-4011-ab27-942219aaddd5\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 10:38:25 crc kubenswrapper[4748]: I0320 10:38:25.771701 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hsr94" Mar 20 10:38:25 crc kubenswrapper[4748]: I0320 10:38:25.835715 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f802f60b-ea00-4011-ab27-942219aaddd5-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"f802f60b-ea00-4011-ab27-942219aaddd5\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 10:38:25 crc kubenswrapper[4748]: I0320 10:38:25.835788 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f802f60b-ea00-4011-ab27-942219aaddd5-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"f802f60b-ea00-4011-ab27-942219aaddd5\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 10:38:25 crc kubenswrapper[4748]: I0320 10:38:25.836441 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f802f60b-ea00-4011-ab27-942219aaddd5-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"f802f60b-ea00-4011-ab27-942219aaddd5\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 10:38:25 crc kubenswrapper[4748]: I0320 10:38:25.859392 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f802f60b-ea00-4011-ab27-942219aaddd5-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"f802f60b-ea00-4011-ab27-942219aaddd5\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 10:38:25 crc kubenswrapper[4748]: I0320 10:38:25.938846 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-hxtfq" Mar 20 10:38:25 crc kubenswrapper[4748]: I0320 10:38:25.942561 4748 patch_prober.go:28] interesting pod/downloads-7954f5f757-hxtfq container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.27:8080/\": dial tcp 10.217.0.27:8080: connect: connection refused" start-of-body= Mar 20 10:38:25 crc kubenswrapper[4748]: I0320 10:38:25.942606 4748 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-hxtfq" podUID="0a5bf285-90ef-47c5-a959-7bae63c410a5" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.27:8080/\": dial tcp 10.217.0.27:8080: connect: connection refused" Mar 20 10:38:25 crc kubenswrapper[4748]: I0320 10:38:25.942657 4748 patch_prober.go:28] interesting pod/downloads-7954f5f757-hxtfq container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.27:8080/\": dial tcp 10.217.0.27:8080: connect: connection refused" start-of-body= Mar 20 10:38:25 crc kubenswrapper[4748]: I0320 10:38:25.942712 4748 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-hxtfq" podUID="0a5bf285-90ef-47c5-a959-7bae63c410a5" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.27:8080/\": dial tcp 10.217.0.27:8080: connect: connection refused" Mar 20 10:38:25 crc kubenswrapper[4748]: I0320 10:38:25.943216 4748 patch_prober.go:28] interesting pod/downloads-7954f5f757-hxtfq container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.27:8080/\": dial tcp 10.217.0.27:8080: connect: connection refused" start-of-body= Mar 20 10:38:25 crc kubenswrapper[4748]: I0320 10:38:25.943276 4748 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-hxtfq" podUID="0a5bf285-90ef-47c5-a959-7bae63c410a5" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.27:8080/\": dial tcp 10.217.0.27:8080: connect: connection refused" Mar 20 10:38:25 crc kubenswrapper[4748]: I0320 10:38:25.951713 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-xpncn" Mar 20 10:38:25 crc kubenswrapper[4748]: I0320 10:38:25.960848 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-xpncn" Mar 20 10:38:26 crc kubenswrapper[4748]: I0320 10:38:26.025746 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 10:38:26 crc kubenswrapper[4748]: E0320 10:38:26.056485 4748 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e249463_f3e7_4aed_a0ac_97c54af87949.slice/crio-conmon-564bf0e0f2460ed08e96ea907f32c0afaaa15a5251e40b3ff1b63fefe1ccff02.scope\": RecentStats: unable to find data in memory cache]" Mar 20 10:38:26 crc kubenswrapper[4748]: I0320 10:38:26.079746 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-z78kn"] Mar 20 10:38:26 crc kubenswrapper[4748]: W0320 10:38:26.101345 4748 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7e393e84_d0bb_4258_8eef_012c9269fc05.slice/crio-3068537172e1764ef8b99022b1a72fa9f1251f541d313d66e9d4a5e0bc488d36 WatchSource:0}: Error finding container 3068537172e1764ef8b99022b1a72fa9f1251f541d313d66e9d4a5e0bc488d36: Status 404 returned error can't find the container with id 3068537172e1764ef8b99022b1a72fa9f1251f541d313d66e9d4a5e0bc488d36 Mar 20 10:38:26 crc kubenswrapper[4748]: I0320 10:38:26.148599 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-c4jdq"] Mar 20 10:38:26 crc kubenswrapper[4748]: I0320 10:38:26.280647 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-dmmdh" Mar 20 10:38:26 crc kubenswrapper[4748]: I0320 10:38:26.280700 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-dmmdh" Mar 20 10:38:26 crc kubenswrapper[4748]: I0320 10:38:26.289066 4748 patch_prober.go:28] interesting pod/console-f9d7485db-dmmdh container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.19:8443/health\": dial tcp 10.217.0.19:8443: connect: connection refused" start-of-body= Mar 20 10:38:26 crc kubenswrapper[4748]: I0320 10:38:26.289121 4748 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-dmmdh" podUID="ae334fdf-f952-4b6b-8372-1fd7ef332362" containerName="console" probeResult="failure" output="Get \"https://10.217.0.19:8443/health\": dial tcp 10.217.0.19:8443: connect: connection refused" Mar 20 10:38:26 crc kubenswrapper[4748]: I0320 10:38:26.296689 4748 generic.go:334] "Generic (PLEG): container finished" podID="bb9bf98a-c134-47ae-9ab8-d287fc56d70c" containerID="8b6b2976f08bdefa6794f14e5b7417940631b13727f6b5859b930b4b3a5a13bc" exitCode=0 Mar 20 10:38:26 crc kubenswrapper[4748]: I0320 10:38:26.296769 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"bb9bf98a-c134-47ae-9ab8-d287fc56d70c","Type":"ContainerDied","Data":"8b6b2976f08bdefa6794f14e5b7417940631b13727f6b5859b930b4b3a5a13bc"} Mar 20 10:38:26 crc kubenswrapper[4748]: I0320 10:38:26.322165 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c4jdq" event={"ID":"a82b758c-eb62-429b-b092-5ba4a7cf665d","Type":"ContainerStarted","Data":"b7cb07f5cac3ad57926072d83de81838ad26e5baadc4b9e9a903bbe72a1b29e2"} Mar 20 10:38:26 crc kubenswrapper[4748]: I0320 10:38:26.336127 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-66957d87b7-q85bk" event={"ID":"e9c71405-b1fd-41f5-8a75-23f81e2b5a5f","Type":"ContainerStarted","Data":"d944b9f004ae9d053dba65cf05ec5acfa9768648878718716f55ee581a98cc26"} Mar 20 10:38:26 crc kubenswrapper[4748]: I0320 10:38:26.336162 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-66957d87b7-q85bk" event={"ID":"e9c71405-b1fd-41f5-8a75-23f81e2b5a5f","Type":"ContainerStarted","Data":"d06c41f4353d612fa95fd785156b763fe14996c95bfa7a5afb5e37f1a65433b2"} Mar 20 10:38:26 crc kubenswrapper[4748]: I0320 10:38:26.337075 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-66957d87b7-q85bk" Mar 20 10:38:26 crc kubenswrapper[4748]: I0320 10:38:26.341709 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-66957d87b7-q85bk" Mar 20 10:38:26 crc kubenswrapper[4748]: I0320 10:38:26.375273 4748 generic.go:334] "Generic (PLEG): container finished" podID="0186ffa9-907a-4afd-953d-28665f7343da" containerID="2bc8b876dfde81df471e95d41f7ca032bd935a8c65b856c94c61c8fe3a3de955" exitCode=0 Mar 20 10:38:26 crc kubenswrapper[4748]: I0320 10:38:26.375626 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8kqb5" event={"ID":"0186ffa9-907a-4afd-953d-28665f7343da","Type":"ContainerDied","Data":"2bc8b876dfde81df471e95d41f7ca032bd935a8c65b856c94c61c8fe3a3de955"} Mar 20 10:38:26 crc kubenswrapper[4748]: I0320 10:38:26.375658 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8kqb5" event={"ID":"0186ffa9-907a-4afd-953d-28665f7343da","Type":"ContainerStarted","Data":"6e814cddf9ef53a75c5c8eb21e64afcdd5f6e0ac31226ccf981a0ed4afc41628"} Mar 20 10:38:26 crc kubenswrapper[4748]: I0320 10:38:26.385284 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-66957d87b7-q85bk" podStartSLOduration=3.385261836 podStartE2EDuration="3.385261836s" podCreationTimestamp="2026-03-20 10:38:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:38:26.37027588 +0000 UTC m=+141.511821694" watchObservedRunningTime="2026-03-20 10:38:26.385261836 +0000 UTC m=+141.526807650" Mar 20 10:38:26 crc kubenswrapper[4748]: I0320 10:38:26.385983 4748 generic.go:334] "Generic (PLEG): container finished" podID="8e249463-f3e7-4aed-a0ac-97c54af87949" containerID="564bf0e0f2460ed08e96ea907f32c0afaaa15a5251e40b3ff1b63fefe1ccff02" exitCode=0 Mar 20 10:38:26 crc kubenswrapper[4748]: I0320 10:38:26.386051 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bsq7p" event={"ID":"8e249463-f3e7-4aed-a0ac-97c54af87949","Type":"ContainerDied","Data":"564bf0e0f2460ed08e96ea907f32c0afaaa15a5251e40b3ff1b63fefe1ccff02"} Mar 20 10:38:26 crc kubenswrapper[4748]: I0320 10:38:26.386076 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bsq7p" event={"ID":"8e249463-f3e7-4aed-a0ac-97c54af87949","Type":"ContainerStarted","Data":"56dee9e757cfe3109602a31f1bd0ccad534b7f0580925c8e5b8035ac2ef22672"} Mar 20 10:38:26 crc kubenswrapper[4748]: I0320 10:38:26.390201 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-674f4f85f8-gzrft" event={"ID":"a4f3d9af-dccd-49ed-b84f-1a5ee38037fd","Type":"ContainerStarted","Data":"b6506986e3b63742d73a261f5921d54f77dc042470b4819486a8375518450914"} Mar 20 10:38:26 crc kubenswrapper[4748]: I0320 10:38:26.390245 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-674f4f85f8-gzrft" event={"ID":"a4f3d9af-dccd-49ed-b84f-1a5ee38037fd","Type":"ContainerStarted","Data":"a16c9bb05f75e4e69e25f14e4da380dba13e34a72e86c0fb9836491a66aa0d07"} Mar 20 10:38:26 crc kubenswrapper[4748]: I0320 10:38:26.393819 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-674f4f85f8-gzrft" Mar 20 10:38:26 crc kubenswrapper[4748]: I0320 10:38:26.425202 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 20 10:38:26 crc kubenswrapper[4748]: W0320 10:38:26.438845 4748 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podf802f60b_ea00_4011_ab27_942219aaddd5.slice/crio-e1b946100bb6a4b6f92f89911da93649964929bdadbd17396c4a2856dcc2e645 WatchSource:0}: Error finding container e1b946100bb6a4b6f92f89911da93649964929bdadbd17396c4a2856dcc2e645: Status 404 returned error can't find the container with id e1b946100bb6a4b6f92f89911da93649964929bdadbd17396c4a2856dcc2e645 Mar 20 10:38:26 crc kubenswrapper[4748]: I0320 10:38:26.439075 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-c5vw7" Mar 20 10:38:26 crc kubenswrapper[4748]: I0320 10:38:26.447093 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-c5vw7" Mar 20 10:38:26 crc kubenswrapper[4748]: I0320 10:38:26.452099 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z78kn" event={"ID":"7e393e84-d0bb-4258-8eef-012c9269fc05","Type":"ContainerStarted","Data":"3068537172e1764ef8b99022b1a72fa9f1251f541d313d66e9d4a5e0bc488d36"} Mar 20 10:38:26 crc kubenswrapper[4748]: I0320 10:38:26.460752 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-qq9xx" Mar 20 10:38:26 crc kubenswrapper[4748]: I0320 10:38:26.468909 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-674f4f85f8-gzrft" podStartSLOduration=3.468885892 podStartE2EDuration="3.468885892s" podCreationTimestamp="2026-03-20 10:38:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:38:26.462464563 +0000 UTC m=+141.604010377" watchObservedRunningTime="2026-03-20 10:38:26.468885892 +0000 UTC m=+141.610431706" Mar 20 10:38:26 crc kubenswrapper[4748]: I0320 10:38:26.476873 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-674f4f85f8-gzrft" Mar 20 10:38:26 crc kubenswrapper[4748]: I0320 10:38:26.561202 4748 patch_prober.go:28] interesting pod/router-default-5444994796-rbx59 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 10:38:26 crc kubenswrapper[4748]: [-]has-synced failed: reason withheld Mar 20 10:38:26 crc kubenswrapper[4748]: [+]process-running ok Mar 20 10:38:26 crc kubenswrapper[4748]: healthz check failed Mar 20 10:38:26 crc kubenswrapper[4748]: I0320 10:38:26.561257 4748 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-rbx59" podUID="2c8e6760-4ff7-4417-917e-61bdc115d710" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 10:38:26 crc kubenswrapper[4748]: I0320 10:38:26.641920 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8xtg5" Mar 20 10:38:26 crc kubenswrapper[4748]: I0320 10:38:26.673524 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8xtg5" Mar 20 10:38:26 crc kubenswrapper[4748]: I0320 10:38:26.723121 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-multus/cni-sysctl-allowlist-ds-9cpg7" Mar 20 10:38:26 crc kubenswrapper[4748]: I0320 10:38:26.775088 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-multus/cni-sysctl-allowlist-ds-9cpg7" Mar 20 10:38:27 crc kubenswrapper[4748]: I0320 10:38:27.117605 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566710-vh2t7" Mar 20 10:38:27 crc kubenswrapper[4748]: I0320 10:38:27.189518 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2dd2dd89-ba90-440f-abc8-74ab27d7db69-secret-volume\") pod \"2dd2dd89-ba90-440f-abc8-74ab27d7db69\" (UID: \"2dd2dd89-ba90-440f-abc8-74ab27d7db69\") " Mar 20 10:38:27 crc kubenswrapper[4748]: I0320 10:38:27.189899 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2dd2dd89-ba90-440f-abc8-74ab27d7db69-config-volume\") pod \"2dd2dd89-ba90-440f-abc8-74ab27d7db69\" (UID: \"2dd2dd89-ba90-440f-abc8-74ab27d7db69\") " Mar 20 10:38:27 crc kubenswrapper[4748]: I0320 10:38:27.189977 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sxwbs\" (UniqueName: \"kubernetes.io/projected/2dd2dd89-ba90-440f-abc8-74ab27d7db69-kube-api-access-sxwbs\") pod \"2dd2dd89-ba90-440f-abc8-74ab27d7db69\" (UID: \"2dd2dd89-ba90-440f-abc8-74ab27d7db69\") " Mar 20 10:38:27 crc kubenswrapper[4748]: I0320 10:38:27.191682 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2dd2dd89-ba90-440f-abc8-74ab27d7db69-config-volume" (OuterVolumeSpecName: "config-volume") pod "2dd2dd89-ba90-440f-abc8-74ab27d7db69" (UID: "2dd2dd89-ba90-440f-abc8-74ab27d7db69"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:38:27 crc kubenswrapper[4748]: I0320 10:38:27.202793 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2dd2dd89-ba90-440f-abc8-74ab27d7db69-kube-api-access-sxwbs" (OuterVolumeSpecName: "kube-api-access-sxwbs") pod "2dd2dd89-ba90-440f-abc8-74ab27d7db69" (UID: "2dd2dd89-ba90-440f-abc8-74ab27d7db69"). InnerVolumeSpecName "kube-api-access-sxwbs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:38:27 crc kubenswrapper[4748]: I0320 10:38:27.203354 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2dd2dd89-ba90-440f-abc8-74ab27d7db69-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "2dd2dd89-ba90-440f-abc8-74ab27d7db69" (UID: "2dd2dd89-ba90-440f-abc8-74ab27d7db69"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:38:27 crc kubenswrapper[4748]: I0320 10:38:27.290446 4748 ???:1] "http: TLS handshake error from 192.168.126.11:42650: no serving certificate available for the kubelet" Mar 20 10:38:27 crc kubenswrapper[4748]: I0320 10:38:27.291197 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sxwbs\" (UniqueName: \"kubernetes.io/projected/2dd2dd89-ba90-440f-abc8-74ab27d7db69-kube-api-access-sxwbs\") on node \"crc\" DevicePath \"\"" Mar 20 10:38:27 crc kubenswrapper[4748]: I0320 10:38:27.291223 4748 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2dd2dd89-ba90-440f-abc8-74ab27d7db69-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 20 10:38:27 crc kubenswrapper[4748]: I0320 10:38:27.291237 4748 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2dd2dd89-ba90-440f-abc8-74ab27d7db69-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 10:38:27 crc kubenswrapper[4748]: I0320 10:38:27.327108 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-9cpg7"] Mar 20 10:38:27 crc kubenswrapper[4748]: I0320 10:38:27.507469 4748 generic.go:334] "Generic (PLEG): container finished" podID="7e393e84-d0bb-4258-8eef-012c9269fc05" containerID="f656da61255b854e0469306f315d81a1fccfdca6dc84f440c4548beaf65bfa33" exitCode=0 Mar 20 10:38:27 crc kubenswrapper[4748]: I0320 10:38:27.508282 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z78kn" event={"ID":"7e393e84-d0bb-4258-8eef-012c9269fc05","Type":"ContainerDied","Data":"f656da61255b854e0469306f315d81a1fccfdca6dc84f440c4548beaf65bfa33"} Mar 20 10:38:27 crc kubenswrapper[4748]: I0320 10:38:27.553282 4748 generic.go:334] "Generic (PLEG): container finished" podID="a82b758c-eb62-429b-b092-5ba4a7cf665d" containerID="bfc6ee5bd4710e309e9817d2fe156e9ed9ab83c170378c330999947644108fa4" exitCode=0 Mar 20 10:38:27 crc kubenswrapper[4748]: I0320 10:38:27.557558 4748 patch_prober.go:28] interesting pod/router-default-5444994796-rbx59 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 10:38:27 crc kubenswrapper[4748]: [-]has-synced failed: reason withheld Mar 20 10:38:27 crc kubenswrapper[4748]: [+]process-running ok Mar 20 10:38:27 crc kubenswrapper[4748]: healthz check failed Mar 20 10:38:27 crc kubenswrapper[4748]: I0320 10:38:27.557605 4748 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-rbx59" podUID="2c8e6760-4ff7-4417-917e-61bdc115d710" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 10:38:27 crc kubenswrapper[4748]: I0320 10:38:27.561397 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"f802f60b-ea00-4011-ab27-942219aaddd5","Type":"ContainerStarted","Data":"ad91997801e8dbf9198e06a87a4f6fae73f7b6376236d49528f8c1a4cfab4bef"} Mar 20 10:38:27 crc kubenswrapper[4748]: I0320 10:38:27.561459 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"f802f60b-ea00-4011-ab27-942219aaddd5","Type":"ContainerStarted","Data":"e1b946100bb6a4b6f92f89911da93649964929bdadbd17396c4a2856dcc2e645"} Mar 20 10:38:27 crc kubenswrapper[4748]: I0320 10:38:27.561471 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c4jdq" event={"ID":"a82b758c-eb62-429b-b092-5ba4a7cf665d","Type":"ContainerDied","Data":"bfc6ee5bd4710e309e9817d2fe156e9ed9ab83c170378c330999947644108fa4"} Mar 20 10:38:27 crc kubenswrapper[4748]: I0320 10:38:27.572551 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=2.572535131 podStartE2EDuration="2.572535131s" podCreationTimestamp="2026-03-20 10:38:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:38:27.570707963 +0000 UTC m=+142.712253777" watchObservedRunningTime="2026-03-20 10:38:27.572535131 +0000 UTC m=+142.714080945" Mar 20 10:38:27 crc kubenswrapper[4748]: I0320 10:38:27.573754 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566710-vh2t7" Mar 20 10:38:27 crc kubenswrapper[4748]: I0320 10:38:27.575974 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566710-vh2t7" event={"ID":"2dd2dd89-ba90-440f-abc8-74ab27d7db69","Type":"ContainerDied","Data":"6289653e02f4403943a0bee471c80acadad5ccb5e9266bcde2d99f64dea0c33d"} Mar 20 10:38:27 crc kubenswrapper[4748]: I0320 10:38:27.576015 4748 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6289653e02f4403943a0bee471c80acadad5ccb5e9266bcde2d99f64dea0c33d" Mar 20 10:38:28 crc kubenswrapper[4748]: I0320 10:38:28.062410 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 10:38:28 crc kubenswrapper[4748]: I0320 10:38:28.114015 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bb9bf98a-c134-47ae-9ab8-d287fc56d70c-kubelet-dir\") pod \"bb9bf98a-c134-47ae-9ab8-d287fc56d70c\" (UID: \"bb9bf98a-c134-47ae-9ab8-d287fc56d70c\") " Mar 20 10:38:28 crc kubenswrapper[4748]: I0320 10:38:28.114130 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bb9bf98a-c134-47ae-9ab8-d287fc56d70c-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "bb9bf98a-c134-47ae-9ab8-d287fc56d70c" (UID: "bb9bf98a-c134-47ae-9ab8-d287fc56d70c"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 10:38:28 crc kubenswrapper[4748]: I0320 10:38:28.114214 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bb9bf98a-c134-47ae-9ab8-d287fc56d70c-kube-api-access\") pod \"bb9bf98a-c134-47ae-9ab8-d287fc56d70c\" (UID: \"bb9bf98a-c134-47ae-9ab8-d287fc56d70c\") " Mar 20 10:38:28 crc kubenswrapper[4748]: I0320 10:38:28.114487 4748 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bb9bf98a-c134-47ae-9ab8-d287fc56d70c-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 20 10:38:28 crc kubenswrapper[4748]: I0320 10:38:28.125408 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb9bf98a-c134-47ae-9ab8-d287fc56d70c-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "bb9bf98a-c134-47ae-9ab8-d287fc56d70c" (UID: "bb9bf98a-c134-47ae-9ab8-d287fc56d70c"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:38:28 crc kubenswrapper[4748]: I0320 10:38:28.216440 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bb9bf98a-c134-47ae-9ab8-d287fc56d70c-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 10:38:28 crc kubenswrapper[4748]: I0320 10:38:28.558037 4748 patch_prober.go:28] interesting pod/router-default-5444994796-rbx59 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 10:38:28 crc kubenswrapper[4748]: [-]has-synced failed: reason withheld Mar 20 10:38:28 crc kubenswrapper[4748]: [+]process-running ok Mar 20 10:38:28 crc kubenswrapper[4748]: healthz check failed Mar 20 10:38:28 crc kubenswrapper[4748]: I0320 10:38:28.558145 4748 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-rbx59" podUID="2c8e6760-4ff7-4417-917e-61bdc115d710" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 10:38:28 crc kubenswrapper[4748]: I0320 10:38:28.600122 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 10:38:28 crc kubenswrapper[4748]: I0320 10:38:28.600495 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"bb9bf98a-c134-47ae-9ab8-d287fc56d70c","Type":"ContainerDied","Data":"af768150a772b34720cd48b64d855c5c61f6a59b029af0be0ccd3c6241fcc8fb"} Mar 20 10:38:28 crc kubenswrapper[4748]: I0320 10:38:28.600591 4748 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="af768150a772b34720cd48b64d855c5c61f6a59b029af0be0ccd3c6241fcc8fb" Mar 20 10:38:28 crc kubenswrapper[4748]: I0320 10:38:28.607155 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-665b6dd947-26svx_6c6c249f-695d-4875-94ad-a608e8bd7d5f/cluster-samples-operator/0.log" Mar 20 10:38:28 crc kubenswrapper[4748]: I0320 10:38:28.607248 4748 generic.go:334] "Generic (PLEG): container finished" podID="6c6c249f-695d-4875-94ad-a608e8bd7d5f" containerID="12a2bbf0979b12d10c351de55ef1c76b2607e5a8e2db34c4858cb22fa8510416" exitCode=2 Mar 20 10:38:28 crc kubenswrapper[4748]: I0320 10:38:28.607311 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-26svx" event={"ID":"6c6c249f-695d-4875-94ad-a608e8bd7d5f","Type":"ContainerDied","Data":"12a2bbf0979b12d10c351de55ef1c76b2607e5a8e2db34c4858cb22fa8510416"} Mar 20 10:38:28 crc kubenswrapper[4748]: I0320 10:38:28.608245 4748 scope.go:117] "RemoveContainer" containerID="12a2bbf0979b12d10c351de55ef1c76b2607e5a8e2db34c4858cb22fa8510416" Mar 20 10:38:28 crc kubenswrapper[4748]: I0320 10:38:28.614708 4748 generic.go:334] "Generic (PLEG): container finished" podID="f802f60b-ea00-4011-ab27-942219aaddd5" containerID="ad91997801e8dbf9198e06a87a4f6fae73f7b6376236d49528f8c1a4cfab4bef" exitCode=0 Mar 20 10:38:28 crc kubenswrapper[4748]: I0320 10:38:28.614973 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"f802f60b-ea00-4011-ab27-942219aaddd5","Type":"ContainerDied","Data":"ad91997801e8dbf9198e06a87a4f6fae73f7b6376236d49528f8c1a4cfab4bef"} Mar 20 10:38:28 crc kubenswrapper[4748]: I0320 10:38:28.615476 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-multus/cni-sysctl-allowlist-ds-9cpg7" podUID="c0116c88-07ce-460e-90ba-2b5d8fd6a921" containerName="kube-multus-additional-cni-plugins" containerID="cri-o://74d970657f1637dca93f3ccd0f395ea17b1b976614aebcf35538a39c03b75a60" gracePeriod=30 Mar 20 10:38:29 crc kubenswrapper[4748]: I0320 10:38:29.556973 4748 patch_prober.go:28] interesting pod/router-default-5444994796-rbx59 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 10:38:29 crc kubenswrapper[4748]: [-]has-synced failed: reason withheld Mar 20 10:38:29 crc kubenswrapper[4748]: [+]process-running ok Mar 20 10:38:29 crc kubenswrapper[4748]: healthz check failed Mar 20 10:38:29 crc kubenswrapper[4748]: I0320 10:38:29.557374 4748 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-rbx59" podUID="2c8e6760-4ff7-4417-917e-61bdc115d710" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 10:38:29 crc kubenswrapper[4748]: I0320 10:38:29.662868 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-665b6dd947-26svx_6c6c249f-695d-4875-94ad-a608e8bd7d5f/cluster-samples-operator/0.log" Mar 20 10:38:29 crc kubenswrapper[4748]: I0320 10:38:29.663130 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-26svx" event={"ID":"6c6c249f-695d-4875-94ad-a608e8bd7d5f","Type":"ContainerStarted","Data":"9b1432eb0c84f8c7c98d43f41bbbaaba2aa0326b3ebbba8fe886a4821373e635"} Mar 20 10:38:30 crc kubenswrapper[4748]: I0320 10:38:30.019009 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 10:38:30 crc kubenswrapper[4748]: I0320 10:38:30.151507 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f802f60b-ea00-4011-ab27-942219aaddd5-kubelet-dir\") pod \"f802f60b-ea00-4011-ab27-942219aaddd5\" (UID: \"f802f60b-ea00-4011-ab27-942219aaddd5\") " Mar 20 10:38:30 crc kubenswrapper[4748]: I0320 10:38:30.152049 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f802f60b-ea00-4011-ab27-942219aaddd5-kube-api-access\") pod \"f802f60b-ea00-4011-ab27-942219aaddd5\" (UID: \"f802f60b-ea00-4011-ab27-942219aaddd5\") " Mar 20 10:38:30 crc kubenswrapper[4748]: I0320 10:38:30.151656 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f802f60b-ea00-4011-ab27-942219aaddd5-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "f802f60b-ea00-4011-ab27-942219aaddd5" (UID: "f802f60b-ea00-4011-ab27-942219aaddd5"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 10:38:30 crc kubenswrapper[4748]: I0320 10:38:30.170311 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f802f60b-ea00-4011-ab27-942219aaddd5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "f802f60b-ea00-4011-ab27-942219aaddd5" (UID: "f802f60b-ea00-4011-ab27-942219aaddd5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:38:30 crc kubenswrapper[4748]: I0320 10:38:30.254214 4748 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f802f60b-ea00-4011-ab27-942219aaddd5-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 20 10:38:30 crc kubenswrapper[4748]: I0320 10:38:30.254248 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f802f60b-ea00-4011-ab27-942219aaddd5-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 10:38:30 crc kubenswrapper[4748]: I0320 10:38:30.580626 4748 patch_prober.go:28] interesting pod/router-default-5444994796-rbx59 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 10:38:30 crc kubenswrapper[4748]: [-]has-synced failed: reason withheld Mar 20 10:38:30 crc kubenswrapper[4748]: [+]process-running ok Mar 20 10:38:30 crc kubenswrapper[4748]: healthz check failed Mar 20 10:38:30 crc kubenswrapper[4748]: I0320 10:38:30.580693 4748 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-rbx59" podUID="2c8e6760-4ff7-4417-917e-61bdc115d710" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 10:38:30 crc kubenswrapper[4748]: I0320 10:38:30.680007 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"f802f60b-ea00-4011-ab27-942219aaddd5","Type":"ContainerDied","Data":"e1b946100bb6a4b6f92f89911da93649964929bdadbd17396c4a2856dcc2e645"} Mar 20 10:38:30 crc kubenswrapper[4748]: I0320 10:38:30.680049 4748 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e1b946100bb6a4b6f92f89911da93649964929bdadbd17396c4a2856dcc2e645" Mar 20 10:38:30 crc kubenswrapper[4748]: I0320 10:38:30.680068 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 10:38:31 crc kubenswrapper[4748]: I0320 10:38:31.556582 4748 patch_prober.go:28] interesting pod/router-default-5444994796-rbx59 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 10:38:31 crc kubenswrapper[4748]: [-]has-synced failed: reason withheld Mar 20 10:38:31 crc kubenswrapper[4748]: [+]process-running ok Mar 20 10:38:31 crc kubenswrapper[4748]: healthz check failed Mar 20 10:38:31 crc kubenswrapper[4748]: I0320 10:38:31.556639 4748 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-rbx59" podUID="2c8e6760-4ff7-4417-917e-61bdc115d710" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 10:38:31 crc kubenswrapper[4748]: I0320 10:38:31.691605 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-vn8zj" Mar 20 10:38:32 crc kubenswrapper[4748]: I0320 10:38:32.245466 4748 ???:1] "http: TLS handshake error from 192.168.126.11:42656: no serving certificate available for the kubelet" Mar 20 10:38:32 crc kubenswrapper[4748]: I0320 10:38:32.445206 4748 ???:1] "http: TLS handshake error from 192.168.126.11:42660: no serving certificate available for the kubelet" Mar 20 10:38:32 crc kubenswrapper[4748]: I0320 10:38:32.556505 4748 patch_prober.go:28] interesting pod/router-default-5444994796-rbx59 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 10:38:32 crc kubenswrapper[4748]: [-]has-synced failed: reason withheld Mar 20 10:38:32 crc kubenswrapper[4748]: [+]process-running ok Mar 20 10:38:32 crc kubenswrapper[4748]: healthz check failed Mar 20 10:38:32 crc kubenswrapper[4748]: I0320 10:38:32.556589 4748 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-rbx59" podUID="2c8e6760-4ff7-4417-917e-61bdc115d710" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 10:38:33 crc kubenswrapper[4748]: I0320 10:38:33.557473 4748 patch_prober.go:28] interesting pod/router-default-5444994796-rbx59 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 10:38:33 crc kubenswrapper[4748]: [-]has-synced failed: reason withheld Mar 20 10:38:33 crc kubenswrapper[4748]: [+]process-running ok Mar 20 10:38:33 crc kubenswrapper[4748]: healthz check failed Mar 20 10:38:33 crc kubenswrapper[4748]: I0320 10:38:33.557531 4748 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-rbx59" podUID="2c8e6760-4ff7-4417-917e-61bdc115d710" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 10:38:34 crc kubenswrapper[4748]: I0320 10:38:34.556816 4748 patch_prober.go:28] interesting pod/router-default-5444994796-rbx59 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 10:38:34 crc kubenswrapper[4748]: [-]has-synced failed: reason withheld Mar 20 10:38:34 crc kubenswrapper[4748]: [+]process-running ok Mar 20 10:38:34 crc kubenswrapper[4748]: healthz check failed Mar 20 10:38:34 crc kubenswrapper[4748]: I0320 10:38:34.557147 4748 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-rbx59" podUID="2c8e6760-4ff7-4417-917e-61bdc115d710" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 10:38:35 crc kubenswrapper[4748]: I0320 10:38:35.556084 4748 patch_prober.go:28] interesting pod/router-default-5444994796-rbx59 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 10:38:35 crc kubenswrapper[4748]: [-]has-synced failed: reason withheld Mar 20 10:38:35 crc kubenswrapper[4748]: [+]process-running ok Mar 20 10:38:35 crc kubenswrapper[4748]: healthz check failed Mar 20 10:38:35 crc kubenswrapper[4748]: I0320 10:38:35.556167 4748 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-rbx59" podUID="2c8e6760-4ff7-4417-917e-61bdc115d710" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 10:38:35 crc kubenswrapper[4748]: I0320 10:38:35.938504 4748 patch_prober.go:28] interesting pod/downloads-7954f5f757-hxtfq container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.27:8080/\": dial tcp 10.217.0.27:8080: connect: connection refused" start-of-body= Mar 20 10:38:35 crc kubenswrapper[4748]: I0320 10:38:35.938589 4748 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-hxtfq" podUID="0a5bf285-90ef-47c5-a959-7bae63c410a5" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.27:8080/\": dial tcp 10.217.0.27:8080: connect: connection refused" Mar 20 10:38:35 crc kubenswrapper[4748]: I0320 10:38:35.938504 4748 patch_prober.go:28] interesting pod/downloads-7954f5f757-hxtfq container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.27:8080/\": dial tcp 10.217.0.27:8080: connect: connection refused" start-of-body= Mar 20 10:38:35 crc kubenswrapper[4748]: I0320 10:38:35.938660 4748 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-hxtfq" podUID="0a5bf285-90ef-47c5-a959-7bae63c410a5" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.27:8080/\": dial tcp 10.217.0.27:8080: connect: connection refused" Mar 20 10:38:36 crc kubenswrapper[4748]: I0320 10:38:36.281423 4748 patch_prober.go:28] interesting pod/console-f9d7485db-dmmdh container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.19:8443/health\": dial tcp 10.217.0.19:8443: connect: connection refused" start-of-body= Mar 20 10:38:36 crc kubenswrapper[4748]: I0320 10:38:36.281746 4748 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-dmmdh" podUID="ae334fdf-f952-4b6b-8372-1fd7ef332362" containerName="console" probeResult="failure" output="Get \"https://10.217.0.19:8443/health\": dial tcp 10.217.0.19:8443: connect: connection refused" Mar 20 10:38:36 crc kubenswrapper[4748]: I0320 10:38:36.556726 4748 patch_prober.go:28] interesting pod/router-default-5444994796-rbx59 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 10:38:36 crc kubenswrapper[4748]: [-]has-synced failed: reason withheld Mar 20 10:38:36 crc kubenswrapper[4748]: [+]process-running ok Mar 20 10:38:36 crc kubenswrapper[4748]: healthz check failed Mar 20 10:38:36 crc kubenswrapper[4748]: I0320 10:38:36.556784 4748 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-rbx59" podUID="2c8e6760-4ff7-4417-917e-61bdc115d710" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 10:38:36 crc kubenswrapper[4748]: E0320 10:38:36.727700 4748 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="74d970657f1637dca93f3ccd0f395ea17b1b976614aebcf35538a39c03b75a60" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 20 10:38:36 crc kubenswrapper[4748]: E0320 10:38:36.731069 4748 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="74d970657f1637dca93f3ccd0f395ea17b1b976614aebcf35538a39c03b75a60" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 20 10:38:36 crc kubenswrapper[4748]: E0320 10:38:36.735270 4748 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="74d970657f1637dca93f3ccd0f395ea17b1b976614aebcf35538a39c03b75a60" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 20 10:38:36 crc kubenswrapper[4748]: E0320 10:38:36.735307 4748 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-9cpg7" podUID="c0116c88-07ce-460e-90ba-2b5d8fd6a921" containerName="kube-multus-additional-cni-plugins" Mar 20 10:38:37 crc kubenswrapper[4748]: I0320 10:38:37.560377 4748 patch_prober.go:28] interesting pod/router-default-5444994796-rbx59 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 10:38:37 crc kubenswrapper[4748]: [-]has-synced failed: reason withheld Mar 20 10:38:37 crc kubenswrapper[4748]: [+]process-running ok Mar 20 10:38:37 crc kubenswrapper[4748]: healthz check failed Mar 20 10:38:37 crc kubenswrapper[4748]: I0320 10:38:37.560463 4748 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-rbx59" podUID="2c8e6760-4ff7-4417-917e-61bdc115d710" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 10:38:38 crc kubenswrapper[4748]: I0320 10:38:38.735185 4748 patch_prober.go:28] interesting pod/router-default-5444994796-rbx59 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 10:38:38 crc kubenswrapper[4748]: [-]has-synced failed: reason withheld Mar 20 10:38:38 crc kubenswrapper[4748]: [+]process-running ok Mar 20 10:38:38 crc kubenswrapper[4748]: healthz check failed Mar 20 10:38:38 crc kubenswrapper[4748]: I0320 10:38:38.735244 4748 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-rbx59" podUID="2c8e6760-4ff7-4417-917e-61bdc115d710" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 10:38:39 crc kubenswrapper[4748]: I0320 10:38:39.556500 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-rbx59" Mar 20 10:38:39 crc kubenswrapper[4748]: I0320 10:38:39.558795 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-rbx59" Mar 20 10:38:42 crc kubenswrapper[4748]: I0320 10:38:42.071396 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-66957d87b7-q85bk"] Mar 20 10:38:42 crc kubenswrapper[4748]: I0320 10:38:42.072042 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-66957d87b7-q85bk" podUID="e9c71405-b1fd-41f5-8a75-23f81e2b5a5f" containerName="controller-manager" containerID="cri-o://d944b9f004ae9d053dba65cf05ec5acfa9768648878718716f55ee581a98cc26" gracePeriod=30 Mar 20 10:38:42 crc kubenswrapper[4748]: I0320 10:38:42.093904 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-674f4f85f8-gzrft"] Mar 20 10:38:42 crc kubenswrapper[4748]: I0320 10:38:42.094271 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-674f4f85f8-gzrft" podUID="a4f3d9af-dccd-49ed-b84f-1a5ee38037fd" containerName="route-controller-manager" containerID="cri-o://b6506986e3b63742d73a261f5921d54f77dc042470b4819486a8375518450914" gracePeriod=30 Mar 20 10:38:44 crc kubenswrapper[4748]: I0320 10:38:44.555438 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-tlkp2" Mar 20 10:38:44 crc kubenswrapper[4748]: I0320 10:38:44.782445 4748 generic.go:334] "Generic (PLEG): container finished" podID="e9c71405-b1fd-41f5-8a75-23f81e2b5a5f" containerID="d944b9f004ae9d053dba65cf05ec5acfa9768648878718716f55ee581a98cc26" exitCode=0 Mar 20 10:38:44 crc kubenswrapper[4748]: I0320 10:38:44.782562 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-66957d87b7-q85bk" event={"ID":"e9c71405-b1fd-41f5-8a75-23f81e2b5a5f","Type":"ContainerDied","Data":"d944b9f004ae9d053dba65cf05ec5acfa9768648878718716f55ee581a98cc26"} Mar 20 10:38:44 crc kubenswrapper[4748]: I0320 10:38:44.911801 4748 patch_prober.go:28] interesting pod/controller-manager-66957d87b7-q85bk container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.51:8443/healthz\": dial tcp 10.217.0.51:8443: connect: connection refused" start-of-body= Mar 20 10:38:44 crc kubenswrapper[4748]: I0320 10:38:44.911916 4748 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-66957d87b7-q85bk" podUID="e9c71405-b1fd-41f5-8a75-23f81e2b5a5f" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.51:8443/healthz\": dial tcp 10.217.0.51:8443: connect: connection refused" Mar 20 10:38:44 crc kubenswrapper[4748]: I0320 10:38:44.920319 4748 patch_prober.go:28] interesting pod/route-controller-manager-674f4f85f8-gzrft container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.52:8443/healthz\": dial tcp 10.217.0.52:8443: connect: connection refused" start-of-body= Mar 20 10:38:44 crc kubenswrapper[4748]: I0320 10:38:44.920388 4748 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-674f4f85f8-gzrft" podUID="a4f3d9af-dccd-49ed-b84f-1a5ee38037fd" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.52:8443/healthz\": dial tcp 10.217.0.52:8443: connect: connection refused" Mar 20 10:38:45 crc kubenswrapper[4748]: I0320 10:38:45.943359 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-hxtfq" Mar 20 10:38:46 crc kubenswrapper[4748]: I0320 10:38:46.319021 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-dmmdh" Mar 20 10:38:46 crc kubenswrapper[4748]: I0320 10:38:46.325136 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-dmmdh" Mar 20 10:38:46 crc kubenswrapper[4748]: E0320 10:38:46.727276 4748 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="74d970657f1637dca93f3ccd0f395ea17b1b976614aebcf35538a39c03b75a60" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 20 10:38:46 crc kubenswrapper[4748]: E0320 10:38:46.729774 4748 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="74d970657f1637dca93f3ccd0f395ea17b1b976614aebcf35538a39c03b75a60" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 20 10:38:46 crc kubenswrapper[4748]: E0320 10:38:46.731822 4748 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="74d970657f1637dca93f3ccd0f395ea17b1b976614aebcf35538a39c03b75a60" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 20 10:38:46 crc kubenswrapper[4748]: E0320 10:38:46.731973 4748 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-9cpg7" podUID="c0116c88-07ce-460e-90ba-2b5d8fd6a921" containerName="kube-multus-additional-cni-plugins" Mar 20 10:38:50 crc kubenswrapper[4748]: I0320 10:38:50.542139 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Mar 20 10:38:52 crc kubenswrapper[4748]: I0320 10:38:52.949826 4748 ???:1] "http: TLS handshake error from 192.168.126.11:35470: no serving certificate available for the kubelet" Mar 20 10:38:54 crc kubenswrapper[4748]: I0320 10:38:54.708096 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:38:54 crc kubenswrapper[4748]: I0320 10:38:54.757361 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=4.757339064 podStartE2EDuration="4.757339064s" podCreationTimestamp="2026-03-20 10:38:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:38:54.754422127 +0000 UTC m=+169.895967971" watchObservedRunningTime="2026-03-20 10:38:54.757339064 +0000 UTC m=+169.898884898" Mar 20 10:38:54 crc kubenswrapper[4748]: I0320 10:38:54.844511 4748 generic.go:334] "Generic (PLEG): container finished" podID="a4f3d9af-dccd-49ed-b84f-1a5ee38037fd" containerID="b6506986e3b63742d73a261f5921d54f77dc042470b4819486a8375518450914" exitCode=0 Mar 20 10:38:54 crc kubenswrapper[4748]: I0320 10:38:54.844557 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-674f4f85f8-gzrft" event={"ID":"a4f3d9af-dccd-49ed-b84f-1a5ee38037fd","Type":"ContainerDied","Data":"b6506986e3b63742d73a261f5921d54f77dc042470b4819486a8375518450914"} Mar 20 10:38:54 crc kubenswrapper[4748]: I0320 10:38:54.921163 4748 patch_prober.go:28] interesting pod/route-controller-manager-674f4f85f8-gzrft container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.52:8443/healthz\": dial tcp 10.217.0.52:8443: connect: connection refused" start-of-body= Mar 20 10:38:54 crc kubenswrapper[4748]: I0320 10:38:54.921242 4748 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-674f4f85f8-gzrft" podUID="a4f3d9af-dccd-49ed-b84f-1a5ee38037fd" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.52:8443/healthz\": dial tcp 10.217.0.52:8443: connect: connection refused" Mar 20 10:38:55 crc kubenswrapper[4748]: I0320 10:38:55.911675 4748 patch_prober.go:28] interesting pod/controller-manager-66957d87b7-q85bk container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.51:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 10:38:55 crc kubenswrapper[4748]: I0320 10:38:55.912128 4748 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-66957d87b7-q85bk" podUID="e9c71405-b1fd-41f5-8a75-23f81e2b5a5f" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.51:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 10:38:56 crc kubenswrapper[4748]: I0320 10:38:56.644980 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hn5gq" Mar 20 10:38:56 crc kubenswrapper[4748]: E0320 10:38:56.724781 4748 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="74d970657f1637dca93f3ccd0f395ea17b1b976614aebcf35538a39c03b75a60" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 20 10:38:56 crc kubenswrapper[4748]: E0320 10:38:56.726737 4748 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="74d970657f1637dca93f3ccd0f395ea17b1b976614aebcf35538a39c03b75a60" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 20 10:38:56 crc kubenswrapper[4748]: E0320 10:38:56.728211 4748 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="74d970657f1637dca93f3ccd0f395ea17b1b976614aebcf35538a39c03b75a60" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 20 10:38:56 crc kubenswrapper[4748]: E0320 10:38:56.728249 4748 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-9cpg7" podUID="c0116c88-07ce-460e-90ba-2b5d8fd6a921" containerName="kube-multus-additional-cni-plugins" Mar 20 10:38:59 crc kubenswrapper[4748]: I0320 10:38:59.367420 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 20 10:38:59 crc kubenswrapper[4748]: E0320 10:38:59.368155 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f802f60b-ea00-4011-ab27-942219aaddd5" containerName="pruner" Mar 20 10:38:59 crc kubenswrapper[4748]: I0320 10:38:59.368177 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="f802f60b-ea00-4011-ab27-942219aaddd5" containerName="pruner" Mar 20 10:38:59 crc kubenswrapper[4748]: E0320 10:38:59.368204 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb9bf98a-c134-47ae-9ab8-d287fc56d70c" containerName="pruner" Mar 20 10:38:59 crc kubenswrapper[4748]: I0320 10:38:59.368216 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb9bf98a-c134-47ae-9ab8-d287fc56d70c" containerName="pruner" Mar 20 10:38:59 crc kubenswrapper[4748]: E0320 10:38:59.368302 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2dd2dd89-ba90-440f-abc8-74ab27d7db69" containerName="collect-profiles" Mar 20 10:38:59 crc kubenswrapper[4748]: I0320 10:38:59.368317 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="2dd2dd89-ba90-440f-abc8-74ab27d7db69" containerName="collect-profiles" Mar 20 10:38:59 crc kubenswrapper[4748]: I0320 10:38:59.369383 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb9bf98a-c134-47ae-9ab8-d287fc56d70c" containerName="pruner" Mar 20 10:38:59 crc kubenswrapper[4748]: I0320 10:38:59.369428 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="2dd2dd89-ba90-440f-abc8-74ab27d7db69" containerName="collect-profiles" Mar 20 10:38:59 crc kubenswrapper[4748]: I0320 10:38:59.369445 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="f802f60b-ea00-4011-ab27-942219aaddd5" containerName="pruner" Mar 20 10:38:59 crc kubenswrapper[4748]: I0320 10:38:59.372984 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 10:38:59 crc kubenswrapper[4748]: I0320 10:38:59.375016 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 20 10:38:59 crc kubenswrapper[4748]: I0320 10:38:59.384280 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 20 10:38:59 crc kubenswrapper[4748]: I0320 10:38:59.384426 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 20 10:38:59 crc kubenswrapper[4748]: I0320 10:38:59.504266 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c183842b-4ff7-4ea0-b4fb-a1641c33efd1-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"c183842b-4ff7-4ea0-b4fb-a1641c33efd1\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 10:38:59 crc kubenswrapper[4748]: I0320 10:38:59.504547 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c183842b-4ff7-4ea0-b4fb-a1641c33efd1-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"c183842b-4ff7-4ea0-b4fb-a1641c33efd1\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 10:38:59 crc kubenswrapper[4748]: I0320 10:38:59.605688 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c183842b-4ff7-4ea0-b4fb-a1641c33efd1-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"c183842b-4ff7-4ea0-b4fb-a1641c33efd1\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 10:38:59 crc kubenswrapper[4748]: I0320 10:38:59.605826 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c183842b-4ff7-4ea0-b4fb-a1641c33efd1-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"c183842b-4ff7-4ea0-b4fb-a1641c33efd1\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 10:38:59 crc kubenswrapper[4748]: I0320 10:38:59.606195 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c183842b-4ff7-4ea0-b4fb-a1641c33efd1-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"c183842b-4ff7-4ea0-b4fb-a1641c33efd1\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 10:38:59 crc kubenswrapper[4748]: I0320 10:38:59.627013 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c183842b-4ff7-4ea0-b4fb-a1641c33efd1-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"c183842b-4ff7-4ea0-b4fb-a1641c33efd1\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 10:38:59 crc kubenswrapper[4748]: I0320 10:38:59.701151 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 10:38:59 crc kubenswrapper[4748]: I0320 10:38:59.876291 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-9cpg7_c0116c88-07ce-460e-90ba-2b5d8fd6a921/kube-multus-additional-cni-plugins/0.log" Mar 20 10:38:59 crc kubenswrapper[4748]: I0320 10:38:59.876350 4748 generic.go:334] "Generic (PLEG): container finished" podID="c0116c88-07ce-460e-90ba-2b5d8fd6a921" containerID="74d970657f1637dca93f3ccd0f395ea17b1b976614aebcf35538a39c03b75a60" exitCode=137 Mar 20 10:38:59 crc kubenswrapper[4748]: I0320 10:38:59.876389 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-9cpg7" event={"ID":"c0116c88-07ce-460e-90ba-2b5d8fd6a921","Type":"ContainerDied","Data":"74d970657f1637dca93f3ccd0f395ea17b1b976614aebcf35538a39c03b75a60"} Mar 20 10:39:01 crc kubenswrapper[4748]: E0320 10:39:01.413914 4748 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift4/ose-cli:latest" Mar 20 10:39:01 crc kubenswrapper[4748]: E0320 10:39:01.414328 4748 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 10:39:01 crc kubenswrapper[4748]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Mar 20 10:39:01 crc kubenswrapper[4748]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-wpwpj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29566718-7hmv9_openshift-infra(0de9aa72-edab-4ae9-b2dd-e20ef6b83277): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled Mar 20 10:39:01 crc kubenswrapper[4748]: > logger="UnhandledError" Mar 20 10:39:01 crc kubenswrapper[4748]: E0320 10:39:01.415519 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-infra/auto-csr-approver-29566718-7hmv9" podUID="0de9aa72-edab-4ae9-b2dd-e20ef6b83277" Mar 20 10:39:01 crc kubenswrapper[4748]: I0320 10:39:01.442338 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-66957d87b7-q85bk" Mar 20 10:39:01 crc kubenswrapper[4748]: I0320 10:39:01.447081 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-674f4f85f8-gzrft" Mar 20 10:39:01 crc kubenswrapper[4748]: I0320 10:39:01.470101 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-9f7b5d9c8-lxv44"] Mar 20 10:39:01 crc kubenswrapper[4748]: E0320 10:39:01.470396 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4f3d9af-dccd-49ed-b84f-1a5ee38037fd" containerName="route-controller-manager" Mar 20 10:39:01 crc kubenswrapper[4748]: I0320 10:39:01.470413 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4f3d9af-dccd-49ed-b84f-1a5ee38037fd" containerName="route-controller-manager" Mar 20 10:39:01 crc kubenswrapper[4748]: E0320 10:39:01.470435 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9c71405-b1fd-41f5-8a75-23f81e2b5a5f" containerName="controller-manager" Mar 20 10:39:01 crc kubenswrapper[4748]: I0320 10:39:01.470444 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9c71405-b1fd-41f5-8a75-23f81e2b5a5f" containerName="controller-manager" Mar 20 10:39:01 crc kubenswrapper[4748]: I0320 10:39:01.470603 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9c71405-b1fd-41f5-8a75-23f81e2b5a5f" containerName="controller-manager" Mar 20 10:39:01 crc kubenswrapper[4748]: I0320 10:39:01.470617 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4f3d9af-dccd-49ed-b84f-1a5ee38037fd" containerName="route-controller-manager" Mar 20 10:39:01 crc kubenswrapper[4748]: I0320 10:39:01.471132 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-9f7b5d9c8-lxv44" Mar 20 10:39:01 crc kubenswrapper[4748]: I0320 10:39:01.475668 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-9f7b5d9c8-lxv44"] Mar 20 10:39:01 crc kubenswrapper[4748]: I0320 10:39:01.531410 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5rpww\" (UniqueName: \"kubernetes.io/projected/a4f3d9af-dccd-49ed-b84f-1a5ee38037fd-kube-api-access-5rpww\") pod \"a4f3d9af-dccd-49ed-b84f-1a5ee38037fd\" (UID: \"a4f3d9af-dccd-49ed-b84f-1a5ee38037fd\") " Mar 20 10:39:01 crc kubenswrapper[4748]: I0320 10:39:01.531504 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a4f3d9af-dccd-49ed-b84f-1a5ee38037fd-serving-cert\") pod \"a4f3d9af-dccd-49ed-b84f-1a5ee38037fd\" (UID: \"a4f3d9af-dccd-49ed-b84f-1a5ee38037fd\") " Mar 20 10:39:01 crc kubenswrapper[4748]: I0320 10:39:01.531548 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-crg2s\" (UniqueName: \"kubernetes.io/projected/e9c71405-b1fd-41f5-8a75-23f81e2b5a5f-kube-api-access-crg2s\") pod \"e9c71405-b1fd-41f5-8a75-23f81e2b5a5f\" (UID: \"e9c71405-b1fd-41f5-8a75-23f81e2b5a5f\") " Mar 20 10:39:01 crc kubenswrapper[4748]: I0320 10:39:01.531569 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4f3d9af-dccd-49ed-b84f-1a5ee38037fd-config\") pod \"a4f3d9af-dccd-49ed-b84f-1a5ee38037fd\" (UID: \"a4f3d9af-dccd-49ed-b84f-1a5ee38037fd\") " Mar 20 10:39:01 crc kubenswrapper[4748]: I0320 10:39:01.531642 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e9c71405-b1fd-41f5-8a75-23f81e2b5a5f-client-ca\") pod \"e9c71405-b1fd-41f5-8a75-23f81e2b5a5f\" (UID: \"e9c71405-b1fd-41f5-8a75-23f81e2b5a5f\") " Mar 20 10:39:01 crc kubenswrapper[4748]: I0320 10:39:01.531662 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e9c71405-b1fd-41f5-8a75-23f81e2b5a5f-proxy-ca-bundles\") pod \"e9c71405-b1fd-41f5-8a75-23f81e2b5a5f\" (UID: \"e9c71405-b1fd-41f5-8a75-23f81e2b5a5f\") " Mar 20 10:39:01 crc kubenswrapper[4748]: I0320 10:39:01.531713 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e9c71405-b1fd-41f5-8a75-23f81e2b5a5f-config\") pod \"e9c71405-b1fd-41f5-8a75-23f81e2b5a5f\" (UID: \"e9c71405-b1fd-41f5-8a75-23f81e2b5a5f\") " Mar 20 10:39:01 crc kubenswrapper[4748]: I0320 10:39:01.531739 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e9c71405-b1fd-41f5-8a75-23f81e2b5a5f-serving-cert\") pod \"e9c71405-b1fd-41f5-8a75-23f81e2b5a5f\" (UID: \"e9c71405-b1fd-41f5-8a75-23f81e2b5a5f\") " Mar 20 10:39:01 crc kubenswrapper[4748]: I0320 10:39:01.531776 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a4f3d9af-dccd-49ed-b84f-1a5ee38037fd-client-ca\") pod \"a4f3d9af-dccd-49ed-b84f-1a5ee38037fd\" (UID: \"a4f3d9af-dccd-49ed-b84f-1a5ee38037fd\") " Mar 20 10:39:01 crc kubenswrapper[4748]: I0320 10:39:01.532702 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4f3d9af-dccd-49ed-b84f-1a5ee38037fd-client-ca" (OuterVolumeSpecName: "client-ca") pod "a4f3d9af-dccd-49ed-b84f-1a5ee38037fd" (UID: "a4f3d9af-dccd-49ed-b84f-1a5ee38037fd"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:39:01 crc kubenswrapper[4748]: I0320 10:39:01.532716 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e9c71405-b1fd-41f5-8a75-23f81e2b5a5f-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "e9c71405-b1fd-41f5-8a75-23f81e2b5a5f" (UID: "e9c71405-b1fd-41f5-8a75-23f81e2b5a5f"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:39:01 crc kubenswrapper[4748]: I0320 10:39:01.532738 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e9c71405-b1fd-41f5-8a75-23f81e2b5a5f-client-ca" (OuterVolumeSpecName: "client-ca") pod "e9c71405-b1fd-41f5-8a75-23f81e2b5a5f" (UID: "e9c71405-b1fd-41f5-8a75-23f81e2b5a5f"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:39:01 crc kubenswrapper[4748]: I0320 10:39:01.532779 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e9c71405-b1fd-41f5-8a75-23f81e2b5a5f-config" (OuterVolumeSpecName: "config") pod "e9c71405-b1fd-41f5-8a75-23f81e2b5a5f" (UID: "e9c71405-b1fd-41f5-8a75-23f81e2b5a5f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:39:01 crc kubenswrapper[4748]: I0320 10:39:01.532919 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4f3d9af-dccd-49ed-b84f-1a5ee38037fd-config" (OuterVolumeSpecName: "config") pod "a4f3d9af-dccd-49ed-b84f-1a5ee38037fd" (UID: "a4f3d9af-dccd-49ed-b84f-1a5ee38037fd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:39:01 crc kubenswrapper[4748]: I0320 10:39:01.538228 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4f3d9af-dccd-49ed-b84f-1a5ee38037fd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "a4f3d9af-dccd-49ed-b84f-1a5ee38037fd" (UID: "a4f3d9af-dccd-49ed-b84f-1a5ee38037fd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:39:01 crc kubenswrapper[4748]: I0320 10:39:01.538244 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9c71405-b1fd-41f5-8a75-23f81e2b5a5f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e9c71405-b1fd-41f5-8a75-23f81e2b5a5f" (UID: "e9c71405-b1fd-41f5-8a75-23f81e2b5a5f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:39:01 crc kubenswrapper[4748]: I0320 10:39:01.538270 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4f3d9af-dccd-49ed-b84f-1a5ee38037fd-kube-api-access-5rpww" (OuterVolumeSpecName: "kube-api-access-5rpww") pod "a4f3d9af-dccd-49ed-b84f-1a5ee38037fd" (UID: "a4f3d9af-dccd-49ed-b84f-1a5ee38037fd"). InnerVolumeSpecName "kube-api-access-5rpww". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:39:01 crc kubenswrapper[4748]: I0320 10:39:01.538424 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9c71405-b1fd-41f5-8a75-23f81e2b5a5f-kube-api-access-crg2s" (OuterVolumeSpecName: "kube-api-access-crg2s") pod "e9c71405-b1fd-41f5-8a75-23f81e2b5a5f" (UID: "e9c71405-b1fd-41f5-8a75-23f81e2b5a5f"). InnerVolumeSpecName "kube-api-access-crg2s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:39:01 crc kubenswrapper[4748]: I0320 10:39:01.633569 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ssq8b\" (UniqueName: \"kubernetes.io/projected/fac3e59d-eae4-4116-adb8-eca1c29a6b4f-kube-api-access-ssq8b\") pod \"controller-manager-9f7b5d9c8-lxv44\" (UID: \"fac3e59d-eae4-4116-adb8-eca1c29a6b4f\") " pod="openshift-controller-manager/controller-manager-9f7b5d9c8-lxv44" Mar 20 10:39:01 crc kubenswrapper[4748]: I0320 10:39:01.633742 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fac3e59d-eae4-4116-adb8-eca1c29a6b4f-config\") pod \"controller-manager-9f7b5d9c8-lxv44\" (UID: \"fac3e59d-eae4-4116-adb8-eca1c29a6b4f\") " pod="openshift-controller-manager/controller-manager-9f7b5d9c8-lxv44" Mar 20 10:39:01 crc kubenswrapper[4748]: I0320 10:39:01.633772 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fac3e59d-eae4-4116-adb8-eca1c29a6b4f-proxy-ca-bundles\") pod \"controller-manager-9f7b5d9c8-lxv44\" (UID: \"fac3e59d-eae4-4116-adb8-eca1c29a6b4f\") " pod="openshift-controller-manager/controller-manager-9f7b5d9c8-lxv44" Mar 20 10:39:01 crc kubenswrapper[4748]: I0320 10:39:01.633799 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fac3e59d-eae4-4116-adb8-eca1c29a6b4f-serving-cert\") pod \"controller-manager-9f7b5d9c8-lxv44\" (UID: \"fac3e59d-eae4-4116-adb8-eca1c29a6b4f\") " pod="openshift-controller-manager/controller-manager-9f7b5d9c8-lxv44" Mar 20 10:39:01 crc kubenswrapper[4748]: I0320 10:39:01.633828 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fac3e59d-eae4-4116-adb8-eca1c29a6b4f-client-ca\") pod \"controller-manager-9f7b5d9c8-lxv44\" (UID: \"fac3e59d-eae4-4116-adb8-eca1c29a6b4f\") " pod="openshift-controller-manager/controller-manager-9f7b5d9c8-lxv44" Mar 20 10:39:01 crc kubenswrapper[4748]: I0320 10:39:01.633934 4748 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e9c71405-b1fd-41f5-8a75-23f81e2b5a5f-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 10:39:01 crc kubenswrapper[4748]: I0320 10:39:01.633954 4748 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e9c71405-b1fd-41f5-8a75-23f81e2b5a5f-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 10:39:01 crc kubenswrapper[4748]: I0320 10:39:01.633967 4748 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e9c71405-b1fd-41f5-8a75-23f81e2b5a5f-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:39:01 crc kubenswrapper[4748]: I0320 10:39:01.633979 4748 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e9c71405-b1fd-41f5-8a75-23f81e2b5a5f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:39:01 crc kubenswrapper[4748]: I0320 10:39:01.633991 4748 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a4f3d9af-dccd-49ed-b84f-1a5ee38037fd-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 10:39:01 crc kubenswrapper[4748]: I0320 10:39:01.634002 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5rpww\" (UniqueName: \"kubernetes.io/projected/a4f3d9af-dccd-49ed-b84f-1a5ee38037fd-kube-api-access-5rpww\") on node \"crc\" DevicePath \"\"" Mar 20 10:39:01 crc kubenswrapper[4748]: I0320 10:39:01.634012 4748 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a4f3d9af-dccd-49ed-b84f-1a5ee38037fd-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:39:01 crc kubenswrapper[4748]: I0320 10:39:01.634023 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-crg2s\" (UniqueName: \"kubernetes.io/projected/e9c71405-b1fd-41f5-8a75-23f81e2b5a5f-kube-api-access-crg2s\") on node \"crc\" DevicePath \"\"" Mar 20 10:39:01 crc kubenswrapper[4748]: I0320 10:39:01.634036 4748 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4f3d9af-dccd-49ed-b84f-1a5ee38037fd-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:39:01 crc kubenswrapper[4748]: I0320 10:39:01.734918 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fac3e59d-eae4-4116-adb8-eca1c29a6b4f-client-ca\") pod \"controller-manager-9f7b5d9c8-lxv44\" (UID: \"fac3e59d-eae4-4116-adb8-eca1c29a6b4f\") " pod="openshift-controller-manager/controller-manager-9f7b5d9c8-lxv44" Mar 20 10:39:01 crc kubenswrapper[4748]: I0320 10:39:01.734975 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ssq8b\" (UniqueName: \"kubernetes.io/projected/fac3e59d-eae4-4116-adb8-eca1c29a6b4f-kube-api-access-ssq8b\") pod \"controller-manager-9f7b5d9c8-lxv44\" (UID: \"fac3e59d-eae4-4116-adb8-eca1c29a6b4f\") " pod="openshift-controller-manager/controller-manager-9f7b5d9c8-lxv44" Mar 20 10:39:01 crc kubenswrapper[4748]: I0320 10:39:01.735788 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fac3e59d-eae4-4116-adb8-eca1c29a6b4f-config\") pod \"controller-manager-9f7b5d9c8-lxv44\" (UID: \"fac3e59d-eae4-4116-adb8-eca1c29a6b4f\") " pod="openshift-controller-manager/controller-manager-9f7b5d9c8-lxv44" Mar 20 10:39:01 crc kubenswrapper[4748]: I0320 10:39:01.736826 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fac3e59d-eae4-4116-adb8-eca1c29a6b4f-proxy-ca-bundles\") pod \"controller-manager-9f7b5d9c8-lxv44\" (UID: \"fac3e59d-eae4-4116-adb8-eca1c29a6b4f\") " pod="openshift-controller-manager/controller-manager-9f7b5d9c8-lxv44" Mar 20 10:39:01 crc kubenswrapper[4748]: I0320 10:39:01.736955 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fac3e59d-eae4-4116-adb8-eca1c29a6b4f-config\") pod \"controller-manager-9f7b5d9c8-lxv44\" (UID: \"fac3e59d-eae4-4116-adb8-eca1c29a6b4f\") " pod="openshift-controller-manager/controller-manager-9f7b5d9c8-lxv44" Mar 20 10:39:01 crc kubenswrapper[4748]: I0320 10:39:01.736986 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fac3e59d-eae4-4116-adb8-eca1c29a6b4f-client-ca\") pod \"controller-manager-9f7b5d9c8-lxv44\" (UID: \"fac3e59d-eae4-4116-adb8-eca1c29a6b4f\") " pod="openshift-controller-manager/controller-manager-9f7b5d9c8-lxv44" Mar 20 10:39:01 crc kubenswrapper[4748]: I0320 10:39:01.736985 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fac3e59d-eae4-4116-adb8-eca1c29a6b4f-proxy-ca-bundles\") pod \"controller-manager-9f7b5d9c8-lxv44\" (UID: \"fac3e59d-eae4-4116-adb8-eca1c29a6b4f\") " pod="openshift-controller-manager/controller-manager-9f7b5d9c8-lxv44" Mar 20 10:39:01 crc kubenswrapper[4748]: I0320 10:39:01.737135 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fac3e59d-eae4-4116-adb8-eca1c29a6b4f-serving-cert\") pod \"controller-manager-9f7b5d9c8-lxv44\" (UID: \"fac3e59d-eae4-4116-adb8-eca1c29a6b4f\") " pod="openshift-controller-manager/controller-manager-9f7b5d9c8-lxv44" Mar 20 10:39:01 crc kubenswrapper[4748]: I0320 10:39:01.741142 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fac3e59d-eae4-4116-adb8-eca1c29a6b4f-serving-cert\") pod \"controller-manager-9f7b5d9c8-lxv44\" (UID: \"fac3e59d-eae4-4116-adb8-eca1c29a6b4f\") " pod="openshift-controller-manager/controller-manager-9f7b5d9c8-lxv44" Mar 20 10:39:01 crc kubenswrapper[4748]: I0320 10:39:01.751934 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ssq8b\" (UniqueName: \"kubernetes.io/projected/fac3e59d-eae4-4116-adb8-eca1c29a6b4f-kube-api-access-ssq8b\") pod \"controller-manager-9f7b5d9c8-lxv44\" (UID: \"fac3e59d-eae4-4116-adb8-eca1c29a6b4f\") " pod="openshift-controller-manager/controller-manager-9f7b5d9c8-lxv44" Mar 20 10:39:01 crc kubenswrapper[4748]: I0320 10:39:01.805791 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-9f7b5d9c8-lxv44" Mar 20 10:39:01 crc kubenswrapper[4748]: I0320 10:39:01.888244 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-674f4f85f8-gzrft" event={"ID":"a4f3d9af-dccd-49ed-b84f-1a5ee38037fd","Type":"ContainerDied","Data":"a16c9bb05f75e4e69e25f14e4da380dba13e34a72e86c0fb9836491a66aa0d07"} Mar 20 10:39:01 crc kubenswrapper[4748]: I0320 10:39:01.888276 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-674f4f85f8-gzrft" Mar 20 10:39:01 crc kubenswrapper[4748]: I0320 10:39:01.888295 4748 scope.go:117] "RemoveContainer" containerID="b6506986e3b63742d73a261f5921d54f77dc042470b4819486a8375518450914" Mar 20 10:39:01 crc kubenswrapper[4748]: I0320 10:39:01.890929 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-66957d87b7-q85bk" event={"ID":"e9c71405-b1fd-41f5-8a75-23f81e2b5a5f","Type":"ContainerDied","Data":"d06c41f4353d612fa95fd785156b763fe14996c95bfa7a5afb5e37f1a65433b2"} Mar 20 10:39:01 crc kubenswrapper[4748]: I0320 10:39:01.890954 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-66957d87b7-q85bk" Mar 20 10:39:01 crc kubenswrapper[4748]: E0320 10:39:01.900520 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29566718-7hmv9" podUID="0de9aa72-edab-4ae9-b2dd-e20ef6b83277" Mar 20 10:39:01 crc kubenswrapper[4748]: I0320 10:39:01.933293 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-674f4f85f8-gzrft"] Mar 20 10:39:01 crc kubenswrapper[4748]: I0320 10:39:01.937921 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-674f4f85f8-gzrft"] Mar 20 10:39:01 crc kubenswrapper[4748]: I0320 10:39:01.950513 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-66957d87b7-q85bk"] Mar 20 10:39:01 crc kubenswrapper[4748]: I0320 10:39:01.957890 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-66957d87b7-q85bk"] Mar 20 10:39:02 crc kubenswrapper[4748]: I0320 10:39:02.017266 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-9f7b5d9c8-lxv44"] Mar 20 10:39:02 crc kubenswrapper[4748]: I0320 10:39:02.142483 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-dfc8ff6c5-9t8fr"] Mar 20 10:39:02 crc kubenswrapper[4748]: I0320 10:39:02.143202 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-dfc8ff6c5-9t8fr" Mar 20 10:39:02 crc kubenswrapper[4748]: I0320 10:39:02.145789 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 20 10:39:02 crc kubenswrapper[4748]: I0320 10:39:02.145997 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 20 10:39:02 crc kubenswrapper[4748]: I0320 10:39:02.146256 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 20 10:39:02 crc kubenswrapper[4748]: I0320 10:39:02.146340 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 20 10:39:02 crc kubenswrapper[4748]: I0320 10:39:02.147281 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 20 10:39:02 crc kubenswrapper[4748]: I0320 10:39:02.149024 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 20 10:39:02 crc kubenswrapper[4748]: I0320 10:39:02.160172 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-dfc8ff6c5-9t8fr"] Mar 20 10:39:02 crc kubenswrapper[4748]: I0320 10:39:02.242131 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a9274aea-7ec5-410b-a183-f67a45ee5241-client-ca\") pod \"route-controller-manager-dfc8ff6c5-9t8fr\" (UID: \"a9274aea-7ec5-410b-a183-f67a45ee5241\") " pod="openshift-route-controller-manager/route-controller-manager-dfc8ff6c5-9t8fr" Mar 20 10:39:02 crc kubenswrapper[4748]: I0320 10:39:02.242202 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5cwbv\" (UniqueName: \"kubernetes.io/projected/a9274aea-7ec5-410b-a183-f67a45ee5241-kube-api-access-5cwbv\") pod \"route-controller-manager-dfc8ff6c5-9t8fr\" (UID: \"a9274aea-7ec5-410b-a183-f67a45ee5241\") " pod="openshift-route-controller-manager/route-controller-manager-dfc8ff6c5-9t8fr" Mar 20 10:39:02 crc kubenswrapper[4748]: I0320 10:39:02.242250 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a9274aea-7ec5-410b-a183-f67a45ee5241-serving-cert\") pod \"route-controller-manager-dfc8ff6c5-9t8fr\" (UID: \"a9274aea-7ec5-410b-a183-f67a45ee5241\") " pod="openshift-route-controller-manager/route-controller-manager-dfc8ff6c5-9t8fr" Mar 20 10:39:02 crc kubenswrapper[4748]: I0320 10:39:02.242295 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9274aea-7ec5-410b-a183-f67a45ee5241-config\") pod \"route-controller-manager-dfc8ff6c5-9t8fr\" (UID: \"a9274aea-7ec5-410b-a183-f67a45ee5241\") " pod="openshift-route-controller-manager/route-controller-manager-dfc8ff6c5-9t8fr" Mar 20 10:39:02 crc kubenswrapper[4748]: I0320 10:39:02.343765 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9274aea-7ec5-410b-a183-f67a45ee5241-config\") pod \"route-controller-manager-dfc8ff6c5-9t8fr\" (UID: \"a9274aea-7ec5-410b-a183-f67a45ee5241\") " pod="openshift-route-controller-manager/route-controller-manager-dfc8ff6c5-9t8fr" Mar 20 10:39:02 crc kubenswrapper[4748]: I0320 10:39:02.343872 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a9274aea-7ec5-410b-a183-f67a45ee5241-client-ca\") pod \"route-controller-manager-dfc8ff6c5-9t8fr\" (UID: \"a9274aea-7ec5-410b-a183-f67a45ee5241\") " pod="openshift-route-controller-manager/route-controller-manager-dfc8ff6c5-9t8fr" Mar 20 10:39:02 crc kubenswrapper[4748]: I0320 10:39:02.343893 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5cwbv\" (UniqueName: \"kubernetes.io/projected/a9274aea-7ec5-410b-a183-f67a45ee5241-kube-api-access-5cwbv\") pod \"route-controller-manager-dfc8ff6c5-9t8fr\" (UID: \"a9274aea-7ec5-410b-a183-f67a45ee5241\") " pod="openshift-route-controller-manager/route-controller-manager-dfc8ff6c5-9t8fr" Mar 20 10:39:02 crc kubenswrapper[4748]: I0320 10:39:02.343926 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a9274aea-7ec5-410b-a183-f67a45ee5241-serving-cert\") pod \"route-controller-manager-dfc8ff6c5-9t8fr\" (UID: \"a9274aea-7ec5-410b-a183-f67a45ee5241\") " pod="openshift-route-controller-manager/route-controller-manager-dfc8ff6c5-9t8fr" Mar 20 10:39:02 crc kubenswrapper[4748]: I0320 10:39:02.344932 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a9274aea-7ec5-410b-a183-f67a45ee5241-client-ca\") pod \"route-controller-manager-dfc8ff6c5-9t8fr\" (UID: \"a9274aea-7ec5-410b-a183-f67a45ee5241\") " pod="openshift-route-controller-manager/route-controller-manager-dfc8ff6c5-9t8fr" Mar 20 10:39:02 crc kubenswrapper[4748]: I0320 10:39:02.345109 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9274aea-7ec5-410b-a183-f67a45ee5241-config\") pod \"route-controller-manager-dfc8ff6c5-9t8fr\" (UID: \"a9274aea-7ec5-410b-a183-f67a45ee5241\") " pod="openshift-route-controller-manager/route-controller-manager-dfc8ff6c5-9t8fr" Mar 20 10:39:02 crc kubenswrapper[4748]: I0320 10:39:02.347313 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a9274aea-7ec5-410b-a183-f67a45ee5241-serving-cert\") pod \"route-controller-manager-dfc8ff6c5-9t8fr\" (UID: \"a9274aea-7ec5-410b-a183-f67a45ee5241\") " pod="openshift-route-controller-manager/route-controller-manager-dfc8ff6c5-9t8fr" Mar 20 10:39:02 crc kubenswrapper[4748]: I0320 10:39:02.378645 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5cwbv\" (UniqueName: \"kubernetes.io/projected/a9274aea-7ec5-410b-a183-f67a45ee5241-kube-api-access-5cwbv\") pod \"route-controller-manager-dfc8ff6c5-9t8fr\" (UID: \"a9274aea-7ec5-410b-a183-f67a45ee5241\") " pod="openshift-route-controller-manager/route-controller-manager-dfc8ff6c5-9t8fr" Mar 20 10:39:02 crc kubenswrapper[4748]: I0320 10:39:02.457315 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-dfc8ff6c5-9t8fr" Mar 20 10:39:03 crc kubenswrapper[4748]: E0320 10:39:03.129266 4748 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 20 10:39:03 crc kubenswrapper[4748]: E0320 10:39:03.129790 4748 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-sjvb4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-mhpdf_openshift-marketplace(623e7576-03b2-41ba-9cd5-e7d3ab7ecdd7): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 10:39:03 crc kubenswrapper[4748]: E0320 10:39:03.130964 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-mhpdf" podUID="623e7576-03b2-41ba-9cd5-e7d3ab7ecdd7" Mar 20 10:39:03 crc kubenswrapper[4748]: I0320 10:39:03.522014 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4f3d9af-dccd-49ed-b84f-1a5ee38037fd" path="/var/lib/kubelet/pods/a4f3d9af-dccd-49ed-b84f-1a5ee38037fd/volumes" Mar 20 10:39:03 crc kubenswrapper[4748]: I0320 10:39:03.522508 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9c71405-b1fd-41f5-8a75-23f81e2b5a5f" path="/var/lib/kubelet/pods/e9c71405-b1fd-41f5-8a75-23f81e2b5a5f/volumes" Mar 20 10:39:04 crc kubenswrapper[4748]: I0320 10:39:04.563622 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 20 10:39:04 crc kubenswrapper[4748]: I0320 10:39:04.565199 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 20 10:39:04 crc kubenswrapper[4748]: I0320 10:39:04.568883 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 20 10:39:04 crc kubenswrapper[4748]: I0320 10:39:04.677984 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/7d42201c-acc6-40d3-85a5-ad7c4a0cbf8d-var-lock\") pod \"installer-9-crc\" (UID: \"7d42201c-acc6-40d3-85a5-ad7c4a0cbf8d\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 10:39:04 crc kubenswrapper[4748]: I0320 10:39:04.678041 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7d42201c-acc6-40d3-85a5-ad7c4a0cbf8d-kube-api-access\") pod \"installer-9-crc\" (UID: \"7d42201c-acc6-40d3-85a5-ad7c4a0cbf8d\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 10:39:04 crc kubenswrapper[4748]: I0320 10:39:04.678085 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7d42201c-acc6-40d3-85a5-ad7c4a0cbf8d-kubelet-dir\") pod \"installer-9-crc\" (UID: \"7d42201c-acc6-40d3-85a5-ad7c4a0cbf8d\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 10:39:04 crc kubenswrapper[4748]: I0320 10:39:04.779097 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7d42201c-acc6-40d3-85a5-ad7c4a0cbf8d-kube-api-access\") pod \"installer-9-crc\" (UID: \"7d42201c-acc6-40d3-85a5-ad7c4a0cbf8d\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 10:39:04 crc kubenswrapper[4748]: I0320 10:39:04.779396 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7d42201c-acc6-40d3-85a5-ad7c4a0cbf8d-kubelet-dir\") pod \"installer-9-crc\" (UID: \"7d42201c-acc6-40d3-85a5-ad7c4a0cbf8d\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 10:39:04 crc kubenswrapper[4748]: I0320 10:39:04.779480 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/7d42201c-acc6-40d3-85a5-ad7c4a0cbf8d-var-lock\") pod \"installer-9-crc\" (UID: \"7d42201c-acc6-40d3-85a5-ad7c4a0cbf8d\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 10:39:04 crc kubenswrapper[4748]: I0320 10:39:04.779497 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7d42201c-acc6-40d3-85a5-ad7c4a0cbf8d-kubelet-dir\") pod \"installer-9-crc\" (UID: \"7d42201c-acc6-40d3-85a5-ad7c4a0cbf8d\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 10:39:04 crc kubenswrapper[4748]: I0320 10:39:04.779547 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/7d42201c-acc6-40d3-85a5-ad7c4a0cbf8d-var-lock\") pod \"installer-9-crc\" (UID: \"7d42201c-acc6-40d3-85a5-ad7c4a0cbf8d\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 10:39:04 crc kubenswrapper[4748]: I0320 10:39:04.798152 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7d42201c-acc6-40d3-85a5-ad7c4a0cbf8d-kube-api-access\") pod \"installer-9-crc\" (UID: \"7d42201c-acc6-40d3-85a5-ad7c4a0cbf8d\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 10:39:04 crc kubenswrapper[4748]: I0320 10:39:04.889057 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 20 10:39:06 crc kubenswrapper[4748]: E0320 10:39:06.487263 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-mhpdf" podUID="623e7576-03b2-41ba-9cd5-e7d3ab7ecdd7" Mar 20 10:39:06 crc kubenswrapper[4748]: E0320 10:39:06.593908 4748 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 20 10:39:06 crc kubenswrapper[4748]: E0320 10:39:06.594112 4748 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cs4lq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-z78kn_openshift-marketplace(7e393e84-d0bb-4258-8eef-012c9269fc05): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 10:39:06 crc kubenswrapper[4748]: E0320 10:39:06.595821 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-z78kn" podUID="7e393e84-d0bb-4258-8eef-012c9269fc05" Mar 20 10:39:06 crc kubenswrapper[4748]: E0320 10:39:06.618050 4748 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 20 10:39:06 crc kubenswrapper[4748]: E0320 10:39:06.618261 4748 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jsdvt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-c4jdq_openshift-marketplace(a82b758c-eb62-429b-b092-5ba4a7cf665d): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 10:39:06 crc kubenswrapper[4748]: E0320 10:39:06.619492 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-c4jdq" podUID="a82b758c-eb62-429b-b092-5ba4a7cf665d" Mar 20 10:39:06 crc kubenswrapper[4748]: E0320 10:39:06.723918 4748 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 74d970657f1637dca93f3ccd0f395ea17b1b976614aebcf35538a39c03b75a60 is running failed: container process not found" containerID="74d970657f1637dca93f3ccd0f395ea17b1b976614aebcf35538a39c03b75a60" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 20 10:39:06 crc kubenswrapper[4748]: E0320 10:39:06.724296 4748 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 74d970657f1637dca93f3ccd0f395ea17b1b976614aebcf35538a39c03b75a60 is running failed: container process not found" containerID="74d970657f1637dca93f3ccd0f395ea17b1b976614aebcf35538a39c03b75a60" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 20 10:39:06 crc kubenswrapper[4748]: E0320 10:39:06.724925 4748 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 74d970657f1637dca93f3ccd0f395ea17b1b976614aebcf35538a39c03b75a60 is running failed: container process not found" containerID="74d970657f1637dca93f3ccd0f395ea17b1b976614aebcf35538a39c03b75a60" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 20 10:39:06 crc kubenswrapper[4748]: E0320 10:39:06.724980 4748 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 74d970657f1637dca93f3ccd0f395ea17b1b976614aebcf35538a39c03b75a60 is running failed: container process not found" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-9cpg7" podUID="c0116c88-07ce-460e-90ba-2b5d8fd6a921" containerName="kube-multus-additional-cni-plugins" Mar 20 10:39:08 crc kubenswrapper[4748]: E0320 10:39:08.147125 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-z78kn" podUID="7e393e84-d0bb-4258-8eef-012c9269fc05" Mar 20 10:39:08 crc kubenswrapper[4748]: E0320 10:39:08.147710 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-c4jdq" podUID="a82b758c-eb62-429b-b092-5ba4a7cf665d" Mar 20 10:39:08 crc kubenswrapper[4748]: E0320 10:39:08.249955 4748 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 20 10:39:08 crc kubenswrapper[4748]: E0320 10:39:08.250332 4748 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bsdsv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-b6j7x_openshift-marketplace(61390690-1bd1-43c9-b82b-e2c5fe3450f9): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 10:39:08 crc kubenswrapper[4748]: E0320 10:39:08.251666 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-b6j7x" podUID="61390690-1bd1-43c9-b82b-e2c5fe3450f9" Mar 20 10:39:08 crc kubenswrapper[4748]: I0320 10:39:08.258112 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-c5vw7"] Mar 20 10:39:09 crc kubenswrapper[4748]: E0320 10:39:09.953364 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-b6j7x" podUID="61390690-1bd1-43c9-b82b-e2c5fe3450f9" Mar 20 10:39:09 crc kubenswrapper[4748]: I0320 10:39:09.991114 4748 scope.go:117] "RemoveContainer" containerID="d944b9f004ae9d053dba65cf05ec5acfa9768648878718716f55ee581a98cc26" Mar 20 10:39:10 crc kubenswrapper[4748]: E0320 10:39:10.021985 4748 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 20 10:39:10 crc kubenswrapper[4748]: E0320 10:39:10.022200 4748 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-g8bpp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-bsq7p_openshift-marketplace(8e249463-f3e7-4aed-a0ac-97c54af87949): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 10:39:10 crc kubenswrapper[4748]: E0320 10:39:10.023400 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-bsq7p" podUID="8e249463-f3e7-4aed-a0ac-97c54af87949" Mar 20 10:39:10 crc kubenswrapper[4748]: E0320 10:39:10.064654 4748 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 20 10:39:10 crc kubenswrapper[4748]: E0320 10:39:10.064943 4748 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bnzq2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-8kqb5_openshift-marketplace(0186ffa9-907a-4afd-953d-28665f7343da): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 10:39:10 crc kubenswrapper[4748]: E0320 10:39:10.066621 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-8kqb5" podUID="0186ffa9-907a-4afd-953d-28665f7343da" Mar 20 10:39:10 crc kubenswrapper[4748]: I0320 10:39:10.072366 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-9cpg7_c0116c88-07ce-460e-90ba-2b5d8fd6a921/kube-multus-additional-cni-plugins/0.log" Mar 20 10:39:10 crc kubenswrapper[4748]: I0320 10:39:10.072448 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-9cpg7" Mar 20 10:39:10 crc kubenswrapper[4748]: E0320 10:39:10.115990 4748 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 20 10:39:10 crc kubenswrapper[4748]: E0320 10:39:10.116338 4748 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bcb89,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-llsmr_openshift-marketplace(5196dc0c-2a46-4fb2-891b-682a6ce5eed9): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 10:39:10 crc kubenswrapper[4748]: E0320 10:39:10.117711 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-llsmr" podUID="5196dc0c-2a46-4fb2-891b-682a6ce5eed9" Mar 20 10:39:10 crc kubenswrapper[4748]: I0320 10:39:10.245398 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-dfc8ff6c5-9t8fr"] Mar 20 10:39:10 crc kubenswrapper[4748]: I0320 10:39:10.248924 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c0116c88-07ce-460e-90ba-2b5d8fd6a921-cni-sysctl-allowlist\") pod \"c0116c88-07ce-460e-90ba-2b5d8fd6a921\" (UID: \"c0116c88-07ce-460e-90ba-2b5d8fd6a921\") " Mar 20 10:39:10 crc kubenswrapper[4748]: I0320 10:39:10.248965 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fb7kd\" (UniqueName: \"kubernetes.io/projected/c0116c88-07ce-460e-90ba-2b5d8fd6a921-kube-api-access-fb7kd\") pod \"c0116c88-07ce-460e-90ba-2b5d8fd6a921\" (UID: \"c0116c88-07ce-460e-90ba-2b5d8fd6a921\") " Mar 20 10:39:10 crc kubenswrapper[4748]: I0320 10:39:10.248986 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c0116c88-07ce-460e-90ba-2b5d8fd6a921-tuning-conf-dir\") pod \"c0116c88-07ce-460e-90ba-2b5d8fd6a921\" (UID: \"c0116c88-07ce-460e-90ba-2b5d8fd6a921\") " Mar 20 10:39:10 crc kubenswrapper[4748]: I0320 10:39:10.249093 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/c0116c88-07ce-460e-90ba-2b5d8fd6a921-ready\") pod \"c0116c88-07ce-460e-90ba-2b5d8fd6a921\" (UID: \"c0116c88-07ce-460e-90ba-2b5d8fd6a921\") " Mar 20 10:39:10 crc kubenswrapper[4748]: I0320 10:39:10.249943 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c0116c88-07ce-460e-90ba-2b5d8fd6a921-ready" (OuterVolumeSpecName: "ready") pod "c0116c88-07ce-460e-90ba-2b5d8fd6a921" (UID: "c0116c88-07ce-460e-90ba-2b5d8fd6a921"). InnerVolumeSpecName "ready". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:39:10 crc kubenswrapper[4748]: I0320 10:39:10.250942 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c0116c88-07ce-460e-90ba-2b5d8fd6a921-tuning-conf-dir" (OuterVolumeSpecName: "tuning-conf-dir") pod "c0116c88-07ce-460e-90ba-2b5d8fd6a921" (UID: "c0116c88-07ce-460e-90ba-2b5d8fd6a921"). InnerVolumeSpecName "tuning-conf-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 10:39:10 crc kubenswrapper[4748]: W0320 10:39:10.254565 4748 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda9274aea_7ec5_410b_a183_f67a45ee5241.slice/crio-4cc415fa88e19e24f0bca8be4a054b5914ee59095f4dac5c12677f65152ebc0c WatchSource:0}: Error finding container 4cc415fa88e19e24f0bca8be4a054b5914ee59095f4dac5c12677f65152ebc0c: Status 404 returned error can't find the container with id 4cc415fa88e19e24f0bca8be4a054b5914ee59095f4dac5c12677f65152ebc0c Mar 20 10:39:10 crc kubenswrapper[4748]: I0320 10:39:10.256264 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0116c88-07ce-460e-90ba-2b5d8fd6a921-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "c0116c88-07ce-460e-90ba-2b5d8fd6a921" (UID: "c0116c88-07ce-460e-90ba-2b5d8fd6a921"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:39:10 crc kubenswrapper[4748]: I0320 10:39:10.271810 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0116c88-07ce-460e-90ba-2b5d8fd6a921-kube-api-access-fb7kd" (OuterVolumeSpecName: "kube-api-access-fb7kd") pod "c0116c88-07ce-460e-90ba-2b5d8fd6a921" (UID: "c0116c88-07ce-460e-90ba-2b5d8fd6a921"). InnerVolumeSpecName "kube-api-access-fb7kd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:39:10 crc kubenswrapper[4748]: I0320 10:39:10.350629 4748 reconciler_common.go:293] "Volume detached for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/c0116c88-07ce-460e-90ba-2b5d8fd6a921-ready\") on node \"crc\" DevicePath \"\"" Mar 20 10:39:10 crc kubenswrapper[4748]: I0320 10:39:10.350656 4748 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c0116c88-07ce-460e-90ba-2b5d8fd6a921-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Mar 20 10:39:10 crc kubenswrapper[4748]: I0320 10:39:10.350667 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fb7kd\" (UniqueName: \"kubernetes.io/projected/c0116c88-07ce-460e-90ba-2b5d8fd6a921-kube-api-access-fb7kd\") on node \"crc\" DevicePath \"\"" Mar 20 10:39:10 crc kubenswrapper[4748]: I0320 10:39:10.350678 4748 reconciler_common.go:293] "Volume detached for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c0116c88-07ce-460e-90ba-2b5d8fd6a921-tuning-conf-dir\") on node \"crc\" DevicePath \"\"" Mar 20 10:39:10 crc kubenswrapper[4748]: I0320 10:39:10.425945 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 20 10:39:10 crc kubenswrapper[4748]: E0320 10:39:10.453058 4748 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 20 10:39:10 crc kubenswrapper[4748]: E0320 10:39:10.453522 4748 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mgnhj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-5fgxw_openshift-marketplace(e0225901-1ac3-4097-9b8b-4a2c7f0f3f3b): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 10:39:10 crc kubenswrapper[4748]: E0320 10:39:10.454700 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-5fgxw" podUID="e0225901-1ac3-4097-9b8b-4a2c7f0f3f3b" Mar 20 10:39:10 crc kubenswrapper[4748]: I0320 10:39:10.524025 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-9f7b5d9c8-lxv44"] Mar 20 10:39:10 crc kubenswrapper[4748]: W0320 10:39:10.535246 4748 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfac3e59d_eae4_4116_adb8_eca1c29a6b4f.slice/crio-e0827928bf1ed50cdf4798dded4236f37dda83dd8c3f0404b2ba9ef4b34a577e WatchSource:0}: Error finding container e0827928bf1ed50cdf4798dded4236f37dda83dd8c3f0404b2ba9ef4b34a577e: Status 404 returned error can't find the container with id e0827928bf1ed50cdf4798dded4236f37dda83dd8c3f0404b2ba9ef4b34a577e Mar 20 10:39:10 crc kubenswrapper[4748]: I0320 10:39:10.558319 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 20 10:39:10 crc kubenswrapper[4748]: I0320 10:39:10.938171 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-9cpg7_c0116c88-07ce-460e-90ba-2b5d8fd6a921/kube-multus-additional-cni-plugins/0.log" Mar 20 10:39:10 crc kubenswrapper[4748]: I0320 10:39:10.938500 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-9cpg7" event={"ID":"c0116c88-07ce-460e-90ba-2b5d8fd6a921","Type":"ContainerDied","Data":"81e4b77d8b16210c5e522889b70c1a16f09db91b82a58af028394ed897c21715"} Mar 20 10:39:10 crc kubenswrapper[4748]: I0320 10:39:10.938536 4748 scope.go:117] "RemoveContainer" containerID="74d970657f1637dca93f3ccd0f395ea17b1b976614aebcf35538a39c03b75a60" Mar 20 10:39:10 crc kubenswrapper[4748]: I0320 10:39:10.938611 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-9cpg7" Mar 20 10:39:10 crc kubenswrapper[4748]: I0320 10:39:10.944454 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"c183842b-4ff7-4ea0-b4fb-a1641c33efd1","Type":"ContainerStarted","Data":"8d39fce8992a8d01b97c0b6109ddc09be7db7591f4a2878a923fee87db69fd79"} Mar 20 10:39:10 crc kubenswrapper[4748]: I0320 10:39:10.944498 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"c183842b-4ff7-4ea0-b4fb-a1641c33efd1","Type":"ContainerStarted","Data":"71e7f54e16265b39657208f677da223df0434e46779159613ad66904e96a9528"} Mar 20 10:39:10 crc kubenswrapper[4748]: I0320 10:39:10.945938 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-9f7b5d9c8-lxv44" event={"ID":"fac3e59d-eae4-4116-adb8-eca1c29a6b4f","Type":"ContainerStarted","Data":"9159b5f330758eba7a78ad10dd9e56c04b528bf1a5e42e14175096e82491e0f6"} Mar 20 10:39:10 crc kubenswrapper[4748]: I0320 10:39:10.945965 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-9f7b5d9c8-lxv44" event={"ID":"fac3e59d-eae4-4116-adb8-eca1c29a6b4f","Type":"ContainerStarted","Data":"e0827928bf1ed50cdf4798dded4236f37dda83dd8c3f0404b2ba9ef4b34a577e"} Mar 20 10:39:10 crc kubenswrapper[4748]: I0320 10:39:10.946874 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"7d42201c-acc6-40d3-85a5-ad7c4a0cbf8d","Type":"ContainerStarted","Data":"5fbf0b08149adccd996ee34fb49ea97a2662871262aeec59df320080123c6175"} Mar 20 10:39:10 crc kubenswrapper[4748]: I0320 10:39:10.950376 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-dfc8ff6c5-9t8fr" event={"ID":"a9274aea-7ec5-410b-a183-f67a45ee5241","Type":"ContainerStarted","Data":"067458dd3e7c2dede621fcaf9ba3479682b15253b8c33280d96f1fbc0595b0bf"} Mar 20 10:39:10 crc kubenswrapper[4748]: I0320 10:39:10.950405 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-dfc8ff6c5-9t8fr" event={"ID":"a9274aea-7ec5-410b-a183-f67a45ee5241","Type":"ContainerStarted","Data":"4cc415fa88e19e24f0bca8be4a054b5914ee59095f4dac5c12677f65152ebc0c"} Mar 20 10:39:10 crc kubenswrapper[4748]: I0320 10:39:10.950418 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-dfc8ff6c5-9t8fr" Mar 20 10:39:10 crc kubenswrapper[4748]: E0320 10:39:10.951451 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-llsmr" podUID="5196dc0c-2a46-4fb2-891b-682a6ce5eed9" Mar 20 10:39:10 crc kubenswrapper[4748]: I0320 10:39:10.959737 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-dfc8ff6c5-9t8fr" Mar 20 10:39:10 crc kubenswrapper[4748]: E0320 10:39:10.968328 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-8kqb5" podUID="0186ffa9-907a-4afd-953d-28665f7343da" Mar 20 10:39:10 crc kubenswrapper[4748]: E0320 10:39:10.968439 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-5fgxw" podUID="e0225901-1ac3-4097-9b8b-4a2c7f0f3f3b" Mar 20 10:39:10 crc kubenswrapper[4748]: E0320 10:39:10.968454 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-bsq7p" podUID="8e249463-f3e7-4aed-a0ac-97c54af87949" Mar 20 10:39:10 crc kubenswrapper[4748]: I0320 10:39:10.997920 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-dfc8ff6c5-9t8fr" podStartSLOduration=8.997903703 podStartE2EDuration="8.997903703s" podCreationTimestamp="2026-03-20 10:39:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:39:10.99665749 +0000 UTC m=+186.138203314" watchObservedRunningTime="2026-03-20 10:39:10.997903703 +0000 UTC m=+186.139449517" Mar 20 10:39:11 crc kubenswrapper[4748]: I0320 10:39:11.064456 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-9cpg7"] Mar 20 10:39:11 crc kubenswrapper[4748]: I0320 10:39:11.068166 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-9cpg7"] Mar 20 10:39:11 crc kubenswrapper[4748]: I0320 10:39:11.525621 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0116c88-07ce-460e-90ba-2b5d8fd6a921" path="/var/lib/kubelet/pods/c0116c88-07ce-460e-90ba-2b5d8fd6a921/volumes" Mar 20 10:39:11 crc kubenswrapper[4748]: I0320 10:39:11.957030 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"7d42201c-acc6-40d3-85a5-ad7c4a0cbf8d","Type":"ContainerStarted","Data":"b3059ea8fec63ee9b7a63374cc436e6ce45be13174e66222e836a504de10238d"} Mar 20 10:39:11 crc kubenswrapper[4748]: I0320 10:39:11.959730 4748 generic.go:334] "Generic (PLEG): container finished" podID="c183842b-4ff7-4ea0-b4fb-a1641c33efd1" containerID="8d39fce8992a8d01b97c0b6109ddc09be7db7591f4a2878a923fee87db69fd79" exitCode=0 Mar 20 10:39:11 crc kubenswrapper[4748]: I0320 10:39:11.959821 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"c183842b-4ff7-4ea0-b4fb-a1641c33efd1","Type":"ContainerDied","Data":"8d39fce8992a8d01b97c0b6109ddc09be7db7591f4a2878a923fee87db69fd79"} Mar 20 10:39:11 crc kubenswrapper[4748]: I0320 10:39:11.960027 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-9f7b5d9c8-lxv44" podUID="fac3e59d-eae4-4116-adb8-eca1c29a6b4f" containerName="controller-manager" containerID="cri-o://9159b5f330758eba7a78ad10dd9e56c04b528bf1a5e42e14175096e82491e0f6" gracePeriod=30 Mar 20 10:39:11 crc kubenswrapper[4748]: I0320 10:39:11.977281 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=7.977255132 podStartE2EDuration="7.977255132s" podCreationTimestamp="2026-03-20 10:39:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:39:11.974735676 +0000 UTC m=+187.116281490" watchObservedRunningTime="2026-03-20 10:39:11.977255132 +0000 UTC m=+187.118800956" Mar 20 10:39:12 crc kubenswrapper[4748]: I0320 10:39:12.020981 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-9f7b5d9c8-lxv44" podStartSLOduration=30.020959875 podStartE2EDuration="30.020959875s" podCreationTimestamp="2026-03-20 10:38:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:39:12.019673441 +0000 UTC m=+187.161219265" watchObservedRunningTime="2026-03-20 10:39:12.020959875 +0000 UTC m=+187.162505689" Mar 20 10:39:12 crc kubenswrapper[4748]: I0320 10:39:12.964360 4748 generic.go:334] "Generic (PLEG): container finished" podID="fac3e59d-eae4-4116-adb8-eca1c29a6b4f" containerID="9159b5f330758eba7a78ad10dd9e56c04b528bf1a5e42e14175096e82491e0f6" exitCode=0 Mar 20 10:39:12 crc kubenswrapper[4748]: I0320 10:39:12.964510 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-9f7b5d9c8-lxv44" event={"ID":"fac3e59d-eae4-4116-adb8-eca1c29a6b4f","Type":"ContainerDied","Data":"9159b5f330758eba7a78ad10dd9e56c04b528bf1a5e42e14175096e82491e0f6"} Mar 20 10:39:13 crc kubenswrapper[4748]: I0320 10:39:13.126230 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-9f7b5d9c8-lxv44" Mar 20 10:39:13 crc kubenswrapper[4748]: I0320 10:39:13.160582 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6677c6f89c-fth76"] Mar 20 10:39:13 crc kubenswrapper[4748]: E0320 10:39:13.160952 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fac3e59d-eae4-4116-adb8-eca1c29a6b4f" containerName="controller-manager" Mar 20 10:39:13 crc kubenswrapper[4748]: I0320 10:39:13.160971 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="fac3e59d-eae4-4116-adb8-eca1c29a6b4f" containerName="controller-manager" Mar 20 10:39:13 crc kubenswrapper[4748]: E0320 10:39:13.160987 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0116c88-07ce-460e-90ba-2b5d8fd6a921" containerName="kube-multus-additional-cni-plugins" Mar 20 10:39:13 crc kubenswrapper[4748]: I0320 10:39:13.160996 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0116c88-07ce-460e-90ba-2b5d8fd6a921" containerName="kube-multus-additional-cni-plugins" Mar 20 10:39:13 crc kubenswrapper[4748]: I0320 10:39:13.161127 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0116c88-07ce-460e-90ba-2b5d8fd6a921" containerName="kube-multus-additional-cni-plugins" Mar 20 10:39:13 crc kubenswrapper[4748]: I0320 10:39:13.161141 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="fac3e59d-eae4-4116-adb8-eca1c29a6b4f" containerName="controller-manager" Mar 20 10:39:13 crc kubenswrapper[4748]: I0320 10:39:13.161668 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6677c6f89c-fth76" Mar 20 10:39:13 crc kubenswrapper[4748]: I0320 10:39:13.163421 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6677c6f89c-fth76"] Mar 20 10:39:13 crc kubenswrapper[4748]: I0320 10:39:13.220095 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 10:39:13 crc kubenswrapper[4748]: I0320 10:39:13.287812 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fac3e59d-eae4-4116-adb8-eca1c29a6b4f-config\") pod \"fac3e59d-eae4-4116-adb8-eca1c29a6b4f\" (UID: \"fac3e59d-eae4-4116-adb8-eca1c29a6b4f\") " Mar 20 10:39:13 crc kubenswrapper[4748]: I0320 10:39:13.287915 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fac3e59d-eae4-4116-adb8-eca1c29a6b4f-proxy-ca-bundles\") pod \"fac3e59d-eae4-4116-adb8-eca1c29a6b4f\" (UID: \"fac3e59d-eae4-4116-adb8-eca1c29a6b4f\") " Mar 20 10:39:13 crc kubenswrapper[4748]: I0320 10:39:13.287971 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ssq8b\" (UniqueName: \"kubernetes.io/projected/fac3e59d-eae4-4116-adb8-eca1c29a6b4f-kube-api-access-ssq8b\") pod \"fac3e59d-eae4-4116-adb8-eca1c29a6b4f\" (UID: \"fac3e59d-eae4-4116-adb8-eca1c29a6b4f\") " Mar 20 10:39:13 crc kubenswrapper[4748]: I0320 10:39:13.288001 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fac3e59d-eae4-4116-adb8-eca1c29a6b4f-serving-cert\") pod \"fac3e59d-eae4-4116-adb8-eca1c29a6b4f\" (UID: \"fac3e59d-eae4-4116-adb8-eca1c29a6b4f\") " Mar 20 10:39:13 crc kubenswrapper[4748]: I0320 10:39:13.288031 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fac3e59d-eae4-4116-adb8-eca1c29a6b4f-client-ca\") pod \"fac3e59d-eae4-4116-adb8-eca1c29a6b4f\" (UID: \"fac3e59d-eae4-4116-adb8-eca1c29a6b4f\") " Mar 20 10:39:13 crc kubenswrapper[4748]: I0320 10:39:13.288187 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc63f09e-a566-4a38-855a-efe2fe737848-config\") pod \"controller-manager-6677c6f89c-fth76\" (UID: \"fc63f09e-a566-4a38-855a-efe2fe737848\") " pod="openshift-controller-manager/controller-manager-6677c6f89c-fth76" Mar 20 10:39:13 crc kubenswrapper[4748]: I0320 10:39:13.288248 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fc63f09e-a566-4a38-855a-efe2fe737848-client-ca\") pod \"controller-manager-6677c6f89c-fth76\" (UID: \"fc63f09e-a566-4a38-855a-efe2fe737848\") " pod="openshift-controller-manager/controller-manager-6677c6f89c-fth76" Mar 20 10:39:13 crc kubenswrapper[4748]: I0320 10:39:13.288301 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2m6d9\" (UniqueName: \"kubernetes.io/projected/fc63f09e-a566-4a38-855a-efe2fe737848-kube-api-access-2m6d9\") pod \"controller-manager-6677c6f89c-fth76\" (UID: \"fc63f09e-a566-4a38-855a-efe2fe737848\") " pod="openshift-controller-manager/controller-manager-6677c6f89c-fth76" Mar 20 10:39:13 crc kubenswrapper[4748]: I0320 10:39:13.288329 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fc63f09e-a566-4a38-855a-efe2fe737848-proxy-ca-bundles\") pod \"controller-manager-6677c6f89c-fth76\" (UID: \"fc63f09e-a566-4a38-855a-efe2fe737848\") " pod="openshift-controller-manager/controller-manager-6677c6f89c-fth76" Mar 20 10:39:13 crc kubenswrapper[4748]: I0320 10:39:13.288347 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fc63f09e-a566-4a38-855a-efe2fe737848-serving-cert\") pod \"controller-manager-6677c6f89c-fth76\" (UID: \"fc63f09e-a566-4a38-855a-efe2fe737848\") " pod="openshift-controller-manager/controller-manager-6677c6f89c-fth76" Mar 20 10:39:13 crc kubenswrapper[4748]: I0320 10:39:13.289185 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fac3e59d-eae4-4116-adb8-eca1c29a6b4f-config" (OuterVolumeSpecName: "config") pod "fac3e59d-eae4-4116-adb8-eca1c29a6b4f" (UID: "fac3e59d-eae4-4116-adb8-eca1c29a6b4f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:39:13 crc kubenswrapper[4748]: I0320 10:39:13.289438 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fac3e59d-eae4-4116-adb8-eca1c29a6b4f-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "fac3e59d-eae4-4116-adb8-eca1c29a6b4f" (UID: "fac3e59d-eae4-4116-adb8-eca1c29a6b4f"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:39:13 crc kubenswrapper[4748]: I0320 10:39:13.289847 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fac3e59d-eae4-4116-adb8-eca1c29a6b4f-client-ca" (OuterVolumeSpecName: "client-ca") pod "fac3e59d-eae4-4116-adb8-eca1c29a6b4f" (UID: "fac3e59d-eae4-4116-adb8-eca1c29a6b4f"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:39:13 crc kubenswrapper[4748]: I0320 10:39:13.294644 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fac3e59d-eae4-4116-adb8-eca1c29a6b4f-kube-api-access-ssq8b" (OuterVolumeSpecName: "kube-api-access-ssq8b") pod "fac3e59d-eae4-4116-adb8-eca1c29a6b4f" (UID: "fac3e59d-eae4-4116-adb8-eca1c29a6b4f"). InnerVolumeSpecName "kube-api-access-ssq8b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:39:13 crc kubenswrapper[4748]: I0320 10:39:13.306858 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fac3e59d-eae4-4116-adb8-eca1c29a6b4f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "fac3e59d-eae4-4116-adb8-eca1c29a6b4f" (UID: "fac3e59d-eae4-4116-adb8-eca1c29a6b4f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:39:13 crc kubenswrapper[4748]: I0320 10:39:13.389293 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c183842b-4ff7-4ea0-b4fb-a1641c33efd1-kube-api-access\") pod \"c183842b-4ff7-4ea0-b4fb-a1641c33efd1\" (UID: \"c183842b-4ff7-4ea0-b4fb-a1641c33efd1\") " Mar 20 10:39:13 crc kubenswrapper[4748]: I0320 10:39:13.389352 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c183842b-4ff7-4ea0-b4fb-a1641c33efd1-kubelet-dir\") pod \"c183842b-4ff7-4ea0-b4fb-a1641c33efd1\" (UID: \"c183842b-4ff7-4ea0-b4fb-a1641c33efd1\") " Mar 20 10:39:13 crc kubenswrapper[4748]: I0320 10:39:13.389430 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c183842b-4ff7-4ea0-b4fb-a1641c33efd1-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "c183842b-4ff7-4ea0-b4fb-a1641c33efd1" (UID: "c183842b-4ff7-4ea0-b4fb-a1641c33efd1"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 10:39:13 crc kubenswrapper[4748]: I0320 10:39:13.389488 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc63f09e-a566-4a38-855a-efe2fe737848-config\") pod \"controller-manager-6677c6f89c-fth76\" (UID: \"fc63f09e-a566-4a38-855a-efe2fe737848\") " pod="openshift-controller-manager/controller-manager-6677c6f89c-fth76" Mar 20 10:39:13 crc kubenswrapper[4748]: I0320 10:39:13.389544 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fc63f09e-a566-4a38-855a-efe2fe737848-client-ca\") pod \"controller-manager-6677c6f89c-fth76\" (UID: \"fc63f09e-a566-4a38-855a-efe2fe737848\") " pod="openshift-controller-manager/controller-manager-6677c6f89c-fth76" Mar 20 10:39:13 crc kubenswrapper[4748]: I0320 10:39:13.389580 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2m6d9\" (UniqueName: \"kubernetes.io/projected/fc63f09e-a566-4a38-855a-efe2fe737848-kube-api-access-2m6d9\") pod \"controller-manager-6677c6f89c-fth76\" (UID: \"fc63f09e-a566-4a38-855a-efe2fe737848\") " pod="openshift-controller-manager/controller-manager-6677c6f89c-fth76" Mar 20 10:39:13 crc kubenswrapper[4748]: I0320 10:39:13.389600 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fc63f09e-a566-4a38-855a-efe2fe737848-serving-cert\") pod \"controller-manager-6677c6f89c-fth76\" (UID: \"fc63f09e-a566-4a38-855a-efe2fe737848\") " pod="openshift-controller-manager/controller-manager-6677c6f89c-fth76" Mar 20 10:39:13 crc kubenswrapper[4748]: I0320 10:39:13.389616 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fc63f09e-a566-4a38-855a-efe2fe737848-proxy-ca-bundles\") pod \"controller-manager-6677c6f89c-fth76\" (UID: \"fc63f09e-a566-4a38-855a-efe2fe737848\") " pod="openshift-controller-manager/controller-manager-6677c6f89c-fth76" Mar 20 10:39:13 crc kubenswrapper[4748]: I0320 10:39:13.389661 4748 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fac3e59d-eae4-4116-adb8-eca1c29a6b4f-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 10:39:13 crc kubenswrapper[4748]: I0320 10:39:13.389673 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ssq8b\" (UniqueName: \"kubernetes.io/projected/fac3e59d-eae4-4116-adb8-eca1c29a6b4f-kube-api-access-ssq8b\") on node \"crc\" DevicePath \"\"" Mar 20 10:39:13 crc kubenswrapper[4748]: I0320 10:39:13.389685 4748 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fac3e59d-eae4-4116-adb8-eca1c29a6b4f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:39:13 crc kubenswrapper[4748]: I0320 10:39:13.389694 4748 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fac3e59d-eae4-4116-adb8-eca1c29a6b4f-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 10:39:13 crc kubenswrapper[4748]: I0320 10:39:13.389702 4748 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c183842b-4ff7-4ea0-b4fb-a1641c33efd1-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 20 10:39:13 crc kubenswrapper[4748]: I0320 10:39:13.389711 4748 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fac3e59d-eae4-4116-adb8-eca1c29a6b4f-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:39:13 crc kubenswrapper[4748]: I0320 10:39:13.390712 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fc63f09e-a566-4a38-855a-efe2fe737848-client-ca\") pod \"controller-manager-6677c6f89c-fth76\" (UID: \"fc63f09e-a566-4a38-855a-efe2fe737848\") " pod="openshift-controller-manager/controller-manager-6677c6f89c-fth76" Mar 20 10:39:13 crc kubenswrapper[4748]: I0320 10:39:13.390978 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fc63f09e-a566-4a38-855a-efe2fe737848-proxy-ca-bundles\") pod \"controller-manager-6677c6f89c-fth76\" (UID: \"fc63f09e-a566-4a38-855a-efe2fe737848\") " pod="openshift-controller-manager/controller-manager-6677c6f89c-fth76" Mar 20 10:39:13 crc kubenswrapper[4748]: I0320 10:39:13.391016 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc63f09e-a566-4a38-855a-efe2fe737848-config\") pod \"controller-manager-6677c6f89c-fth76\" (UID: \"fc63f09e-a566-4a38-855a-efe2fe737848\") " pod="openshift-controller-manager/controller-manager-6677c6f89c-fth76" Mar 20 10:39:13 crc kubenswrapper[4748]: I0320 10:39:13.392714 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c183842b-4ff7-4ea0-b4fb-a1641c33efd1-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "c183842b-4ff7-4ea0-b4fb-a1641c33efd1" (UID: "c183842b-4ff7-4ea0-b4fb-a1641c33efd1"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:39:13 crc kubenswrapper[4748]: I0320 10:39:13.393311 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fc63f09e-a566-4a38-855a-efe2fe737848-serving-cert\") pod \"controller-manager-6677c6f89c-fth76\" (UID: \"fc63f09e-a566-4a38-855a-efe2fe737848\") " pod="openshift-controller-manager/controller-manager-6677c6f89c-fth76" Mar 20 10:39:13 crc kubenswrapper[4748]: I0320 10:39:13.412221 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2m6d9\" (UniqueName: \"kubernetes.io/projected/fc63f09e-a566-4a38-855a-efe2fe737848-kube-api-access-2m6d9\") pod \"controller-manager-6677c6f89c-fth76\" (UID: \"fc63f09e-a566-4a38-855a-efe2fe737848\") " pod="openshift-controller-manager/controller-manager-6677c6f89c-fth76" Mar 20 10:39:13 crc kubenswrapper[4748]: I0320 10:39:13.490997 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c183842b-4ff7-4ea0-b4fb-a1641c33efd1-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 10:39:13 crc kubenswrapper[4748]: I0320 10:39:13.518942 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6677c6f89c-fth76" Mar 20 10:39:13 crc kubenswrapper[4748]: I0320 10:39:13.693918 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6677c6f89c-fth76"] Mar 20 10:39:13 crc kubenswrapper[4748]: I0320 10:39:13.972718 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6677c6f89c-fth76" event={"ID":"fc63f09e-a566-4a38-855a-efe2fe737848","Type":"ContainerStarted","Data":"3b604433f968363d4e70c60beddfeec3705bdef87ed7ef5da2b27da4b7767f4c"} Mar 20 10:39:13 crc kubenswrapper[4748]: I0320 10:39:13.972774 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6677c6f89c-fth76" event={"ID":"fc63f09e-a566-4a38-855a-efe2fe737848","Type":"ContainerStarted","Data":"a8dc467391b89d96131bb4d3771082996618dfd5dd6bfad81496ea316746d18a"} Mar 20 10:39:13 crc kubenswrapper[4748]: I0320 10:39:13.973169 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6677c6f89c-fth76" Mar 20 10:39:13 crc kubenswrapper[4748]: I0320 10:39:13.974546 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-9f7b5d9c8-lxv44" event={"ID":"fac3e59d-eae4-4116-adb8-eca1c29a6b4f","Type":"ContainerDied","Data":"e0827928bf1ed50cdf4798dded4236f37dda83dd8c3f0404b2ba9ef4b34a577e"} Mar 20 10:39:13 crc kubenswrapper[4748]: I0320 10:39:13.974627 4748 scope.go:117] "RemoveContainer" containerID="9159b5f330758eba7a78ad10dd9e56c04b528bf1a5e42e14175096e82491e0f6" Mar 20 10:39:13 crc kubenswrapper[4748]: I0320 10:39:13.974726 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-9f7b5d9c8-lxv44" Mar 20 10:39:13 crc kubenswrapper[4748]: I0320 10:39:13.976268 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"c183842b-4ff7-4ea0-b4fb-a1641c33efd1","Type":"ContainerDied","Data":"71e7f54e16265b39657208f677da223df0434e46779159613ad66904e96a9528"} Mar 20 10:39:13 crc kubenswrapper[4748]: I0320 10:39:13.976288 4748 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="71e7f54e16265b39657208f677da223df0434e46779159613ad66904e96a9528" Mar 20 10:39:13 crc kubenswrapper[4748]: I0320 10:39:13.976312 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 10:39:13 crc kubenswrapper[4748]: I0320 10:39:13.984959 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6677c6f89c-fth76" Mar 20 10:39:13 crc kubenswrapper[4748]: I0320 10:39:13.997273 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6677c6f89c-fth76" podStartSLOduration=11.997248908 podStartE2EDuration="11.997248908s" podCreationTimestamp="2026-03-20 10:39:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:39:13.995485632 +0000 UTC m=+189.137031456" watchObservedRunningTime="2026-03-20 10:39:13.997248908 +0000 UTC m=+189.138794722" Mar 20 10:39:14 crc kubenswrapper[4748]: I0320 10:39:14.007995 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-9f7b5d9c8-lxv44"] Mar 20 10:39:14 crc kubenswrapper[4748]: I0320 10:39:14.022725 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-9f7b5d9c8-lxv44"] Mar 20 10:39:15 crc kubenswrapper[4748]: I0320 10:39:15.523073 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fac3e59d-eae4-4116-adb8-eca1c29a6b4f" path="/var/lib/kubelet/pods/fac3e59d-eae4-4116-adb8-eca1c29a6b4f/volumes" Mar 20 10:39:22 crc kubenswrapper[4748]: I0320 10:39:22.026285 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6677c6f89c-fth76"] Mar 20 10:39:22 crc kubenswrapper[4748]: I0320 10:39:22.026924 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-6677c6f89c-fth76" podUID="fc63f09e-a566-4a38-855a-efe2fe737848" containerName="controller-manager" containerID="cri-o://3b604433f968363d4e70c60beddfeec3705bdef87ed7ef5da2b27da4b7767f4c" gracePeriod=30 Mar 20 10:39:22 crc kubenswrapper[4748]: I0320 10:39:22.039804 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-dfc8ff6c5-9t8fr"] Mar 20 10:39:22 crc kubenswrapper[4748]: I0320 10:39:22.040269 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-dfc8ff6c5-9t8fr" podUID="a9274aea-7ec5-410b-a183-f67a45ee5241" containerName="route-controller-manager" containerID="cri-o://067458dd3e7c2dede621fcaf9ba3479682b15253b8c33280d96f1fbc0595b0bf" gracePeriod=30 Mar 20 10:39:22 crc kubenswrapper[4748]: I0320 10:39:22.458176 4748 patch_prober.go:28] interesting pod/route-controller-manager-dfc8ff6c5-9t8fr container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.60:8443/healthz\": dial tcp 10.217.0.60:8443: connect: connection refused" start-of-body= Mar 20 10:39:22 crc kubenswrapper[4748]: I0320 10:39:22.458452 4748 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-dfc8ff6c5-9t8fr" podUID="a9274aea-7ec5-410b-a183-f67a45ee5241" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.60:8443/healthz\": dial tcp 10.217.0.60:8443: connect: connection refused" Mar 20 10:39:23 crc kubenswrapper[4748]: I0320 10:39:23.034082 4748 generic.go:334] "Generic (PLEG): container finished" podID="a9274aea-7ec5-410b-a183-f67a45ee5241" containerID="067458dd3e7c2dede621fcaf9ba3479682b15253b8c33280d96f1fbc0595b0bf" exitCode=0 Mar 20 10:39:23 crc kubenswrapper[4748]: I0320 10:39:23.034159 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-dfc8ff6c5-9t8fr" event={"ID":"a9274aea-7ec5-410b-a183-f67a45ee5241","Type":"ContainerDied","Data":"067458dd3e7c2dede621fcaf9ba3479682b15253b8c33280d96f1fbc0595b0bf"} Mar 20 10:39:23 crc kubenswrapper[4748]: I0320 10:39:23.035614 4748 generic.go:334] "Generic (PLEG): container finished" podID="fc63f09e-a566-4a38-855a-efe2fe737848" containerID="3b604433f968363d4e70c60beddfeec3705bdef87ed7ef5da2b27da4b7767f4c" exitCode=0 Mar 20 10:39:23 crc kubenswrapper[4748]: I0320 10:39:23.035659 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6677c6f89c-fth76" event={"ID":"fc63f09e-a566-4a38-855a-efe2fe737848","Type":"ContainerDied","Data":"3b604433f968363d4e70c60beddfeec3705bdef87ed7ef5da2b27da4b7767f4c"} Mar 20 10:39:23 crc kubenswrapper[4748]: I0320 10:39:23.519770 4748 patch_prober.go:28] interesting pod/controller-manager-6677c6f89c-fth76 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.62:8443/healthz\": dial tcp 10.217.0.62:8443: connect: connection refused" start-of-body= Mar 20 10:39:23 crc kubenswrapper[4748]: I0320 10:39:23.519905 4748 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-6677c6f89c-fth76" podUID="fc63f09e-a566-4a38-855a-efe2fe737848" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.62:8443/healthz\": dial tcp 10.217.0.62:8443: connect: connection refused" Mar 20 10:39:24 crc kubenswrapper[4748]: I0320 10:39:24.046011 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-dfc8ff6c5-9t8fr" event={"ID":"a9274aea-7ec5-410b-a183-f67a45ee5241","Type":"ContainerDied","Data":"4cc415fa88e19e24f0bca8be4a054b5914ee59095f4dac5c12677f65152ebc0c"} Mar 20 10:39:24 crc kubenswrapper[4748]: I0320 10:39:24.046329 4748 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4cc415fa88e19e24f0bca8be4a054b5914ee59095f4dac5c12677f65152ebc0c" Mar 20 10:39:24 crc kubenswrapper[4748]: I0320 10:39:24.054612 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-dfc8ff6c5-9t8fr" Mar 20 10:39:24 crc kubenswrapper[4748]: I0320 10:39:24.099925 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-59c68db7c4-66ddn"] Mar 20 10:39:24 crc kubenswrapper[4748]: E0320 10:39:24.100285 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9274aea-7ec5-410b-a183-f67a45ee5241" containerName="route-controller-manager" Mar 20 10:39:24 crc kubenswrapper[4748]: I0320 10:39:24.100325 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9274aea-7ec5-410b-a183-f67a45ee5241" containerName="route-controller-manager" Mar 20 10:39:24 crc kubenswrapper[4748]: E0320 10:39:24.100356 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c183842b-4ff7-4ea0-b4fb-a1641c33efd1" containerName="pruner" Mar 20 10:39:24 crc kubenswrapper[4748]: I0320 10:39:24.100365 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="c183842b-4ff7-4ea0-b4fb-a1641c33efd1" containerName="pruner" Mar 20 10:39:24 crc kubenswrapper[4748]: I0320 10:39:24.100525 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="c183842b-4ff7-4ea0-b4fb-a1641c33efd1" containerName="pruner" Mar 20 10:39:24 crc kubenswrapper[4748]: I0320 10:39:24.100557 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9274aea-7ec5-410b-a183-f67a45ee5241" containerName="route-controller-manager" Mar 20 10:39:24 crc kubenswrapper[4748]: I0320 10:39:24.101102 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-59c68db7c4-66ddn" Mar 20 10:39:24 crc kubenswrapper[4748]: I0320 10:39:24.106417 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-59c68db7c4-66ddn"] Mar 20 10:39:24 crc kubenswrapper[4748]: I0320 10:39:24.133142 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9274aea-7ec5-410b-a183-f67a45ee5241-config\") pod \"a9274aea-7ec5-410b-a183-f67a45ee5241\" (UID: \"a9274aea-7ec5-410b-a183-f67a45ee5241\") " Mar 20 10:39:24 crc kubenswrapper[4748]: I0320 10:39:24.133221 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5cwbv\" (UniqueName: \"kubernetes.io/projected/a9274aea-7ec5-410b-a183-f67a45ee5241-kube-api-access-5cwbv\") pod \"a9274aea-7ec5-410b-a183-f67a45ee5241\" (UID: \"a9274aea-7ec5-410b-a183-f67a45ee5241\") " Mar 20 10:39:24 crc kubenswrapper[4748]: I0320 10:39:24.133315 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a9274aea-7ec5-410b-a183-f67a45ee5241-client-ca\") pod \"a9274aea-7ec5-410b-a183-f67a45ee5241\" (UID: \"a9274aea-7ec5-410b-a183-f67a45ee5241\") " Mar 20 10:39:24 crc kubenswrapper[4748]: I0320 10:39:24.133386 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a9274aea-7ec5-410b-a183-f67a45ee5241-serving-cert\") pod \"a9274aea-7ec5-410b-a183-f67a45ee5241\" (UID: \"a9274aea-7ec5-410b-a183-f67a45ee5241\") " Mar 20 10:39:24 crc kubenswrapper[4748]: I0320 10:39:24.146943 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9274aea-7ec5-410b-a183-f67a45ee5241-config" (OuterVolumeSpecName: "config") pod "a9274aea-7ec5-410b-a183-f67a45ee5241" (UID: "a9274aea-7ec5-410b-a183-f67a45ee5241"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:39:24 crc kubenswrapper[4748]: I0320 10:39:24.148501 4748 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9274aea-7ec5-410b-a183-f67a45ee5241-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:39:24 crc kubenswrapper[4748]: I0320 10:39:24.149028 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9274aea-7ec5-410b-a183-f67a45ee5241-client-ca" (OuterVolumeSpecName: "client-ca") pod "a9274aea-7ec5-410b-a183-f67a45ee5241" (UID: "a9274aea-7ec5-410b-a183-f67a45ee5241"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:39:24 crc kubenswrapper[4748]: I0320 10:39:24.153328 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9274aea-7ec5-410b-a183-f67a45ee5241-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "a9274aea-7ec5-410b-a183-f67a45ee5241" (UID: "a9274aea-7ec5-410b-a183-f67a45ee5241"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:39:24 crc kubenswrapper[4748]: I0320 10:39:24.153635 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9274aea-7ec5-410b-a183-f67a45ee5241-kube-api-access-5cwbv" (OuterVolumeSpecName: "kube-api-access-5cwbv") pod "a9274aea-7ec5-410b-a183-f67a45ee5241" (UID: "a9274aea-7ec5-410b-a183-f67a45ee5241"). InnerVolumeSpecName "kube-api-access-5cwbv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:39:24 crc kubenswrapper[4748]: I0320 10:39:24.249413 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7798b202-00ed-40e7-b4d2-abccb941ed18-client-ca\") pod \"route-controller-manager-59c68db7c4-66ddn\" (UID: \"7798b202-00ed-40e7-b4d2-abccb941ed18\") " pod="openshift-route-controller-manager/route-controller-manager-59c68db7c4-66ddn" Mar 20 10:39:24 crc kubenswrapper[4748]: I0320 10:39:24.249468 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7798b202-00ed-40e7-b4d2-abccb941ed18-config\") pod \"route-controller-manager-59c68db7c4-66ddn\" (UID: \"7798b202-00ed-40e7-b4d2-abccb941ed18\") " pod="openshift-route-controller-manager/route-controller-manager-59c68db7c4-66ddn" Mar 20 10:39:24 crc kubenswrapper[4748]: I0320 10:39:24.249489 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7798b202-00ed-40e7-b4d2-abccb941ed18-serving-cert\") pod \"route-controller-manager-59c68db7c4-66ddn\" (UID: \"7798b202-00ed-40e7-b4d2-abccb941ed18\") " pod="openshift-route-controller-manager/route-controller-manager-59c68db7c4-66ddn" Mar 20 10:39:24 crc kubenswrapper[4748]: I0320 10:39:24.249503 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9tnp4\" (UniqueName: \"kubernetes.io/projected/7798b202-00ed-40e7-b4d2-abccb941ed18-kube-api-access-9tnp4\") pod \"route-controller-manager-59c68db7c4-66ddn\" (UID: \"7798b202-00ed-40e7-b4d2-abccb941ed18\") " pod="openshift-route-controller-manager/route-controller-manager-59c68db7c4-66ddn" Mar 20 10:39:24 crc kubenswrapper[4748]: I0320 10:39:24.249605 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5cwbv\" (UniqueName: \"kubernetes.io/projected/a9274aea-7ec5-410b-a183-f67a45ee5241-kube-api-access-5cwbv\") on node \"crc\" DevicePath \"\"" Mar 20 10:39:24 crc kubenswrapper[4748]: I0320 10:39:24.249617 4748 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a9274aea-7ec5-410b-a183-f67a45ee5241-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 10:39:24 crc kubenswrapper[4748]: I0320 10:39:24.249628 4748 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a9274aea-7ec5-410b-a183-f67a45ee5241-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:39:24 crc kubenswrapper[4748]: I0320 10:39:24.350761 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7798b202-00ed-40e7-b4d2-abccb941ed18-client-ca\") pod \"route-controller-manager-59c68db7c4-66ddn\" (UID: \"7798b202-00ed-40e7-b4d2-abccb941ed18\") " pod="openshift-route-controller-manager/route-controller-manager-59c68db7c4-66ddn" Mar 20 10:39:24 crc kubenswrapper[4748]: I0320 10:39:24.350900 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7798b202-00ed-40e7-b4d2-abccb941ed18-config\") pod \"route-controller-manager-59c68db7c4-66ddn\" (UID: \"7798b202-00ed-40e7-b4d2-abccb941ed18\") " pod="openshift-route-controller-manager/route-controller-manager-59c68db7c4-66ddn" Mar 20 10:39:24 crc kubenswrapper[4748]: I0320 10:39:24.350964 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7798b202-00ed-40e7-b4d2-abccb941ed18-serving-cert\") pod \"route-controller-manager-59c68db7c4-66ddn\" (UID: \"7798b202-00ed-40e7-b4d2-abccb941ed18\") " pod="openshift-route-controller-manager/route-controller-manager-59c68db7c4-66ddn" Mar 20 10:39:24 crc kubenswrapper[4748]: I0320 10:39:24.351008 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9tnp4\" (UniqueName: \"kubernetes.io/projected/7798b202-00ed-40e7-b4d2-abccb941ed18-kube-api-access-9tnp4\") pod \"route-controller-manager-59c68db7c4-66ddn\" (UID: \"7798b202-00ed-40e7-b4d2-abccb941ed18\") " pod="openshift-route-controller-manager/route-controller-manager-59c68db7c4-66ddn" Mar 20 10:39:24 crc kubenswrapper[4748]: I0320 10:39:24.351875 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7798b202-00ed-40e7-b4d2-abccb941ed18-client-ca\") pod \"route-controller-manager-59c68db7c4-66ddn\" (UID: \"7798b202-00ed-40e7-b4d2-abccb941ed18\") " pod="openshift-route-controller-manager/route-controller-manager-59c68db7c4-66ddn" Mar 20 10:39:24 crc kubenswrapper[4748]: I0320 10:39:24.353387 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7798b202-00ed-40e7-b4d2-abccb941ed18-config\") pod \"route-controller-manager-59c68db7c4-66ddn\" (UID: \"7798b202-00ed-40e7-b4d2-abccb941ed18\") " pod="openshift-route-controller-manager/route-controller-manager-59c68db7c4-66ddn" Mar 20 10:39:24 crc kubenswrapper[4748]: I0320 10:39:24.363891 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7798b202-00ed-40e7-b4d2-abccb941ed18-serving-cert\") pod \"route-controller-manager-59c68db7c4-66ddn\" (UID: \"7798b202-00ed-40e7-b4d2-abccb941ed18\") " pod="openshift-route-controller-manager/route-controller-manager-59c68db7c4-66ddn" Mar 20 10:39:24 crc kubenswrapper[4748]: I0320 10:39:24.366673 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9tnp4\" (UniqueName: \"kubernetes.io/projected/7798b202-00ed-40e7-b4d2-abccb941ed18-kube-api-access-9tnp4\") pod \"route-controller-manager-59c68db7c4-66ddn\" (UID: \"7798b202-00ed-40e7-b4d2-abccb941ed18\") " pod="openshift-route-controller-manager/route-controller-manager-59c68db7c4-66ddn" Mar 20 10:39:24 crc kubenswrapper[4748]: I0320 10:39:24.414945 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6677c6f89c-fth76" Mar 20 10:39:24 crc kubenswrapper[4748]: I0320 10:39:24.421755 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-59c68db7c4-66ddn" Mar 20 10:39:24 crc kubenswrapper[4748]: I0320 10:39:24.555199 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2m6d9\" (UniqueName: \"kubernetes.io/projected/fc63f09e-a566-4a38-855a-efe2fe737848-kube-api-access-2m6d9\") pod \"fc63f09e-a566-4a38-855a-efe2fe737848\" (UID: \"fc63f09e-a566-4a38-855a-efe2fe737848\") " Mar 20 10:39:24 crc kubenswrapper[4748]: I0320 10:39:24.555507 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fc63f09e-a566-4a38-855a-efe2fe737848-serving-cert\") pod \"fc63f09e-a566-4a38-855a-efe2fe737848\" (UID: \"fc63f09e-a566-4a38-855a-efe2fe737848\") " Mar 20 10:39:24 crc kubenswrapper[4748]: I0320 10:39:24.555623 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fc63f09e-a566-4a38-855a-efe2fe737848-proxy-ca-bundles\") pod \"fc63f09e-a566-4a38-855a-efe2fe737848\" (UID: \"fc63f09e-a566-4a38-855a-efe2fe737848\") " Mar 20 10:39:24 crc kubenswrapper[4748]: I0320 10:39:24.555651 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fc63f09e-a566-4a38-855a-efe2fe737848-client-ca\") pod \"fc63f09e-a566-4a38-855a-efe2fe737848\" (UID: \"fc63f09e-a566-4a38-855a-efe2fe737848\") " Mar 20 10:39:24 crc kubenswrapper[4748]: I0320 10:39:24.555690 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc63f09e-a566-4a38-855a-efe2fe737848-config\") pod \"fc63f09e-a566-4a38-855a-efe2fe737848\" (UID: \"fc63f09e-a566-4a38-855a-efe2fe737848\") " Mar 20 10:39:24 crc kubenswrapper[4748]: I0320 10:39:24.557135 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc63f09e-a566-4a38-855a-efe2fe737848-client-ca" (OuterVolumeSpecName: "client-ca") pod "fc63f09e-a566-4a38-855a-efe2fe737848" (UID: "fc63f09e-a566-4a38-855a-efe2fe737848"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:39:24 crc kubenswrapper[4748]: I0320 10:39:24.557250 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc63f09e-a566-4a38-855a-efe2fe737848-config" (OuterVolumeSpecName: "config") pod "fc63f09e-a566-4a38-855a-efe2fe737848" (UID: "fc63f09e-a566-4a38-855a-efe2fe737848"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:39:24 crc kubenswrapper[4748]: I0320 10:39:24.558244 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc63f09e-a566-4a38-855a-efe2fe737848-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "fc63f09e-a566-4a38-855a-efe2fe737848" (UID: "fc63f09e-a566-4a38-855a-efe2fe737848"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:39:24 crc kubenswrapper[4748]: I0320 10:39:24.563142 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc63f09e-a566-4a38-855a-efe2fe737848-kube-api-access-2m6d9" (OuterVolumeSpecName: "kube-api-access-2m6d9") pod "fc63f09e-a566-4a38-855a-efe2fe737848" (UID: "fc63f09e-a566-4a38-855a-efe2fe737848"). InnerVolumeSpecName "kube-api-access-2m6d9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:39:24 crc kubenswrapper[4748]: I0320 10:39:24.569498 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc63f09e-a566-4a38-855a-efe2fe737848-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "fc63f09e-a566-4a38-855a-efe2fe737848" (UID: "fc63f09e-a566-4a38-855a-efe2fe737848"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:39:24 crc kubenswrapper[4748]: I0320 10:39:24.610670 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-59c68db7c4-66ddn"] Mar 20 10:39:24 crc kubenswrapper[4748]: W0320 10:39:24.622884 4748 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7798b202_00ed_40e7_b4d2_abccb941ed18.slice/crio-82262975ba0696e1068ec23ec9ca1cf4b588d884a674b4ca9d79628e8d1672aa WatchSource:0}: Error finding container 82262975ba0696e1068ec23ec9ca1cf4b588d884a674b4ca9d79628e8d1672aa: Status 404 returned error can't find the container with id 82262975ba0696e1068ec23ec9ca1cf4b588d884a674b4ca9d79628e8d1672aa Mar 20 10:39:24 crc kubenswrapper[4748]: I0320 10:39:24.657566 4748 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fc63f09e-a566-4a38-855a-efe2fe737848-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 10:39:24 crc kubenswrapper[4748]: I0320 10:39:24.657598 4748 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fc63f09e-a566-4a38-855a-efe2fe737848-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 10:39:24 crc kubenswrapper[4748]: I0320 10:39:24.657611 4748 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc63f09e-a566-4a38-855a-efe2fe737848-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:39:24 crc kubenswrapper[4748]: I0320 10:39:24.657622 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2m6d9\" (UniqueName: \"kubernetes.io/projected/fc63f09e-a566-4a38-855a-efe2fe737848-kube-api-access-2m6d9\") on node \"crc\" DevicePath \"\"" Mar 20 10:39:24 crc kubenswrapper[4748]: I0320 10:39:24.657630 4748 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fc63f09e-a566-4a38-855a-efe2fe737848-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:39:25 crc kubenswrapper[4748]: I0320 10:39:25.053433 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6677c6f89c-fth76" event={"ID":"fc63f09e-a566-4a38-855a-efe2fe737848","Type":"ContainerDied","Data":"a8dc467391b89d96131bb4d3771082996618dfd5dd6bfad81496ea316746d18a"} Mar 20 10:39:25 crc kubenswrapper[4748]: I0320 10:39:25.054992 4748 scope.go:117] "RemoveContainer" containerID="3b604433f968363d4e70c60beddfeec3705bdef87ed7ef5da2b27da4b7767f4c" Mar 20 10:39:25 crc kubenswrapper[4748]: I0320 10:39:25.055230 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6677c6f89c-fth76" Mar 20 10:39:25 crc kubenswrapper[4748]: I0320 10:39:25.061379 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566718-7hmv9" event={"ID":"0de9aa72-edab-4ae9-b2dd-e20ef6b83277","Type":"ContainerStarted","Data":"7d3e72e3fd6fca7d1a1240ac3c55ba176e943063f21e150b1a4cea49f6cce92e"} Mar 20 10:39:25 crc kubenswrapper[4748]: I0320 10:39:25.064800 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-dfc8ff6c5-9t8fr" Mar 20 10:39:25 crc kubenswrapper[4748]: I0320 10:39:25.066034 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-59c68db7c4-66ddn" event={"ID":"7798b202-00ed-40e7-b4d2-abccb941ed18","Type":"ContainerStarted","Data":"82262975ba0696e1068ec23ec9ca1cf4b588d884a674b4ca9d79628e8d1672aa"} Mar 20 10:39:25 crc kubenswrapper[4748]: I0320 10:39:25.098532 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6677c6f89c-fth76"] Mar 20 10:39:25 crc kubenswrapper[4748]: I0320 10:39:25.103580 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6677c6f89c-fth76"] Mar 20 10:39:25 crc kubenswrapper[4748]: I0320 10:39:25.129460 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-dfc8ff6c5-9t8fr"] Mar 20 10:39:25 crc kubenswrapper[4748]: I0320 10:39:25.130773 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-dfc8ff6c5-9t8fr"] Mar 20 10:39:25 crc kubenswrapper[4748]: I0320 10:39:25.525115 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9274aea-7ec5-410b-a183-f67a45ee5241" path="/var/lib/kubelet/pods/a9274aea-7ec5-410b-a183-f67a45ee5241/volumes" Mar 20 10:39:25 crc kubenswrapper[4748]: I0320 10:39:25.526446 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc63f09e-a566-4a38-855a-efe2fe737848" path="/var/lib/kubelet/pods/fc63f09e-a566-4a38-855a-efe2fe737848/volumes" Mar 20 10:39:25 crc kubenswrapper[4748]: I0320 10:39:25.532430 4748 csr.go:261] certificate signing request csr-tm6dj is approved, waiting to be issued Mar 20 10:39:25 crc kubenswrapper[4748]: I0320 10:39:25.540877 4748 csr.go:257] certificate signing request csr-tm6dj is issued Mar 20 10:39:26 crc kubenswrapper[4748]: I0320 10:39:26.076707 4748 generic.go:334] "Generic (PLEG): container finished" podID="0de9aa72-edab-4ae9-b2dd-e20ef6b83277" containerID="7d3e72e3fd6fca7d1a1240ac3c55ba176e943063f21e150b1a4cea49f6cce92e" exitCode=0 Mar 20 10:39:26 crc kubenswrapper[4748]: I0320 10:39:26.076778 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566718-7hmv9" event={"ID":"0de9aa72-edab-4ae9-b2dd-e20ef6b83277","Type":"ContainerDied","Data":"7d3e72e3fd6fca7d1a1240ac3c55ba176e943063f21e150b1a4cea49f6cce92e"} Mar 20 10:39:26 crc kubenswrapper[4748]: I0320 10:39:26.079188 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-59c68db7c4-66ddn" event={"ID":"7798b202-00ed-40e7-b4d2-abccb941ed18","Type":"ContainerStarted","Data":"f04b0db02939f225cfe90cae89f401ba8e7ef1d9a60716a6a092411524c35c41"} Mar 20 10:39:26 crc kubenswrapper[4748]: I0320 10:39:26.079681 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-59c68db7c4-66ddn" Mar 20 10:39:26 crc kubenswrapper[4748]: I0320 10:39:26.090575 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-59c68db7c4-66ddn" Mar 20 10:39:26 crc kubenswrapper[4748]: I0320 10:39:26.133262 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-59c68db7c4-66ddn" podStartSLOduration=4.133231573 podStartE2EDuration="4.133231573s" podCreationTimestamp="2026-03-20 10:39:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:39:26.117910158 +0000 UTC m=+201.259455992" watchObservedRunningTime="2026-03-20 10:39:26.133231573 +0000 UTC m=+201.274777387" Mar 20 10:39:26 crc kubenswrapper[4748]: I0320 10:39:26.542563 4748 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-20 02:38:48.87476791 +0000 UTC Mar 20 10:39:26 crc kubenswrapper[4748]: I0320 10:39:26.542633 4748 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6591h59m22.332138526s for next certificate rotation Mar 20 10:39:26 crc kubenswrapper[4748]: I0320 10:39:26.613395 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7c6bdd87c4-fmn7s"] Mar 20 10:39:26 crc kubenswrapper[4748]: E0320 10:39:26.613634 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc63f09e-a566-4a38-855a-efe2fe737848" containerName="controller-manager" Mar 20 10:39:26 crc kubenswrapper[4748]: I0320 10:39:26.613652 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc63f09e-a566-4a38-855a-efe2fe737848" containerName="controller-manager" Mar 20 10:39:26 crc kubenswrapper[4748]: I0320 10:39:26.613767 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc63f09e-a566-4a38-855a-efe2fe737848" containerName="controller-manager" Mar 20 10:39:26 crc kubenswrapper[4748]: I0320 10:39:26.614147 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7c6bdd87c4-fmn7s" Mar 20 10:39:26 crc kubenswrapper[4748]: I0320 10:39:26.616543 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 20 10:39:26 crc kubenswrapper[4748]: I0320 10:39:26.616619 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 20 10:39:26 crc kubenswrapper[4748]: I0320 10:39:26.617394 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 20 10:39:26 crc kubenswrapper[4748]: I0320 10:39:26.617885 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 20 10:39:26 crc kubenswrapper[4748]: I0320 10:39:26.617962 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 20 10:39:26 crc kubenswrapper[4748]: I0320 10:39:26.618239 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 20 10:39:26 crc kubenswrapper[4748]: I0320 10:39:26.623103 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 20 10:39:26 crc kubenswrapper[4748]: I0320 10:39:26.631403 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7c6bdd87c4-fmn7s"] Mar 20 10:39:26 crc kubenswrapper[4748]: E0320 10:39:26.782423 4748 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod61390690_1bd1_43c9_b82b_e2c5fe3450f9.slice/crio-37f2f4501a16a8351b691a83364c3a23446a050ebcdbf05e894a8d004c90cd9b.scope\": RecentStats: unable to find data in memory cache]" Mar 20 10:39:26 crc kubenswrapper[4748]: I0320 10:39:26.790165 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/71cc80b6-74af-413e-8b36-3d637f7dd31e-client-ca\") pod \"controller-manager-7c6bdd87c4-fmn7s\" (UID: \"71cc80b6-74af-413e-8b36-3d637f7dd31e\") " pod="openshift-controller-manager/controller-manager-7c6bdd87c4-fmn7s" Mar 20 10:39:26 crc kubenswrapper[4748]: I0320 10:39:26.790283 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71cc80b6-74af-413e-8b36-3d637f7dd31e-config\") pod \"controller-manager-7c6bdd87c4-fmn7s\" (UID: \"71cc80b6-74af-413e-8b36-3d637f7dd31e\") " pod="openshift-controller-manager/controller-manager-7c6bdd87c4-fmn7s" Mar 20 10:39:26 crc kubenswrapper[4748]: I0320 10:39:26.790350 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/71cc80b6-74af-413e-8b36-3d637f7dd31e-proxy-ca-bundles\") pod \"controller-manager-7c6bdd87c4-fmn7s\" (UID: \"71cc80b6-74af-413e-8b36-3d637f7dd31e\") " pod="openshift-controller-manager/controller-manager-7c6bdd87c4-fmn7s" Mar 20 10:39:26 crc kubenswrapper[4748]: I0320 10:39:26.790421 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnj8m\" (UniqueName: \"kubernetes.io/projected/71cc80b6-74af-413e-8b36-3d637f7dd31e-kube-api-access-rnj8m\") pod \"controller-manager-7c6bdd87c4-fmn7s\" (UID: \"71cc80b6-74af-413e-8b36-3d637f7dd31e\") " pod="openshift-controller-manager/controller-manager-7c6bdd87c4-fmn7s" Mar 20 10:39:26 crc kubenswrapper[4748]: I0320 10:39:26.790457 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/71cc80b6-74af-413e-8b36-3d637f7dd31e-serving-cert\") pod \"controller-manager-7c6bdd87c4-fmn7s\" (UID: \"71cc80b6-74af-413e-8b36-3d637f7dd31e\") " pod="openshift-controller-manager/controller-manager-7c6bdd87c4-fmn7s" Mar 20 10:39:26 crc kubenswrapper[4748]: I0320 10:39:26.894469 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/71cc80b6-74af-413e-8b36-3d637f7dd31e-proxy-ca-bundles\") pod \"controller-manager-7c6bdd87c4-fmn7s\" (UID: \"71cc80b6-74af-413e-8b36-3d637f7dd31e\") " pod="openshift-controller-manager/controller-manager-7c6bdd87c4-fmn7s" Mar 20 10:39:26 crc kubenswrapper[4748]: I0320 10:39:26.894845 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rnj8m\" (UniqueName: \"kubernetes.io/projected/71cc80b6-74af-413e-8b36-3d637f7dd31e-kube-api-access-rnj8m\") pod \"controller-manager-7c6bdd87c4-fmn7s\" (UID: \"71cc80b6-74af-413e-8b36-3d637f7dd31e\") " pod="openshift-controller-manager/controller-manager-7c6bdd87c4-fmn7s" Mar 20 10:39:26 crc kubenswrapper[4748]: I0320 10:39:26.894978 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/71cc80b6-74af-413e-8b36-3d637f7dd31e-serving-cert\") pod \"controller-manager-7c6bdd87c4-fmn7s\" (UID: \"71cc80b6-74af-413e-8b36-3d637f7dd31e\") " pod="openshift-controller-manager/controller-manager-7c6bdd87c4-fmn7s" Mar 20 10:39:26 crc kubenswrapper[4748]: I0320 10:39:26.895083 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/71cc80b6-74af-413e-8b36-3d637f7dd31e-client-ca\") pod \"controller-manager-7c6bdd87c4-fmn7s\" (UID: \"71cc80b6-74af-413e-8b36-3d637f7dd31e\") " pod="openshift-controller-manager/controller-manager-7c6bdd87c4-fmn7s" Mar 20 10:39:26 crc kubenswrapper[4748]: I0320 10:39:26.895200 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71cc80b6-74af-413e-8b36-3d637f7dd31e-config\") pod \"controller-manager-7c6bdd87c4-fmn7s\" (UID: \"71cc80b6-74af-413e-8b36-3d637f7dd31e\") " pod="openshift-controller-manager/controller-manager-7c6bdd87c4-fmn7s" Mar 20 10:39:26 crc kubenswrapper[4748]: I0320 10:39:26.896042 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/71cc80b6-74af-413e-8b36-3d637f7dd31e-proxy-ca-bundles\") pod \"controller-manager-7c6bdd87c4-fmn7s\" (UID: \"71cc80b6-74af-413e-8b36-3d637f7dd31e\") " pod="openshift-controller-manager/controller-manager-7c6bdd87c4-fmn7s" Mar 20 10:39:26 crc kubenswrapper[4748]: I0320 10:39:26.896723 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71cc80b6-74af-413e-8b36-3d637f7dd31e-config\") pod \"controller-manager-7c6bdd87c4-fmn7s\" (UID: \"71cc80b6-74af-413e-8b36-3d637f7dd31e\") " pod="openshift-controller-manager/controller-manager-7c6bdd87c4-fmn7s" Mar 20 10:39:26 crc kubenswrapper[4748]: I0320 10:39:26.897450 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/71cc80b6-74af-413e-8b36-3d637f7dd31e-client-ca\") pod \"controller-manager-7c6bdd87c4-fmn7s\" (UID: \"71cc80b6-74af-413e-8b36-3d637f7dd31e\") " pod="openshift-controller-manager/controller-manager-7c6bdd87c4-fmn7s" Mar 20 10:39:26 crc kubenswrapper[4748]: I0320 10:39:26.902804 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/71cc80b6-74af-413e-8b36-3d637f7dd31e-serving-cert\") pod \"controller-manager-7c6bdd87c4-fmn7s\" (UID: \"71cc80b6-74af-413e-8b36-3d637f7dd31e\") " pod="openshift-controller-manager/controller-manager-7c6bdd87c4-fmn7s" Mar 20 10:39:26 crc kubenswrapper[4748]: I0320 10:39:26.922990 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnj8m\" (UniqueName: \"kubernetes.io/projected/71cc80b6-74af-413e-8b36-3d637f7dd31e-kube-api-access-rnj8m\") pod \"controller-manager-7c6bdd87c4-fmn7s\" (UID: \"71cc80b6-74af-413e-8b36-3d637f7dd31e\") " pod="openshift-controller-manager/controller-manager-7c6bdd87c4-fmn7s" Mar 20 10:39:26 crc kubenswrapper[4748]: I0320 10:39:26.987001 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7c6bdd87c4-fmn7s" Mar 20 10:39:27 crc kubenswrapper[4748]: I0320 10:39:27.093073 4748 generic.go:334] "Generic (PLEG): container finished" podID="623e7576-03b2-41ba-9cd5-e7d3ab7ecdd7" containerID="b4effbfd33d6ef8d91e8660fe1d1fbc0343f651759fe0b9791c78d2386c70c23" exitCode=0 Mar 20 10:39:27 crc kubenswrapper[4748]: I0320 10:39:27.093293 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mhpdf" event={"ID":"623e7576-03b2-41ba-9cd5-e7d3ab7ecdd7","Type":"ContainerDied","Data":"b4effbfd33d6ef8d91e8660fe1d1fbc0343f651759fe0b9791c78d2386c70c23"} Mar 20 10:39:27 crc kubenswrapper[4748]: I0320 10:39:27.100298 4748 generic.go:334] "Generic (PLEG): container finished" podID="5196dc0c-2a46-4fb2-891b-682a6ce5eed9" containerID="3f8eab936b75ab67d84bb367d0d05d8e2344ddee5c8d6abf180165b53d20fad3" exitCode=0 Mar 20 10:39:27 crc kubenswrapper[4748]: I0320 10:39:27.100459 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-llsmr" event={"ID":"5196dc0c-2a46-4fb2-891b-682a6ce5eed9","Type":"ContainerDied","Data":"3f8eab936b75ab67d84bb367d0d05d8e2344ddee5c8d6abf180165b53d20fad3"} Mar 20 10:39:27 crc kubenswrapper[4748]: I0320 10:39:27.111303 4748 generic.go:334] "Generic (PLEG): container finished" podID="a82b758c-eb62-429b-b092-5ba4a7cf665d" containerID="0502488e69ed276258241b28cb6dd7e542373a7e6f016787a30dbf2bb32acef8" exitCode=0 Mar 20 10:39:27 crc kubenswrapper[4748]: I0320 10:39:27.111441 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c4jdq" event={"ID":"a82b758c-eb62-429b-b092-5ba4a7cf665d","Type":"ContainerDied","Data":"0502488e69ed276258241b28cb6dd7e542373a7e6f016787a30dbf2bb32acef8"} Mar 20 10:39:27 crc kubenswrapper[4748]: I0320 10:39:27.135100 4748 generic.go:334] "Generic (PLEG): container finished" podID="0186ffa9-907a-4afd-953d-28665f7343da" containerID="7fa50158e264bdda9d565f4dcd7ca54a039159bce144ead50f845245684d7d5e" exitCode=0 Mar 20 10:39:27 crc kubenswrapper[4748]: I0320 10:39:27.135202 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8kqb5" event={"ID":"0186ffa9-907a-4afd-953d-28665f7343da","Type":"ContainerDied","Data":"7fa50158e264bdda9d565f4dcd7ca54a039159bce144ead50f845245684d7d5e"} Mar 20 10:39:27 crc kubenswrapper[4748]: I0320 10:39:27.142975 4748 generic.go:334] "Generic (PLEG): container finished" podID="e0225901-1ac3-4097-9b8b-4a2c7f0f3f3b" containerID="ec0102935c3f45ce38fa4fc9c17992d5373ea8705bcb5a1a8545406280dc0745" exitCode=0 Mar 20 10:39:27 crc kubenswrapper[4748]: I0320 10:39:27.143059 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5fgxw" event={"ID":"e0225901-1ac3-4097-9b8b-4a2c7f0f3f3b","Type":"ContainerDied","Data":"ec0102935c3f45ce38fa4fc9c17992d5373ea8705bcb5a1a8545406280dc0745"} Mar 20 10:39:27 crc kubenswrapper[4748]: I0320 10:39:27.145757 4748 generic.go:334] "Generic (PLEG): container finished" podID="61390690-1bd1-43c9-b82b-e2c5fe3450f9" containerID="37f2f4501a16a8351b691a83364c3a23446a050ebcdbf05e894a8d004c90cd9b" exitCode=0 Mar 20 10:39:27 crc kubenswrapper[4748]: I0320 10:39:27.145899 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b6j7x" event={"ID":"61390690-1bd1-43c9-b82b-e2c5fe3450f9","Type":"ContainerDied","Data":"37f2f4501a16a8351b691a83364c3a23446a050ebcdbf05e894a8d004c90cd9b"} Mar 20 10:39:27 crc kubenswrapper[4748]: I0320 10:39:27.148994 4748 generic.go:334] "Generic (PLEG): container finished" podID="7e393e84-d0bb-4258-8eef-012c9269fc05" containerID="6a514e8ee4323d9a111787c95d0dbf8b1c1ee49cb8a19f29ae2fe90c5525e67d" exitCode=0 Mar 20 10:39:27 crc kubenswrapper[4748]: I0320 10:39:27.149088 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z78kn" event={"ID":"7e393e84-d0bb-4258-8eef-012c9269fc05","Type":"ContainerDied","Data":"6a514e8ee4323d9a111787c95d0dbf8b1c1ee49cb8a19f29ae2fe90c5525e67d"} Mar 20 10:39:27 crc kubenswrapper[4748]: I0320 10:39:27.153033 4748 generic.go:334] "Generic (PLEG): container finished" podID="8e249463-f3e7-4aed-a0ac-97c54af87949" containerID="1e351fa3a05224b4c450ae73d2a9dbbb3ca4d207610e880af3bf45a3b8f2330a" exitCode=0 Mar 20 10:39:27 crc kubenswrapper[4748]: I0320 10:39:27.153181 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bsq7p" event={"ID":"8e249463-f3e7-4aed-a0ac-97c54af87949","Type":"ContainerDied","Data":"1e351fa3a05224b4c450ae73d2a9dbbb3ca4d207610e880af3bf45a3b8f2330a"} Mar 20 10:39:27 crc kubenswrapper[4748]: I0320 10:39:27.362967 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566718-7hmv9" Mar 20 10:39:27 crc kubenswrapper[4748]: I0320 10:39:27.404301 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7c6bdd87c4-fmn7s"] Mar 20 10:39:27 crc kubenswrapper[4748]: I0320 10:39:27.505869 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wpwpj\" (UniqueName: \"kubernetes.io/projected/0de9aa72-edab-4ae9-b2dd-e20ef6b83277-kube-api-access-wpwpj\") pod \"0de9aa72-edab-4ae9-b2dd-e20ef6b83277\" (UID: \"0de9aa72-edab-4ae9-b2dd-e20ef6b83277\") " Mar 20 10:39:27 crc kubenswrapper[4748]: I0320 10:39:27.511665 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0de9aa72-edab-4ae9-b2dd-e20ef6b83277-kube-api-access-wpwpj" (OuterVolumeSpecName: "kube-api-access-wpwpj") pod "0de9aa72-edab-4ae9-b2dd-e20ef6b83277" (UID: "0de9aa72-edab-4ae9-b2dd-e20ef6b83277"). InnerVolumeSpecName "kube-api-access-wpwpj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:39:27 crc kubenswrapper[4748]: I0320 10:39:27.543150 4748 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-17 12:27:23.797333301 +0000 UTC Mar 20 10:39:27 crc kubenswrapper[4748]: I0320 10:39:27.543272 4748 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6529h47m56.254065276s for next certificate rotation Mar 20 10:39:27 crc kubenswrapper[4748]: I0320 10:39:27.607280 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wpwpj\" (UniqueName: \"kubernetes.io/projected/0de9aa72-edab-4ae9-b2dd-e20ef6b83277-kube-api-access-wpwpj\") on node \"crc\" DevicePath \"\"" Mar 20 10:39:28 crc kubenswrapper[4748]: I0320 10:39:28.162488 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mhpdf" event={"ID":"623e7576-03b2-41ba-9cd5-e7d3ab7ecdd7","Type":"ContainerStarted","Data":"ba911fedc155e07ca9be1b8ab528bc7e5004920acc9881ff33a627adf1550e72"} Mar 20 10:39:28 crc kubenswrapper[4748]: I0320 10:39:28.164802 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-llsmr" event={"ID":"5196dc0c-2a46-4fb2-891b-682a6ce5eed9","Type":"ContainerStarted","Data":"818af1be7d986ef78e48ac5b436873844ea2ba14f05332d0bbd9e5da66c7b37b"} Mar 20 10:39:28 crc kubenswrapper[4748]: I0320 10:39:28.166770 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c4jdq" event={"ID":"a82b758c-eb62-429b-b092-5ba4a7cf665d","Type":"ContainerStarted","Data":"25add1f159d9c2b06a457989173a938f8c1e9b19a8a2bf039a4488820dfa8b84"} Mar 20 10:39:28 crc kubenswrapper[4748]: I0320 10:39:28.170576 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8kqb5" event={"ID":"0186ffa9-907a-4afd-953d-28665f7343da","Type":"ContainerStarted","Data":"6b1e7e5524ba07cf6db412bc93cf5887af47b76b7b5baf3e610f543e27197475"} Mar 20 10:39:28 crc kubenswrapper[4748]: I0320 10:39:28.173253 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5fgxw" event={"ID":"e0225901-1ac3-4097-9b8b-4a2c7f0f3f3b","Type":"ContainerStarted","Data":"b82a5b5b2403e77bdaf06114412968e18f377f7d281f2a81555980d0dba52549"} Mar 20 10:39:28 crc kubenswrapper[4748]: I0320 10:39:28.175244 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7c6bdd87c4-fmn7s" event={"ID":"71cc80b6-74af-413e-8b36-3d637f7dd31e","Type":"ContainerStarted","Data":"1fafcb8bc819d47a30cc4b17aa80ff00f13cf806a334cc81113624f47a1f8f69"} Mar 20 10:39:28 crc kubenswrapper[4748]: I0320 10:39:28.175280 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7c6bdd87c4-fmn7s" event={"ID":"71cc80b6-74af-413e-8b36-3d637f7dd31e","Type":"ContainerStarted","Data":"991be0f417c0bce5fcc68a3789b0dd1f80bb7e29110b4146b8981aa240bd54c5"} Mar 20 10:39:28 crc kubenswrapper[4748]: I0320 10:39:28.175473 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7c6bdd87c4-fmn7s" Mar 20 10:39:28 crc kubenswrapper[4748]: I0320 10:39:28.180111 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b6j7x" event={"ID":"61390690-1bd1-43c9-b82b-e2c5fe3450f9","Type":"ContainerStarted","Data":"2576ae933efc7e5c139af3acae0dcde268a0fd6e25626147530d40547643435e"} Mar 20 10:39:28 crc kubenswrapper[4748]: I0320 10:39:28.186180 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-mhpdf" podStartSLOduration=2.818693953 podStartE2EDuration="1m6.186158577s" podCreationTimestamp="2026-03-20 10:38:22 +0000 UTC" firstStartedPulling="2026-03-20 10:38:24.145741748 +0000 UTC m=+139.287287562" lastFinishedPulling="2026-03-20 10:39:27.513206342 +0000 UTC m=+202.654752186" observedRunningTime="2026-03-20 10:39:28.180784256 +0000 UTC m=+203.322330070" watchObservedRunningTime="2026-03-20 10:39:28.186158577 +0000 UTC m=+203.327704411" Mar 20 10:39:28 crc kubenswrapper[4748]: I0320 10:39:28.186884 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7c6bdd87c4-fmn7s" Mar 20 10:39:28 crc kubenswrapper[4748]: I0320 10:39:28.191929 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566718-7hmv9" event={"ID":"0de9aa72-edab-4ae9-b2dd-e20ef6b83277","Type":"ContainerDied","Data":"e28cd3ed09ba0b5d63e50a5244ab15bacfd9d12d0375232c3a16a9ff4ce18ef7"} Mar 20 10:39:28 crc kubenswrapper[4748]: I0320 10:39:28.191979 4748 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e28cd3ed09ba0b5d63e50a5244ab15bacfd9d12d0375232c3a16a9ff4ce18ef7" Mar 20 10:39:28 crc kubenswrapper[4748]: I0320 10:39:28.191949 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566718-7hmv9" Mar 20 10:39:28 crc kubenswrapper[4748]: I0320 10:39:28.196299 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z78kn" event={"ID":"7e393e84-d0bb-4258-8eef-012c9269fc05","Type":"ContainerStarted","Data":"f8c8c3ae29c737432695cd88f17f33a39cc3e0b0f9828132bc08a1b2de163131"} Mar 20 10:39:28 crc kubenswrapper[4748]: I0320 10:39:28.198736 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bsq7p" event={"ID":"8e249463-f3e7-4aed-a0ac-97c54af87949","Type":"ContainerStarted","Data":"3be9a0e52c6e54d65e9726e30a7c304d771450e424d9b3eb68d45d5ee7c24de0"} Mar 20 10:39:28 crc kubenswrapper[4748]: I0320 10:39:28.211294 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-llsmr" podStartSLOduration=3.801335285 podStartE2EDuration="1m7.21127723s" podCreationTimestamp="2026-03-20 10:38:21 +0000 UTC" firstStartedPulling="2026-03-20 10:38:24.161439642 +0000 UTC m=+139.302985466" lastFinishedPulling="2026-03-20 10:39:27.571381587 +0000 UTC m=+202.712927411" observedRunningTime="2026-03-20 10:39:28.210430298 +0000 UTC m=+203.351976122" watchObservedRunningTime="2026-03-20 10:39:28.21127723 +0000 UTC m=+203.352823034" Mar 20 10:39:28 crc kubenswrapper[4748]: I0320 10:39:28.256312 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-8kqb5" podStartSLOduration=3.019224483 podStartE2EDuration="1m4.256287588s" podCreationTimestamp="2026-03-20 10:38:24 +0000 UTC" firstStartedPulling="2026-03-20 10:38:26.385177074 +0000 UTC m=+141.526722888" lastFinishedPulling="2026-03-20 10:39:27.622240179 +0000 UTC m=+202.763785993" observedRunningTime="2026-03-20 10:39:28.235606962 +0000 UTC m=+203.377152776" watchObservedRunningTime="2026-03-20 10:39:28.256287588 +0000 UTC m=+203.397833402" Mar 20 10:39:28 crc kubenswrapper[4748]: I0320 10:39:28.256509 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-c4jdq" podStartSLOduration=3.092216442 podStartE2EDuration="1m3.256504943s" podCreationTimestamp="2026-03-20 10:38:25 +0000 UTC" firstStartedPulling="2026-03-20 10:38:27.555711967 +0000 UTC m=+142.697257781" lastFinishedPulling="2026-03-20 10:39:27.720000478 +0000 UTC m=+202.861546282" observedRunningTime="2026-03-20 10:39:28.254803789 +0000 UTC m=+203.396349603" watchObservedRunningTime="2026-03-20 10:39:28.256504943 +0000 UTC m=+203.398050757" Mar 20 10:39:28 crc kubenswrapper[4748]: I0320 10:39:28.280977 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7c6bdd87c4-fmn7s" podStartSLOduration=6.280959339 podStartE2EDuration="6.280959339s" podCreationTimestamp="2026-03-20 10:39:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:39:28.278061682 +0000 UTC m=+203.419607536" watchObservedRunningTime="2026-03-20 10:39:28.280959339 +0000 UTC m=+203.422505143" Mar 20 10:39:28 crc kubenswrapper[4748]: I0320 10:39:28.303304 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-b6j7x" podStartSLOduration=2.64347192 podStartE2EDuration="1m6.303281558s" podCreationTimestamp="2026-03-20 10:38:22 +0000 UTC" firstStartedPulling="2026-03-20 10:38:24.135288052 +0000 UTC m=+139.276833866" lastFinishedPulling="2026-03-20 10:39:27.79509769 +0000 UTC m=+202.936643504" observedRunningTime="2026-03-20 10:39:28.301046559 +0000 UTC m=+203.442592373" watchObservedRunningTime="2026-03-20 10:39:28.303281558 +0000 UTC m=+203.444827372" Mar 20 10:39:28 crc kubenswrapper[4748]: I0320 10:39:28.324001 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-5fgxw" podStartSLOduration=2.687084611 podStartE2EDuration="1m6.323981074s" podCreationTimestamp="2026-03-20 10:38:22 +0000 UTC" firstStartedPulling="2026-03-20 10:38:24.127584129 +0000 UTC m=+139.269129943" lastFinishedPulling="2026-03-20 10:39:27.764480592 +0000 UTC m=+202.906026406" observedRunningTime="2026-03-20 10:39:28.322499185 +0000 UTC m=+203.464045009" watchObservedRunningTime="2026-03-20 10:39:28.323981074 +0000 UTC m=+203.465526888" Mar 20 10:39:28 crc kubenswrapper[4748]: I0320 10:39:28.343420 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-bsq7p" podStartSLOduration=3.151539724 podStartE2EDuration="1m4.343401506s" podCreationTimestamp="2026-03-20 10:38:24 +0000 UTC" firstStartedPulling="2026-03-20 10:38:26.403717633 +0000 UTC m=+141.545263447" lastFinishedPulling="2026-03-20 10:39:27.595579415 +0000 UTC m=+202.737125229" observedRunningTime="2026-03-20 10:39:28.339496693 +0000 UTC m=+203.481042507" watchObservedRunningTime="2026-03-20 10:39:28.343401506 +0000 UTC m=+203.484947320" Mar 20 10:39:28 crc kubenswrapper[4748]: I0320 10:39:28.366443 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-z78kn" podStartSLOduration=3.074772122 podStartE2EDuration="1m3.366416373s" podCreationTimestamp="2026-03-20 10:38:25 +0000 UTC" firstStartedPulling="2026-03-20 10:38:27.530339378 +0000 UTC m=+142.671885192" lastFinishedPulling="2026-03-20 10:39:27.821983629 +0000 UTC m=+202.963529443" observedRunningTime="2026-03-20 10:39:28.361530844 +0000 UTC m=+203.503076658" watchObservedRunningTime="2026-03-20 10:39:28.366416373 +0000 UTC m=+203.507962197" Mar 20 10:39:32 crc kubenswrapper[4748]: I0320 10:39:32.237364 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-llsmr" Mar 20 10:39:32 crc kubenswrapper[4748]: I0320 10:39:32.237933 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-llsmr" Mar 20 10:39:32 crc kubenswrapper[4748]: I0320 10:39:32.438479 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-b6j7x" Mar 20 10:39:32 crc kubenswrapper[4748]: I0320 10:39:32.438558 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-b6j7x" Mar 20 10:39:32 crc kubenswrapper[4748]: I0320 10:39:32.475551 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-llsmr" Mar 20 10:39:32 crc kubenswrapper[4748]: I0320 10:39:32.484772 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-b6j7x" Mar 20 10:39:32 crc kubenswrapper[4748]: I0320 10:39:32.641303 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-mhpdf" Mar 20 10:39:32 crc kubenswrapper[4748]: I0320 10:39:32.641353 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-mhpdf" Mar 20 10:39:32 crc kubenswrapper[4748]: I0320 10:39:32.680253 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-mhpdf" Mar 20 10:39:32 crc kubenswrapper[4748]: I0320 10:39:32.978503 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-5fgxw" Mar 20 10:39:32 crc kubenswrapper[4748]: I0320 10:39:32.978564 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-5fgxw" Mar 20 10:39:33 crc kubenswrapper[4748]: I0320 10:39:33.023887 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-5fgxw" Mar 20 10:39:33 crc kubenswrapper[4748]: I0320 10:39:33.275941 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-b6j7x" Mar 20 10:39:33 crc kubenswrapper[4748]: I0320 10:39:33.280560 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-mhpdf" Mar 20 10:39:33 crc kubenswrapper[4748]: I0320 10:39:33.285307 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-5fgxw" Mar 20 10:39:33 crc kubenswrapper[4748]: I0320 10:39:33.299895 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-c5vw7" podUID="4d3ff749-37cf-414c-ad3b-fe72fc1cfe28" containerName="oauth-openshift" containerID="cri-o://d5f20b96ea371a511f703240ed488aebd4732d30be3a9a7e46e36675c6839e85" gracePeriod=15 Mar 20 10:39:33 crc kubenswrapper[4748]: I0320 10:39:33.300795 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-llsmr" Mar 20 10:39:34 crc kubenswrapper[4748]: I0320 10:39:34.240483 4748 generic.go:334] "Generic (PLEG): container finished" podID="4d3ff749-37cf-414c-ad3b-fe72fc1cfe28" containerID="d5f20b96ea371a511f703240ed488aebd4732d30be3a9a7e46e36675c6839e85" exitCode=0 Mar 20 10:39:34 crc kubenswrapper[4748]: I0320 10:39:34.240540 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-c5vw7" event={"ID":"4d3ff749-37cf-414c-ad3b-fe72fc1cfe28","Type":"ContainerDied","Data":"d5f20b96ea371a511f703240ed488aebd4732d30be3a9a7e46e36675c6839e85"} Mar 20 10:39:34 crc kubenswrapper[4748]: I0320 10:39:34.430335 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-c5vw7" Mar 20 10:39:34 crc kubenswrapper[4748]: I0320 10:39:34.611797 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/4d3ff749-37cf-414c-ad3b-fe72fc1cfe28-v4-0-config-user-idp-0-file-data\") pod \"4d3ff749-37cf-414c-ad3b-fe72fc1cfe28\" (UID: \"4d3ff749-37cf-414c-ad3b-fe72fc1cfe28\") " Mar 20 10:39:34 crc kubenswrapper[4748]: I0320 10:39:34.612357 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/4d3ff749-37cf-414c-ad3b-fe72fc1cfe28-v4-0-config-user-template-login\") pod \"4d3ff749-37cf-414c-ad3b-fe72fc1cfe28\" (UID: \"4d3ff749-37cf-414c-ad3b-fe72fc1cfe28\") " Mar 20 10:39:34 crc kubenswrapper[4748]: I0320 10:39:34.612386 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/4d3ff749-37cf-414c-ad3b-fe72fc1cfe28-v4-0-config-system-session\") pod \"4d3ff749-37cf-414c-ad3b-fe72fc1cfe28\" (UID: \"4d3ff749-37cf-414c-ad3b-fe72fc1cfe28\") " Mar 20 10:39:34 crc kubenswrapper[4748]: I0320 10:39:34.612424 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/4d3ff749-37cf-414c-ad3b-fe72fc1cfe28-v4-0-config-system-ocp-branding-template\") pod \"4d3ff749-37cf-414c-ad3b-fe72fc1cfe28\" (UID: \"4d3ff749-37cf-414c-ad3b-fe72fc1cfe28\") " Mar 20 10:39:34 crc kubenswrapper[4748]: I0320 10:39:34.612462 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/4d3ff749-37cf-414c-ad3b-fe72fc1cfe28-v4-0-config-system-service-ca\") pod \"4d3ff749-37cf-414c-ad3b-fe72fc1cfe28\" (UID: \"4d3ff749-37cf-414c-ad3b-fe72fc1cfe28\") " Mar 20 10:39:34 crc kubenswrapper[4748]: I0320 10:39:34.612510 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4d3ff749-37cf-414c-ad3b-fe72fc1cfe28-audit-policies\") pod \"4d3ff749-37cf-414c-ad3b-fe72fc1cfe28\" (UID: \"4d3ff749-37cf-414c-ad3b-fe72fc1cfe28\") " Mar 20 10:39:34 crc kubenswrapper[4748]: I0320 10:39:34.612537 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4d3ff749-37cf-414c-ad3b-fe72fc1cfe28-v4-0-config-system-trusted-ca-bundle\") pod \"4d3ff749-37cf-414c-ad3b-fe72fc1cfe28\" (UID: \"4d3ff749-37cf-414c-ad3b-fe72fc1cfe28\") " Mar 20 10:39:34 crc kubenswrapper[4748]: I0320 10:39:34.612567 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/4d3ff749-37cf-414c-ad3b-fe72fc1cfe28-v4-0-config-user-template-error\") pod \"4d3ff749-37cf-414c-ad3b-fe72fc1cfe28\" (UID: \"4d3ff749-37cf-414c-ad3b-fe72fc1cfe28\") " Mar 20 10:39:34 crc kubenswrapper[4748]: I0320 10:39:34.612590 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/4d3ff749-37cf-414c-ad3b-fe72fc1cfe28-v4-0-config-system-router-certs\") pod \"4d3ff749-37cf-414c-ad3b-fe72fc1cfe28\" (UID: \"4d3ff749-37cf-414c-ad3b-fe72fc1cfe28\") " Mar 20 10:39:34 crc kubenswrapper[4748]: I0320 10:39:34.612619 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/4d3ff749-37cf-414c-ad3b-fe72fc1cfe28-v4-0-config-user-template-provider-selection\") pod \"4d3ff749-37cf-414c-ad3b-fe72fc1cfe28\" (UID: \"4d3ff749-37cf-414c-ad3b-fe72fc1cfe28\") " Mar 20 10:39:34 crc kubenswrapper[4748]: I0320 10:39:34.612658 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/4d3ff749-37cf-414c-ad3b-fe72fc1cfe28-v4-0-config-system-serving-cert\") pod \"4d3ff749-37cf-414c-ad3b-fe72fc1cfe28\" (UID: \"4d3ff749-37cf-414c-ad3b-fe72fc1cfe28\") " Mar 20 10:39:34 crc kubenswrapper[4748]: I0320 10:39:34.612680 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tbbln\" (UniqueName: \"kubernetes.io/projected/4d3ff749-37cf-414c-ad3b-fe72fc1cfe28-kube-api-access-tbbln\") pod \"4d3ff749-37cf-414c-ad3b-fe72fc1cfe28\" (UID: \"4d3ff749-37cf-414c-ad3b-fe72fc1cfe28\") " Mar 20 10:39:34 crc kubenswrapper[4748]: I0320 10:39:34.612764 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4d3ff749-37cf-414c-ad3b-fe72fc1cfe28-audit-dir\") pod \"4d3ff749-37cf-414c-ad3b-fe72fc1cfe28\" (UID: \"4d3ff749-37cf-414c-ad3b-fe72fc1cfe28\") " Mar 20 10:39:34 crc kubenswrapper[4748]: I0320 10:39:34.612885 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4d3ff749-37cf-414c-ad3b-fe72fc1cfe28-v4-0-config-system-cliconfig\") pod \"4d3ff749-37cf-414c-ad3b-fe72fc1cfe28\" (UID: \"4d3ff749-37cf-414c-ad3b-fe72fc1cfe28\") " Mar 20 10:39:34 crc kubenswrapper[4748]: I0320 10:39:34.613397 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4d3ff749-37cf-414c-ad3b-fe72fc1cfe28-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "4d3ff749-37cf-414c-ad3b-fe72fc1cfe28" (UID: "4d3ff749-37cf-414c-ad3b-fe72fc1cfe28"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:39:34 crc kubenswrapper[4748]: I0320 10:39:34.613600 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4d3ff749-37cf-414c-ad3b-fe72fc1cfe28-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "4d3ff749-37cf-414c-ad3b-fe72fc1cfe28" (UID: "4d3ff749-37cf-414c-ad3b-fe72fc1cfe28"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:39:34 crc kubenswrapper[4748]: I0320 10:39:34.614112 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4d3ff749-37cf-414c-ad3b-fe72fc1cfe28-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "4d3ff749-37cf-414c-ad3b-fe72fc1cfe28" (UID: "4d3ff749-37cf-414c-ad3b-fe72fc1cfe28"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 10:39:34 crc kubenswrapper[4748]: I0320 10:39:34.615008 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4d3ff749-37cf-414c-ad3b-fe72fc1cfe28-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "4d3ff749-37cf-414c-ad3b-fe72fc1cfe28" (UID: "4d3ff749-37cf-414c-ad3b-fe72fc1cfe28"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:39:34 crc kubenswrapper[4748]: I0320 10:39:34.616217 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4d3ff749-37cf-414c-ad3b-fe72fc1cfe28-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "4d3ff749-37cf-414c-ad3b-fe72fc1cfe28" (UID: "4d3ff749-37cf-414c-ad3b-fe72fc1cfe28"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:39:34 crc kubenswrapper[4748]: I0320 10:39:34.623031 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d3ff749-37cf-414c-ad3b-fe72fc1cfe28-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "4d3ff749-37cf-414c-ad3b-fe72fc1cfe28" (UID: "4d3ff749-37cf-414c-ad3b-fe72fc1cfe28"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:39:34 crc kubenswrapper[4748]: I0320 10:39:34.623255 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d3ff749-37cf-414c-ad3b-fe72fc1cfe28-kube-api-access-tbbln" (OuterVolumeSpecName: "kube-api-access-tbbln") pod "4d3ff749-37cf-414c-ad3b-fe72fc1cfe28" (UID: "4d3ff749-37cf-414c-ad3b-fe72fc1cfe28"). InnerVolumeSpecName "kube-api-access-tbbln". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:39:34 crc kubenswrapper[4748]: I0320 10:39:34.623567 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d3ff749-37cf-414c-ad3b-fe72fc1cfe28-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "4d3ff749-37cf-414c-ad3b-fe72fc1cfe28" (UID: "4d3ff749-37cf-414c-ad3b-fe72fc1cfe28"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:39:34 crc kubenswrapper[4748]: I0320 10:39:34.623852 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d3ff749-37cf-414c-ad3b-fe72fc1cfe28-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "4d3ff749-37cf-414c-ad3b-fe72fc1cfe28" (UID: "4d3ff749-37cf-414c-ad3b-fe72fc1cfe28"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:39:34 crc kubenswrapper[4748]: I0320 10:39:34.624090 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d3ff749-37cf-414c-ad3b-fe72fc1cfe28-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "4d3ff749-37cf-414c-ad3b-fe72fc1cfe28" (UID: "4d3ff749-37cf-414c-ad3b-fe72fc1cfe28"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:39:34 crc kubenswrapper[4748]: I0320 10:39:34.630604 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d3ff749-37cf-414c-ad3b-fe72fc1cfe28-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "4d3ff749-37cf-414c-ad3b-fe72fc1cfe28" (UID: "4d3ff749-37cf-414c-ad3b-fe72fc1cfe28"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:39:34 crc kubenswrapper[4748]: I0320 10:39:34.634508 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-8kqb5" Mar 20 10:39:34 crc kubenswrapper[4748]: I0320 10:39:34.634558 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-8kqb5" Mar 20 10:39:34 crc kubenswrapper[4748]: I0320 10:39:34.638633 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d3ff749-37cf-414c-ad3b-fe72fc1cfe28-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "4d3ff749-37cf-414c-ad3b-fe72fc1cfe28" (UID: "4d3ff749-37cf-414c-ad3b-fe72fc1cfe28"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:39:34 crc kubenswrapper[4748]: I0320 10:39:34.643135 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d3ff749-37cf-414c-ad3b-fe72fc1cfe28-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "4d3ff749-37cf-414c-ad3b-fe72fc1cfe28" (UID: "4d3ff749-37cf-414c-ad3b-fe72fc1cfe28"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:39:34 crc kubenswrapper[4748]: I0320 10:39:34.643392 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d3ff749-37cf-414c-ad3b-fe72fc1cfe28-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "4d3ff749-37cf-414c-ad3b-fe72fc1cfe28" (UID: "4d3ff749-37cf-414c-ad3b-fe72fc1cfe28"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:39:34 crc kubenswrapper[4748]: I0320 10:39:34.713869 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-8kqb5" Mar 20 10:39:34 crc kubenswrapper[4748]: I0320 10:39:34.714119 4748 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4d3ff749-37cf-414c-ad3b-fe72fc1cfe28-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 20 10:39:34 crc kubenswrapper[4748]: I0320 10:39:34.714588 4748 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4d3ff749-37cf-414c-ad3b-fe72fc1cfe28-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 20 10:39:34 crc kubenswrapper[4748]: I0320 10:39:34.714676 4748 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/4d3ff749-37cf-414c-ad3b-fe72fc1cfe28-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 20 10:39:34 crc kubenswrapper[4748]: I0320 10:39:34.714769 4748 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/4d3ff749-37cf-414c-ad3b-fe72fc1cfe28-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 20 10:39:34 crc kubenswrapper[4748]: I0320 10:39:34.714906 4748 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/4d3ff749-37cf-414c-ad3b-fe72fc1cfe28-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 20 10:39:34 crc kubenswrapper[4748]: I0320 10:39:34.715024 4748 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/4d3ff749-37cf-414c-ad3b-fe72fc1cfe28-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 20 10:39:34 crc kubenswrapper[4748]: I0320 10:39:34.715153 4748 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/4d3ff749-37cf-414c-ad3b-fe72fc1cfe28-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 10:39:34 crc kubenswrapper[4748]: I0320 10:39:34.715267 4748 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4d3ff749-37cf-414c-ad3b-fe72fc1cfe28-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 20 10:39:34 crc kubenswrapper[4748]: I0320 10:39:34.715383 4748 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/4d3ff749-37cf-414c-ad3b-fe72fc1cfe28-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 20 10:39:34 crc kubenswrapper[4748]: I0320 10:39:34.715492 4748 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4d3ff749-37cf-414c-ad3b-fe72fc1cfe28-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 10:39:34 crc kubenswrapper[4748]: I0320 10:39:34.715611 4748 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/4d3ff749-37cf-414c-ad3b-fe72fc1cfe28-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 20 10:39:34 crc kubenswrapper[4748]: I0320 10:39:34.715769 4748 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/4d3ff749-37cf-414c-ad3b-fe72fc1cfe28-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 20 10:39:34 crc kubenswrapper[4748]: I0320 10:39:34.715907 4748 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/4d3ff749-37cf-414c-ad3b-fe72fc1cfe28-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:39:34 crc kubenswrapper[4748]: I0320 10:39:34.716011 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tbbln\" (UniqueName: \"kubernetes.io/projected/4d3ff749-37cf-414c-ad3b-fe72fc1cfe28-kube-api-access-tbbln\") on node \"crc\" DevicePath \"\"" Mar 20 10:39:34 crc kubenswrapper[4748]: I0320 10:39:34.744394 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mhpdf"] Mar 20 10:39:34 crc kubenswrapper[4748]: I0320 10:39:34.951123 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5fgxw"] Mar 20 10:39:35 crc kubenswrapper[4748]: I0320 10:39:35.032103 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-bsq7p" Mar 20 10:39:35 crc kubenswrapper[4748]: I0320 10:39:35.032190 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-bsq7p" Mar 20 10:39:35 crc kubenswrapper[4748]: I0320 10:39:35.090668 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-bsq7p" Mar 20 10:39:35 crc kubenswrapper[4748]: I0320 10:39:35.248029 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-c5vw7" Mar 20 10:39:35 crc kubenswrapper[4748]: I0320 10:39:35.248035 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-c5vw7" event={"ID":"4d3ff749-37cf-414c-ad3b-fe72fc1cfe28","Type":"ContainerDied","Data":"6e5c6044d4ee71169aebf20b11677808229d917450e7803e53d2be232d9ddc4f"} Mar 20 10:39:35 crc kubenswrapper[4748]: I0320 10:39:35.248126 4748 scope.go:117] "RemoveContainer" containerID="d5f20b96ea371a511f703240ed488aebd4732d30be3a9a7e46e36675c6839e85" Mar 20 10:39:35 crc kubenswrapper[4748]: I0320 10:39:35.248552 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-5fgxw" podUID="e0225901-1ac3-4097-9b8b-4a2c7f0f3f3b" containerName="registry-server" containerID="cri-o://b82a5b5b2403e77bdaf06114412968e18f377f7d281f2a81555980d0dba52549" gracePeriod=2 Mar 20 10:39:35 crc kubenswrapper[4748]: I0320 10:39:35.248943 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-mhpdf" podUID="623e7576-03b2-41ba-9cd5-e7d3ab7ecdd7" containerName="registry-server" containerID="cri-o://ba911fedc155e07ca9be1b8ab528bc7e5004920acc9881ff33a627adf1550e72" gracePeriod=2 Mar 20 10:39:35 crc kubenswrapper[4748]: I0320 10:39:35.291765 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-c5vw7"] Mar 20 10:39:35 crc kubenswrapper[4748]: I0320 10:39:35.297574 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-bsq7p" Mar 20 10:39:35 crc kubenswrapper[4748]: I0320 10:39:35.298264 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-c5vw7"] Mar 20 10:39:35 crc kubenswrapper[4748]: I0320 10:39:35.313529 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-8kqb5" Mar 20 10:39:35 crc kubenswrapper[4748]: I0320 10:39:35.436330 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-z78kn" Mar 20 10:39:35 crc kubenswrapper[4748]: I0320 10:39:35.436395 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-z78kn" Mar 20 10:39:35 crc kubenswrapper[4748]: I0320 10:39:35.479626 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-z78kn" Mar 20 10:39:35 crc kubenswrapper[4748]: I0320 10:39:35.533664 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d3ff749-37cf-414c-ad3b-fe72fc1cfe28" path="/var/lib/kubelet/pods/4d3ff749-37cf-414c-ad3b-fe72fc1cfe28/volumes" Mar 20 10:39:35 crc kubenswrapper[4748]: I0320 10:39:35.627247 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-8467bc5bd-rdzdn"] Mar 20 10:39:35 crc kubenswrapper[4748]: E0320 10:39:35.627635 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0de9aa72-edab-4ae9-b2dd-e20ef6b83277" containerName="oc" Mar 20 10:39:35 crc kubenswrapper[4748]: I0320 10:39:35.627657 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="0de9aa72-edab-4ae9-b2dd-e20ef6b83277" containerName="oc" Mar 20 10:39:35 crc kubenswrapper[4748]: E0320 10:39:35.627685 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d3ff749-37cf-414c-ad3b-fe72fc1cfe28" containerName="oauth-openshift" Mar 20 10:39:35 crc kubenswrapper[4748]: I0320 10:39:35.627699 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d3ff749-37cf-414c-ad3b-fe72fc1cfe28" containerName="oauth-openshift" Mar 20 10:39:35 crc kubenswrapper[4748]: I0320 10:39:35.627942 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="0de9aa72-edab-4ae9-b2dd-e20ef6b83277" containerName="oc" Mar 20 10:39:35 crc kubenswrapper[4748]: I0320 10:39:35.627977 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d3ff749-37cf-414c-ad3b-fe72fc1cfe28" containerName="oauth-openshift" Mar 20 10:39:35 crc kubenswrapper[4748]: I0320 10:39:35.628622 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-8467bc5bd-rdzdn" Mar 20 10:39:35 crc kubenswrapper[4748]: I0320 10:39:35.637507 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 20 10:39:35 crc kubenswrapper[4748]: I0320 10:39:35.645175 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 20 10:39:35 crc kubenswrapper[4748]: I0320 10:39:35.646114 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 20 10:39:35 crc kubenswrapper[4748]: I0320 10:39:35.646344 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 20 10:39:35 crc kubenswrapper[4748]: I0320 10:39:35.646819 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 20 10:39:35 crc kubenswrapper[4748]: I0320 10:39:35.647345 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 20 10:39:35 crc kubenswrapper[4748]: I0320 10:39:35.647658 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 20 10:39:35 crc kubenswrapper[4748]: I0320 10:39:35.647862 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 20 10:39:35 crc kubenswrapper[4748]: I0320 10:39:35.648542 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 20 10:39:35 crc kubenswrapper[4748]: I0320 10:39:35.650534 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 20 10:39:35 crc kubenswrapper[4748]: I0320 10:39:35.651854 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 20 10:39:35 crc kubenswrapper[4748]: I0320 10:39:35.660546 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 20 10:39:35 crc kubenswrapper[4748]: I0320 10:39:35.679120 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-8467bc5bd-rdzdn"] Mar 20 10:39:35 crc kubenswrapper[4748]: I0320 10:39:35.679303 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 20 10:39:35 crc kubenswrapper[4748]: I0320 10:39:35.683362 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 20 10:39:35 crc kubenswrapper[4748]: I0320 10:39:35.693639 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-c4jdq" Mar 20 10:39:35 crc kubenswrapper[4748]: I0320 10:39:35.694663 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-c4jdq" Mar 20 10:39:35 crc kubenswrapper[4748]: I0320 10:39:35.696694 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 20 10:39:35 crc kubenswrapper[4748]: I0320 10:39:35.728491 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ba95d0b6-2674-4d41-958a-0a311023cf24-v4-0-config-user-template-error\") pod \"oauth-openshift-8467bc5bd-rdzdn\" (UID: \"ba95d0b6-2674-4d41-958a-0a311023cf24\") " pod="openshift-authentication/oauth-openshift-8467bc5bd-rdzdn" Mar 20 10:39:35 crc kubenswrapper[4748]: I0320 10:39:35.728560 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ba95d0b6-2674-4d41-958a-0a311023cf24-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-8467bc5bd-rdzdn\" (UID: \"ba95d0b6-2674-4d41-958a-0a311023cf24\") " pod="openshift-authentication/oauth-openshift-8467bc5bd-rdzdn" Mar 20 10:39:35 crc kubenswrapper[4748]: I0320 10:39:35.728589 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ba95d0b6-2674-4d41-958a-0a311023cf24-v4-0-config-system-session\") pod \"oauth-openshift-8467bc5bd-rdzdn\" (UID: \"ba95d0b6-2674-4d41-958a-0a311023cf24\") " pod="openshift-authentication/oauth-openshift-8467bc5bd-rdzdn" Mar 20 10:39:35 crc kubenswrapper[4748]: I0320 10:39:35.728823 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ba95d0b6-2674-4d41-958a-0a311023cf24-v4-0-config-system-router-certs\") pod \"oauth-openshift-8467bc5bd-rdzdn\" (UID: \"ba95d0b6-2674-4d41-958a-0a311023cf24\") " pod="openshift-authentication/oauth-openshift-8467bc5bd-rdzdn" Mar 20 10:39:35 crc kubenswrapper[4748]: I0320 10:39:35.729009 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ba95d0b6-2674-4d41-958a-0a311023cf24-v4-0-config-system-cliconfig\") pod \"oauth-openshift-8467bc5bd-rdzdn\" (UID: \"ba95d0b6-2674-4d41-958a-0a311023cf24\") " pod="openshift-authentication/oauth-openshift-8467bc5bd-rdzdn" Mar 20 10:39:35 crc kubenswrapper[4748]: I0320 10:39:35.729054 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ba95d0b6-2674-4d41-958a-0a311023cf24-v4-0-config-system-service-ca\") pod \"oauth-openshift-8467bc5bd-rdzdn\" (UID: \"ba95d0b6-2674-4d41-958a-0a311023cf24\") " pod="openshift-authentication/oauth-openshift-8467bc5bd-rdzdn" Mar 20 10:39:35 crc kubenswrapper[4748]: I0320 10:39:35.729085 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ba95d0b6-2674-4d41-958a-0a311023cf24-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-8467bc5bd-rdzdn\" (UID: \"ba95d0b6-2674-4d41-958a-0a311023cf24\") " pod="openshift-authentication/oauth-openshift-8467bc5bd-rdzdn" Mar 20 10:39:35 crc kubenswrapper[4748]: I0320 10:39:35.729138 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ba95d0b6-2674-4d41-958a-0a311023cf24-v4-0-config-user-template-login\") pod \"oauth-openshift-8467bc5bd-rdzdn\" (UID: \"ba95d0b6-2674-4d41-958a-0a311023cf24\") " pod="openshift-authentication/oauth-openshift-8467bc5bd-rdzdn" Mar 20 10:39:35 crc kubenswrapper[4748]: I0320 10:39:35.729297 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ba95d0b6-2674-4d41-958a-0a311023cf24-audit-policies\") pod \"oauth-openshift-8467bc5bd-rdzdn\" (UID: \"ba95d0b6-2674-4d41-958a-0a311023cf24\") " pod="openshift-authentication/oauth-openshift-8467bc5bd-rdzdn" Mar 20 10:39:35 crc kubenswrapper[4748]: I0320 10:39:35.729412 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ba95d0b6-2674-4d41-958a-0a311023cf24-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-8467bc5bd-rdzdn\" (UID: \"ba95d0b6-2674-4d41-958a-0a311023cf24\") " pod="openshift-authentication/oauth-openshift-8467bc5bd-rdzdn" Mar 20 10:39:35 crc kubenswrapper[4748]: I0320 10:39:35.729466 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xg6gp\" (UniqueName: \"kubernetes.io/projected/ba95d0b6-2674-4d41-958a-0a311023cf24-kube-api-access-xg6gp\") pod \"oauth-openshift-8467bc5bd-rdzdn\" (UID: \"ba95d0b6-2674-4d41-958a-0a311023cf24\") " pod="openshift-authentication/oauth-openshift-8467bc5bd-rdzdn" Mar 20 10:39:35 crc kubenswrapper[4748]: I0320 10:39:35.729550 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ba95d0b6-2674-4d41-958a-0a311023cf24-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-8467bc5bd-rdzdn\" (UID: \"ba95d0b6-2674-4d41-958a-0a311023cf24\") " pod="openshift-authentication/oauth-openshift-8467bc5bd-rdzdn" Mar 20 10:39:35 crc kubenswrapper[4748]: I0320 10:39:35.729598 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ba95d0b6-2674-4d41-958a-0a311023cf24-v4-0-config-system-serving-cert\") pod \"oauth-openshift-8467bc5bd-rdzdn\" (UID: \"ba95d0b6-2674-4d41-958a-0a311023cf24\") " pod="openshift-authentication/oauth-openshift-8467bc5bd-rdzdn" Mar 20 10:39:35 crc kubenswrapper[4748]: I0320 10:39:35.729631 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ba95d0b6-2674-4d41-958a-0a311023cf24-audit-dir\") pod \"oauth-openshift-8467bc5bd-rdzdn\" (UID: \"ba95d0b6-2674-4d41-958a-0a311023cf24\") " pod="openshift-authentication/oauth-openshift-8467bc5bd-rdzdn" Mar 20 10:39:35 crc kubenswrapper[4748]: I0320 10:39:35.752352 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-c4jdq" Mar 20 10:39:35 crc kubenswrapper[4748]: I0320 10:39:35.831444 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ba95d0b6-2674-4d41-958a-0a311023cf24-v4-0-config-user-template-error\") pod \"oauth-openshift-8467bc5bd-rdzdn\" (UID: \"ba95d0b6-2674-4d41-958a-0a311023cf24\") " pod="openshift-authentication/oauth-openshift-8467bc5bd-rdzdn" Mar 20 10:39:35 crc kubenswrapper[4748]: I0320 10:39:35.831779 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ba95d0b6-2674-4d41-958a-0a311023cf24-v4-0-config-system-session\") pod \"oauth-openshift-8467bc5bd-rdzdn\" (UID: \"ba95d0b6-2674-4d41-958a-0a311023cf24\") " pod="openshift-authentication/oauth-openshift-8467bc5bd-rdzdn" Mar 20 10:39:35 crc kubenswrapper[4748]: I0320 10:39:35.831807 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ba95d0b6-2674-4d41-958a-0a311023cf24-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-8467bc5bd-rdzdn\" (UID: \"ba95d0b6-2674-4d41-958a-0a311023cf24\") " pod="openshift-authentication/oauth-openshift-8467bc5bd-rdzdn" Mar 20 10:39:35 crc kubenswrapper[4748]: I0320 10:39:35.831873 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ba95d0b6-2674-4d41-958a-0a311023cf24-v4-0-config-system-router-certs\") pod \"oauth-openshift-8467bc5bd-rdzdn\" (UID: \"ba95d0b6-2674-4d41-958a-0a311023cf24\") " pod="openshift-authentication/oauth-openshift-8467bc5bd-rdzdn" Mar 20 10:39:35 crc kubenswrapper[4748]: I0320 10:39:35.831923 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ba95d0b6-2674-4d41-958a-0a311023cf24-v4-0-config-system-cliconfig\") pod \"oauth-openshift-8467bc5bd-rdzdn\" (UID: \"ba95d0b6-2674-4d41-958a-0a311023cf24\") " pod="openshift-authentication/oauth-openshift-8467bc5bd-rdzdn" Mar 20 10:39:35 crc kubenswrapper[4748]: I0320 10:39:35.831946 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ba95d0b6-2674-4d41-958a-0a311023cf24-v4-0-config-system-service-ca\") pod \"oauth-openshift-8467bc5bd-rdzdn\" (UID: \"ba95d0b6-2674-4d41-958a-0a311023cf24\") " pod="openshift-authentication/oauth-openshift-8467bc5bd-rdzdn" Mar 20 10:39:35 crc kubenswrapper[4748]: I0320 10:39:35.831969 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ba95d0b6-2674-4d41-958a-0a311023cf24-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-8467bc5bd-rdzdn\" (UID: \"ba95d0b6-2674-4d41-958a-0a311023cf24\") " pod="openshift-authentication/oauth-openshift-8467bc5bd-rdzdn" Mar 20 10:39:35 crc kubenswrapper[4748]: I0320 10:39:35.831996 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ba95d0b6-2674-4d41-958a-0a311023cf24-v4-0-config-user-template-login\") pod \"oauth-openshift-8467bc5bd-rdzdn\" (UID: \"ba95d0b6-2674-4d41-958a-0a311023cf24\") " pod="openshift-authentication/oauth-openshift-8467bc5bd-rdzdn" Mar 20 10:39:35 crc kubenswrapper[4748]: I0320 10:39:35.832028 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ba95d0b6-2674-4d41-958a-0a311023cf24-audit-policies\") pod \"oauth-openshift-8467bc5bd-rdzdn\" (UID: \"ba95d0b6-2674-4d41-958a-0a311023cf24\") " pod="openshift-authentication/oauth-openshift-8467bc5bd-rdzdn" Mar 20 10:39:35 crc kubenswrapper[4748]: I0320 10:39:35.832052 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ba95d0b6-2674-4d41-958a-0a311023cf24-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-8467bc5bd-rdzdn\" (UID: \"ba95d0b6-2674-4d41-958a-0a311023cf24\") " pod="openshift-authentication/oauth-openshift-8467bc5bd-rdzdn" Mar 20 10:39:35 crc kubenswrapper[4748]: I0320 10:39:35.832076 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xg6gp\" (UniqueName: \"kubernetes.io/projected/ba95d0b6-2674-4d41-958a-0a311023cf24-kube-api-access-xg6gp\") pod \"oauth-openshift-8467bc5bd-rdzdn\" (UID: \"ba95d0b6-2674-4d41-958a-0a311023cf24\") " pod="openshift-authentication/oauth-openshift-8467bc5bd-rdzdn" Mar 20 10:39:35 crc kubenswrapper[4748]: I0320 10:39:35.832103 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ba95d0b6-2674-4d41-958a-0a311023cf24-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-8467bc5bd-rdzdn\" (UID: \"ba95d0b6-2674-4d41-958a-0a311023cf24\") " pod="openshift-authentication/oauth-openshift-8467bc5bd-rdzdn" Mar 20 10:39:35 crc kubenswrapper[4748]: I0320 10:39:35.832127 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ba95d0b6-2674-4d41-958a-0a311023cf24-v4-0-config-system-serving-cert\") pod \"oauth-openshift-8467bc5bd-rdzdn\" (UID: \"ba95d0b6-2674-4d41-958a-0a311023cf24\") " pod="openshift-authentication/oauth-openshift-8467bc5bd-rdzdn" Mar 20 10:39:35 crc kubenswrapper[4748]: I0320 10:39:35.832155 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ba95d0b6-2674-4d41-958a-0a311023cf24-audit-dir\") pod \"oauth-openshift-8467bc5bd-rdzdn\" (UID: \"ba95d0b6-2674-4d41-958a-0a311023cf24\") " pod="openshift-authentication/oauth-openshift-8467bc5bd-rdzdn" Mar 20 10:39:35 crc kubenswrapper[4748]: I0320 10:39:35.832226 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ba95d0b6-2674-4d41-958a-0a311023cf24-audit-dir\") pod \"oauth-openshift-8467bc5bd-rdzdn\" (UID: \"ba95d0b6-2674-4d41-958a-0a311023cf24\") " pod="openshift-authentication/oauth-openshift-8467bc5bd-rdzdn" Mar 20 10:39:35 crc kubenswrapper[4748]: I0320 10:39:35.833332 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ba95d0b6-2674-4d41-958a-0a311023cf24-audit-policies\") pod \"oauth-openshift-8467bc5bd-rdzdn\" (UID: \"ba95d0b6-2674-4d41-958a-0a311023cf24\") " pod="openshift-authentication/oauth-openshift-8467bc5bd-rdzdn" Mar 20 10:39:35 crc kubenswrapper[4748]: I0320 10:39:35.833446 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ba95d0b6-2674-4d41-958a-0a311023cf24-v4-0-config-system-service-ca\") pod \"oauth-openshift-8467bc5bd-rdzdn\" (UID: \"ba95d0b6-2674-4d41-958a-0a311023cf24\") " pod="openshift-authentication/oauth-openshift-8467bc5bd-rdzdn" Mar 20 10:39:35 crc kubenswrapper[4748]: I0320 10:39:35.833681 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ba95d0b6-2674-4d41-958a-0a311023cf24-v4-0-config-system-cliconfig\") pod \"oauth-openshift-8467bc5bd-rdzdn\" (UID: \"ba95d0b6-2674-4d41-958a-0a311023cf24\") " pod="openshift-authentication/oauth-openshift-8467bc5bd-rdzdn" Mar 20 10:39:35 crc kubenswrapper[4748]: I0320 10:39:35.834492 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ba95d0b6-2674-4d41-958a-0a311023cf24-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-8467bc5bd-rdzdn\" (UID: \"ba95d0b6-2674-4d41-958a-0a311023cf24\") " pod="openshift-authentication/oauth-openshift-8467bc5bd-rdzdn" Mar 20 10:39:35 crc kubenswrapper[4748]: I0320 10:39:35.837414 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ba95d0b6-2674-4d41-958a-0a311023cf24-v4-0-config-user-template-error\") pod \"oauth-openshift-8467bc5bd-rdzdn\" (UID: \"ba95d0b6-2674-4d41-958a-0a311023cf24\") " pod="openshift-authentication/oauth-openshift-8467bc5bd-rdzdn" Mar 20 10:39:35 crc kubenswrapper[4748]: I0320 10:39:35.837516 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ba95d0b6-2674-4d41-958a-0a311023cf24-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-8467bc5bd-rdzdn\" (UID: \"ba95d0b6-2674-4d41-958a-0a311023cf24\") " pod="openshift-authentication/oauth-openshift-8467bc5bd-rdzdn" Mar 20 10:39:35 crc kubenswrapper[4748]: I0320 10:39:35.837559 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ba95d0b6-2674-4d41-958a-0a311023cf24-v4-0-config-system-router-certs\") pod \"oauth-openshift-8467bc5bd-rdzdn\" (UID: \"ba95d0b6-2674-4d41-958a-0a311023cf24\") " pod="openshift-authentication/oauth-openshift-8467bc5bd-rdzdn" Mar 20 10:39:35 crc kubenswrapper[4748]: I0320 10:39:35.838259 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ba95d0b6-2674-4d41-958a-0a311023cf24-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-8467bc5bd-rdzdn\" (UID: \"ba95d0b6-2674-4d41-958a-0a311023cf24\") " pod="openshift-authentication/oauth-openshift-8467bc5bd-rdzdn" Mar 20 10:39:35 crc kubenswrapper[4748]: I0320 10:39:35.839300 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ba95d0b6-2674-4d41-958a-0a311023cf24-v4-0-config-system-serving-cert\") pod \"oauth-openshift-8467bc5bd-rdzdn\" (UID: \"ba95d0b6-2674-4d41-958a-0a311023cf24\") " pod="openshift-authentication/oauth-openshift-8467bc5bd-rdzdn" Mar 20 10:39:35 crc kubenswrapper[4748]: I0320 10:39:35.839313 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ba95d0b6-2674-4d41-958a-0a311023cf24-v4-0-config-system-session\") pod \"oauth-openshift-8467bc5bd-rdzdn\" (UID: \"ba95d0b6-2674-4d41-958a-0a311023cf24\") " pod="openshift-authentication/oauth-openshift-8467bc5bd-rdzdn" Mar 20 10:39:35 crc kubenswrapper[4748]: I0320 10:39:35.840323 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ba95d0b6-2674-4d41-958a-0a311023cf24-v4-0-config-user-template-login\") pod \"oauth-openshift-8467bc5bd-rdzdn\" (UID: \"ba95d0b6-2674-4d41-958a-0a311023cf24\") " pod="openshift-authentication/oauth-openshift-8467bc5bd-rdzdn" Mar 20 10:39:35 crc kubenswrapper[4748]: I0320 10:39:35.841776 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ba95d0b6-2674-4d41-958a-0a311023cf24-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-8467bc5bd-rdzdn\" (UID: \"ba95d0b6-2674-4d41-958a-0a311023cf24\") " pod="openshift-authentication/oauth-openshift-8467bc5bd-rdzdn" Mar 20 10:39:35 crc kubenswrapper[4748]: I0320 10:39:35.848508 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xg6gp\" (UniqueName: \"kubernetes.io/projected/ba95d0b6-2674-4d41-958a-0a311023cf24-kube-api-access-xg6gp\") pod \"oauth-openshift-8467bc5bd-rdzdn\" (UID: \"ba95d0b6-2674-4d41-958a-0a311023cf24\") " pod="openshift-authentication/oauth-openshift-8467bc5bd-rdzdn" Mar 20 10:39:35 crc kubenswrapper[4748]: I0320 10:39:35.962437 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-8467bc5bd-rdzdn" Mar 20 10:39:36 crc kubenswrapper[4748]: I0320 10:39:36.256695 4748 generic.go:334] "Generic (PLEG): container finished" podID="623e7576-03b2-41ba-9cd5-e7d3ab7ecdd7" containerID="ba911fedc155e07ca9be1b8ab528bc7e5004920acc9881ff33a627adf1550e72" exitCode=0 Mar 20 10:39:36 crc kubenswrapper[4748]: I0320 10:39:36.256776 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mhpdf" event={"ID":"623e7576-03b2-41ba-9cd5-e7d3ab7ecdd7","Type":"ContainerDied","Data":"ba911fedc155e07ca9be1b8ab528bc7e5004920acc9881ff33a627adf1550e72"} Mar 20 10:39:36 crc kubenswrapper[4748]: I0320 10:39:36.258429 4748 generic.go:334] "Generic (PLEG): container finished" podID="e0225901-1ac3-4097-9b8b-4a2c7f0f3f3b" containerID="b82a5b5b2403e77bdaf06114412968e18f377f7d281f2a81555980d0dba52549" exitCode=0 Mar 20 10:39:36 crc kubenswrapper[4748]: I0320 10:39:36.258470 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5fgxw" event={"ID":"e0225901-1ac3-4097-9b8b-4a2c7f0f3f3b","Type":"ContainerDied","Data":"b82a5b5b2403e77bdaf06114412968e18f377f7d281f2a81555980d0dba52549"} Mar 20 10:39:36 crc kubenswrapper[4748]: I0320 10:39:36.331207 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mhpdf" Mar 20 10:39:36 crc kubenswrapper[4748]: I0320 10:39:36.349219 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-c4jdq" Mar 20 10:39:36 crc kubenswrapper[4748]: I0320 10:39:36.385641 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-z78kn" Mar 20 10:39:36 crc kubenswrapper[4748]: I0320 10:39:36.392761 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5fgxw" Mar 20 10:39:36 crc kubenswrapper[4748]: I0320 10:39:36.413606 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-8467bc5bd-rdzdn"] Mar 20 10:39:36 crc kubenswrapper[4748]: W0320 10:39:36.419800 4748 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podba95d0b6_2674_4d41_958a_0a311023cf24.slice/crio-859a709bb8735c85bc445d9b1cc5ec8933667b1d6f55cdad9e0c1c07ad055259 WatchSource:0}: Error finding container 859a709bb8735c85bc445d9b1cc5ec8933667b1d6f55cdad9e0c1c07ad055259: Status 404 returned error can't find the container with id 859a709bb8735c85bc445d9b1cc5ec8933667b1d6f55cdad9e0c1c07ad055259 Mar 20 10:39:36 crc kubenswrapper[4748]: I0320 10:39:36.440322 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/623e7576-03b2-41ba-9cd5-e7d3ab7ecdd7-utilities\") pod \"623e7576-03b2-41ba-9cd5-e7d3ab7ecdd7\" (UID: \"623e7576-03b2-41ba-9cd5-e7d3ab7ecdd7\") " Mar 20 10:39:36 crc kubenswrapper[4748]: I0320 10:39:36.440440 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/623e7576-03b2-41ba-9cd5-e7d3ab7ecdd7-catalog-content\") pod \"623e7576-03b2-41ba-9cd5-e7d3ab7ecdd7\" (UID: \"623e7576-03b2-41ba-9cd5-e7d3ab7ecdd7\") " Mar 20 10:39:36 crc kubenswrapper[4748]: I0320 10:39:36.440485 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sjvb4\" (UniqueName: \"kubernetes.io/projected/623e7576-03b2-41ba-9cd5-e7d3ab7ecdd7-kube-api-access-sjvb4\") pod \"623e7576-03b2-41ba-9cd5-e7d3ab7ecdd7\" (UID: \"623e7576-03b2-41ba-9cd5-e7d3ab7ecdd7\") " Mar 20 10:39:36 crc kubenswrapper[4748]: I0320 10:39:36.441755 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/623e7576-03b2-41ba-9cd5-e7d3ab7ecdd7-utilities" (OuterVolumeSpecName: "utilities") pod "623e7576-03b2-41ba-9cd5-e7d3ab7ecdd7" (UID: "623e7576-03b2-41ba-9cd5-e7d3ab7ecdd7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:39:36 crc kubenswrapper[4748]: I0320 10:39:36.447175 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/623e7576-03b2-41ba-9cd5-e7d3ab7ecdd7-kube-api-access-sjvb4" (OuterVolumeSpecName: "kube-api-access-sjvb4") pod "623e7576-03b2-41ba-9cd5-e7d3ab7ecdd7" (UID: "623e7576-03b2-41ba-9cd5-e7d3ab7ecdd7"). InnerVolumeSpecName "kube-api-access-sjvb4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:39:36 crc kubenswrapper[4748]: I0320 10:39:36.500801 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/623e7576-03b2-41ba-9cd5-e7d3ab7ecdd7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "623e7576-03b2-41ba-9cd5-e7d3ab7ecdd7" (UID: "623e7576-03b2-41ba-9cd5-e7d3ab7ecdd7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:39:36 crc kubenswrapper[4748]: I0320 10:39:36.541464 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mgnhj\" (UniqueName: \"kubernetes.io/projected/e0225901-1ac3-4097-9b8b-4a2c7f0f3f3b-kube-api-access-mgnhj\") pod \"e0225901-1ac3-4097-9b8b-4a2c7f0f3f3b\" (UID: \"e0225901-1ac3-4097-9b8b-4a2c7f0f3f3b\") " Mar 20 10:39:36 crc kubenswrapper[4748]: I0320 10:39:36.541595 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e0225901-1ac3-4097-9b8b-4a2c7f0f3f3b-catalog-content\") pod \"e0225901-1ac3-4097-9b8b-4a2c7f0f3f3b\" (UID: \"e0225901-1ac3-4097-9b8b-4a2c7f0f3f3b\") " Mar 20 10:39:36 crc kubenswrapper[4748]: I0320 10:39:36.541624 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e0225901-1ac3-4097-9b8b-4a2c7f0f3f3b-utilities\") pod \"e0225901-1ac3-4097-9b8b-4a2c7f0f3f3b\" (UID: \"e0225901-1ac3-4097-9b8b-4a2c7f0f3f3b\") " Mar 20 10:39:36 crc kubenswrapper[4748]: I0320 10:39:36.541828 4748 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/623e7576-03b2-41ba-9cd5-e7d3ab7ecdd7-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 10:39:36 crc kubenswrapper[4748]: I0320 10:39:36.541859 4748 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/623e7576-03b2-41ba-9cd5-e7d3ab7ecdd7-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 10:39:36 crc kubenswrapper[4748]: I0320 10:39:36.541873 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sjvb4\" (UniqueName: \"kubernetes.io/projected/623e7576-03b2-41ba-9cd5-e7d3ab7ecdd7-kube-api-access-sjvb4\") on node \"crc\" DevicePath \"\"" Mar 20 10:39:36 crc kubenswrapper[4748]: I0320 10:39:36.542933 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e0225901-1ac3-4097-9b8b-4a2c7f0f3f3b-utilities" (OuterVolumeSpecName: "utilities") pod "e0225901-1ac3-4097-9b8b-4a2c7f0f3f3b" (UID: "e0225901-1ac3-4097-9b8b-4a2c7f0f3f3b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:39:36 crc kubenswrapper[4748]: I0320 10:39:36.545896 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0225901-1ac3-4097-9b8b-4a2c7f0f3f3b-kube-api-access-mgnhj" (OuterVolumeSpecName: "kube-api-access-mgnhj") pod "e0225901-1ac3-4097-9b8b-4a2c7f0f3f3b" (UID: "e0225901-1ac3-4097-9b8b-4a2c7f0f3f3b"). InnerVolumeSpecName "kube-api-access-mgnhj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:39:36 crc kubenswrapper[4748]: I0320 10:39:36.600926 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e0225901-1ac3-4097-9b8b-4a2c7f0f3f3b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e0225901-1ac3-4097-9b8b-4a2c7f0f3f3b" (UID: "e0225901-1ac3-4097-9b8b-4a2c7f0f3f3b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:39:36 crc kubenswrapper[4748]: I0320 10:39:36.643108 4748 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e0225901-1ac3-4097-9b8b-4a2c7f0f3f3b-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 10:39:36 crc kubenswrapper[4748]: I0320 10:39:36.643186 4748 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e0225901-1ac3-4097-9b8b-4a2c7f0f3f3b-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 10:39:36 crc kubenswrapper[4748]: I0320 10:39:36.643212 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mgnhj\" (UniqueName: \"kubernetes.io/projected/e0225901-1ac3-4097-9b8b-4a2c7f0f3f3b-kube-api-access-mgnhj\") on node \"crc\" DevicePath \"\"" Mar 20 10:39:37 crc kubenswrapper[4748]: I0320 10:39:37.149220 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bsq7p"] Mar 20 10:39:37 crc kubenswrapper[4748]: I0320 10:39:37.269744 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mhpdf" event={"ID":"623e7576-03b2-41ba-9cd5-e7d3ab7ecdd7","Type":"ContainerDied","Data":"249624e5b4dfee6dd0ee675c9b5ddcc0e94f027e96561df06404e467bf4cd4c9"} Mar 20 10:39:37 crc kubenswrapper[4748]: I0320 10:39:37.269864 4748 scope.go:117] "RemoveContainer" containerID="ba911fedc155e07ca9be1b8ab528bc7e5004920acc9881ff33a627adf1550e72" Mar 20 10:39:37 crc kubenswrapper[4748]: I0320 10:39:37.269952 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mhpdf" Mar 20 10:39:37 crc kubenswrapper[4748]: I0320 10:39:37.274332 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-8467bc5bd-rdzdn" event={"ID":"ba95d0b6-2674-4d41-958a-0a311023cf24","Type":"ContainerStarted","Data":"7312626202cde41d096282803bb6ec1c25dce0d0983d10ed17b1757d6a10d6b0"} Mar 20 10:39:37 crc kubenswrapper[4748]: I0320 10:39:37.274419 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-8467bc5bd-rdzdn" event={"ID":"ba95d0b6-2674-4d41-958a-0a311023cf24","Type":"ContainerStarted","Data":"859a709bb8735c85bc445d9b1cc5ec8933667b1d6f55cdad9e0c1c07ad055259"} Mar 20 10:39:37 crc kubenswrapper[4748]: I0320 10:39:37.275267 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-8467bc5bd-rdzdn" Mar 20 10:39:37 crc kubenswrapper[4748]: I0320 10:39:37.278769 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5fgxw" event={"ID":"e0225901-1ac3-4097-9b8b-4a2c7f0f3f3b","Type":"ContainerDied","Data":"2b5d251b09957d35bc3c64e4fae137287bbfa416183552edd9f4f2ad7e06beaa"} Mar 20 10:39:37 crc kubenswrapper[4748]: I0320 10:39:37.278931 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-bsq7p" podUID="8e249463-f3e7-4aed-a0ac-97c54af87949" containerName="registry-server" containerID="cri-o://3be9a0e52c6e54d65e9726e30a7c304d771450e424d9b3eb68d45d5ee7c24de0" gracePeriod=2 Mar 20 10:39:37 crc kubenswrapper[4748]: I0320 10:39:37.279049 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5fgxw" Mar 20 10:39:37 crc kubenswrapper[4748]: I0320 10:39:37.286486 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-8467bc5bd-rdzdn" Mar 20 10:39:37 crc kubenswrapper[4748]: I0320 10:39:37.295947 4748 scope.go:117] "RemoveContainer" containerID="b4effbfd33d6ef8d91e8660fe1d1fbc0343f651759fe0b9791c78d2386c70c23" Mar 20 10:39:37 crc kubenswrapper[4748]: I0320 10:39:37.362464 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-8467bc5bd-rdzdn" podStartSLOduration=29.362433363 podStartE2EDuration="29.362433363s" podCreationTimestamp="2026-03-20 10:39:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:39:37.346158103 +0000 UTC m=+212.487703977" watchObservedRunningTime="2026-03-20 10:39:37.362433363 +0000 UTC m=+212.503979197" Mar 20 10:39:37 crc kubenswrapper[4748]: I0320 10:39:37.364169 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5fgxw"] Mar 20 10:39:37 crc kubenswrapper[4748]: I0320 10:39:37.365008 4748 scope.go:117] "RemoveContainer" containerID="31def6896a1e41acae6771bff4c30bfe314ab0085f30b94adecad53864befc02" Mar 20 10:39:37 crc kubenswrapper[4748]: I0320 10:39:37.372051 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-5fgxw"] Mar 20 10:39:37 crc kubenswrapper[4748]: I0320 10:39:37.473798 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mhpdf"] Mar 20 10:39:37 crc kubenswrapper[4748]: I0320 10:39:37.473922 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-mhpdf"] Mar 20 10:39:37 crc kubenswrapper[4748]: I0320 10:39:37.520168 4748 scope.go:117] "RemoveContainer" containerID="b82a5b5b2403e77bdaf06114412968e18f377f7d281f2a81555980d0dba52549" Mar 20 10:39:37 crc kubenswrapper[4748]: I0320 10:39:37.535463 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="623e7576-03b2-41ba-9cd5-e7d3ab7ecdd7" path="/var/lib/kubelet/pods/623e7576-03b2-41ba-9cd5-e7d3ab7ecdd7/volumes" Mar 20 10:39:37 crc kubenswrapper[4748]: I0320 10:39:37.536345 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0225901-1ac3-4097-9b8b-4a2c7f0f3f3b" path="/var/lib/kubelet/pods/e0225901-1ac3-4097-9b8b-4a2c7f0f3f3b/volumes" Mar 20 10:39:37 crc kubenswrapper[4748]: I0320 10:39:37.559689 4748 scope.go:117] "RemoveContainer" containerID="ec0102935c3f45ce38fa4fc9c17992d5373ea8705bcb5a1a8545406280dc0745" Mar 20 10:39:37 crc kubenswrapper[4748]: I0320 10:39:37.589575 4748 scope.go:117] "RemoveContainer" containerID="90e88093111b4b53979d7ae3e3ea5b78ac9039492237a398606f6c5608ebbbec" Mar 20 10:39:37 crc kubenswrapper[4748]: I0320 10:39:37.819030 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bsq7p" Mar 20 10:39:37 crc kubenswrapper[4748]: I0320 10:39:37.970445 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e249463-f3e7-4aed-a0ac-97c54af87949-catalog-content\") pod \"8e249463-f3e7-4aed-a0ac-97c54af87949\" (UID: \"8e249463-f3e7-4aed-a0ac-97c54af87949\") " Mar 20 10:39:37 crc kubenswrapper[4748]: I0320 10:39:37.970510 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g8bpp\" (UniqueName: \"kubernetes.io/projected/8e249463-f3e7-4aed-a0ac-97c54af87949-kube-api-access-g8bpp\") pod \"8e249463-f3e7-4aed-a0ac-97c54af87949\" (UID: \"8e249463-f3e7-4aed-a0ac-97c54af87949\") " Mar 20 10:39:37 crc kubenswrapper[4748]: I0320 10:39:37.970615 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e249463-f3e7-4aed-a0ac-97c54af87949-utilities\") pod \"8e249463-f3e7-4aed-a0ac-97c54af87949\" (UID: \"8e249463-f3e7-4aed-a0ac-97c54af87949\") " Mar 20 10:39:37 crc kubenswrapper[4748]: I0320 10:39:37.972151 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e249463-f3e7-4aed-a0ac-97c54af87949-utilities" (OuterVolumeSpecName: "utilities") pod "8e249463-f3e7-4aed-a0ac-97c54af87949" (UID: "8e249463-f3e7-4aed-a0ac-97c54af87949"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:39:37 crc kubenswrapper[4748]: I0320 10:39:37.981904 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e249463-f3e7-4aed-a0ac-97c54af87949-kube-api-access-g8bpp" (OuterVolumeSpecName: "kube-api-access-g8bpp") pod "8e249463-f3e7-4aed-a0ac-97c54af87949" (UID: "8e249463-f3e7-4aed-a0ac-97c54af87949"). InnerVolumeSpecName "kube-api-access-g8bpp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:39:38 crc kubenswrapper[4748]: I0320 10:39:38.072422 4748 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e249463-f3e7-4aed-a0ac-97c54af87949-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 10:39:38 crc kubenswrapper[4748]: I0320 10:39:38.072679 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g8bpp\" (UniqueName: \"kubernetes.io/projected/8e249463-f3e7-4aed-a0ac-97c54af87949-kube-api-access-g8bpp\") on node \"crc\" DevicePath \"\"" Mar 20 10:39:38 crc kubenswrapper[4748]: I0320 10:39:38.256208 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e249463-f3e7-4aed-a0ac-97c54af87949-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8e249463-f3e7-4aed-a0ac-97c54af87949" (UID: "8e249463-f3e7-4aed-a0ac-97c54af87949"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:39:38 crc kubenswrapper[4748]: I0320 10:39:38.274863 4748 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e249463-f3e7-4aed-a0ac-97c54af87949-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 10:39:38 crc kubenswrapper[4748]: I0320 10:39:38.291043 4748 generic.go:334] "Generic (PLEG): container finished" podID="8e249463-f3e7-4aed-a0ac-97c54af87949" containerID="3be9a0e52c6e54d65e9726e30a7c304d771450e424d9b3eb68d45d5ee7c24de0" exitCode=0 Mar 20 10:39:38 crc kubenswrapper[4748]: I0320 10:39:38.291161 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bsq7p" event={"ID":"8e249463-f3e7-4aed-a0ac-97c54af87949","Type":"ContainerDied","Data":"3be9a0e52c6e54d65e9726e30a7c304d771450e424d9b3eb68d45d5ee7c24de0"} Mar 20 10:39:38 crc kubenswrapper[4748]: I0320 10:39:38.291212 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bsq7p" event={"ID":"8e249463-f3e7-4aed-a0ac-97c54af87949","Type":"ContainerDied","Data":"56dee9e757cfe3109602a31f1bd0ccad534b7f0580925c8e5b8035ac2ef22672"} Mar 20 10:39:38 crc kubenswrapper[4748]: I0320 10:39:38.291225 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bsq7p" Mar 20 10:39:38 crc kubenswrapper[4748]: I0320 10:39:38.291248 4748 scope.go:117] "RemoveContainer" containerID="3be9a0e52c6e54d65e9726e30a7c304d771450e424d9b3eb68d45d5ee7c24de0" Mar 20 10:39:38 crc kubenswrapper[4748]: I0320 10:39:38.318139 4748 scope.go:117] "RemoveContainer" containerID="1e351fa3a05224b4c450ae73d2a9dbbb3ca4d207610e880af3bf45a3b8f2330a" Mar 20 10:39:38 crc kubenswrapper[4748]: I0320 10:39:38.338079 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bsq7p"] Mar 20 10:39:38 crc kubenswrapper[4748]: I0320 10:39:38.340226 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-bsq7p"] Mar 20 10:39:38 crc kubenswrapper[4748]: I0320 10:39:38.363341 4748 scope.go:117] "RemoveContainer" containerID="564bf0e0f2460ed08e96ea907f32c0afaaa15a5251e40b3ff1b63fefe1ccff02" Mar 20 10:39:38 crc kubenswrapper[4748]: I0320 10:39:38.384775 4748 scope.go:117] "RemoveContainer" containerID="3be9a0e52c6e54d65e9726e30a7c304d771450e424d9b3eb68d45d5ee7c24de0" Mar 20 10:39:38 crc kubenswrapper[4748]: E0320 10:39:38.385377 4748 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3be9a0e52c6e54d65e9726e30a7c304d771450e424d9b3eb68d45d5ee7c24de0\": container with ID starting with 3be9a0e52c6e54d65e9726e30a7c304d771450e424d9b3eb68d45d5ee7c24de0 not found: ID does not exist" containerID="3be9a0e52c6e54d65e9726e30a7c304d771450e424d9b3eb68d45d5ee7c24de0" Mar 20 10:39:38 crc kubenswrapper[4748]: I0320 10:39:38.385449 4748 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3be9a0e52c6e54d65e9726e30a7c304d771450e424d9b3eb68d45d5ee7c24de0"} err="failed to get container status \"3be9a0e52c6e54d65e9726e30a7c304d771450e424d9b3eb68d45d5ee7c24de0\": rpc error: code = NotFound desc = could not find container \"3be9a0e52c6e54d65e9726e30a7c304d771450e424d9b3eb68d45d5ee7c24de0\": container with ID starting with 3be9a0e52c6e54d65e9726e30a7c304d771450e424d9b3eb68d45d5ee7c24de0 not found: ID does not exist" Mar 20 10:39:38 crc kubenswrapper[4748]: I0320 10:39:38.385494 4748 scope.go:117] "RemoveContainer" containerID="1e351fa3a05224b4c450ae73d2a9dbbb3ca4d207610e880af3bf45a3b8f2330a" Mar 20 10:39:38 crc kubenswrapper[4748]: E0320 10:39:38.385996 4748 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e351fa3a05224b4c450ae73d2a9dbbb3ca4d207610e880af3bf45a3b8f2330a\": container with ID starting with 1e351fa3a05224b4c450ae73d2a9dbbb3ca4d207610e880af3bf45a3b8f2330a not found: ID does not exist" containerID="1e351fa3a05224b4c450ae73d2a9dbbb3ca4d207610e880af3bf45a3b8f2330a" Mar 20 10:39:38 crc kubenswrapper[4748]: I0320 10:39:38.386025 4748 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e351fa3a05224b4c450ae73d2a9dbbb3ca4d207610e880af3bf45a3b8f2330a"} err="failed to get container status \"1e351fa3a05224b4c450ae73d2a9dbbb3ca4d207610e880af3bf45a3b8f2330a\": rpc error: code = NotFound desc = could not find container \"1e351fa3a05224b4c450ae73d2a9dbbb3ca4d207610e880af3bf45a3b8f2330a\": container with ID starting with 1e351fa3a05224b4c450ae73d2a9dbbb3ca4d207610e880af3bf45a3b8f2330a not found: ID does not exist" Mar 20 10:39:38 crc kubenswrapper[4748]: I0320 10:39:38.386050 4748 scope.go:117] "RemoveContainer" containerID="564bf0e0f2460ed08e96ea907f32c0afaaa15a5251e40b3ff1b63fefe1ccff02" Mar 20 10:39:38 crc kubenswrapper[4748]: E0320 10:39:38.386534 4748 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"564bf0e0f2460ed08e96ea907f32c0afaaa15a5251e40b3ff1b63fefe1ccff02\": container with ID starting with 564bf0e0f2460ed08e96ea907f32c0afaaa15a5251e40b3ff1b63fefe1ccff02 not found: ID does not exist" containerID="564bf0e0f2460ed08e96ea907f32c0afaaa15a5251e40b3ff1b63fefe1ccff02" Mar 20 10:39:38 crc kubenswrapper[4748]: I0320 10:39:38.386814 4748 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"564bf0e0f2460ed08e96ea907f32c0afaaa15a5251e40b3ff1b63fefe1ccff02"} err="failed to get container status \"564bf0e0f2460ed08e96ea907f32c0afaaa15a5251e40b3ff1b63fefe1ccff02\": rpc error: code = NotFound desc = could not find container \"564bf0e0f2460ed08e96ea907f32c0afaaa15a5251e40b3ff1b63fefe1ccff02\": container with ID starting with 564bf0e0f2460ed08e96ea907f32c0afaaa15a5251e40b3ff1b63fefe1ccff02 not found: ID does not exist" Mar 20 10:39:39 crc kubenswrapper[4748]: I0320 10:39:39.528426 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e249463-f3e7-4aed-a0ac-97c54af87949" path="/var/lib/kubelet/pods/8e249463-f3e7-4aed-a0ac-97c54af87949/volumes" Mar 20 10:39:39 crc kubenswrapper[4748]: I0320 10:39:39.549701 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-c4jdq"] Mar 20 10:39:39 crc kubenswrapper[4748]: I0320 10:39:39.550174 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-c4jdq" podUID="a82b758c-eb62-429b-b092-5ba4a7cf665d" containerName="registry-server" containerID="cri-o://25add1f159d9c2b06a457989173a938f8c1e9b19a8a2bf039a4488820dfa8b84" gracePeriod=2 Mar 20 10:39:41 crc kubenswrapper[4748]: I0320 10:39:41.344033 4748 generic.go:334] "Generic (PLEG): container finished" podID="a82b758c-eb62-429b-b092-5ba4a7cf665d" containerID="25add1f159d9c2b06a457989173a938f8c1e9b19a8a2bf039a4488820dfa8b84" exitCode=0 Mar 20 10:39:41 crc kubenswrapper[4748]: I0320 10:39:41.344117 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c4jdq" event={"ID":"a82b758c-eb62-429b-b092-5ba4a7cf665d","Type":"ContainerDied","Data":"25add1f159d9c2b06a457989173a938f8c1e9b19a8a2bf039a4488820dfa8b84"} Mar 20 10:39:41 crc kubenswrapper[4748]: I0320 10:39:41.768721 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c4jdq" Mar 20 10:39:41 crc kubenswrapper[4748]: I0320 10:39:41.892003 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a82b758c-eb62-429b-b092-5ba4a7cf665d-catalog-content\") pod \"a82b758c-eb62-429b-b092-5ba4a7cf665d\" (UID: \"a82b758c-eb62-429b-b092-5ba4a7cf665d\") " Mar 20 10:39:41 crc kubenswrapper[4748]: I0320 10:39:41.892157 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a82b758c-eb62-429b-b092-5ba4a7cf665d-utilities\") pod \"a82b758c-eb62-429b-b092-5ba4a7cf665d\" (UID: \"a82b758c-eb62-429b-b092-5ba4a7cf665d\") " Mar 20 10:39:41 crc kubenswrapper[4748]: I0320 10:39:41.892246 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jsdvt\" (UniqueName: \"kubernetes.io/projected/a82b758c-eb62-429b-b092-5ba4a7cf665d-kube-api-access-jsdvt\") pod \"a82b758c-eb62-429b-b092-5ba4a7cf665d\" (UID: \"a82b758c-eb62-429b-b092-5ba4a7cf665d\") " Mar 20 10:39:41 crc kubenswrapper[4748]: I0320 10:39:41.894534 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a82b758c-eb62-429b-b092-5ba4a7cf665d-utilities" (OuterVolumeSpecName: "utilities") pod "a82b758c-eb62-429b-b092-5ba4a7cf665d" (UID: "a82b758c-eb62-429b-b092-5ba4a7cf665d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:39:41 crc kubenswrapper[4748]: I0320 10:39:41.898901 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a82b758c-eb62-429b-b092-5ba4a7cf665d-kube-api-access-jsdvt" (OuterVolumeSpecName: "kube-api-access-jsdvt") pod "a82b758c-eb62-429b-b092-5ba4a7cf665d" (UID: "a82b758c-eb62-429b-b092-5ba4a7cf665d"). InnerVolumeSpecName "kube-api-access-jsdvt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:39:41 crc kubenswrapper[4748]: I0320 10:39:41.994102 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jsdvt\" (UniqueName: \"kubernetes.io/projected/a82b758c-eb62-429b-b092-5ba4a7cf665d-kube-api-access-jsdvt\") on node \"crc\" DevicePath \"\"" Mar 20 10:39:41 crc kubenswrapper[4748]: I0320 10:39:41.994257 4748 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a82b758c-eb62-429b-b092-5ba4a7cf665d-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 10:39:42 crc kubenswrapper[4748]: I0320 10:39:42.051438 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7c6bdd87c4-fmn7s"] Mar 20 10:39:42 crc kubenswrapper[4748]: I0320 10:39:42.051782 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-7c6bdd87c4-fmn7s" podUID="71cc80b6-74af-413e-8b36-3d637f7dd31e" containerName="controller-manager" containerID="cri-o://1fafcb8bc819d47a30cc4b17aa80ff00f13cf806a334cc81113624f47a1f8f69" gracePeriod=30 Mar 20 10:39:42 crc kubenswrapper[4748]: I0320 10:39:42.139402 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-59c68db7c4-66ddn"] Mar 20 10:39:42 crc kubenswrapper[4748]: I0320 10:39:42.140043 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-59c68db7c4-66ddn" podUID="7798b202-00ed-40e7-b4d2-abccb941ed18" containerName="route-controller-manager" containerID="cri-o://f04b0db02939f225cfe90cae89f401ba8e7ef1d9a60716a6a092411524c35c41" gracePeriod=30 Mar 20 10:39:42 crc kubenswrapper[4748]: I0320 10:39:42.351609 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c4jdq" event={"ID":"a82b758c-eb62-429b-b092-5ba4a7cf665d","Type":"ContainerDied","Data":"b7cb07f5cac3ad57926072d83de81838ad26e5baadc4b9e9a903bbe72a1b29e2"} Mar 20 10:39:42 crc kubenswrapper[4748]: I0320 10:39:42.351662 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c4jdq" Mar 20 10:39:42 crc kubenswrapper[4748]: I0320 10:39:42.351678 4748 scope.go:117] "RemoveContainer" containerID="25add1f159d9c2b06a457989173a938f8c1e9b19a8a2bf039a4488820dfa8b84" Mar 20 10:39:42 crc kubenswrapper[4748]: I0320 10:39:42.373730 4748 scope.go:117] "RemoveContainer" containerID="0502488e69ed276258241b28cb6dd7e542373a7e6f016787a30dbf2bb32acef8" Mar 20 10:39:42 crc kubenswrapper[4748]: I0320 10:39:42.388647 4748 scope.go:117] "RemoveContainer" containerID="bfc6ee5bd4710e309e9817d2fe156e9ed9ab83c170378c330999947644108fa4" Mar 20 10:39:42 crc kubenswrapper[4748]: I0320 10:39:42.426359 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a82b758c-eb62-429b-b092-5ba4a7cf665d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a82b758c-eb62-429b-b092-5ba4a7cf665d" (UID: "a82b758c-eb62-429b-b092-5ba4a7cf665d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:39:42 crc kubenswrapper[4748]: I0320 10:39:42.500782 4748 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a82b758c-eb62-429b-b092-5ba4a7cf665d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 10:39:42 crc kubenswrapper[4748]: I0320 10:39:42.696624 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-c4jdq"] Mar 20 10:39:42 crc kubenswrapper[4748]: I0320 10:39:42.702545 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-c4jdq"] Mar 20 10:39:43 crc kubenswrapper[4748]: I0320 10:39:43.361817 4748 generic.go:334] "Generic (PLEG): container finished" podID="7798b202-00ed-40e7-b4d2-abccb941ed18" containerID="f04b0db02939f225cfe90cae89f401ba8e7ef1d9a60716a6a092411524c35c41" exitCode=0 Mar 20 10:39:43 crc kubenswrapper[4748]: I0320 10:39:43.361915 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-59c68db7c4-66ddn" event={"ID":"7798b202-00ed-40e7-b4d2-abccb941ed18","Type":"ContainerDied","Data":"f04b0db02939f225cfe90cae89f401ba8e7ef1d9a60716a6a092411524c35c41"} Mar 20 10:39:43 crc kubenswrapper[4748]: I0320 10:39:43.363769 4748 generic.go:334] "Generic (PLEG): container finished" podID="71cc80b6-74af-413e-8b36-3d637f7dd31e" containerID="1fafcb8bc819d47a30cc4b17aa80ff00f13cf806a334cc81113624f47a1f8f69" exitCode=0 Mar 20 10:39:43 crc kubenswrapper[4748]: I0320 10:39:43.363803 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7c6bdd87c4-fmn7s" event={"ID":"71cc80b6-74af-413e-8b36-3d637f7dd31e","Type":"ContainerDied","Data":"1fafcb8bc819d47a30cc4b17aa80ff00f13cf806a334cc81113624f47a1f8f69"} Mar 20 10:39:43 crc kubenswrapper[4748]: I0320 10:39:43.528252 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a82b758c-eb62-429b-b092-5ba4a7cf665d" path="/var/lib/kubelet/pods/a82b758c-eb62-429b-b092-5ba4a7cf665d/volumes" Mar 20 10:39:44 crc kubenswrapper[4748]: I0320 10:39:44.297341 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-59c68db7c4-66ddn" Mar 20 10:39:44 crc kubenswrapper[4748]: I0320 10:39:44.339800 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-bf6c965c8-lft54"] Mar 20 10:39:44 crc kubenswrapper[4748]: E0320 10:39:44.340198 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a82b758c-eb62-429b-b092-5ba4a7cf665d" containerName="extract-utilities" Mar 20 10:39:44 crc kubenswrapper[4748]: I0320 10:39:44.340233 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="a82b758c-eb62-429b-b092-5ba4a7cf665d" containerName="extract-utilities" Mar 20 10:39:44 crc kubenswrapper[4748]: E0320 10:39:44.340256 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e249463-f3e7-4aed-a0ac-97c54af87949" containerName="extract-utilities" Mar 20 10:39:44 crc kubenswrapper[4748]: I0320 10:39:44.340268 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e249463-f3e7-4aed-a0ac-97c54af87949" containerName="extract-utilities" Mar 20 10:39:44 crc kubenswrapper[4748]: E0320 10:39:44.340292 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7798b202-00ed-40e7-b4d2-abccb941ed18" containerName="route-controller-manager" Mar 20 10:39:44 crc kubenswrapper[4748]: I0320 10:39:44.340304 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="7798b202-00ed-40e7-b4d2-abccb941ed18" containerName="route-controller-manager" Mar 20 10:39:44 crc kubenswrapper[4748]: E0320 10:39:44.340333 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e249463-f3e7-4aed-a0ac-97c54af87949" containerName="extract-content" Mar 20 10:39:44 crc kubenswrapper[4748]: I0320 10:39:44.340346 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e249463-f3e7-4aed-a0ac-97c54af87949" containerName="extract-content" Mar 20 10:39:44 crc kubenswrapper[4748]: E0320 10:39:44.340361 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a82b758c-eb62-429b-b092-5ba4a7cf665d" containerName="extract-content" Mar 20 10:39:44 crc kubenswrapper[4748]: I0320 10:39:44.340374 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="a82b758c-eb62-429b-b092-5ba4a7cf665d" containerName="extract-content" Mar 20 10:39:44 crc kubenswrapper[4748]: E0320 10:39:44.340390 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e249463-f3e7-4aed-a0ac-97c54af87949" containerName="registry-server" Mar 20 10:39:44 crc kubenswrapper[4748]: I0320 10:39:44.340402 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e249463-f3e7-4aed-a0ac-97c54af87949" containerName="registry-server" Mar 20 10:39:44 crc kubenswrapper[4748]: E0320 10:39:44.340418 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="623e7576-03b2-41ba-9cd5-e7d3ab7ecdd7" containerName="extract-content" Mar 20 10:39:44 crc kubenswrapper[4748]: I0320 10:39:44.340430 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="623e7576-03b2-41ba-9cd5-e7d3ab7ecdd7" containerName="extract-content" Mar 20 10:39:44 crc kubenswrapper[4748]: E0320 10:39:44.340444 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0225901-1ac3-4097-9b8b-4a2c7f0f3f3b" containerName="extract-content" Mar 20 10:39:44 crc kubenswrapper[4748]: I0320 10:39:44.340456 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0225901-1ac3-4097-9b8b-4a2c7f0f3f3b" containerName="extract-content" Mar 20 10:39:44 crc kubenswrapper[4748]: E0320 10:39:44.340473 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a82b758c-eb62-429b-b092-5ba4a7cf665d" containerName="registry-server" Mar 20 10:39:44 crc kubenswrapper[4748]: I0320 10:39:44.340485 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="a82b758c-eb62-429b-b092-5ba4a7cf665d" containerName="registry-server" Mar 20 10:39:44 crc kubenswrapper[4748]: E0320 10:39:44.340502 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="623e7576-03b2-41ba-9cd5-e7d3ab7ecdd7" containerName="extract-utilities" Mar 20 10:39:44 crc kubenswrapper[4748]: I0320 10:39:44.340514 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="623e7576-03b2-41ba-9cd5-e7d3ab7ecdd7" containerName="extract-utilities" Mar 20 10:39:44 crc kubenswrapper[4748]: E0320 10:39:44.340533 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0225901-1ac3-4097-9b8b-4a2c7f0f3f3b" containerName="extract-utilities" Mar 20 10:39:44 crc kubenswrapper[4748]: I0320 10:39:44.340545 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0225901-1ac3-4097-9b8b-4a2c7f0f3f3b" containerName="extract-utilities" Mar 20 10:39:44 crc kubenswrapper[4748]: E0320 10:39:44.340559 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="623e7576-03b2-41ba-9cd5-e7d3ab7ecdd7" containerName="registry-server" Mar 20 10:39:44 crc kubenswrapper[4748]: I0320 10:39:44.340571 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="623e7576-03b2-41ba-9cd5-e7d3ab7ecdd7" containerName="registry-server" Mar 20 10:39:44 crc kubenswrapper[4748]: E0320 10:39:44.340587 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0225901-1ac3-4097-9b8b-4a2c7f0f3f3b" containerName="registry-server" Mar 20 10:39:44 crc kubenswrapper[4748]: I0320 10:39:44.340598 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0225901-1ac3-4097-9b8b-4a2c7f0f3f3b" containerName="registry-server" Mar 20 10:39:44 crc kubenswrapper[4748]: I0320 10:39:44.340765 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0225901-1ac3-4097-9b8b-4a2c7f0f3f3b" containerName="registry-server" Mar 20 10:39:44 crc kubenswrapper[4748]: I0320 10:39:44.340788 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="a82b758c-eb62-429b-b092-5ba4a7cf665d" containerName="registry-server" Mar 20 10:39:44 crc kubenswrapper[4748]: I0320 10:39:44.340807 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="7798b202-00ed-40e7-b4d2-abccb941ed18" containerName="route-controller-manager" Mar 20 10:39:44 crc kubenswrapper[4748]: I0320 10:39:44.340826 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e249463-f3e7-4aed-a0ac-97c54af87949" containerName="registry-server" Mar 20 10:39:44 crc kubenswrapper[4748]: I0320 10:39:44.340876 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="623e7576-03b2-41ba-9cd5-e7d3ab7ecdd7" containerName="registry-server" Mar 20 10:39:44 crc kubenswrapper[4748]: I0320 10:39:44.341614 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-bf6c965c8-lft54" Mar 20 10:39:44 crc kubenswrapper[4748]: I0320 10:39:44.349339 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-bf6c965c8-lft54"] Mar 20 10:39:44 crc kubenswrapper[4748]: I0320 10:39:44.371346 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-59c68db7c4-66ddn" event={"ID":"7798b202-00ed-40e7-b4d2-abccb941ed18","Type":"ContainerDied","Data":"82262975ba0696e1068ec23ec9ca1cf4b588d884a674b4ca9d79628e8d1672aa"} Mar 20 10:39:44 crc kubenswrapper[4748]: I0320 10:39:44.371408 4748 scope.go:117] "RemoveContainer" containerID="f04b0db02939f225cfe90cae89f401ba8e7ef1d9a60716a6a092411524c35c41" Mar 20 10:39:44 crc kubenswrapper[4748]: I0320 10:39:44.371437 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-59c68db7c4-66ddn" Mar 20 10:39:44 crc kubenswrapper[4748]: I0320 10:39:44.432410 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9tnp4\" (UniqueName: \"kubernetes.io/projected/7798b202-00ed-40e7-b4d2-abccb941ed18-kube-api-access-9tnp4\") pod \"7798b202-00ed-40e7-b4d2-abccb941ed18\" (UID: \"7798b202-00ed-40e7-b4d2-abccb941ed18\") " Mar 20 10:39:44 crc kubenswrapper[4748]: I0320 10:39:44.432478 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7798b202-00ed-40e7-b4d2-abccb941ed18-serving-cert\") pod \"7798b202-00ed-40e7-b4d2-abccb941ed18\" (UID: \"7798b202-00ed-40e7-b4d2-abccb941ed18\") " Mar 20 10:39:44 crc kubenswrapper[4748]: I0320 10:39:44.432522 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7798b202-00ed-40e7-b4d2-abccb941ed18-config\") pod \"7798b202-00ed-40e7-b4d2-abccb941ed18\" (UID: \"7798b202-00ed-40e7-b4d2-abccb941ed18\") " Mar 20 10:39:44 crc kubenswrapper[4748]: I0320 10:39:44.432567 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7798b202-00ed-40e7-b4d2-abccb941ed18-client-ca\") pod \"7798b202-00ed-40e7-b4d2-abccb941ed18\" (UID: \"7798b202-00ed-40e7-b4d2-abccb941ed18\") " Mar 20 10:39:44 crc kubenswrapper[4748]: I0320 10:39:44.432822 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2ad00a69-1629-4543-9630-d26e30470649-client-ca\") pod \"route-controller-manager-bf6c965c8-lft54\" (UID: \"2ad00a69-1629-4543-9630-d26e30470649\") " pod="openshift-route-controller-manager/route-controller-manager-bf6c965c8-lft54" Mar 20 10:39:44 crc kubenswrapper[4748]: I0320 10:39:44.432892 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-759n6\" (UniqueName: \"kubernetes.io/projected/2ad00a69-1629-4543-9630-d26e30470649-kube-api-access-759n6\") pod \"route-controller-manager-bf6c965c8-lft54\" (UID: \"2ad00a69-1629-4543-9630-d26e30470649\") " pod="openshift-route-controller-manager/route-controller-manager-bf6c965c8-lft54" Mar 20 10:39:44 crc kubenswrapper[4748]: I0320 10:39:44.432956 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ad00a69-1629-4543-9630-d26e30470649-config\") pod \"route-controller-manager-bf6c965c8-lft54\" (UID: \"2ad00a69-1629-4543-9630-d26e30470649\") " pod="openshift-route-controller-manager/route-controller-manager-bf6c965c8-lft54" Mar 20 10:39:44 crc kubenswrapper[4748]: I0320 10:39:44.432990 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2ad00a69-1629-4543-9630-d26e30470649-serving-cert\") pod \"route-controller-manager-bf6c965c8-lft54\" (UID: \"2ad00a69-1629-4543-9630-d26e30470649\") " pod="openshift-route-controller-manager/route-controller-manager-bf6c965c8-lft54" Mar 20 10:39:44 crc kubenswrapper[4748]: I0320 10:39:44.433952 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7798b202-00ed-40e7-b4d2-abccb941ed18-config" (OuterVolumeSpecName: "config") pod "7798b202-00ed-40e7-b4d2-abccb941ed18" (UID: "7798b202-00ed-40e7-b4d2-abccb941ed18"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:39:44 crc kubenswrapper[4748]: I0320 10:39:44.433978 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7798b202-00ed-40e7-b4d2-abccb941ed18-client-ca" (OuterVolumeSpecName: "client-ca") pod "7798b202-00ed-40e7-b4d2-abccb941ed18" (UID: "7798b202-00ed-40e7-b4d2-abccb941ed18"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:39:44 crc kubenswrapper[4748]: I0320 10:39:44.437606 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7798b202-00ed-40e7-b4d2-abccb941ed18-kube-api-access-9tnp4" (OuterVolumeSpecName: "kube-api-access-9tnp4") pod "7798b202-00ed-40e7-b4d2-abccb941ed18" (UID: "7798b202-00ed-40e7-b4d2-abccb941ed18"). InnerVolumeSpecName "kube-api-access-9tnp4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:39:44 crc kubenswrapper[4748]: I0320 10:39:44.438341 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7798b202-00ed-40e7-b4d2-abccb941ed18-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7798b202-00ed-40e7-b4d2-abccb941ed18" (UID: "7798b202-00ed-40e7-b4d2-abccb941ed18"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:39:44 crc kubenswrapper[4748]: I0320 10:39:44.470779 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7c6bdd87c4-fmn7s" Mar 20 10:39:44 crc kubenswrapper[4748]: I0320 10:39:44.535249 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-759n6\" (UniqueName: \"kubernetes.io/projected/2ad00a69-1629-4543-9630-d26e30470649-kube-api-access-759n6\") pod \"route-controller-manager-bf6c965c8-lft54\" (UID: \"2ad00a69-1629-4543-9630-d26e30470649\") " pod="openshift-route-controller-manager/route-controller-manager-bf6c965c8-lft54" Mar 20 10:39:44 crc kubenswrapper[4748]: I0320 10:39:44.535380 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ad00a69-1629-4543-9630-d26e30470649-config\") pod \"route-controller-manager-bf6c965c8-lft54\" (UID: \"2ad00a69-1629-4543-9630-d26e30470649\") " pod="openshift-route-controller-manager/route-controller-manager-bf6c965c8-lft54" Mar 20 10:39:44 crc kubenswrapper[4748]: I0320 10:39:44.535423 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2ad00a69-1629-4543-9630-d26e30470649-serving-cert\") pod \"route-controller-manager-bf6c965c8-lft54\" (UID: \"2ad00a69-1629-4543-9630-d26e30470649\") " pod="openshift-route-controller-manager/route-controller-manager-bf6c965c8-lft54" Mar 20 10:39:44 crc kubenswrapper[4748]: I0320 10:39:44.535552 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2ad00a69-1629-4543-9630-d26e30470649-client-ca\") pod \"route-controller-manager-bf6c965c8-lft54\" (UID: \"2ad00a69-1629-4543-9630-d26e30470649\") " pod="openshift-route-controller-manager/route-controller-manager-bf6c965c8-lft54" Mar 20 10:39:44 crc kubenswrapper[4748]: I0320 10:39:44.535645 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9tnp4\" (UniqueName: \"kubernetes.io/projected/7798b202-00ed-40e7-b4d2-abccb941ed18-kube-api-access-9tnp4\") on node \"crc\" DevicePath \"\"" Mar 20 10:39:44 crc kubenswrapper[4748]: I0320 10:39:44.535671 4748 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7798b202-00ed-40e7-b4d2-abccb941ed18-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:39:44 crc kubenswrapper[4748]: I0320 10:39:44.535692 4748 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7798b202-00ed-40e7-b4d2-abccb941ed18-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:39:44 crc kubenswrapper[4748]: I0320 10:39:44.535711 4748 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7798b202-00ed-40e7-b4d2-abccb941ed18-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 10:39:44 crc kubenswrapper[4748]: I0320 10:39:44.536739 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2ad00a69-1629-4543-9630-d26e30470649-client-ca\") pod \"route-controller-manager-bf6c965c8-lft54\" (UID: \"2ad00a69-1629-4543-9630-d26e30470649\") " pod="openshift-route-controller-manager/route-controller-manager-bf6c965c8-lft54" Mar 20 10:39:44 crc kubenswrapper[4748]: I0320 10:39:44.536890 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ad00a69-1629-4543-9630-d26e30470649-config\") pod \"route-controller-manager-bf6c965c8-lft54\" (UID: \"2ad00a69-1629-4543-9630-d26e30470649\") " pod="openshift-route-controller-manager/route-controller-manager-bf6c965c8-lft54" Mar 20 10:39:44 crc kubenswrapper[4748]: I0320 10:39:44.540896 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2ad00a69-1629-4543-9630-d26e30470649-serving-cert\") pod \"route-controller-manager-bf6c965c8-lft54\" (UID: \"2ad00a69-1629-4543-9630-d26e30470649\") " pod="openshift-route-controller-manager/route-controller-manager-bf6c965c8-lft54" Mar 20 10:39:44 crc kubenswrapper[4748]: I0320 10:39:44.551965 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-759n6\" (UniqueName: \"kubernetes.io/projected/2ad00a69-1629-4543-9630-d26e30470649-kube-api-access-759n6\") pod \"route-controller-manager-bf6c965c8-lft54\" (UID: \"2ad00a69-1629-4543-9630-d26e30470649\") " pod="openshift-route-controller-manager/route-controller-manager-bf6c965c8-lft54" Mar 20 10:39:44 crc kubenswrapper[4748]: I0320 10:39:44.636848 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/71cc80b6-74af-413e-8b36-3d637f7dd31e-proxy-ca-bundles\") pod \"71cc80b6-74af-413e-8b36-3d637f7dd31e\" (UID: \"71cc80b6-74af-413e-8b36-3d637f7dd31e\") " Mar 20 10:39:44 crc kubenswrapper[4748]: I0320 10:39:44.636939 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/71cc80b6-74af-413e-8b36-3d637f7dd31e-client-ca\") pod \"71cc80b6-74af-413e-8b36-3d637f7dd31e\" (UID: \"71cc80b6-74af-413e-8b36-3d637f7dd31e\") " Mar 20 10:39:44 crc kubenswrapper[4748]: I0320 10:39:44.636970 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnj8m\" (UniqueName: \"kubernetes.io/projected/71cc80b6-74af-413e-8b36-3d637f7dd31e-kube-api-access-rnj8m\") pod \"71cc80b6-74af-413e-8b36-3d637f7dd31e\" (UID: \"71cc80b6-74af-413e-8b36-3d637f7dd31e\") " Mar 20 10:39:44 crc kubenswrapper[4748]: I0320 10:39:44.637039 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/71cc80b6-74af-413e-8b36-3d637f7dd31e-serving-cert\") pod \"71cc80b6-74af-413e-8b36-3d637f7dd31e\" (UID: \"71cc80b6-74af-413e-8b36-3d637f7dd31e\") " Mar 20 10:39:44 crc kubenswrapper[4748]: I0320 10:39:44.637065 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71cc80b6-74af-413e-8b36-3d637f7dd31e-config\") pod \"71cc80b6-74af-413e-8b36-3d637f7dd31e\" (UID: \"71cc80b6-74af-413e-8b36-3d637f7dd31e\") " Mar 20 10:39:44 crc kubenswrapper[4748]: I0320 10:39:44.639213 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/71cc80b6-74af-413e-8b36-3d637f7dd31e-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "71cc80b6-74af-413e-8b36-3d637f7dd31e" (UID: "71cc80b6-74af-413e-8b36-3d637f7dd31e"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:39:44 crc kubenswrapper[4748]: I0320 10:39:44.639480 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/71cc80b6-74af-413e-8b36-3d637f7dd31e-client-ca" (OuterVolumeSpecName: "client-ca") pod "71cc80b6-74af-413e-8b36-3d637f7dd31e" (UID: "71cc80b6-74af-413e-8b36-3d637f7dd31e"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:39:44 crc kubenswrapper[4748]: I0320 10:39:44.639556 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/71cc80b6-74af-413e-8b36-3d637f7dd31e-config" (OuterVolumeSpecName: "config") pod "71cc80b6-74af-413e-8b36-3d637f7dd31e" (UID: "71cc80b6-74af-413e-8b36-3d637f7dd31e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:39:44 crc kubenswrapper[4748]: I0320 10:39:44.641439 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71cc80b6-74af-413e-8b36-3d637f7dd31e-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "71cc80b6-74af-413e-8b36-3d637f7dd31e" (UID: "71cc80b6-74af-413e-8b36-3d637f7dd31e"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:39:44 crc kubenswrapper[4748]: I0320 10:39:44.644025 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71cc80b6-74af-413e-8b36-3d637f7dd31e-kube-api-access-rnj8m" (OuterVolumeSpecName: "kube-api-access-rnj8m") pod "71cc80b6-74af-413e-8b36-3d637f7dd31e" (UID: "71cc80b6-74af-413e-8b36-3d637f7dd31e"). InnerVolumeSpecName "kube-api-access-rnj8m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:39:44 crc kubenswrapper[4748]: I0320 10:39:44.667738 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-bf6c965c8-lft54" Mar 20 10:39:44 crc kubenswrapper[4748]: I0320 10:39:44.738725 4748 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/71cc80b6-74af-413e-8b36-3d637f7dd31e-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:39:44 crc kubenswrapper[4748]: I0320 10:39:44.738779 4748 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71cc80b6-74af-413e-8b36-3d637f7dd31e-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:39:44 crc kubenswrapper[4748]: I0320 10:39:44.738797 4748 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/71cc80b6-74af-413e-8b36-3d637f7dd31e-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 10:39:44 crc kubenswrapper[4748]: I0320 10:39:44.738818 4748 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/71cc80b6-74af-413e-8b36-3d637f7dd31e-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 10:39:44 crc kubenswrapper[4748]: I0320 10:39:44.738883 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnj8m\" (UniqueName: \"kubernetes.io/projected/71cc80b6-74af-413e-8b36-3d637f7dd31e-kube-api-access-rnj8m\") on node \"crc\" DevicePath \"\"" Mar 20 10:39:44 crc kubenswrapper[4748]: I0320 10:39:44.748012 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-59c68db7c4-66ddn"] Mar 20 10:39:44 crc kubenswrapper[4748]: I0320 10:39:44.754355 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-59c68db7c4-66ddn"] Mar 20 10:39:44 crc kubenswrapper[4748]: I0320 10:39:44.926939 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-bf6c965c8-lft54"] Mar 20 10:39:44 crc kubenswrapper[4748]: W0320 10:39:44.935112 4748 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2ad00a69_1629_4543_9630_d26e30470649.slice/crio-97c64b2a998d8acef949b7699e81365958c45fdc1a455e8e72a7fdb0c52fbe45 WatchSource:0}: Error finding container 97c64b2a998d8acef949b7699e81365958c45fdc1a455e8e72a7fdb0c52fbe45: Status 404 returned error can't find the container with id 97c64b2a998d8acef949b7699e81365958c45fdc1a455e8e72a7fdb0c52fbe45 Mar 20 10:39:45 crc kubenswrapper[4748]: I0320 10:39:45.380527 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7c6bdd87c4-fmn7s" event={"ID":"71cc80b6-74af-413e-8b36-3d637f7dd31e","Type":"ContainerDied","Data":"991be0f417c0bce5fcc68a3789b0dd1f80bb7e29110b4146b8981aa240bd54c5"} Mar 20 10:39:45 crc kubenswrapper[4748]: I0320 10:39:45.380604 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7c6bdd87c4-fmn7s" Mar 20 10:39:45 crc kubenswrapper[4748]: I0320 10:39:45.380861 4748 scope.go:117] "RemoveContainer" containerID="1fafcb8bc819d47a30cc4b17aa80ff00f13cf806a334cc81113624f47a1f8f69" Mar 20 10:39:45 crc kubenswrapper[4748]: I0320 10:39:45.394610 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-bf6c965c8-lft54" event={"ID":"2ad00a69-1629-4543-9630-d26e30470649","Type":"ContainerStarted","Data":"d1e87c61038cc6d9018a0cac1895497ab8cd65bb0d5a3a8daebd409a8cba5b1a"} Mar 20 10:39:45 crc kubenswrapper[4748]: I0320 10:39:45.394663 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-bf6c965c8-lft54" event={"ID":"2ad00a69-1629-4543-9630-d26e30470649","Type":"ContainerStarted","Data":"97c64b2a998d8acef949b7699e81365958c45fdc1a455e8e72a7fdb0c52fbe45"} Mar 20 10:39:45 crc kubenswrapper[4748]: I0320 10:39:45.394868 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-bf6c965c8-lft54" Mar 20 10:39:45 crc kubenswrapper[4748]: I0320 10:39:45.415188 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-bf6c965c8-lft54" podStartSLOduration=3.415167205 podStartE2EDuration="3.415167205s" podCreationTimestamp="2026-03-20 10:39:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:39:45.413448381 +0000 UTC m=+220.554994205" watchObservedRunningTime="2026-03-20 10:39:45.415167205 +0000 UTC m=+220.556713019" Mar 20 10:39:45 crc kubenswrapper[4748]: I0320 10:39:45.435198 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7c6bdd87c4-fmn7s"] Mar 20 10:39:45 crc kubenswrapper[4748]: I0320 10:39:45.439486 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-7c6bdd87c4-fmn7s"] Mar 20 10:39:45 crc kubenswrapper[4748]: I0320 10:39:45.491147 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-bf6c965c8-lft54" Mar 20 10:39:45 crc kubenswrapper[4748]: I0320 10:39:45.524966 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71cc80b6-74af-413e-8b36-3d637f7dd31e" path="/var/lib/kubelet/pods/71cc80b6-74af-413e-8b36-3d637f7dd31e/volumes" Mar 20 10:39:45 crc kubenswrapper[4748]: I0320 10:39:45.525701 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7798b202-00ed-40e7-b4d2-abccb941ed18" path="/var/lib/kubelet/pods/7798b202-00ed-40e7-b4d2-abccb941ed18/volumes" Mar 20 10:39:46 crc kubenswrapper[4748]: I0320 10:39:46.640382 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-845559dd54-v8dfx"] Mar 20 10:39:46 crc kubenswrapper[4748]: E0320 10:39:46.641371 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71cc80b6-74af-413e-8b36-3d637f7dd31e" containerName="controller-manager" Mar 20 10:39:46 crc kubenswrapper[4748]: I0320 10:39:46.641402 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="71cc80b6-74af-413e-8b36-3d637f7dd31e" containerName="controller-manager" Mar 20 10:39:46 crc kubenswrapper[4748]: I0320 10:39:46.641619 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="71cc80b6-74af-413e-8b36-3d637f7dd31e" containerName="controller-manager" Mar 20 10:39:46 crc kubenswrapper[4748]: I0320 10:39:46.642407 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-845559dd54-v8dfx" Mar 20 10:39:46 crc kubenswrapper[4748]: I0320 10:39:46.649625 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 20 10:39:46 crc kubenswrapper[4748]: I0320 10:39:46.650258 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 20 10:39:46 crc kubenswrapper[4748]: I0320 10:39:46.650477 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 20 10:39:46 crc kubenswrapper[4748]: I0320 10:39:46.650536 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 20 10:39:46 crc kubenswrapper[4748]: I0320 10:39:46.650902 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 20 10:39:46 crc kubenswrapper[4748]: I0320 10:39:46.651473 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 20 10:39:46 crc kubenswrapper[4748]: I0320 10:39:46.651907 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-845559dd54-v8dfx"] Mar 20 10:39:46 crc kubenswrapper[4748]: I0320 10:39:46.660907 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 20 10:39:46 crc kubenswrapper[4748]: I0320 10:39:46.768410 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b4d695a1-8a7f-4860-85b7-5f09d660027c-config\") pod \"controller-manager-845559dd54-v8dfx\" (UID: \"b4d695a1-8a7f-4860-85b7-5f09d660027c\") " pod="openshift-controller-manager/controller-manager-845559dd54-v8dfx" Mar 20 10:39:46 crc kubenswrapper[4748]: I0320 10:39:46.768492 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b4d695a1-8a7f-4860-85b7-5f09d660027c-serving-cert\") pod \"controller-manager-845559dd54-v8dfx\" (UID: \"b4d695a1-8a7f-4860-85b7-5f09d660027c\") " pod="openshift-controller-manager/controller-manager-845559dd54-v8dfx" Mar 20 10:39:46 crc kubenswrapper[4748]: I0320 10:39:46.768532 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6f45\" (UniqueName: \"kubernetes.io/projected/b4d695a1-8a7f-4860-85b7-5f09d660027c-kube-api-access-j6f45\") pod \"controller-manager-845559dd54-v8dfx\" (UID: \"b4d695a1-8a7f-4860-85b7-5f09d660027c\") " pod="openshift-controller-manager/controller-manager-845559dd54-v8dfx" Mar 20 10:39:46 crc kubenswrapper[4748]: I0320 10:39:46.768564 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b4d695a1-8a7f-4860-85b7-5f09d660027c-proxy-ca-bundles\") pod \"controller-manager-845559dd54-v8dfx\" (UID: \"b4d695a1-8a7f-4860-85b7-5f09d660027c\") " pod="openshift-controller-manager/controller-manager-845559dd54-v8dfx" Mar 20 10:39:46 crc kubenswrapper[4748]: I0320 10:39:46.768604 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b4d695a1-8a7f-4860-85b7-5f09d660027c-client-ca\") pod \"controller-manager-845559dd54-v8dfx\" (UID: \"b4d695a1-8a7f-4860-85b7-5f09d660027c\") " pod="openshift-controller-manager/controller-manager-845559dd54-v8dfx" Mar 20 10:39:46 crc kubenswrapper[4748]: I0320 10:39:46.870651 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b4d695a1-8a7f-4860-85b7-5f09d660027c-config\") pod \"controller-manager-845559dd54-v8dfx\" (UID: \"b4d695a1-8a7f-4860-85b7-5f09d660027c\") " pod="openshift-controller-manager/controller-manager-845559dd54-v8dfx" Mar 20 10:39:46 crc kubenswrapper[4748]: I0320 10:39:46.870784 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b4d695a1-8a7f-4860-85b7-5f09d660027c-serving-cert\") pod \"controller-manager-845559dd54-v8dfx\" (UID: \"b4d695a1-8a7f-4860-85b7-5f09d660027c\") " pod="openshift-controller-manager/controller-manager-845559dd54-v8dfx" Mar 20 10:39:46 crc kubenswrapper[4748]: I0320 10:39:46.870924 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j6f45\" (UniqueName: \"kubernetes.io/projected/b4d695a1-8a7f-4860-85b7-5f09d660027c-kube-api-access-j6f45\") pod \"controller-manager-845559dd54-v8dfx\" (UID: \"b4d695a1-8a7f-4860-85b7-5f09d660027c\") " pod="openshift-controller-manager/controller-manager-845559dd54-v8dfx" Mar 20 10:39:46 crc kubenswrapper[4748]: I0320 10:39:46.870988 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b4d695a1-8a7f-4860-85b7-5f09d660027c-proxy-ca-bundles\") pod \"controller-manager-845559dd54-v8dfx\" (UID: \"b4d695a1-8a7f-4860-85b7-5f09d660027c\") " pod="openshift-controller-manager/controller-manager-845559dd54-v8dfx" Mar 20 10:39:46 crc kubenswrapper[4748]: I0320 10:39:46.871073 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b4d695a1-8a7f-4860-85b7-5f09d660027c-client-ca\") pod \"controller-manager-845559dd54-v8dfx\" (UID: \"b4d695a1-8a7f-4860-85b7-5f09d660027c\") " pod="openshift-controller-manager/controller-manager-845559dd54-v8dfx" Mar 20 10:39:46 crc kubenswrapper[4748]: I0320 10:39:46.872706 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b4d695a1-8a7f-4860-85b7-5f09d660027c-client-ca\") pod \"controller-manager-845559dd54-v8dfx\" (UID: \"b4d695a1-8a7f-4860-85b7-5f09d660027c\") " pod="openshift-controller-manager/controller-manager-845559dd54-v8dfx" Mar 20 10:39:46 crc kubenswrapper[4748]: I0320 10:39:46.873202 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b4d695a1-8a7f-4860-85b7-5f09d660027c-config\") pod \"controller-manager-845559dd54-v8dfx\" (UID: \"b4d695a1-8a7f-4860-85b7-5f09d660027c\") " pod="openshift-controller-manager/controller-manager-845559dd54-v8dfx" Mar 20 10:39:46 crc kubenswrapper[4748]: I0320 10:39:46.873262 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b4d695a1-8a7f-4860-85b7-5f09d660027c-proxy-ca-bundles\") pod \"controller-manager-845559dd54-v8dfx\" (UID: \"b4d695a1-8a7f-4860-85b7-5f09d660027c\") " pod="openshift-controller-manager/controller-manager-845559dd54-v8dfx" Mar 20 10:39:46 crc kubenswrapper[4748]: I0320 10:39:46.881945 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b4d695a1-8a7f-4860-85b7-5f09d660027c-serving-cert\") pod \"controller-manager-845559dd54-v8dfx\" (UID: \"b4d695a1-8a7f-4860-85b7-5f09d660027c\") " pod="openshift-controller-manager/controller-manager-845559dd54-v8dfx" Mar 20 10:39:46 crc kubenswrapper[4748]: I0320 10:39:46.901276 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6f45\" (UniqueName: \"kubernetes.io/projected/b4d695a1-8a7f-4860-85b7-5f09d660027c-kube-api-access-j6f45\") pod \"controller-manager-845559dd54-v8dfx\" (UID: \"b4d695a1-8a7f-4860-85b7-5f09d660027c\") " pod="openshift-controller-manager/controller-manager-845559dd54-v8dfx" Mar 20 10:39:46 crc kubenswrapper[4748]: I0320 10:39:46.968996 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-845559dd54-v8dfx" Mar 20 10:39:47 crc kubenswrapper[4748]: I0320 10:39:47.202673 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-845559dd54-v8dfx"] Mar 20 10:39:47 crc kubenswrapper[4748]: W0320 10:39:47.215145 4748 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb4d695a1_8a7f_4860_85b7_5f09d660027c.slice/crio-ec613fbaf176bc1a6128f22773f79ffcf30fde256fdba64c13ed2cafe1c864f2 WatchSource:0}: Error finding container ec613fbaf176bc1a6128f22773f79ffcf30fde256fdba64c13ed2cafe1c864f2: Status 404 returned error can't find the container with id ec613fbaf176bc1a6128f22773f79ffcf30fde256fdba64c13ed2cafe1c864f2 Mar 20 10:39:47 crc kubenswrapper[4748]: I0320 10:39:47.414145 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-845559dd54-v8dfx" event={"ID":"b4d695a1-8a7f-4860-85b7-5f09d660027c","Type":"ContainerStarted","Data":"c6ee74d930e7b50b3c0ff2d64cbf0b74c5ce190b2025f9b855ec774a8cee5803"} Mar 20 10:39:47 crc kubenswrapper[4748]: I0320 10:39:47.414208 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-845559dd54-v8dfx" event={"ID":"b4d695a1-8a7f-4860-85b7-5f09d660027c","Type":"ContainerStarted","Data":"ec613fbaf176bc1a6128f22773f79ffcf30fde256fdba64c13ed2cafe1c864f2"} Mar 20 10:39:47 crc kubenswrapper[4748]: I0320 10:39:47.414307 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-845559dd54-v8dfx" Mar 20 10:39:47 crc kubenswrapper[4748]: I0320 10:39:47.415921 4748 patch_prober.go:28] interesting pod/controller-manager-845559dd54-v8dfx container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.67:8443/healthz\": dial tcp 10.217.0.67:8443: connect: connection refused" start-of-body= Mar 20 10:39:47 crc kubenswrapper[4748]: I0320 10:39:47.415978 4748 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-845559dd54-v8dfx" podUID="b4d695a1-8a7f-4860-85b7-5f09d660027c" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.67:8443/healthz\": dial tcp 10.217.0.67:8443: connect: connection refused" Mar 20 10:39:48 crc kubenswrapper[4748]: I0320 10:39:48.425992 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-845559dd54-v8dfx" Mar 20 10:39:48 crc kubenswrapper[4748]: I0320 10:39:48.452669 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-845559dd54-v8dfx" podStartSLOduration=6.452635414 podStartE2EDuration="6.452635414s" podCreationTimestamp="2026-03-20 10:39:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:39:47.439958229 +0000 UTC m=+222.581504143" watchObservedRunningTime="2026-03-20 10:39:48.452635414 +0000 UTC m=+223.594181258" Mar 20 10:39:48 crc kubenswrapper[4748]: I0320 10:39:48.975281 4748 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 20 10:39:48 crc kubenswrapper[4748]: I0320 10:39:48.977323 4748 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 20 10:39:48 crc kubenswrapper[4748]: I0320 10:39:48.977785 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://2c40d6e7336ba79828d98e8a8a2cb2e049ce6ad046d53a6337d9ab0d1ce0ee11" gracePeriod=15 Mar 20 10:39:48 crc kubenswrapper[4748]: I0320 10:39:48.978019 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://401326b29a7df326d224b7ab4a9a1189d9219035a3df12a03d97e726d95641c4" gracePeriod=15 Mar 20 10:39:48 crc kubenswrapper[4748]: I0320 10:39:48.978045 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://ed2d8bdb5b326ae91ba0ca3da26d434af7ae6a3c604fd3ff22eb0ae8b00a0aa1" gracePeriod=15 Mar 20 10:39:48 crc kubenswrapper[4748]: I0320 10:39:48.978028 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 10:39:48 crc kubenswrapper[4748]: I0320 10:39:48.978154 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://15d3d6d4f8c47119a747f3c76a9bb2b05c6f0030576418e787f3deb6e9f2485b" gracePeriod=15 Mar 20 10:39:48 crc kubenswrapper[4748]: I0320 10:39:48.978172 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://989be6314e2d0af65e9541b5b2a0be5aed080255658e56c1c5bd989d1a842acf" gracePeriod=15 Mar 20 10:39:48 crc kubenswrapper[4748]: I0320 10:39:48.980795 4748 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 20 10:39:48 crc kubenswrapper[4748]: E0320 10:39:48.981157 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 20 10:39:48 crc kubenswrapper[4748]: I0320 10:39:48.981175 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 20 10:39:48 crc kubenswrapper[4748]: E0320 10:39:48.981193 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 20 10:39:48 crc kubenswrapper[4748]: I0320 10:39:48.981204 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 20 10:39:48 crc kubenswrapper[4748]: E0320 10:39:48.981213 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 10:39:48 crc kubenswrapper[4748]: I0320 10:39:48.981221 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 10:39:48 crc kubenswrapper[4748]: E0320 10:39:48.981235 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 10:39:48 crc kubenswrapper[4748]: I0320 10:39:48.981243 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 10:39:48 crc kubenswrapper[4748]: E0320 10:39:48.981253 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 10:39:48 crc kubenswrapper[4748]: I0320 10:39:48.981261 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 10:39:48 crc kubenswrapper[4748]: E0320 10:39:48.981297 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 20 10:39:48 crc kubenswrapper[4748]: I0320 10:39:48.981306 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 20 10:39:48 crc kubenswrapper[4748]: E0320 10:39:48.981322 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 10:39:48 crc kubenswrapper[4748]: I0320 10:39:48.981331 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 10:39:48 crc kubenswrapper[4748]: E0320 10:39:48.981343 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 20 10:39:48 crc kubenswrapper[4748]: I0320 10:39:48.981351 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 20 10:39:48 crc kubenswrapper[4748]: E0320 10:39:48.981361 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 20 10:39:48 crc kubenswrapper[4748]: I0320 10:39:48.981369 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 20 10:39:48 crc kubenswrapper[4748]: I0320 10:39:48.981490 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 10:39:48 crc kubenswrapper[4748]: I0320 10:39:48.981503 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 20 10:39:48 crc kubenswrapper[4748]: I0320 10:39:48.981519 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 10:39:48 crc kubenswrapper[4748]: I0320 10:39:48.981533 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 20 10:39:48 crc kubenswrapper[4748]: I0320 10:39:48.981541 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 20 10:39:48 crc kubenswrapper[4748]: I0320 10:39:48.981551 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 10:39:48 crc kubenswrapper[4748]: I0320 10:39:48.981560 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 10:39:48 crc kubenswrapper[4748]: I0320 10:39:48.981571 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 20 10:39:48 crc kubenswrapper[4748]: E0320 10:39:48.981687 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 10:39:48 crc kubenswrapper[4748]: I0320 10:39:48.981696 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 10:39:48 crc kubenswrapper[4748]: I0320 10:39:48.981827 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 10:39:49 crc kubenswrapper[4748]: E0320 10:39:49.040453 4748 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.204:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 10:39:49 crc kubenswrapper[4748]: I0320 10:39:49.109906 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 10:39:49 crc kubenswrapper[4748]: I0320 10:39:49.110760 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 10:39:49 crc kubenswrapper[4748]: I0320 10:39:49.110847 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:39:49 crc kubenswrapper[4748]: I0320 10:39:49.110976 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:39:49 crc kubenswrapper[4748]: I0320 10:39:49.111087 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:39:49 crc kubenswrapper[4748]: I0320 10:39:49.111157 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 10:39:49 crc kubenswrapper[4748]: I0320 10:39:49.111198 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 10:39:49 crc kubenswrapper[4748]: I0320 10:39:49.111237 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 10:39:49 crc kubenswrapper[4748]: I0320 10:39:49.213474 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 10:39:49 crc kubenswrapper[4748]: I0320 10:39:49.213609 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 10:39:49 crc kubenswrapper[4748]: I0320 10:39:49.213674 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:39:49 crc kubenswrapper[4748]: I0320 10:39:49.213745 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:39:49 crc kubenswrapper[4748]: I0320 10:39:49.213793 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:39:49 crc kubenswrapper[4748]: I0320 10:39:49.213867 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 10:39:49 crc kubenswrapper[4748]: I0320 10:39:49.213896 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 10:39:49 crc kubenswrapper[4748]: I0320 10:39:49.213928 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 10:39:49 crc kubenswrapper[4748]: I0320 10:39:49.214062 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 10:39:49 crc kubenswrapper[4748]: I0320 10:39:49.214127 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 10:39:49 crc kubenswrapper[4748]: I0320 10:39:49.214168 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 10:39:49 crc kubenswrapper[4748]: I0320 10:39:49.214212 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:39:49 crc kubenswrapper[4748]: I0320 10:39:49.214254 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:39:49 crc kubenswrapper[4748]: I0320 10:39:49.214299 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:39:49 crc kubenswrapper[4748]: I0320 10:39:49.214340 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 10:39:49 crc kubenswrapper[4748]: I0320 10:39:49.214380 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 10:39:49 crc kubenswrapper[4748]: I0320 10:39:49.342645 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 10:39:49 crc kubenswrapper[4748]: E0320 10:39:49.369262 4748 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.204:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189e86806cb6b79e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:39:49.36840387 +0000 UTC m=+224.509949694,LastTimestamp:2026-03-20 10:39:49.36840387 +0000 UTC m=+224.509949694,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:39:49 crc kubenswrapper[4748]: I0320 10:39:49.433185 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"8e496b4d46df0ecc978c8f2807ddd5e208172b5053edf1026904de57bde2f7bf"} Mar 20 10:39:49 crc kubenswrapper[4748]: I0320 10:39:49.435388 4748 generic.go:334] "Generic (PLEG): container finished" podID="7d42201c-acc6-40d3-85a5-ad7c4a0cbf8d" containerID="b3059ea8fec63ee9b7a63374cc436e6ce45be13174e66222e836a504de10238d" exitCode=0 Mar 20 10:39:49 crc kubenswrapper[4748]: I0320 10:39:49.435478 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"7d42201c-acc6-40d3-85a5-ad7c4a0cbf8d","Type":"ContainerDied","Data":"b3059ea8fec63ee9b7a63374cc436e6ce45be13174e66222e836a504de10238d"} Mar 20 10:39:49 crc kubenswrapper[4748]: I0320 10:39:49.436346 4748 status_manager.go:851] "Failed to get status for pod" podUID="7d42201c-acc6-40d3-85a5-ad7c4a0cbf8d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Mar 20 10:39:49 crc kubenswrapper[4748]: I0320 10:39:49.436794 4748 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Mar 20 10:39:49 crc kubenswrapper[4748]: I0320 10:39:49.439969 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 20 10:39:49 crc kubenswrapper[4748]: I0320 10:39:49.441549 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 20 10:39:49 crc kubenswrapper[4748]: I0320 10:39:49.443435 4748 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="401326b29a7df326d224b7ab4a9a1189d9219035a3df12a03d97e726d95641c4" exitCode=0 Mar 20 10:39:49 crc kubenswrapper[4748]: I0320 10:39:49.443499 4748 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="989be6314e2d0af65e9541b5b2a0be5aed080255658e56c1c5bd989d1a842acf" exitCode=0 Mar 20 10:39:49 crc kubenswrapper[4748]: I0320 10:39:49.443565 4748 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="ed2d8bdb5b326ae91ba0ca3da26d434af7ae6a3c604fd3ff22eb0ae8b00a0aa1" exitCode=0 Mar 20 10:39:49 crc kubenswrapper[4748]: I0320 10:39:49.443587 4748 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="15d3d6d4f8c47119a747f3c76a9bb2b05c6f0030576418e787f3deb6e9f2485b" exitCode=2 Mar 20 10:39:49 crc kubenswrapper[4748]: I0320 10:39:49.443581 4748 scope.go:117] "RemoveContainer" containerID="abfae41f906e09e4b2be25e2cdf0408811449414f51931ce149a8468a32a88c6" Mar 20 10:39:50 crc kubenswrapper[4748]: I0320 10:39:50.456625 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"7e2c3eae3778f34ab75fefa7f159b95ce1a719182c30da2a11ae081936d9860e"} Mar 20 10:39:50 crc kubenswrapper[4748]: E0320 10:39:50.457558 4748 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.204:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 10:39:50 crc kubenswrapper[4748]: I0320 10:39:50.457533 4748 status_manager.go:851] "Failed to get status for pod" podUID="7d42201c-acc6-40d3-85a5-ad7c4a0cbf8d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Mar 20 10:39:50 crc kubenswrapper[4748]: I0320 10:39:50.460930 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 20 10:39:50 crc kubenswrapper[4748]: I0320 10:39:50.824005 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 20 10:39:50 crc kubenswrapper[4748]: I0320 10:39:50.825138 4748 status_manager.go:851] "Failed to get status for pod" podUID="7d42201c-acc6-40d3-85a5-ad7c4a0cbf8d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Mar 20 10:39:50 crc kubenswrapper[4748]: I0320 10:39:50.939508 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/7d42201c-acc6-40d3-85a5-ad7c4a0cbf8d-var-lock\") pod \"7d42201c-acc6-40d3-85a5-ad7c4a0cbf8d\" (UID: \"7d42201c-acc6-40d3-85a5-ad7c4a0cbf8d\") " Mar 20 10:39:50 crc kubenswrapper[4748]: I0320 10:39:50.939645 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7d42201c-acc6-40d3-85a5-ad7c4a0cbf8d-kubelet-dir\") pod \"7d42201c-acc6-40d3-85a5-ad7c4a0cbf8d\" (UID: \"7d42201c-acc6-40d3-85a5-ad7c4a0cbf8d\") " Mar 20 10:39:50 crc kubenswrapper[4748]: I0320 10:39:50.939771 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7d42201c-acc6-40d3-85a5-ad7c4a0cbf8d-kube-api-access\") pod \"7d42201c-acc6-40d3-85a5-ad7c4a0cbf8d\" (UID: \"7d42201c-acc6-40d3-85a5-ad7c4a0cbf8d\") " Mar 20 10:39:50 crc kubenswrapper[4748]: I0320 10:39:50.939954 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7d42201c-acc6-40d3-85a5-ad7c4a0cbf8d-var-lock" (OuterVolumeSpecName: "var-lock") pod "7d42201c-acc6-40d3-85a5-ad7c4a0cbf8d" (UID: "7d42201c-acc6-40d3-85a5-ad7c4a0cbf8d"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 10:39:50 crc kubenswrapper[4748]: I0320 10:39:50.939968 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7d42201c-acc6-40d3-85a5-ad7c4a0cbf8d-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "7d42201c-acc6-40d3-85a5-ad7c4a0cbf8d" (UID: "7d42201c-acc6-40d3-85a5-ad7c4a0cbf8d"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 10:39:50 crc kubenswrapper[4748]: I0320 10:39:50.940660 4748 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/7d42201c-acc6-40d3-85a5-ad7c4a0cbf8d-var-lock\") on node \"crc\" DevicePath \"\"" Mar 20 10:39:50 crc kubenswrapper[4748]: I0320 10:39:50.940691 4748 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7d42201c-acc6-40d3-85a5-ad7c4a0cbf8d-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 20 10:39:50 crc kubenswrapper[4748]: I0320 10:39:50.949071 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d42201c-acc6-40d3-85a5-ad7c4a0cbf8d-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "7d42201c-acc6-40d3-85a5-ad7c4a0cbf8d" (UID: "7d42201c-acc6-40d3-85a5-ad7c4a0cbf8d"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:39:51 crc kubenswrapper[4748]: I0320 10:39:51.043073 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7d42201c-acc6-40d3-85a5-ad7c4a0cbf8d-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 10:39:51 crc kubenswrapper[4748]: I0320 10:39:51.343214 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 20 10:39:51 crc kubenswrapper[4748]: I0320 10:39:51.344292 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:39:51 crc kubenswrapper[4748]: I0320 10:39:51.345181 4748 status_manager.go:851] "Failed to get status for pod" podUID="7d42201c-acc6-40d3-85a5-ad7c4a0cbf8d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Mar 20 10:39:51 crc kubenswrapper[4748]: I0320 10:39:51.345496 4748 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Mar 20 10:39:51 crc kubenswrapper[4748]: I0320 10:39:51.448242 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 20 10:39:51 crc kubenswrapper[4748]: I0320 10:39:51.448692 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 20 10:39:51 crc kubenswrapper[4748]: I0320 10:39:51.448973 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 20 10:39:51 crc kubenswrapper[4748]: I0320 10:39:51.448403 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 10:39:51 crc kubenswrapper[4748]: I0320 10:39:51.448765 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 10:39:51 crc kubenswrapper[4748]: I0320 10:39:51.449043 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 10:39:51 crc kubenswrapper[4748]: I0320 10:39:51.449884 4748 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 20 10:39:51 crc kubenswrapper[4748]: I0320 10:39:51.450065 4748 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Mar 20 10:39:51 crc kubenswrapper[4748]: I0320 10:39:51.450204 4748 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 20 10:39:51 crc kubenswrapper[4748]: I0320 10:39:51.474010 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 20 10:39:51 crc kubenswrapper[4748]: I0320 10:39:51.475072 4748 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="2c40d6e7336ba79828d98e8a8a2cb2e049ce6ad046d53a6337d9ab0d1ce0ee11" exitCode=0 Mar 20 10:39:51 crc kubenswrapper[4748]: I0320 10:39:51.475176 4748 scope.go:117] "RemoveContainer" containerID="401326b29a7df326d224b7ab4a9a1189d9219035a3df12a03d97e726d95641c4" Mar 20 10:39:51 crc kubenswrapper[4748]: I0320 10:39:51.475354 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:39:51 crc kubenswrapper[4748]: I0320 10:39:51.484959 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"7d42201c-acc6-40d3-85a5-ad7c4a0cbf8d","Type":"ContainerDied","Data":"5fbf0b08149adccd996ee34fb49ea97a2662871262aeec59df320080123c6175"} Mar 20 10:39:51 crc kubenswrapper[4748]: I0320 10:39:51.485032 4748 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5fbf0b08149adccd996ee34fb49ea97a2662871262aeec59df320080123c6175" Mar 20 10:39:51 crc kubenswrapper[4748]: I0320 10:39:51.485133 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 20 10:39:51 crc kubenswrapper[4748]: E0320 10:39:51.486147 4748 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.204:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 10:39:51 crc kubenswrapper[4748]: I0320 10:39:51.506555 4748 status_manager.go:851] "Failed to get status for pod" podUID="7d42201c-acc6-40d3-85a5-ad7c4a0cbf8d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Mar 20 10:39:51 crc kubenswrapper[4748]: I0320 10:39:51.506822 4748 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Mar 20 10:39:51 crc kubenswrapper[4748]: I0320 10:39:51.510537 4748 scope.go:117] "RemoveContainer" containerID="989be6314e2d0af65e9541b5b2a0be5aed080255658e56c1c5bd989d1a842acf" Mar 20 10:39:51 crc kubenswrapper[4748]: I0320 10:39:51.513523 4748 status_manager.go:851] "Failed to get status for pod" podUID="7d42201c-acc6-40d3-85a5-ad7c4a0cbf8d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Mar 20 10:39:51 crc kubenswrapper[4748]: I0320 10:39:51.513919 4748 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Mar 20 10:39:51 crc kubenswrapper[4748]: I0320 10:39:51.529042 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Mar 20 10:39:51 crc kubenswrapper[4748]: I0320 10:39:51.533662 4748 scope.go:117] "RemoveContainer" containerID="ed2d8bdb5b326ae91ba0ca3da26d434af7ae6a3c604fd3ff22eb0ae8b00a0aa1" Mar 20 10:39:51 crc kubenswrapper[4748]: I0320 10:39:51.550130 4748 scope.go:117] "RemoveContainer" containerID="15d3d6d4f8c47119a747f3c76a9bb2b05c6f0030576418e787f3deb6e9f2485b" Mar 20 10:39:51 crc kubenswrapper[4748]: I0320 10:39:51.568120 4748 scope.go:117] "RemoveContainer" containerID="2c40d6e7336ba79828d98e8a8a2cb2e049ce6ad046d53a6337d9ab0d1ce0ee11" Mar 20 10:39:51 crc kubenswrapper[4748]: I0320 10:39:51.593757 4748 scope.go:117] "RemoveContainer" containerID="b974db792d69c06471dc5bd8f9e64e727b052347aed06e5a1c80653838984f51" Mar 20 10:39:51 crc kubenswrapper[4748]: I0320 10:39:51.618227 4748 scope.go:117] "RemoveContainer" containerID="401326b29a7df326d224b7ab4a9a1189d9219035a3df12a03d97e726d95641c4" Mar 20 10:39:51 crc kubenswrapper[4748]: E0320 10:39:51.618787 4748 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"401326b29a7df326d224b7ab4a9a1189d9219035a3df12a03d97e726d95641c4\": container with ID starting with 401326b29a7df326d224b7ab4a9a1189d9219035a3df12a03d97e726d95641c4 not found: ID does not exist" containerID="401326b29a7df326d224b7ab4a9a1189d9219035a3df12a03d97e726d95641c4" Mar 20 10:39:51 crc kubenswrapper[4748]: I0320 10:39:51.618858 4748 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"401326b29a7df326d224b7ab4a9a1189d9219035a3df12a03d97e726d95641c4"} err="failed to get container status \"401326b29a7df326d224b7ab4a9a1189d9219035a3df12a03d97e726d95641c4\": rpc error: code = NotFound desc = could not find container \"401326b29a7df326d224b7ab4a9a1189d9219035a3df12a03d97e726d95641c4\": container with ID starting with 401326b29a7df326d224b7ab4a9a1189d9219035a3df12a03d97e726d95641c4 not found: ID does not exist" Mar 20 10:39:51 crc kubenswrapper[4748]: I0320 10:39:51.618899 4748 scope.go:117] "RemoveContainer" containerID="989be6314e2d0af65e9541b5b2a0be5aed080255658e56c1c5bd989d1a842acf" Mar 20 10:39:51 crc kubenswrapper[4748]: E0320 10:39:51.619442 4748 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"989be6314e2d0af65e9541b5b2a0be5aed080255658e56c1c5bd989d1a842acf\": container with ID starting with 989be6314e2d0af65e9541b5b2a0be5aed080255658e56c1c5bd989d1a842acf not found: ID does not exist" containerID="989be6314e2d0af65e9541b5b2a0be5aed080255658e56c1c5bd989d1a842acf" Mar 20 10:39:51 crc kubenswrapper[4748]: I0320 10:39:51.619472 4748 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"989be6314e2d0af65e9541b5b2a0be5aed080255658e56c1c5bd989d1a842acf"} err="failed to get container status \"989be6314e2d0af65e9541b5b2a0be5aed080255658e56c1c5bd989d1a842acf\": rpc error: code = NotFound desc = could not find container \"989be6314e2d0af65e9541b5b2a0be5aed080255658e56c1c5bd989d1a842acf\": container with ID starting with 989be6314e2d0af65e9541b5b2a0be5aed080255658e56c1c5bd989d1a842acf not found: ID does not exist" Mar 20 10:39:51 crc kubenswrapper[4748]: I0320 10:39:51.619496 4748 scope.go:117] "RemoveContainer" containerID="ed2d8bdb5b326ae91ba0ca3da26d434af7ae6a3c604fd3ff22eb0ae8b00a0aa1" Mar 20 10:39:51 crc kubenswrapper[4748]: E0320 10:39:51.620732 4748 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed2d8bdb5b326ae91ba0ca3da26d434af7ae6a3c604fd3ff22eb0ae8b00a0aa1\": container with ID starting with ed2d8bdb5b326ae91ba0ca3da26d434af7ae6a3c604fd3ff22eb0ae8b00a0aa1 not found: ID does not exist" containerID="ed2d8bdb5b326ae91ba0ca3da26d434af7ae6a3c604fd3ff22eb0ae8b00a0aa1" Mar 20 10:39:51 crc kubenswrapper[4748]: I0320 10:39:51.620792 4748 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed2d8bdb5b326ae91ba0ca3da26d434af7ae6a3c604fd3ff22eb0ae8b00a0aa1"} err="failed to get container status \"ed2d8bdb5b326ae91ba0ca3da26d434af7ae6a3c604fd3ff22eb0ae8b00a0aa1\": rpc error: code = NotFound desc = could not find container \"ed2d8bdb5b326ae91ba0ca3da26d434af7ae6a3c604fd3ff22eb0ae8b00a0aa1\": container with ID starting with ed2d8bdb5b326ae91ba0ca3da26d434af7ae6a3c604fd3ff22eb0ae8b00a0aa1 not found: ID does not exist" Mar 20 10:39:51 crc kubenswrapper[4748]: I0320 10:39:51.620812 4748 scope.go:117] "RemoveContainer" containerID="15d3d6d4f8c47119a747f3c76a9bb2b05c6f0030576418e787f3deb6e9f2485b" Mar 20 10:39:51 crc kubenswrapper[4748]: E0320 10:39:51.621228 4748 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15d3d6d4f8c47119a747f3c76a9bb2b05c6f0030576418e787f3deb6e9f2485b\": container with ID starting with 15d3d6d4f8c47119a747f3c76a9bb2b05c6f0030576418e787f3deb6e9f2485b not found: ID does not exist" containerID="15d3d6d4f8c47119a747f3c76a9bb2b05c6f0030576418e787f3deb6e9f2485b" Mar 20 10:39:51 crc kubenswrapper[4748]: I0320 10:39:51.621282 4748 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15d3d6d4f8c47119a747f3c76a9bb2b05c6f0030576418e787f3deb6e9f2485b"} err="failed to get container status \"15d3d6d4f8c47119a747f3c76a9bb2b05c6f0030576418e787f3deb6e9f2485b\": rpc error: code = NotFound desc = could not find container \"15d3d6d4f8c47119a747f3c76a9bb2b05c6f0030576418e787f3deb6e9f2485b\": container with ID starting with 15d3d6d4f8c47119a747f3c76a9bb2b05c6f0030576418e787f3deb6e9f2485b not found: ID does not exist" Mar 20 10:39:51 crc kubenswrapper[4748]: I0320 10:39:51.621326 4748 scope.go:117] "RemoveContainer" containerID="2c40d6e7336ba79828d98e8a8a2cb2e049ce6ad046d53a6337d9ab0d1ce0ee11" Mar 20 10:39:51 crc kubenswrapper[4748]: E0320 10:39:51.621747 4748 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c40d6e7336ba79828d98e8a8a2cb2e049ce6ad046d53a6337d9ab0d1ce0ee11\": container with ID starting with 2c40d6e7336ba79828d98e8a8a2cb2e049ce6ad046d53a6337d9ab0d1ce0ee11 not found: ID does not exist" containerID="2c40d6e7336ba79828d98e8a8a2cb2e049ce6ad046d53a6337d9ab0d1ce0ee11" Mar 20 10:39:51 crc kubenswrapper[4748]: I0320 10:39:51.621786 4748 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c40d6e7336ba79828d98e8a8a2cb2e049ce6ad046d53a6337d9ab0d1ce0ee11"} err="failed to get container status \"2c40d6e7336ba79828d98e8a8a2cb2e049ce6ad046d53a6337d9ab0d1ce0ee11\": rpc error: code = NotFound desc = could not find container \"2c40d6e7336ba79828d98e8a8a2cb2e049ce6ad046d53a6337d9ab0d1ce0ee11\": container with ID starting with 2c40d6e7336ba79828d98e8a8a2cb2e049ce6ad046d53a6337d9ab0d1ce0ee11 not found: ID does not exist" Mar 20 10:39:51 crc kubenswrapper[4748]: I0320 10:39:51.621807 4748 scope.go:117] "RemoveContainer" containerID="b974db792d69c06471dc5bd8f9e64e727b052347aed06e5a1c80653838984f51" Mar 20 10:39:51 crc kubenswrapper[4748]: E0320 10:39:51.622158 4748 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b974db792d69c06471dc5bd8f9e64e727b052347aed06e5a1c80653838984f51\": container with ID starting with b974db792d69c06471dc5bd8f9e64e727b052347aed06e5a1c80653838984f51 not found: ID does not exist" containerID="b974db792d69c06471dc5bd8f9e64e727b052347aed06e5a1c80653838984f51" Mar 20 10:39:51 crc kubenswrapper[4748]: I0320 10:39:51.622198 4748 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b974db792d69c06471dc5bd8f9e64e727b052347aed06e5a1c80653838984f51"} err="failed to get container status \"b974db792d69c06471dc5bd8f9e64e727b052347aed06e5a1c80653838984f51\": rpc error: code = NotFound desc = could not find container \"b974db792d69c06471dc5bd8f9e64e727b052347aed06e5a1c80653838984f51\": container with ID starting with b974db792d69c06471dc5bd8f9e64e727b052347aed06e5a1c80653838984f51 not found: ID does not exist" Mar 20 10:39:53 crc kubenswrapper[4748]: E0320 10:39:53.331705 4748 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.204:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189e86806cb6b79e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:39:49.36840387 +0000 UTC m=+224.509949694,LastTimestamp:2026-03-20 10:39:49.36840387 +0000 UTC m=+224.509949694,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:39:54 crc kubenswrapper[4748]: E0320 10:39:54.985266 4748 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:39:54Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:39:54Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:39:54Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:39:54Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.102.83.204:6443: connect: connection refused" Mar 20 10:39:54 crc kubenswrapper[4748]: E0320 10:39:54.986043 4748 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.204:6443: connect: connection refused" Mar 20 10:39:54 crc kubenswrapper[4748]: E0320 10:39:54.986289 4748 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.204:6443: connect: connection refused" Mar 20 10:39:54 crc kubenswrapper[4748]: E0320 10:39:54.986471 4748 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.204:6443: connect: connection refused" Mar 20 10:39:54 crc kubenswrapper[4748]: E0320 10:39:54.986651 4748 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.204:6443: connect: connection refused" Mar 20 10:39:54 crc kubenswrapper[4748]: E0320 10:39:54.986665 4748 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 10:39:55 crc kubenswrapper[4748]: I0320 10:39:55.516820 4748 status_manager.go:851] "Failed to get status for pod" podUID="7d42201c-acc6-40d3-85a5-ad7c4a0cbf8d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Mar 20 10:39:55 crc kubenswrapper[4748]: E0320 10:39:55.818217 4748 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.204:6443: connect: connection refused" Mar 20 10:39:55 crc kubenswrapper[4748]: E0320 10:39:55.819306 4748 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.204:6443: connect: connection refused" Mar 20 10:39:55 crc kubenswrapper[4748]: E0320 10:39:55.819897 4748 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.204:6443: connect: connection refused" Mar 20 10:39:55 crc kubenswrapper[4748]: E0320 10:39:55.820613 4748 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.204:6443: connect: connection refused" Mar 20 10:39:55 crc kubenswrapper[4748]: E0320 10:39:55.821540 4748 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.204:6443: connect: connection refused" Mar 20 10:39:55 crc kubenswrapper[4748]: I0320 10:39:55.821611 4748 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 20 10:39:55 crc kubenswrapper[4748]: E0320 10:39:55.822122 4748 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.204:6443: connect: connection refused" interval="200ms" Mar 20 10:39:56 crc kubenswrapper[4748]: E0320 10:39:56.023653 4748 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.204:6443: connect: connection refused" interval="400ms" Mar 20 10:39:56 crc kubenswrapper[4748]: E0320 10:39:56.424527 4748 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.204:6443: connect: connection refused" interval="800ms" Mar 20 10:39:57 crc kubenswrapper[4748]: E0320 10:39:57.239117 4748 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.204:6443: connect: connection refused" interval="1.6s" Mar 20 10:39:58 crc kubenswrapper[4748]: E0320 10:39:58.842405 4748 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.204:6443: connect: connection refused" interval="3.2s" Mar 20 10:40:01 crc kubenswrapper[4748]: I0320 10:40:01.562639 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 20 10:40:01 crc kubenswrapper[4748]: I0320 10:40:01.564391 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 20 10:40:01 crc kubenswrapper[4748]: I0320 10:40:01.564460 4748 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="639e31d3a83e1b703bfdbfc7c3e881ef42ced35bc22d96d8917993131c97f72c" exitCode=1 Mar 20 10:40:01 crc kubenswrapper[4748]: I0320 10:40:01.564512 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"639e31d3a83e1b703bfdbfc7c3e881ef42ced35bc22d96d8917993131c97f72c"} Mar 20 10:40:01 crc kubenswrapper[4748]: I0320 10:40:01.565350 4748 scope.go:117] "RemoveContainer" containerID="639e31d3a83e1b703bfdbfc7c3e881ef42ced35bc22d96d8917993131c97f72c" Mar 20 10:40:01 crc kubenswrapper[4748]: I0320 10:40:01.567187 4748 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Mar 20 10:40:01 crc kubenswrapper[4748]: I0320 10:40:01.567767 4748 status_manager.go:851] "Failed to get status for pod" podUID="7d42201c-acc6-40d3-85a5-ad7c4a0cbf8d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Mar 20 10:40:02 crc kubenswrapper[4748]: E0320 10:40:02.044463 4748 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.204:6443: connect: connection refused" interval="6.4s" Mar 20 10:40:02 crc kubenswrapper[4748]: I0320 10:40:02.574808 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 20 10:40:02 crc kubenswrapper[4748]: I0320 10:40:02.575562 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 20 10:40:02 crc kubenswrapper[4748]: I0320 10:40:02.575657 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"dd310ce203086b1858f2c39aa08b7db78c023ecfcdcde856309b0b56e02e506f"} Mar 20 10:40:02 crc kubenswrapper[4748]: I0320 10:40:02.577032 4748 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Mar 20 10:40:02 crc kubenswrapper[4748]: I0320 10:40:02.577858 4748 status_manager.go:851] "Failed to get status for pod" podUID="7d42201c-acc6-40d3-85a5-ad7c4a0cbf8d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Mar 20 10:40:03 crc kubenswrapper[4748]: E0320 10:40:03.333650 4748 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.204:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189e86806cb6b79e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:39:49.36840387 +0000 UTC m=+224.509949694,LastTimestamp:2026-03-20 10:39:49.36840387 +0000 UTC m=+224.509949694,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:40:04 crc kubenswrapper[4748]: I0320 10:40:04.514402 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:40:04 crc kubenswrapper[4748]: I0320 10:40:04.515559 4748 status_manager.go:851] "Failed to get status for pod" podUID="7d42201c-acc6-40d3-85a5-ad7c4a0cbf8d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Mar 20 10:40:04 crc kubenswrapper[4748]: I0320 10:40:04.516634 4748 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Mar 20 10:40:04 crc kubenswrapper[4748]: I0320 10:40:04.532380 4748 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="b5483745-95c8-4c6e-bd16-ec5fec57af5d" Mar 20 10:40:04 crc kubenswrapper[4748]: I0320 10:40:04.532419 4748 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="b5483745-95c8-4c6e-bd16-ec5fec57af5d" Mar 20 10:40:04 crc kubenswrapper[4748]: E0320 10:40:04.533092 4748 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:40:04 crc kubenswrapper[4748]: I0320 10:40:04.533906 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:40:04 crc kubenswrapper[4748]: W0320 10:40:04.566985 4748 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-ceb9eb9876a0b68c687c767cecbde2518364a730c5fcb25f970b2b17d1eb63d2 WatchSource:0}: Error finding container ceb9eb9876a0b68c687c767cecbde2518364a730c5fcb25f970b2b17d1eb63d2: Status 404 returned error can't find the container with id ceb9eb9876a0b68c687c767cecbde2518364a730c5fcb25f970b2b17d1eb63d2 Mar 20 10:40:04 crc kubenswrapper[4748]: I0320 10:40:04.590001 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"ceb9eb9876a0b68c687c767cecbde2518364a730c5fcb25f970b2b17d1eb63d2"} Mar 20 10:40:05 crc kubenswrapper[4748]: E0320 10:40:05.048888 4748 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:40:05Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:40:05Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:40:05Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:40:05Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.102.83.204:6443: connect: connection refused" Mar 20 10:40:05 crc kubenswrapper[4748]: E0320 10:40:05.049547 4748 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.204:6443: connect: connection refused" Mar 20 10:40:05 crc kubenswrapper[4748]: E0320 10:40:05.050128 4748 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.204:6443: connect: connection refused" Mar 20 10:40:05 crc kubenswrapper[4748]: E0320 10:40:05.050539 4748 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.204:6443: connect: connection refused" Mar 20 10:40:05 crc kubenswrapper[4748]: E0320 10:40:05.050930 4748 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.204:6443: connect: connection refused" Mar 20 10:40:05 crc kubenswrapper[4748]: E0320 10:40:05.050955 4748 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 10:40:05 crc kubenswrapper[4748]: I0320 10:40:05.454013 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 10:40:05 crc kubenswrapper[4748]: I0320 10:40:05.459997 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 10:40:05 crc kubenswrapper[4748]: I0320 10:40:05.461017 4748 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Mar 20 10:40:05 crc kubenswrapper[4748]: I0320 10:40:05.461591 4748 status_manager.go:851] "Failed to get status for pod" podUID="7d42201c-acc6-40d3-85a5-ad7c4a0cbf8d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Mar 20 10:40:05 crc kubenswrapper[4748]: I0320 10:40:05.529471 4748 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Mar 20 10:40:05 crc kubenswrapper[4748]: I0320 10:40:05.530141 4748 status_manager.go:851] "Failed to get status for pod" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Mar 20 10:40:05 crc kubenswrapper[4748]: I0320 10:40:05.531014 4748 status_manager.go:851] "Failed to get status for pod" podUID="7d42201c-acc6-40d3-85a5-ad7c4a0cbf8d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Mar 20 10:40:05 crc kubenswrapper[4748]: I0320 10:40:05.598723 4748 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="90215fce0d58ed9cbb2c9f8311a2845161122d86202be008fe1995332940bff5" exitCode=0 Mar 20 10:40:05 crc kubenswrapper[4748]: I0320 10:40:05.598824 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"90215fce0d58ed9cbb2c9f8311a2845161122d86202be008fe1995332940bff5"} Mar 20 10:40:05 crc kubenswrapper[4748]: I0320 10:40:05.599072 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 10:40:05 crc kubenswrapper[4748]: I0320 10:40:05.599613 4748 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="b5483745-95c8-4c6e-bd16-ec5fec57af5d" Mar 20 10:40:05 crc kubenswrapper[4748]: I0320 10:40:05.599635 4748 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="b5483745-95c8-4c6e-bd16-ec5fec57af5d" Mar 20 10:40:05 crc kubenswrapper[4748]: E0320 10:40:05.600267 4748 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:40:05 crc kubenswrapper[4748]: I0320 10:40:05.600294 4748 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Mar 20 10:40:05 crc kubenswrapper[4748]: I0320 10:40:05.600966 4748 status_manager.go:851] "Failed to get status for pod" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Mar 20 10:40:05 crc kubenswrapper[4748]: I0320 10:40:05.601367 4748 status_manager.go:851] "Failed to get status for pod" podUID="7d42201c-acc6-40d3-85a5-ad7c4a0cbf8d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Mar 20 10:40:05 crc kubenswrapper[4748]: E0320 10:40:05.609825 4748 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openshift-image-registry/crc-image-registry-storage: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage\": dial tcp 38.102.83.204:6443: connect: connection refused" pod="openshift-image-registry/image-registry-697d97f7c8-tlkp2" volumeName="registry-storage" Mar 20 10:40:06 crc kubenswrapper[4748]: I0320 10:40:06.607908 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"a5f41c0ef4b64a5cbbb95b02de8416bf83658cdab05398a1546f6b0d7ec53134"} Mar 20 10:40:06 crc kubenswrapper[4748]: I0320 10:40:06.608433 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"ad81b5617d103ed35a933d28abb77ff9f3c60146528c57246d2d4e882d7a6e62"} Mar 20 10:40:06 crc kubenswrapper[4748]: I0320 10:40:06.608445 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"5734948653c58fcb1e45f67e5e26990fd0be065f8f7617527ac9e3ed6a7e9ed5"} Mar 20 10:40:07 crc kubenswrapper[4748]: I0320 10:40:07.629864 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"a5915d69736c1d01dc218fe31f29abafd39087be614486e77b81763461c60ce6"} Mar 20 10:40:07 crc kubenswrapper[4748]: I0320 10:40:07.629957 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"df2518ad8f5cf7fde28bab8f6692d0c2760887362112733a9a5ad05cee4ce09b"} Mar 20 10:40:07 crc kubenswrapper[4748]: I0320 10:40:07.630459 4748 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="b5483745-95c8-4c6e-bd16-ec5fec57af5d" Mar 20 10:40:07 crc kubenswrapper[4748]: I0320 10:40:07.630491 4748 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="b5483745-95c8-4c6e-bd16-ec5fec57af5d" Mar 20 10:40:07 crc kubenswrapper[4748]: I0320 10:40:07.630910 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:40:09 crc kubenswrapper[4748]: I0320 10:40:09.534050 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:40:09 crc kubenswrapper[4748]: I0320 10:40:09.534368 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:40:09 crc kubenswrapper[4748]: I0320 10:40:09.539698 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:40:12 crc kubenswrapper[4748]: I0320 10:40:12.643522 4748 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:40:12 crc kubenswrapper[4748]: I0320 10:40:12.672452 4748 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="02b740d3-e683-4cc1-84be-cadea771d1d0" Mar 20 10:40:12 crc kubenswrapper[4748]: I0320 10:40:12.928311 4748 patch_prober.go:28] interesting pod/machine-config-daemon-5lbvz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 10:40:12 crc kubenswrapper[4748]: I0320 10:40:12.928796 4748 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 10:40:13 crc kubenswrapper[4748]: I0320 10:40:13.665723 4748 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="b5483745-95c8-4c6e-bd16-ec5fec57af5d" Mar 20 10:40:13 crc kubenswrapper[4748]: I0320 10:40:13.665768 4748 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="b5483745-95c8-4c6e-bd16-ec5fec57af5d" Mar 20 10:40:13 crc kubenswrapper[4748]: I0320 10:40:13.670190 4748 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="02b740d3-e683-4cc1-84be-cadea771d1d0" Mar 20 10:40:20 crc kubenswrapper[4748]: I0320 10:40:20.263745 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 10:40:22 crc kubenswrapper[4748]: I0320 10:40:22.008933 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 20 10:40:22 crc kubenswrapper[4748]: I0320 10:40:22.219705 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 20 10:40:22 crc kubenswrapper[4748]: I0320 10:40:22.526858 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 20 10:40:22 crc kubenswrapper[4748]: I0320 10:40:22.720736 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 20 10:40:22 crc kubenswrapper[4748]: I0320 10:40:22.809787 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 20 10:40:22 crc kubenswrapper[4748]: I0320 10:40:22.942412 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 20 10:40:22 crc kubenswrapper[4748]: I0320 10:40:22.948425 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 20 10:40:22 crc kubenswrapper[4748]: I0320 10:40:22.951223 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 20 10:40:23 crc kubenswrapper[4748]: I0320 10:40:23.226219 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 20 10:40:23 crc kubenswrapper[4748]: I0320 10:40:23.589620 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 20 10:40:23 crc kubenswrapper[4748]: I0320 10:40:23.680134 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 20 10:40:23 crc kubenswrapper[4748]: I0320 10:40:23.857899 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 20 10:40:24 crc kubenswrapper[4748]: I0320 10:40:24.142699 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 20 10:40:24 crc kubenswrapper[4748]: I0320 10:40:24.152179 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 20 10:40:24 crc kubenswrapper[4748]: I0320 10:40:24.241113 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 20 10:40:24 crc kubenswrapper[4748]: I0320 10:40:24.407555 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 20 10:40:24 crc kubenswrapper[4748]: I0320 10:40:24.557365 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 20 10:40:24 crc kubenswrapper[4748]: I0320 10:40:24.569661 4748 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 20 10:40:24 crc kubenswrapper[4748]: I0320 10:40:24.780366 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 20 10:40:24 crc kubenswrapper[4748]: I0320 10:40:24.864853 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 20 10:40:24 crc kubenswrapper[4748]: I0320 10:40:24.991156 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 20 10:40:25 crc kubenswrapper[4748]: I0320 10:40:25.241751 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 20 10:40:25 crc kubenswrapper[4748]: I0320 10:40:25.275522 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 20 10:40:25 crc kubenswrapper[4748]: I0320 10:40:25.468299 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 20 10:40:25 crc kubenswrapper[4748]: I0320 10:40:25.561080 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 20 10:40:25 crc kubenswrapper[4748]: I0320 10:40:25.647462 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 20 10:40:25 crc kubenswrapper[4748]: I0320 10:40:25.751778 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 20 10:40:25 crc kubenswrapper[4748]: I0320 10:40:25.776868 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 20 10:40:25 crc kubenswrapper[4748]: I0320 10:40:25.797085 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 20 10:40:25 crc kubenswrapper[4748]: I0320 10:40:25.912482 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 20 10:40:25 crc kubenswrapper[4748]: I0320 10:40:25.966931 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 20 10:40:25 crc kubenswrapper[4748]: I0320 10:40:25.990384 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 20 10:40:25 crc kubenswrapper[4748]: I0320 10:40:25.990504 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 20 10:40:26 crc kubenswrapper[4748]: I0320 10:40:26.047065 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 20 10:40:26 crc kubenswrapper[4748]: I0320 10:40:26.060961 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 20 10:40:26 crc kubenswrapper[4748]: I0320 10:40:26.111935 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 20 10:40:26 crc kubenswrapper[4748]: I0320 10:40:26.121395 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 20 10:40:26 crc kubenswrapper[4748]: I0320 10:40:26.135133 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 20 10:40:26 crc kubenswrapper[4748]: I0320 10:40:26.195188 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 20 10:40:26 crc kubenswrapper[4748]: I0320 10:40:26.223917 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 20 10:40:26 crc kubenswrapper[4748]: I0320 10:40:26.285527 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 20 10:40:26 crc kubenswrapper[4748]: I0320 10:40:26.314757 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 20 10:40:26 crc kubenswrapper[4748]: I0320 10:40:26.315220 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 20 10:40:26 crc kubenswrapper[4748]: I0320 10:40:26.367926 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 20 10:40:26 crc kubenswrapper[4748]: I0320 10:40:26.380625 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 20 10:40:26 crc kubenswrapper[4748]: I0320 10:40:26.517021 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 20 10:40:26 crc kubenswrapper[4748]: I0320 10:40:26.600779 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 20 10:40:26 crc kubenswrapper[4748]: I0320 10:40:26.987130 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 20 10:40:26 crc kubenswrapper[4748]: I0320 10:40:26.988296 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 20 10:40:27 crc kubenswrapper[4748]: I0320 10:40:27.001338 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 20 10:40:27 crc kubenswrapper[4748]: I0320 10:40:27.023462 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 20 10:40:27 crc kubenswrapper[4748]: I0320 10:40:27.035384 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 20 10:40:27 crc kubenswrapper[4748]: I0320 10:40:27.069946 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 20 10:40:27 crc kubenswrapper[4748]: I0320 10:40:27.150602 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 20 10:40:27 crc kubenswrapper[4748]: I0320 10:40:27.189864 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 20 10:40:27 crc kubenswrapper[4748]: I0320 10:40:27.257769 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 20 10:40:27 crc kubenswrapper[4748]: I0320 10:40:27.260050 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 20 10:40:27 crc kubenswrapper[4748]: I0320 10:40:27.284310 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 20 10:40:27 crc kubenswrapper[4748]: I0320 10:40:27.448313 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 20 10:40:27 crc kubenswrapper[4748]: I0320 10:40:27.453171 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 20 10:40:27 crc kubenswrapper[4748]: I0320 10:40:27.746735 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 20 10:40:27 crc kubenswrapper[4748]: I0320 10:40:27.748451 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 20 10:40:27 crc kubenswrapper[4748]: I0320 10:40:27.814132 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 20 10:40:27 crc kubenswrapper[4748]: I0320 10:40:27.831762 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 20 10:40:27 crc kubenswrapper[4748]: I0320 10:40:27.921354 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 20 10:40:27 crc kubenswrapper[4748]: I0320 10:40:27.925710 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 20 10:40:27 crc kubenswrapper[4748]: I0320 10:40:27.982046 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 20 10:40:27 crc kubenswrapper[4748]: I0320 10:40:27.991752 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 20 10:40:28 crc kubenswrapper[4748]: I0320 10:40:28.092465 4748 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 20 10:40:28 crc kubenswrapper[4748]: I0320 10:40:28.093993 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 20 10:40:28 crc kubenswrapper[4748]: I0320 10:40:28.129711 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 20 10:40:28 crc kubenswrapper[4748]: I0320 10:40:28.137229 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 20 10:40:28 crc kubenswrapper[4748]: I0320 10:40:28.148532 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 20 10:40:28 crc kubenswrapper[4748]: I0320 10:40:28.197761 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 20 10:40:28 crc kubenswrapper[4748]: I0320 10:40:28.199071 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 20 10:40:28 crc kubenswrapper[4748]: I0320 10:40:28.200741 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 20 10:40:28 crc kubenswrapper[4748]: I0320 10:40:28.208016 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 20 10:40:28 crc kubenswrapper[4748]: I0320 10:40:28.262257 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 20 10:40:28 crc kubenswrapper[4748]: I0320 10:40:28.346910 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 20 10:40:28 crc kubenswrapper[4748]: I0320 10:40:28.365945 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 20 10:40:28 crc kubenswrapper[4748]: I0320 10:40:28.390672 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 20 10:40:28 crc kubenswrapper[4748]: I0320 10:40:28.449941 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 20 10:40:28 crc kubenswrapper[4748]: I0320 10:40:28.509504 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 20 10:40:28 crc kubenswrapper[4748]: I0320 10:40:28.513316 4748 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 20 10:40:28 crc kubenswrapper[4748]: I0320 10:40:28.524299 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 20 10:40:28 crc kubenswrapper[4748]: I0320 10:40:28.568467 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 20 10:40:28 crc kubenswrapper[4748]: I0320 10:40:28.713417 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 20 10:40:28 crc kubenswrapper[4748]: I0320 10:40:28.740183 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 20 10:40:28 crc kubenswrapper[4748]: I0320 10:40:28.781888 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 20 10:40:28 crc kubenswrapper[4748]: I0320 10:40:28.960671 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 20 10:40:29 crc kubenswrapper[4748]: I0320 10:40:29.089442 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 20 10:40:29 crc kubenswrapper[4748]: I0320 10:40:29.191183 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 20 10:40:29 crc kubenswrapper[4748]: I0320 10:40:29.205940 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 20 10:40:29 crc kubenswrapper[4748]: I0320 10:40:29.260418 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 20 10:40:29 crc kubenswrapper[4748]: I0320 10:40:29.263729 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 20 10:40:29 crc kubenswrapper[4748]: I0320 10:40:29.322178 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 20 10:40:29 crc kubenswrapper[4748]: I0320 10:40:29.341533 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 20 10:40:29 crc kubenswrapper[4748]: I0320 10:40:29.368483 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 20 10:40:29 crc kubenswrapper[4748]: I0320 10:40:29.370014 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 20 10:40:29 crc kubenswrapper[4748]: I0320 10:40:29.479126 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 20 10:40:29 crc kubenswrapper[4748]: I0320 10:40:29.505305 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 20 10:40:29 crc kubenswrapper[4748]: I0320 10:40:29.559568 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 20 10:40:29 crc kubenswrapper[4748]: I0320 10:40:29.707203 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 20 10:40:29 crc kubenswrapper[4748]: I0320 10:40:29.720590 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 20 10:40:29 crc kubenswrapper[4748]: I0320 10:40:29.854058 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 20 10:40:29 crc kubenswrapper[4748]: I0320 10:40:29.864522 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 20 10:40:29 crc kubenswrapper[4748]: I0320 10:40:29.946277 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 20 10:40:29 crc kubenswrapper[4748]: I0320 10:40:29.959977 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 20 10:40:30 crc kubenswrapper[4748]: I0320 10:40:30.063507 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 20 10:40:30 crc kubenswrapper[4748]: I0320 10:40:30.079978 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 20 10:40:30 crc kubenswrapper[4748]: I0320 10:40:30.112105 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 20 10:40:30 crc kubenswrapper[4748]: I0320 10:40:30.132089 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 20 10:40:30 crc kubenswrapper[4748]: I0320 10:40:30.265755 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 20 10:40:30 crc kubenswrapper[4748]: I0320 10:40:30.293957 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 20 10:40:30 crc kubenswrapper[4748]: I0320 10:40:30.374119 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 20 10:40:30 crc kubenswrapper[4748]: I0320 10:40:30.419775 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 20 10:40:30 crc kubenswrapper[4748]: I0320 10:40:30.470970 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 20 10:40:30 crc kubenswrapper[4748]: I0320 10:40:30.491955 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 20 10:40:30 crc kubenswrapper[4748]: I0320 10:40:30.502059 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 20 10:40:30 crc kubenswrapper[4748]: I0320 10:40:30.518405 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 20 10:40:30 crc kubenswrapper[4748]: I0320 10:40:30.563178 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 20 10:40:30 crc kubenswrapper[4748]: I0320 10:40:30.602072 4748 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 20 10:40:30 crc kubenswrapper[4748]: I0320 10:40:30.610185 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 20 10:40:30 crc kubenswrapper[4748]: I0320 10:40:30.621326 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 20 10:40:30 crc kubenswrapper[4748]: I0320 10:40:30.621975 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 20 10:40:30 crc kubenswrapper[4748]: I0320 10:40:30.624201 4748 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="b5483745-95c8-4c6e-bd16-ec5fec57af5d" Mar 20 10:40:30 crc kubenswrapper[4748]: I0320 10:40:30.624242 4748 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="b5483745-95c8-4c6e-bd16-ec5fec57af5d" Mar 20 10:40:30 crc kubenswrapper[4748]: I0320 10:40:30.624219 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 20 10:40:30 crc kubenswrapper[4748]: I0320 10:40:30.630909 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:40:30 crc kubenswrapper[4748]: I0320 10:40:30.635876 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:40:30 crc kubenswrapper[4748]: I0320 10:40:30.668942 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=18.668907966 podStartE2EDuration="18.668907966s" podCreationTimestamp="2026-03-20 10:40:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:40:30.649449754 +0000 UTC m=+265.790995568" watchObservedRunningTime="2026-03-20 10:40:30.668907966 +0000 UTC m=+265.810453780" Mar 20 10:40:30 crc kubenswrapper[4748]: I0320 10:40:30.696866 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566720-mvtnf"] Mar 20 10:40:30 crc kubenswrapper[4748]: E0320 10:40:30.697120 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d42201c-acc6-40d3-85a5-ad7c4a0cbf8d" containerName="installer" Mar 20 10:40:30 crc kubenswrapper[4748]: I0320 10:40:30.697134 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d42201c-acc6-40d3-85a5-ad7c4a0cbf8d" containerName="installer" Mar 20 10:40:30 crc kubenswrapper[4748]: I0320 10:40:30.697249 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d42201c-acc6-40d3-85a5-ad7c4a0cbf8d" containerName="installer" Mar 20 10:40:30 crc kubenswrapper[4748]: I0320 10:40:30.697732 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566720-mvtnf" Mar 20 10:40:30 crc kubenswrapper[4748]: I0320 10:40:30.699714 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 10:40:30 crc kubenswrapper[4748]: I0320 10:40:30.700563 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-p6q8z" Mar 20 10:40:30 crc kubenswrapper[4748]: I0320 10:40:30.700564 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 10:40:30 crc kubenswrapper[4748]: I0320 10:40:30.709009 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 20 10:40:30 crc kubenswrapper[4748]: I0320 10:40:30.755488 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 20 10:40:30 crc kubenswrapper[4748]: I0320 10:40:30.757654 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 20 10:40:30 crc kubenswrapper[4748]: I0320 10:40:30.785568 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 20 10:40:30 crc kubenswrapper[4748]: I0320 10:40:30.831179 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 20 10:40:30 crc kubenswrapper[4748]: I0320 10:40:30.853754 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rr4lm\" (UniqueName: \"kubernetes.io/projected/31d344ca-51fc-428e-9703-7d57a3c4bd8d-kube-api-access-rr4lm\") pod \"auto-csr-approver-29566720-mvtnf\" (UID: \"31d344ca-51fc-428e-9703-7d57a3c4bd8d\") " pod="openshift-infra/auto-csr-approver-29566720-mvtnf" Mar 20 10:40:30 crc kubenswrapper[4748]: I0320 10:40:30.867868 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 20 10:40:30 crc kubenswrapper[4748]: I0320 10:40:30.918325 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 20 10:40:30 crc kubenswrapper[4748]: I0320 10:40:30.925971 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 20 10:40:30 crc kubenswrapper[4748]: I0320 10:40:30.954579 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 20 10:40:30 crc kubenswrapper[4748]: I0320 10:40:30.955251 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rr4lm\" (UniqueName: \"kubernetes.io/projected/31d344ca-51fc-428e-9703-7d57a3c4bd8d-kube-api-access-rr4lm\") pod \"auto-csr-approver-29566720-mvtnf\" (UID: \"31d344ca-51fc-428e-9703-7d57a3c4bd8d\") " pod="openshift-infra/auto-csr-approver-29566720-mvtnf" Mar 20 10:40:30 crc kubenswrapper[4748]: I0320 10:40:30.960390 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 20 10:40:30 crc kubenswrapper[4748]: I0320 10:40:30.977343 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rr4lm\" (UniqueName: \"kubernetes.io/projected/31d344ca-51fc-428e-9703-7d57a3c4bd8d-kube-api-access-rr4lm\") pod \"auto-csr-approver-29566720-mvtnf\" (UID: \"31d344ca-51fc-428e-9703-7d57a3c4bd8d\") " pod="openshift-infra/auto-csr-approver-29566720-mvtnf" Mar 20 10:40:31 crc kubenswrapper[4748]: I0320 10:40:31.002057 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 20 10:40:31 crc kubenswrapper[4748]: I0320 10:40:31.017090 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566720-mvtnf" Mar 20 10:40:31 crc kubenswrapper[4748]: I0320 10:40:31.129226 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 20 10:40:31 crc kubenswrapper[4748]: I0320 10:40:31.143105 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 20 10:40:31 crc kubenswrapper[4748]: I0320 10:40:31.175873 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 20 10:40:31 crc kubenswrapper[4748]: I0320 10:40:31.300669 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 20 10:40:31 crc kubenswrapper[4748]: I0320 10:40:31.306109 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 20 10:40:31 crc kubenswrapper[4748]: I0320 10:40:31.332618 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 20 10:40:31 crc kubenswrapper[4748]: I0320 10:40:31.346576 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 20 10:40:31 crc kubenswrapper[4748]: I0320 10:40:31.372344 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 20 10:40:31 crc kubenswrapper[4748]: I0320 10:40:31.381696 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 20 10:40:31 crc kubenswrapper[4748]: I0320 10:40:31.407597 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 20 10:40:31 crc kubenswrapper[4748]: I0320 10:40:31.423682 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 20 10:40:31 crc kubenswrapper[4748]: I0320 10:40:31.535438 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 20 10:40:31 crc kubenswrapper[4748]: I0320 10:40:31.630534 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 20 10:40:31 crc kubenswrapper[4748]: I0320 10:40:31.727766 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 20 10:40:31 crc kubenswrapper[4748]: I0320 10:40:31.742535 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 20 10:40:31 crc kubenswrapper[4748]: I0320 10:40:31.766024 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 20 10:40:31 crc kubenswrapper[4748]: I0320 10:40:31.823681 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 20 10:40:31 crc kubenswrapper[4748]: I0320 10:40:31.843562 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 20 10:40:31 crc kubenswrapper[4748]: I0320 10:40:31.875963 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 20 10:40:31 crc kubenswrapper[4748]: I0320 10:40:31.960094 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 20 10:40:32 crc kubenswrapper[4748]: I0320 10:40:32.033073 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 20 10:40:32 crc kubenswrapper[4748]: I0320 10:40:32.060123 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 20 10:40:32 crc kubenswrapper[4748]: I0320 10:40:32.130727 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 20 10:40:32 crc kubenswrapper[4748]: I0320 10:40:32.247520 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 20 10:40:32 crc kubenswrapper[4748]: I0320 10:40:32.281764 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 20 10:40:32 crc kubenswrapper[4748]: I0320 10:40:32.305522 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 20 10:40:32 crc kubenswrapper[4748]: I0320 10:40:32.487807 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 20 10:40:32 crc kubenswrapper[4748]: I0320 10:40:32.504899 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 20 10:40:32 crc kubenswrapper[4748]: I0320 10:40:32.750683 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 20 10:40:32 crc kubenswrapper[4748]: I0320 10:40:32.914727 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 20 10:40:32 crc kubenswrapper[4748]: I0320 10:40:32.927896 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 20 10:40:32 crc kubenswrapper[4748]: I0320 10:40:32.941913 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 20 10:40:32 crc kubenswrapper[4748]: I0320 10:40:32.993041 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 20 10:40:32 crc kubenswrapper[4748]: I0320 10:40:32.995722 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 20 10:40:33 crc kubenswrapper[4748]: I0320 10:40:33.108275 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 20 10:40:33 crc kubenswrapper[4748]: I0320 10:40:33.150149 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 20 10:40:33 crc kubenswrapper[4748]: I0320 10:40:33.327421 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 20 10:40:33 crc kubenswrapper[4748]: I0320 10:40:33.376572 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566720-mvtnf"] Mar 20 10:40:33 crc kubenswrapper[4748]: I0320 10:40:33.444875 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 20 10:40:33 crc kubenswrapper[4748]: I0320 10:40:33.467031 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 20 10:40:33 crc kubenswrapper[4748]: I0320 10:40:33.480123 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 20 10:40:33 crc kubenswrapper[4748]: I0320 10:40:33.613700 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 20 10:40:33 crc kubenswrapper[4748]: I0320 10:40:33.779368 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 20 10:40:33 crc kubenswrapper[4748]: I0320 10:40:33.809919 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 20 10:40:33 crc kubenswrapper[4748]: I0320 10:40:33.860701 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 20 10:40:33 crc kubenswrapper[4748]: I0320 10:40:33.872574 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 20 10:40:33 crc kubenswrapper[4748]: I0320 10:40:33.886649 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 20 10:40:33 crc kubenswrapper[4748]: I0320 10:40:33.900792 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566720-mvtnf"] Mar 20 10:40:33 crc kubenswrapper[4748]: I0320 10:40:33.972696 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 20 10:40:34 crc kubenswrapper[4748]: I0320 10:40:34.007363 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 20 10:40:34 crc kubenswrapper[4748]: I0320 10:40:34.044863 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 20 10:40:34 crc kubenswrapper[4748]: I0320 10:40:34.097866 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 20 10:40:34 crc kubenswrapper[4748]: I0320 10:40:34.119357 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 20 10:40:34 crc kubenswrapper[4748]: I0320 10:40:34.139914 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 20 10:40:34 crc kubenswrapper[4748]: I0320 10:40:34.155954 4748 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 20 10:40:34 crc kubenswrapper[4748]: I0320 10:40:34.185344 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 20 10:40:34 crc kubenswrapper[4748]: I0320 10:40:34.294685 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 20 10:40:34 crc kubenswrapper[4748]: I0320 10:40:34.335895 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 20 10:40:34 crc kubenswrapper[4748]: I0320 10:40:34.382027 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 20 10:40:34 crc kubenswrapper[4748]: I0320 10:40:34.420423 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 20 10:40:34 crc kubenswrapper[4748]: I0320 10:40:34.427738 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 20 10:40:34 crc kubenswrapper[4748]: I0320 10:40:34.432817 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 20 10:40:34 crc kubenswrapper[4748]: I0320 10:40:34.480407 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 20 10:40:34 crc kubenswrapper[4748]: I0320 10:40:34.509119 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 20 10:40:34 crc kubenswrapper[4748]: I0320 10:40:34.712174 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 20 10:40:34 crc kubenswrapper[4748]: I0320 10:40:34.712174 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 20 10:40:34 crc kubenswrapper[4748]: I0320 10:40:34.754575 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 20 10:40:34 crc kubenswrapper[4748]: I0320 10:40:34.805235 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566720-mvtnf" event={"ID":"31d344ca-51fc-428e-9703-7d57a3c4bd8d","Type":"ContainerStarted","Data":"2f461cdc00113df8b2f8cc9f4e7a22bf8bfb0d6a53fa14eba2d707910ad557d3"} Mar 20 10:40:34 crc kubenswrapper[4748]: I0320 10:40:34.931864 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 20 10:40:35 crc kubenswrapper[4748]: I0320 10:40:35.045079 4748 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 20 10:40:35 crc kubenswrapper[4748]: I0320 10:40:35.045380 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://7e2c3eae3778f34ab75fefa7f159b95ce1a719182c30da2a11ae081936d9860e" gracePeriod=5 Mar 20 10:40:35 crc kubenswrapper[4748]: I0320 10:40:35.106447 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 20 10:40:35 crc kubenswrapper[4748]: I0320 10:40:35.119828 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 20 10:40:35 crc kubenswrapper[4748]: I0320 10:40:35.164719 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 20 10:40:35 crc kubenswrapper[4748]: I0320 10:40:35.251932 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 20 10:40:35 crc kubenswrapper[4748]: I0320 10:40:35.304545 4748 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 20 10:40:35 crc kubenswrapper[4748]: I0320 10:40:35.320167 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 20 10:40:35 crc kubenswrapper[4748]: I0320 10:40:35.364078 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 20 10:40:35 crc kubenswrapper[4748]: I0320 10:40:35.438757 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 20 10:40:35 crc kubenswrapper[4748]: I0320 10:40:35.447282 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 20 10:40:35 crc kubenswrapper[4748]: I0320 10:40:35.501284 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 20 10:40:35 crc kubenswrapper[4748]: I0320 10:40:35.580442 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 20 10:40:35 crc kubenswrapper[4748]: I0320 10:40:35.679036 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 20 10:40:35 crc kubenswrapper[4748]: I0320 10:40:35.895372 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 20 10:40:36 crc kubenswrapper[4748]: I0320 10:40:36.183087 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 20 10:40:36 crc kubenswrapper[4748]: I0320 10:40:36.184772 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 20 10:40:36 crc kubenswrapper[4748]: I0320 10:40:36.224662 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 20 10:40:36 crc kubenswrapper[4748]: I0320 10:40:36.322260 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 20 10:40:36 crc kubenswrapper[4748]: I0320 10:40:36.342185 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 20 10:40:36 crc kubenswrapper[4748]: I0320 10:40:36.445465 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 20 10:40:36 crc kubenswrapper[4748]: I0320 10:40:36.514021 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 20 10:40:36 crc kubenswrapper[4748]: I0320 10:40:36.813785 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 20 10:40:36 crc kubenswrapper[4748]: I0320 10:40:36.874679 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 20 10:40:36 crc kubenswrapper[4748]: I0320 10:40:36.893161 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 20 10:40:36 crc kubenswrapper[4748]: I0320 10:40:36.896660 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 20 10:40:37 crc kubenswrapper[4748]: I0320 10:40:37.123822 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 20 10:40:37 crc kubenswrapper[4748]: I0320 10:40:37.222163 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 20 10:40:37 crc kubenswrapper[4748]: I0320 10:40:37.274267 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 20 10:40:37 crc kubenswrapper[4748]: I0320 10:40:37.356952 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 20 10:40:37 crc kubenswrapper[4748]: I0320 10:40:37.532361 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 20 10:40:37 crc kubenswrapper[4748]: I0320 10:40:37.630300 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 20 10:40:37 crc kubenswrapper[4748]: I0320 10:40:37.827785 4748 generic.go:334] "Generic (PLEG): container finished" podID="31d344ca-51fc-428e-9703-7d57a3c4bd8d" containerID="ef739de92cc4bee9bb07f6445e2f48d6e4709f37b3f9ddb9a97a80ee16d85cfb" exitCode=0 Mar 20 10:40:37 crc kubenswrapper[4748]: I0320 10:40:37.827866 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566720-mvtnf" event={"ID":"31d344ca-51fc-428e-9703-7d57a3c4bd8d","Type":"ContainerDied","Data":"ef739de92cc4bee9bb07f6445e2f48d6e4709f37b3f9ddb9a97a80ee16d85cfb"} Mar 20 10:40:37 crc kubenswrapper[4748]: I0320 10:40:37.935557 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 20 10:40:38 crc kubenswrapper[4748]: I0320 10:40:38.311808 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 20 10:40:39 crc kubenswrapper[4748]: I0320 10:40:39.206297 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566720-mvtnf" Mar 20 10:40:39 crc kubenswrapper[4748]: I0320 10:40:39.282692 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rr4lm\" (UniqueName: \"kubernetes.io/projected/31d344ca-51fc-428e-9703-7d57a3c4bd8d-kube-api-access-rr4lm\") pod \"31d344ca-51fc-428e-9703-7d57a3c4bd8d\" (UID: \"31d344ca-51fc-428e-9703-7d57a3c4bd8d\") " Mar 20 10:40:39 crc kubenswrapper[4748]: I0320 10:40:39.291231 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d344ca-51fc-428e-9703-7d57a3c4bd8d-kube-api-access-rr4lm" (OuterVolumeSpecName: "kube-api-access-rr4lm") pod "31d344ca-51fc-428e-9703-7d57a3c4bd8d" (UID: "31d344ca-51fc-428e-9703-7d57a3c4bd8d"). InnerVolumeSpecName "kube-api-access-rr4lm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:40:39 crc kubenswrapper[4748]: I0320 10:40:39.384577 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rr4lm\" (UniqueName: \"kubernetes.io/projected/31d344ca-51fc-428e-9703-7d57a3c4bd8d-kube-api-access-rr4lm\") on node \"crc\" DevicePath \"\"" Mar 20 10:40:39 crc kubenswrapper[4748]: I0320 10:40:39.841520 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566720-mvtnf" Mar 20 10:40:39 crc kubenswrapper[4748]: I0320 10:40:39.841458 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566720-mvtnf" event={"ID":"31d344ca-51fc-428e-9703-7d57a3c4bd8d","Type":"ContainerDied","Data":"2f461cdc00113df8b2f8cc9f4e7a22bf8bfb0d6a53fa14eba2d707910ad557d3"} Mar 20 10:40:39 crc kubenswrapper[4748]: I0320 10:40:39.842377 4748 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2f461cdc00113df8b2f8cc9f4e7a22bf8bfb0d6a53fa14eba2d707910ad557d3" Mar 20 10:40:40 crc kubenswrapper[4748]: I0320 10:40:40.628022 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 20 10:40:40 crc kubenswrapper[4748]: I0320 10:40:40.628510 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 10:40:40 crc kubenswrapper[4748]: I0320 10:40:40.706082 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 20 10:40:40 crc kubenswrapper[4748]: I0320 10:40:40.706169 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 20 10:40:40 crc kubenswrapper[4748]: I0320 10:40:40.706228 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 20 10:40:40 crc kubenswrapper[4748]: I0320 10:40:40.706251 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 20 10:40:40 crc kubenswrapper[4748]: I0320 10:40:40.706320 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 20 10:40:40 crc kubenswrapper[4748]: I0320 10:40:40.706612 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 10:40:40 crc kubenswrapper[4748]: I0320 10:40:40.707365 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 10:40:40 crc kubenswrapper[4748]: I0320 10:40:40.707426 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 10:40:40 crc kubenswrapper[4748]: I0320 10:40:40.707451 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 10:40:40 crc kubenswrapper[4748]: I0320 10:40:40.716658 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 10:40:40 crc kubenswrapper[4748]: I0320 10:40:40.807948 4748 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Mar 20 10:40:40 crc kubenswrapper[4748]: I0320 10:40:40.807991 4748 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 20 10:40:40 crc kubenswrapper[4748]: I0320 10:40:40.808005 4748 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 20 10:40:40 crc kubenswrapper[4748]: I0320 10:40:40.808014 4748 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Mar 20 10:40:40 crc kubenswrapper[4748]: I0320 10:40:40.808022 4748 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Mar 20 10:40:40 crc kubenswrapper[4748]: I0320 10:40:40.849696 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 20 10:40:40 crc kubenswrapper[4748]: I0320 10:40:40.849753 4748 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="7e2c3eae3778f34ab75fefa7f159b95ce1a719182c30da2a11ae081936d9860e" exitCode=137 Mar 20 10:40:40 crc kubenswrapper[4748]: I0320 10:40:40.849901 4748 scope.go:117] "RemoveContainer" containerID="7e2c3eae3778f34ab75fefa7f159b95ce1a719182c30da2a11ae081936d9860e" Mar 20 10:40:40 crc kubenswrapper[4748]: I0320 10:40:40.849935 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 10:40:40 crc kubenswrapper[4748]: I0320 10:40:40.870993 4748 scope.go:117] "RemoveContainer" containerID="7e2c3eae3778f34ab75fefa7f159b95ce1a719182c30da2a11ae081936d9860e" Mar 20 10:40:40 crc kubenswrapper[4748]: E0320 10:40:40.871620 4748 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e2c3eae3778f34ab75fefa7f159b95ce1a719182c30da2a11ae081936d9860e\": container with ID starting with 7e2c3eae3778f34ab75fefa7f159b95ce1a719182c30da2a11ae081936d9860e not found: ID does not exist" containerID="7e2c3eae3778f34ab75fefa7f159b95ce1a719182c30da2a11ae081936d9860e" Mar 20 10:40:40 crc kubenswrapper[4748]: I0320 10:40:40.871657 4748 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e2c3eae3778f34ab75fefa7f159b95ce1a719182c30da2a11ae081936d9860e"} err="failed to get container status \"7e2c3eae3778f34ab75fefa7f159b95ce1a719182c30da2a11ae081936d9860e\": rpc error: code = NotFound desc = could not find container \"7e2c3eae3778f34ab75fefa7f159b95ce1a719182c30da2a11ae081936d9860e\": container with ID starting with 7e2c3eae3778f34ab75fefa7f159b95ce1a719182c30da2a11ae081936d9860e not found: ID does not exist" Mar 20 10:40:41 crc kubenswrapper[4748]: I0320 10:40:41.523239 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Mar 20 10:40:42 crc kubenswrapper[4748]: I0320 10:40:42.616920 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-b6j7x"] Mar 20 10:40:42 crc kubenswrapper[4748]: I0320 10:40:42.617575 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-b6j7x" podUID="61390690-1bd1-43c9-b82b-e2c5fe3450f9" containerName="registry-server" containerID="cri-o://2576ae933efc7e5c139af3acae0dcde268a0fd6e25626147530d40547643435e" gracePeriod=30 Mar 20 10:40:42 crc kubenswrapper[4748]: I0320 10:40:42.648496 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-llsmr"] Mar 20 10:40:42 crc kubenswrapper[4748]: I0320 10:40:42.648862 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-llsmr" podUID="5196dc0c-2a46-4fb2-891b-682a6ce5eed9" containerName="registry-server" containerID="cri-o://818af1be7d986ef78e48ac5b436873844ea2ba14f05332d0bbd9e5da66c7b37b" gracePeriod=30 Mar 20 10:40:42 crc kubenswrapper[4748]: I0320 10:40:42.654431 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-xpncn"] Mar 20 10:40:42 crc kubenswrapper[4748]: I0320 10:40:42.654903 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-xpncn" podUID="9abcdd14-d386-4279-9d4c-4a7326a32a11" containerName="marketplace-operator" containerID="cri-o://dc05cb3effe7316098cdb06421a4067203e2d5538b4be516ead6a95e449ace37" gracePeriod=30 Mar 20 10:40:42 crc kubenswrapper[4748]: I0320 10:40:42.656848 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8kqb5"] Mar 20 10:40:42 crc kubenswrapper[4748]: I0320 10:40:42.657184 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-8kqb5" podUID="0186ffa9-907a-4afd-953d-28665f7343da" containerName="registry-server" containerID="cri-o://6b1e7e5524ba07cf6db412bc93cf5887af47b76b7b5baf3e610f543e27197475" gracePeriod=30 Mar 20 10:40:42 crc kubenswrapper[4748]: I0320 10:40:42.659481 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-z78kn"] Mar 20 10:40:42 crc kubenswrapper[4748]: I0320 10:40:42.659815 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-z78kn" podUID="7e393e84-d0bb-4258-8eef-012c9269fc05" containerName="registry-server" containerID="cri-o://f8c8c3ae29c737432695cd88f17f33a39cc3e0b0f9828132bc08a1b2de163131" gracePeriod=30 Mar 20 10:40:42 crc kubenswrapper[4748]: I0320 10:40:42.685591 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-2qp4n"] Mar 20 10:40:42 crc kubenswrapper[4748]: E0320 10:40:42.685906 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 20 10:40:42 crc kubenswrapper[4748]: I0320 10:40:42.685921 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 20 10:40:42 crc kubenswrapper[4748]: E0320 10:40:42.685952 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31d344ca-51fc-428e-9703-7d57a3c4bd8d" containerName="oc" Mar 20 10:40:42 crc kubenswrapper[4748]: I0320 10:40:42.685958 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="31d344ca-51fc-428e-9703-7d57a3c4bd8d" containerName="oc" Mar 20 10:40:42 crc kubenswrapper[4748]: I0320 10:40:42.686057 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 20 10:40:42 crc kubenswrapper[4748]: I0320 10:40:42.686072 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="31d344ca-51fc-428e-9703-7d57a3c4bd8d" containerName="oc" Mar 20 10:40:42 crc kubenswrapper[4748]: I0320 10:40:42.686544 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-2qp4n" Mar 20 10:40:42 crc kubenswrapper[4748]: I0320 10:40:42.717999 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-2qp4n"] Mar 20 10:40:42 crc kubenswrapper[4748]: I0320 10:40:42.841986 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4e32ab73-56fa-4a44-bb26-42d87e8ee2d5-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-2qp4n\" (UID: \"4e32ab73-56fa-4a44-bb26-42d87e8ee2d5\") " pod="openshift-marketplace/marketplace-operator-79b997595-2qp4n" Mar 20 10:40:42 crc kubenswrapper[4748]: I0320 10:40:42.842348 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hpj55\" (UniqueName: \"kubernetes.io/projected/4e32ab73-56fa-4a44-bb26-42d87e8ee2d5-kube-api-access-hpj55\") pod \"marketplace-operator-79b997595-2qp4n\" (UID: \"4e32ab73-56fa-4a44-bb26-42d87e8ee2d5\") " pod="openshift-marketplace/marketplace-operator-79b997595-2qp4n" Mar 20 10:40:42 crc kubenswrapper[4748]: I0320 10:40:42.842385 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/4e32ab73-56fa-4a44-bb26-42d87e8ee2d5-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-2qp4n\" (UID: \"4e32ab73-56fa-4a44-bb26-42d87e8ee2d5\") " pod="openshift-marketplace/marketplace-operator-79b997595-2qp4n" Mar 20 10:40:42 crc kubenswrapper[4748]: I0320 10:40:42.876477 4748 generic.go:334] "Generic (PLEG): container finished" podID="7e393e84-d0bb-4258-8eef-012c9269fc05" containerID="f8c8c3ae29c737432695cd88f17f33a39cc3e0b0f9828132bc08a1b2de163131" exitCode=0 Mar 20 10:40:42 crc kubenswrapper[4748]: I0320 10:40:42.876547 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z78kn" event={"ID":"7e393e84-d0bb-4258-8eef-012c9269fc05","Type":"ContainerDied","Data":"f8c8c3ae29c737432695cd88f17f33a39cc3e0b0f9828132bc08a1b2de163131"} Mar 20 10:40:42 crc kubenswrapper[4748]: I0320 10:40:42.878520 4748 generic.go:334] "Generic (PLEG): container finished" podID="5196dc0c-2a46-4fb2-891b-682a6ce5eed9" containerID="818af1be7d986ef78e48ac5b436873844ea2ba14f05332d0bbd9e5da66c7b37b" exitCode=0 Mar 20 10:40:42 crc kubenswrapper[4748]: I0320 10:40:42.878564 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-llsmr" event={"ID":"5196dc0c-2a46-4fb2-891b-682a6ce5eed9","Type":"ContainerDied","Data":"818af1be7d986ef78e48ac5b436873844ea2ba14f05332d0bbd9e5da66c7b37b"} Mar 20 10:40:42 crc kubenswrapper[4748]: I0320 10:40:42.880373 4748 generic.go:334] "Generic (PLEG): container finished" podID="0186ffa9-907a-4afd-953d-28665f7343da" containerID="6b1e7e5524ba07cf6db412bc93cf5887af47b76b7b5baf3e610f543e27197475" exitCode=0 Mar 20 10:40:42 crc kubenswrapper[4748]: I0320 10:40:42.880413 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8kqb5" event={"ID":"0186ffa9-907a-4afd-953d-28665f7343da","Type":"ContainerDied","Data":"6b1e7e5524ba07cf6db412bc93cf5887af47b76b7b5baf3e610f543e27197475"} Mar 20 10:40:42 crc kubenswrapper[4748]: I0320 10:40:42.882467 4748 generic.go:334] "Generic (PLEG): container finished" podID="61390690-1bd1-43c9-b82b-e2c5fe3450f9" containerID="2576ae933efc7e5c139af3acae0dcde268a0fd6e25626147530d40547643435e" exitCode=0 Mar 20 10:40:42 crc kubenswrapper[4748]: I0320 10:40:42.882549 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b6j7x" event={"ID":"61390690-1bd1-43c9-b82b-e2c5fe3450f9","Type":"ContainerDied","Data":"2576ae933efc7e5c139af3acae0dcde268a0fd6e25626147530d40547643435e"} Mar 20 10:40:42 crc kubenswrapper[4748]: I0320 10:40:42.884076 4748 generic.go:334] "Generic (PLEG): container finished" podID="9abcdd14-d386-4279-9d4c-4a7326a32a11" containerID="dc05cb3effe7316098cdb06421a4067203e2d5538b4be516ead6a95e449ace37" exitCode=0 Mar 20 10:40:42 crc kubenswrapper[4748]: I0320 10:40:42.884112 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-xpncn" event={"ID":"9abcdd14-d386-4279-9d4c-4a7326a32a11","Type":"ContainerDied","Data":"dc05cb3effe7316098cdb06421a4067203e2d5538b4be516ead6a95e449ace37"} Mar 20 10:40:42 crc kubenswrapper[4748]: I0320 10:40:42.928653 4748 patch_prober.go:28] interesting pod/machine-config-daemon-5lbvz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 10:40:42 crc kubenswrapper[4748]: I0320 10:40:42.928715 4748 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 10:40:42 crc kubenswrapper[4748]: I0320 10:40:42.944938 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4e32ab73-56fa-4a44-bb26-42d87e8ee2d5-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-2qp4n\" (UID: \"4e32ab73-56fa-4a44-bb26-42d87e8ee2d5\") " pod="openshift-marketplace/marketplace-operator-79b997595-2qp4n" Mar 20 10:40:42 crc kubenswrapper[4748]: I0320 10:40:42.945023 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hpj55\" (UniqueName: \"kubernetes.io/projected/4e32ab73-56fa-4a44-bb26-42d87e8ee2d5-kube-api-access-hpj55\") pod \"marketplace-operator-79b997595-2qp4n\" (UID: \"4e32ab73-56fa-4a44-bb26-42d87e8ee2d5\") " pod="openshift-marketplace/marketplace-operator-79b997595-2qp4n" Mar 20 10:40:42 crc kubenswrapper[4748]: I0320 10:40:42.945078 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/4e32ab73-56fa-4a44-bb26-42d87e8ee2d5-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-2qp4n\" (UID: \"4e32ab73-56fa-4a44-bb26-42d87e8ee2d5\") " pod="openshift-marketplace/marketplace-operator-79b997595-2qp4n" Mar 20 10:40:42 crc kubenswrapper[4748]: I0320 10:40:42.947246 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4e32ab73-56fa-4a44-bb26-42d87e8ee2d5-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-2qp4n\" (UID: \"4e32ab73-56fa-4a44-bb26-42d87e8ee2d5\") " pod="openshift-marketplace/marketplace-operator-79b997595-2qp4n" Mar 20 10:40:42 crc kubenswrapper[4748]: I0320 10:40:42.954027 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/4e32ab73-56fa-4a44-bb26-42d87e8ee2d5-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-2qp4n\" (UID: \"4e32ab73-56fa-4a44-bb26-42d87e8ee2d5\") " pod="openshift-marketplace/marketplace-operator-79b997595-2qp4n" Mar 20 10:40:42 crc kubenswrapper[4748]: I0320 10:40:42.984498 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hpj55\" (UniqueName: \"kubernetes.io/projected/4e32ab73-56fa-4a44-bb26-42d87e8ee2d5-kube-api-access-hpj55\") pod \"marketplace-operator-79b997595-2qp4n\" (UID: \"4e32ab73-56fa-4a44-bb26-42d87e8ee2d5\") " pod="openshift-marketplace/marketplace-operator-79b997595-2qp4n" Mar 20 10:40:43 crc kubenswrapper[4748]: I0320 10:40:43.078377 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-2qp4n" Mar 20 10:40:43 crc kubenswrapper[4748]: I0320 10:40:43.095364 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b6j7x" Mar 20 10:40:43 crc kubenswrapper[4748]: I0320 10:40:43.144877 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-llsmr" Mar 20 10:40:43 crc kubenswrapper[4748]: I0320 10:40:43.196781 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-xpncn" Mar 20 10:40:43 crc kubenswrapper[4748]: I0320 10:40:43.206389 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z78kn" Mar 20 10:40:43 crc kubenswrapper[4748]: I0320 10:40:43.207899 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8kqb5" Mar 20 10:40:43 crc kubenswrapper[4748]: I0320 10:40:43.250371 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5196dc0c-2a46-4fb2-891b-682a6ce5eed9-utilities\") pod \"5196dc0c-2a46-4fb2-891b-682a6ce5eed9\" (UID: \"5196dc0c-2a46-4fb2-891b-682a6ce5eed9\") " Mar 20 10:40:43 crc kubenswrapper[4748]: I0320 10:40:43.250486 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61390690-1bd1-43c9-b82b-e2c5fe3450f9-utilities\") pod \"61390690-1bd1-43c9-b82b-e2c5fe3450f9\" (UID: \"61390690-1bd1-43c9-b82b-e2c5fe3450f9\") " Mar 20 10:40:43 crc kubenswrapper[4748]: I0320 10:40:43.250551 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5196dc0c-2a46-4fb2-891b-682a6ce5eed9-catalog-content\") pod \"5196dc0c-2a46-4fb2-891b-682a6ce5eed9\" (UID: \"5196dc0c-2a46-4fb2-891b-682a6ce5eed9\") " Mar 20 10:40:43 crc kubenswrapper[4748]: I0320 10:40:43.250663 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bcb89\" (UniqueName: \"kubernetes.io/projected/5196dc0c-2a46-4fb2-891b-682a6ce5eed9-kube-api-access-bcb89\") pod \"5196dc0c-2a46-4fb2-891b-682a6ce5eed9\" (UID: \"5196dc0c-2a46-4fb2-891b-682a6ce5eed9\") " Mar 20 10:40:43 crc kubenswrapper[4748]: I0320 10:40:43.250764 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61390690-1bd1-43c9-b82b-e2c5fe3450f9-catalog-content\") pod \"61390690-1bd1-43c9-b82b-e2c5fe3450f9\" (UID: \"61390690-1bd1-43c9-b82b-e2c5fe3450f9\") " Mar 20 10:40:43 crc kubenswrapper[4748]: I0320 10:40:43.250896 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bsdsv\" (UniqueName: \"kubernetes.io/projected/61390690-1bd1-43c9-b82b-e2c5fe3450f9-kube-api-access-bsdsv\") pod \"61390690-1bd1-43c9-b82b-e2c5fe3450f9\" (UID: \"61390690-1bd1-43c9-b82b-e2c5fe3450f9\") " Mar 20 10:40:43 crc kubenswrapper[4748]: I0320 10:40:43.257614 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5196dc0c-2a46-4fb2-891b-682a6ce5eed9-utilities" (OuterVolumeSpecName: "utilities") pod "5196dc0c-2a46-4fb2-891b-682a6ce5eed9" (UID: "5196dc0c-2a46-4fb2-891b-682a6ce5eed9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:40:43 crc kubenswrapper[4748]: I0320 10:40:43.258480 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/61390690-1bd1-43c9-b82b-e2c5fe3450f9-utilities" (OuterVolumeSpecName: "utilities") pod "61390690-1bd1-43c9-b82b-e2c5fe3450f9" (UID: "61390690-1bd1-43c9-b82b-e2c5fe3450f9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:40:43 crc kubenswrapper[4748]: I0320 10:40:43.266187 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5196dc0c-2a46-4fb2-891b-682a6ce5eed9-kube-api-access-bcb89" (OuterVolumeSpecName: "kube-api-access-bcb89") pod "5196dc0c-2a46-4fb2-891b-682a6ce5eed9" (UID: "5196dc0c-2a46-4fb2-891b-682a6ce5eed9"). InnerVolumeSpecName "kube-api-access-bcb89". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:40:43 crc kubenswrapper[4748]: I0320 10:40:43.266280 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61390690-1bd1-43c9-b82b-e2c5fe3450f9-kube-api-access-bsdsv" (OuterVolumeSpecName: "kube-api-access-bsdsv") pod "61390690-1bd1-43c9-b82b-e2c5fe3450f9" (UID: "61390690-1bd1-43c9-b82b-e2c5fe3450f9"). InnerVolumeSpecName "kube-api-access-bsdsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:40:43 crc kubenswrapper[4748]: I0320 10:40:43.339666 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/61390690-1bd1-43c9-b82b-e2c5fe3450f9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "61390690-1bd1-43c9-b82b-e2c5fe3450f9" (UID: "61390690-1bd1-43c9-b82b-e2c5fe3450f9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:40:43 crc kubenswrapper[4748]: I0320 10:40:43.352588 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/9abcdd14-d386-4279-9d4c-4a7326a32a11-marketplace-operator-metrics\") pod \"9abcdd14-d386-4279-9d4c-4a7326a32a11\" (UID: \"9abcdd14-d386-4279-9d4c-4a7326a32a11\") " Mar 20 10:40:43 crc kubenswrapper[4748]: I0320 10:40:43.352736 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bnzq2\" (UniqueName: \"kubernetes.io/projected/0186ffa9-907a-4afd-953d-28665f7343da-kube-api-access-bnzq2\") pod \"0186ffa9-907a-4afd-953d-28665f7343da\" (UID: \"0186ffa9-907a-4afd-953d-28665f7343da\") " Mar 20 10:40:43 crc kubenswrapper[4748]: I0320 10:40:43.352764 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ng4rk\" (UniqueName: \"kubernetes.io/projected/9abcdd14-d386-4279-9d4c-4a7326a32a11-kube-api-access-ng4rk\") pod \"9abcdd14-d386-4279-9d4c-4a7326a32a11\" (UID: \"9abcdd14-d386-4279-9d4c-4a7326a32a11\") " Mar 20 10:40:43 crc kubenswrapper[4748]: I0320 10:40:43.352794 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0186ffa9-907a-4afd-953d-28665f7343da-catalog-content\") pod \"0186ffa9-907a-4afd-953d-28665f7343da\" (UID: \"0186ffa9-907a-4afd-953d-28665f7343da\") " Mar 20 10:40:43 crc kubenswrapper[4748]: I0320 10:40:43.352868 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e393e84-d0bb-4258-8eef-012c9269fc05-utilities\") pod \"7e393e84-d0bb-4258-8eef-012c9269fc05\" (UID: \"7e393e84-d0bb-4258-8eef-012c9269fc05\") " Mar 20 10:40:43 crc kubenswrapper[4748]: I0320 10:40:43.352917 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9abcdd14-d386-4279-9d4c-4a7326a32a11-marketplace-trusted-ca\") pod \"9abcdd14-d386-4279-9d4c-4a7326a32a11\" (UID: \"9abcdd14-d386-4279-9d4c-4a7326a32a11\") " Mar 20 10:40:43 crc kubenswrapper[4748]: I0320 10:40:43.352952 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cs4lq\" (UniqueName: \"kubernetes.io/projected/7e393e84-d0bb-4258-8eef-012c9269fc05-kube-api-access-cs4lq\") pod \"7e393e84-d0bb-4258-8eef-012c9269fc05\" (UID: \"7e393e84-d0bb-4258-8eef-012c9269fc05\") " Mar 20 10:40:43 crc kubenswrapper[4748]: I0320 10:40:43.353033 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e393e84-d0bb-4258-8eef-012c9269fc05-catalog-content\") pod \"7e393e84-d0bb-4258-8eef-012c9269fc05\" (UID: \"7e393e84-d0bb-4258-8eef-012c9269fc05\") " Mar 20 10:40:43 crc kubenswrapper[4748]: I0320 10:40:43.353056 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0186ffa9-907a-4afd-953d-28665f7343da-utilities\") pod \"0186ffa9-907a-4afd-953d-28665f7343da\" (UID: \"0186ffa9-907a-4afd-953d-28665f7343da\") " Mar 20 10:40:43 crc kubenswrapper[4748]: I0320 10:40:43.353297 4748 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61390690-1bd1-43c9-b82b-e2c5fe3450f9-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 10:40:43 crc kubenswrapper[4748]: I0320 10:40:43.353313 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bsdsv\" (UniqueName: \"kubernetes.io/projected/61390690-1bd1-43c9-b82b-e2c5fe3450f9-kube-api-access-bsdsv\") on node \"crc\" DevicePath \"\"" Mar 20 10:40:43 crc kubenswrapper[4748]: I0320 10:40:43.353328 4748 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5196dc0c-2a46-4fb2-891b-682a6ce5eed9-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 10:40:43 crc kubenswrapper[4748]: I0320 10:40:43.353338 4748 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61390690-1bd1-43c9-b82b-e2c5fe3450f9-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 10:40:43 crc kubenswrapper[4748]: I0320 10:40:43.353347 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bcb89\" (UniqueName: \"kubernetes.io/projected/5196dc0c-2a46-4fb2-891b-682a6ce5eed9-kube-api-access-bcb89\") on node \"crc\" DevicePath \"\"" Mar 20 10:40:43 crc kubenswrapper[4748]: I0320 10:40:43.354115 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e393e84-d0bb-4258-8eef-012c9269fc05-utilities" (OuterVolumeSpecName: "utilities") pod "7e393e84-d0bb-4258-8eef-012c9269fc05" (UID: "7e393e84-d0bb-4258-8eef-012c9269fc05"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:40:43 crc kubenswrapper[4748]: I0320 10:40:43.354432 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9abcdd14-d386-4279-9d4c-4a7326a32a11-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "9abcdd14-d386-4279-9d4c-4a7326a32a11" (UID: "9abcdd14-d386-4279-9d4c-4a7326a32a11"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:40:43 crc kubenswrapper[4748]: I0320 10:40:43.354436 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0186ffa9-907a-4afd-953d-28665f7343da-utilities" (OuterVolumeSpecName: "utilities") pod "0186ffa9-907a-4afd-953d-28665f7343da" (UID: "0186ffa9-907a-4afd-953d-28665f7343da"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:40:43 crc kubenswrapper[4748]: I0320 10:40:43.358207 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9abcdd14-d386-4279-9d4c-4a7326a32a11-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "9abcdd14-d386-4279-9d4c-4a7326a32a11" (UID: "9abcdd14-d386-4279-9d4c-4a7326a32a11"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:40:43 crc kubenswrapper[4748]: I0320 10:40:43.358283 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e393e84-d0bb-4258-8eef-012c9269fc05-kube-api-access-cs4lq" (OuterVolumeSpecName: "kube-api-access-cs4lq") pod "7e393e84-d0bb-4258-8eef-012c9269fc05" (UID: "7e393e84-d0bb-4258-8eef-012c9269fc05"). InnerVolumeSpecName "kube-api-access-cs4lq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:40:43 crc kubenswrapper[4748]: I0320 10:40:43.361283 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5196dc0c-2a46-4fb2-891b-682a6ce5eed9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5196dc0c-2a46-4fb2-891b-682a6ce5eed9" (UID: "5196dc0c-2a46-4fb2-891b-682a6ce5eed9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:40:43 crc kubenswrapper[4748]: I0320 10:40:43.364140 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0186ffa9-907a-4afd-953d-28665f7343da-kube-api-access-bnzq2" (OuterVolumeSpecName: "kube-api-access-bnzq2") pod "0186ffa9-907a-4afd-953d-28665f7343da" (UID: "0186ffa9-907a-4afd-953d-28665f7343da"). InnerVolumeSpecName "kube-api-access-bnzq2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:40:43 crc kubenswrapper[4748]: I0320 10:40:43.374088 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9abcdd14-d386-4279-9d4c-4a7326a32a11-kube-api-access-ng4rk" (OuterVolumeSpecName: "kube-api-access-ng4rk") pod "9abcdd14-d386-4279-9d4c-4a7326a32a11" (UID: "9abcdd14-d386-4279-9d4c-4a7326a32a11"). InnerVolumeSpecName "kube-api-access-ng4rk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:40:43 crc kubenswrapper[4748]: I0320 10:40:43.390950 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0186ffa9-907a-4afd-953d-28665f7343da-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0186ffa9-907a-4afd-953d-28665f7343da" (UID: "0186ffa9-907a-4afd-953d-28665f7343da"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:40:43 crc kubenswrapper[4748]: I0320 10:40:43.454643 4748 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/9abcdd14-d386-4279-9d4c-4a7326a32a11-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 20 10:40:43 crc kubenswrapper[4748]: I0320 10:40:43.454686 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bnzq2\" (UniqueName: \"kubernetes.io/projected/0186ffa9-907a-4afd-953d-28665f7343da-kube-api-access-bnzq2\") on node \"crc\" DevicePath \"\"" Mar 20 10:40:43 crc kubenswrapper[4748]: I0320 10:40:43.454697 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ng4rk\" (UniqueName: \"kubernetes.io/projected/9abcdd14-d386-4279-9d4c-4a7326a32a11-kube-api-access-ng4rk\") on node \"crc\" DevicePath \"\"" Mar 20 10:40:43 crc kubenswrapper[4748]: I0320 10:40:43.454707 4748 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0186ffa9-907a-4afd-953d-28665f7343da-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 10:40:43 crc kubenswrapper[4748]: I0320 10:40:43.454717 4748 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e393e84-d0bb-4258-8eef-012c9269fc05-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 10:40:43 crc kubenswrapper[4748]: I0320 10:40:43.454727 4748 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5196dc0c-2a46-4fb2-891b-682a6ce5eed9-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 10:40:43 crc kubenswrapper[4748]: I0320 10:40:43.454738 4748 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9abcdd14-d386-4279-9d4c-4a7326a32a11-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 10:40:43 crc kubenswrapper[4748]: I0320 10:40:43.454774 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cs4lq\" (UniqueName: \"kubernetes.io/projected/7e393e84-d0bb-4258-8eef-012c9269fc05-kube-api-access-cs4lq\") on node \"crc\" DevicePath \"\"" Mar 20 10:40:43 crc kubenswrapper[4748]: I0320 10:40:43.454783 4748 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0186ffa9-907a-4afd-953d-28665f7343da-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 10:40:43 crc kubenswrapper[4748]: I0320 10:40:43.501905 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e393e84-d0bb-4258-8eef-012c9269fc05-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7e393e84-d0bb-4258-8eef-012c9269fc05" (UID: "7e393e84-d0bb-4258-8eef-012c9269fc05"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:40:43 crc kubenswrapper[4748]: I0320 10:40:43.584953 4748 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e393e84-d0bb-4258-8eef-012c9269fc05-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 10:40:43 crc kubenswrapper[4748]: I0320 10:40:43.618070 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-2qp4n"] Mar 20 10:40:43 crc kubenswrapper[4748]: I0320 10:40:43.893731 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-2qp4n" event={"ID":"4e32ab73-56fa-4a44-bb26-42d87e8ee2d5","Type":"ContainerStarted","Data":"b6b7b30a0273de88985b10154acdc64907841dce1fce596bef9b78a24b82fd70"} Mar 20 10:40:43 crc kubenswrapper[4748]: I0320 10:40:43.893795 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-2qp4n" event={"ID":"4e32ab73-56fa-4a44-bb26-42d87e8ee2d5","Type":"ContainerStarted","Data":"91da16e2a9ef4a3a53cd7852348c1893603fd95c4fde03a7246430d1c029e1d8"} Mar 20 10:40:43 crc kubenswrapper[4748]: I0320 10:40:43.894676 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-2qp4n" Mar 20 10:40:43 crc kubenswrapper[4748]: I0320 10:40:43.896347 4748 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-2qp4n container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.69:8080/healthz\": dial tcp 10.217.0.69:8080: connect: connection refused" start-of-body= Mar 20 10:40:43 crc kubenswrapper[4748]: I0320 10:40:43.896466 4748 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-2qp4n" podUID="4e32ab73-56fa-4a44-bb26-42d87e8ee2d5" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.69:8080/healthz\": dial tcp 10.217.0.69:8080: connect: connection refused" Mar 20 10:40:43 crc kubenswrapper[4748]: I0320 10:40:43.896915 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b6j7x" event={"ID":"61390690-1bd1-43c9-b82b-e2c5fe3450f9","Type":"ContainerDied","Data":"d56cd3eab7c073eb4a9b927e8f7f1421bab87030d220d6f6c3db782dfe765b2b"} Mar 20 10:40:43 crc kubenswrapper[4748]: I0320 10:40:43.896957 4748 scope.go:117] "RemoveContainer" containerID="2576ae933efc7e5c139af3acae0dcde268a0fd6e25626147530d40547643435e" Mar 20 10:40:43 crc kubenswrapper[4748]: I0320 10:40:43.897011 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b6j7x" Mar 20 10:40:43 crc kubenswrapper[4748]: I0320 10:40:43.913449 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z78kn" Mar 20 10:40:43 crc kubenswrapper[4748]: I0320 10:40:43.914163 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z78kn" event={"ID":"7e393e84-d0bb-4258-8eef-012c9269fc05","Type":"ContainerDied","Data":"3068537172e1764ef8b99022b1a72fa9f1251f541d313d66e9d4a5e0bc488d36"} Mar 20 10:40:43 crc kubenswrapper[4748]: I0320 10:40:43.915250 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-2qp4n" podStartSLOduration=1.9152411310000002 podStartE2EDuration="1.915241131s" podCreationTimestamp="2026-03-20 10:40:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:40:43.912922714 +0000 UTC m=+279.054468528" watchObservedRunningTime="2026-03-20 10:40:43.915241131 +0000 UTC m=+279.056786945" Mar 20 10:40:43 crc kubenswrapper[4748]: I0320 10:40:43.915765 4748 scope.go:117] "RemoveContainer" containerID="37f2f4501a16a8351b691a83364c3a23446a050ebcdbf05e894a8d004c90cd9b" Mar 20 10:40:43 crc kubenswrapper[4748]: I0320 10:40:43.920533 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-xpncn" Mar 20 10:40:43 crc kubenswrapper[4748]: I0320 10:40:43.920747 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-xpncn" event={"ID":"9abcdd14-d386-4279-9d4c-4a7326a32a11","Type":"ContainerDied","Data":"3723617594ce7ac27a00b57c0746f02379306603b55346b8e86b2e1542c23d25"} Mar 20 10:40:43 crc kubenswrapper[4748]: I0320 10:40:43.931660 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-llsmr" event={"ID":"5196dc0c-2a46-4fb2-891b-682a6ce5eed9","Type":"ContainerDied","Data":"c5597c5769d09d8f641ee101ee959907fac4c4a7fcf56e80203ed0deb77ad325"} Mar 20 10:40:43 crc kubenswrapper[4748]: I0320 10:40:43.931807 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-llsmr" Mar 20 10:40:43 crc kubenswrapper[4748]: I0320 10:40:43.935170 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-b6j7x"] Mar 20 10:40:43 crc kubenswrapper[4748]: I0320 10:40:43.938908 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-b6j7x"] Mar 20 10:40:43 crc kubenswrapper[4748]: I0320 10:40:43.939795 4748 scope.go:117] "RemoveContainer" containerID="c293e7296d050a2b3c7c25422c7d08c1b492af3270631d8b6de2f0928974382b" Mar 20 10:40:43 crc kubenswrapper[4748]: I0320 10:40:43.941210 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8kqb5" event={"ID":"0186ffa9-907a-4afd-953d-28665f7343da","Type":"ContainerDied","Data":"6e814cddf9ef53a75c5c8eb21e64afcdd5f6e0ac31226ccf981a0ed4afc41628"} Mar 20 10:40:43 crc kubenswrapper[4748]: I0320 10:40:43.941394 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8kqb5" Mar 20 10:40:43 crc kubenswrapper[4748]: I0320 10:40:43.955011 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-z78kn"] Mar 20 10:40:43 crc kubenswrapper[4748]: I0320 10:40:43.961236 4748 scope.go:117] "RemoveContainer" containerID="f8c8c3ae29c737432695cd88f17f33a39cc3e0b0f9828132bc08a1b2de163131" Mar 20 10:40:43 crc kubenswrapper[4748]: I0320 10:40:43.963487 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-z78kn"] Mar 20 10:40:43 crc kubenswrapper[4748]: I0320 10:40:43.975942 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-xpncn"] Mar 20 10:40:43 crc kubenswrapper[4748]: I0320 10:40:43.981188 4748 scope.go:117] "RemoveContainer" containerID="6a514e8ee4323d9a111787c95d0dbf8b1c1ee49cb8a19f29ae2fe90c5525e67d" Mar 20 10:40:43 crc kubenswrapper[4748]: I0320 10:40:43.983227 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-xpncn"] Mar 20 10:40:44 crc kubenswrapper[4748]: I0320 10:40:44.000075 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-llsmr"] Mar 20 10:40:44 crc kubenswrapper[4748]: I0320 10:40:44.002136 4748 scope.go:117] "RemoveContainer" containerID="f656da61255b854e0469306f315d81a1fccfdca6dc84f440c4548beaf65bfa33" Mar 20 10:40:44 crc kubenswrapper[4748]: I0320 10:40:44.005961 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-llsmr"] Mar 20 10:40:44 crc kubenswrapper[4748]: I0320 10:40:44.010599 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8kqb5"] Mar 20 10:40:44 crc kubenswrapper[4748]: I0320 10:40:44.014985 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-8kqb5"] Mar 20 10:40:44 crc kubenswrapper[4748]: I0320 10:40:44.021237 4748 scope.go:117] "RemoveContainer" containerID="dc05cb3effe7316098cdb06421a4067203e2d5538b4be516ead6a95e449ace37" Mar 20 10:40:44 crc kubenswrapper[4748]: I0320 10:40:44.046512 4748 scope.go:117] "RemoveContainer" containerID="818af1be7d986ef78e48ac5b436873844ea2ba14f05332d0bbd9e5da66c7b37b" Mar 20 10:40:44 crc kubenswrapper[4748]: I0320 10:40:44.060478 4748 scope.go:117] "RemoveContainer" containerID="3f8eab936b75ab67d84bb367d0d05d8e2344ddee5c8d6abf180165b53d20fad3" Mar 20 10:40:44 crc kubenswrapper[4748]: I0320 10:40:44.079806 4748 scope.go:117] "RemoveContainer" containerID="a9df4eadc7d988167690805eecffd70491944b494f875516f22dfcef0c5b96ac" Mar 20 10:40:44 crc kubenswrapper[4748]: I0320 10:40:44.096423 4748 scope.go:117] "RemoveContainer" containerID="6b1e7e5524ba07cf6db412bc93cf5887af47b76b7b5baf3e610f543e27197475" Mar 20 10:40:44 crc kubenswrapper[4748]: I0320 10:40:44.110153 4748 scope.go:117] "RemoveContainer" containerID="7fa50158e264bdda9d565f4dcd7ca54a039159bce144ead50f845245684d7d5e" Mar 20 10:40:44 crc kubenswrapper[4748]: I0320 10:40:44.133014 4748 scope.go:117] "RemoveContainer" containerID="2bc8b876dfde81df471e95d41f7ca032bd935a8c65b856c94c61c8fe3a3de955" Mar 20 10:40:44 crc kubenswrapper[4748]: I0320 10:40:44.953901 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-2qp4n" Mar 20 10:40:45 crc kubenswrapper[4748]: I0320 10:40:45.522638 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0186ffa9-907a-4afd-953d-28665f7343da" path="/var/lib/kubelet/pods/0186ffa9-907a-4afd-953d-28665f7343da/volumes" Mar 20 10:40:45 crc kubenswrapper[4748]: I0320 10:40:45.523331 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5196dc0c-2a46-4fb2-891b-682a6ce5eed9" path="/var/lib/kubelet/pods/5196dc0c-2a46-4fb2-891b-682a6ce5eed9/volumes" Mar 20 10:40:45 crc kubenswrapper[4748]: I0320 10:40:45.523923 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="61390690-1bd1-43c9-b82b-e2c5fe3450f9" path="/var/lib/kubelet/pods/61390690-1bd1-43c9-b82b-e2c5fe3450f9/volumes" Mar 20 10:40:45 crc kubenswrapper[4748]: I0320 10:40:45.525047 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e393e84-d0bb-4258-8eef-012c9269fc05" path="/var/lib/kubelet/pods/7e393e84-d0bb-4258-8eef-012c9269fc05/volumes" Mar 20 10:40:45 crc kubenswrapper[4748]: I0320 10:40:45.525670 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9abcdd14-d386-4279-9d4c-4a7326a32a11" path="/var/lib/kubelet/pods/9abcdd14-d386-4279-9d4c-4a7326a32a11/volumes" Mar 20 10:40:50 crc kubenswrapper[4748]: I0320 10:40:50.286757 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 20 10:40:51 crc kubenswrapper[4748]: I0320 10:40:51.984910 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 20 10:40:55 crc kubenswrapper[4748]: I0320 10:40:55.370649 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 20 10:40:55 crc kubenswrapper[4748]: I0320 10:40:55.424429 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 20 10:40:56 crc kubenswrapper[4748]: I0320 10:40:56.567612 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 20 10:40:56 crc kubenswrapper[4748]: I0320 10:40:56.957755 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 20 10:40:57 crc kubenswrapper[4748]: I0320 10:40:57.926114 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 20 10:40:58 crc kubenswrapper[4748]: I0320 10:40:58.087617 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 20 10:40:58 crc kubenswrapper[4748]: I0320 10:40:58.270862 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 20 10:40:59 crc kubenswrapper[4748]: I0320 10:40:59.596520 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 20 10:40:59 crc kubenswrapper[4748]: I0320 10:40:59.868471 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 20 10:41:00 crc kubenswrapper[4748]: I0320 10:41:00.941154 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 20 10:41:03 crc kubenswrapper[4748]: I0320 10:41:03.075865 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 20 10:41:03 crc kubenswrapper[4748]: I0320 10:41:03.748685 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 20 10:41:07 crc kubenswrapper[4748]: I0320 10:41:07.185232 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-8mlkc"] Mar 20 10:41:07 crc kubenswrapper[4748]: E0320 10:41:07.186104 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61390690-1bd1-43c9-b82b-e2c5fe3450f9" containerName="extract-utilities" Mar 20 10:41:07 crc kubenswrapper[4748]: I0320 10:41:07.186126 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="61390690-1bd1-43c9-b82b-e2c5fe3450f9" containerName="extract-utilities" Mar 20 10:41:07 crc kubenswrapper[4748]: E0320 10:41:07.186149 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9abcdd14-d386-4279-9d4c-4a7326a32a11" containerName="marketplace-operator" Mar 20 10:41:07 crc kubenswrapper[4748]: I0320 10:41:07.186159 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="9abcdd14-d386-4279-9d4c-4a7326a32a11" containerName="marketplace-operator" Mar 20 10:41:07 crc kubenswrapper[4748]: E0320 10:41:07.186171 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0186ffa9-907a-4afd-953d-28665f7343da" containerName="registry-server" Mar 20 10:41:07 crc kubenswrapper[4748]: I0320 10:41:07.186181 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="0186ffa9-907a-4afd-953d-28665f7343da" containerName="registry-server" Mar 20 10:41:07 crc kubenswrapper[4748]: E0320 10:41:07.186194 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5196dc0c-2a46-4fb2-891b-682a6ce5eed9" containerName="extract-content" Mar 20 10:41:07 crc kubenswrapper[4748]: I0320 10:41:07.186204 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="5196dc0c-2a46-4fb2-891b-682a6ce5eed9" containerName="extract-content" Mar 20 10:41:07 crc kubenswrapper[4748]: E0320 10:41:07.186216 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e393e84-d0bb-4258-8eef-012c9269fc05" containerName="registry-server" Mar 20 10:41:07 crc kubenswrapper[4748]: I0320 10:41:07.186225 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e393e84-d0bb-4258-8eef-012c9269fc05" containerName="registry-server" Mar 20 10:41:07 crc kubenswrapper[4748]: E0320 10:41:07.186244 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0186ffa9-907a-4afd-953d-28665f7343da" containerName="extract-utilities" Mar 20 10:41:07 crc kubenswrapper[4748]: I0320 10:41:07.186253 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="0186ffa9-907a-4afd-953d-28665f7343da" containerName="extract-utilities" Mar 20 10:41:07 crc kubenswrapper[4748]: E0320 10:41:07.186264 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5196dc0c-2a46-4fb2-891b-682a6ce5eed9" containerName="extract-utilities" Mar 20 10:41:07 crc kubenswrapper[4748]: I0320 10:41:07.186273 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="5196dc0c-2a46-4fb2-891b-682a6ce5eed9" containerName="extract-utilities" Mar 20 10:41:07 crc kubenswrapper[4748]: E0320 10:41:07.186284 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e393e84-d0bb-4258-8eef-012c9269fc05" containerName="extract-utilities" Mar 20 10:41:07 crc kubenswrapper[4748]: I0320 10:41:07.186294 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e393e84-d0bb-4258-8eef-012c9269fc05" containerName="extract-utilities" Mar 20 10:41:07 crc kubenswrapper[4748]: E0320 10:41:07.186304 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5196dc0c-2a46-4fb2-891b-682a6ce5eed9" containerName="registry-server" Mar 20 10:41:07 crc kubenswrapper[4748]: I0320 10:41:07.186314 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="5196dc0c-2a46-4fb2-891b-682a6ce5eed9" containerName="registry-server" Mar 20 10:41:07 crc kubenswrapper[4748]: E0320 10:41:07.186323 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0186ffa9-907a-4afd-953d-28665f7343da" containerName="extract-content" Mar 20 10:41:07 crc kubenswrapper[4748]: I0320 10:41:07.186332 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="0186ffa9-907a-4afd-953d-28665f7343da" containerName="extract-content" Mar 20 10:41:07 crc kubenswrapper[4748]: E0320 10:41:07.186345 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61390690-1bd1-43c9-b82b-e2c5fe3450f9" containerName="extract-content" Mar 20 10:41:07 crc kubenswrapper[4748]: I0320 10:41:07.186356 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="61390690-1bd1-43c9-b82b-e2c5fe3450f9" containerName="extract-content" Mar 20 10:41:07 crc kubenswrapper[4748]: E0320 10:41:07.186374 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61390690-1bd1-43c9-b82b-e2c5fe3450f9" containerName="registry-server" Mar 20 10:41:07 crc kubenswrapper[4748]: I0320 10:41:07.186384 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="61390690-1bd1-43c9-b82b-e2c5fe3450f9" containerName="registry-server" Mar 20 10:41:07 crc kubenswrapper[4748]: E0320 10:41:07.186402 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e393e84-d0bb-4258-8eef-012c9269fc05" containerName="extract-content" Mar 20 10:41:07 crc kubenswrapper[4748]: I0320 10:41:07.186410 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e393e84-d0bb-4258-8eef-012c9269fc05" containerName="extract-content" Mar 20 10:41:07 crc kubenswrapper[4748]: I0320 10:41:07.186538 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="61390690-1bd1-43c9-b82b-e2c5fe3450f9" containerName="registry-server" Mar 20 10:41:07 crc kubenswrapper[4748]: I0320 10:41:07.186554 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e393e84-d0bb-4258-8eef-012c9269fc05" containerName="registry-server" Mar 20 10:41:07 crc kubenswrapper[4748]: I0320 10:41:07.186569 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="0186ffa9-907a-4afd-953d-28665f7343da" containerName="registry-server" Mar 20 10:41:07 crc kubenswrapper[4748]: I0320 10:41:07.186583 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="9abcdd14-d386-4279-9d4c-4a7326a32a11" containerName="marketplace-operator" Mar 20 10:41:07 crc kubenswrapper[4748]: I0320 10:41:07.186594 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="5196dc0c-2a46-4fb2-891b-682a6ce5eed9" containerName="registry-server" Mar 20 10:41:07 crc kubenswrapper[4748]: I0320 10:41:07.187650 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8mlkc" Mar 20 10:41:07 crc kubenswrapper[4748]: I0320 10:41:07.190798 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 20 10:41:07 crc kubenswrapper[4748]: I0320 10:41:07.199688 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8mlkc"] Mar 20 10:41:07 crc kubenswrapper[4748]: I0320 10:41:07.346710 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5qht\" (UniqueName: \"kubernetes.io/projected/36f779a6-7268-4911-8532-f6fda0d56533-kube-api-access-f5qht\") pod \"redhat-operators-8mlkc\" (UID: \"36f779a6-7268-4911-8532-f6fda0d56533\") " pod="openshift-marketplace/redhat-operators-8mlkc" Mar 20 10:41:07 crc kubenswrapper[4748]: I0320 10:41:07.346801 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36f779a6-7268-4911-8532-f6fda0d56533-utilities\") pod \"redhat-operators-8mlkc\" (UID: \"36f779a6-7268-4911-8532-f6fda0d56533\") " pod="openshift-marketplace/redhat-operators-8mlkc" Mar 20 10:41:07 crc kubenswrapper[4748]: I0320 10:41:07.346873 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36f779a6-7268-4911-8532-f6fda0d56533-catalog-content\") pod \"redhat-operators-8mlkc\" (UID: \"36f779a6-7268-4911-8532-f6fda0d56533\") " pod="openshift-marketplace/redhat-operators-8mlkc" Mar 20 10:41:07 crc kubenswrapper[4748]: I0320 10:41:07.370509 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-xvhwr"] Mar 20 10:41:07 crc kubenswrapper[4748]: I0320 10:41:07.371489 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xvhwr" Mar 20 10:41:07 crc kubenswrapper[4748]: I0320 10:41:07.373726 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 20 10:41:07 crc kubenswrapper[4748]: I0320 10:41:07.382345 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xvhwr"] Mar 20 10:41:07 crc kubenswrapper[4748]: I0320 10:41:07.447883 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36f779a6-7268-4911-8532-f6fda0d56533-utilities\") pod \"redhat-operators-8mlkc\" (UID: \"36f779a6-7268-4911-8532-f6fda0d56533\") " pod="openshift-marketplace/redhat-operators-8mlkc" Mar 20 10:41:07 crc kubenswrapper[4748]: I0320 10:41:07.448000 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36f779a6-7268-4911-8532-f6fda0d56533-catalog-content\") pod \"redhat-operators-8mlkc\" (UID: \"36f779a6-7268-4911-8532-f6fda0d56533\") " pod="openshift-marketplace/redhat-operators-8mlkc" Mar 20 10:41:07 crc kubenswrapper[4748]: I0320 10:41:07.448055 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f5qht\" (UniqueName: \"kubernetes.io/projected/36f779a6-7268-4911-8532-f6fda0d56533-kube-api-access-f5qht\") pod \"redhat-operators-8mlkc\" (UID: \"36f779a6-7268-4911-8532-f6fda0d56533\") " pod="openshift-marketplace/redhat-operators-8mlkc" Mar 20 10:41:07 crc kubenswrapper[4748]: I0320 10:41:07.448631 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36f779a6-7268-4911-8532-f6fda0d56533-utilities\") pod \"redhat-operators-8mlkc\" (UID: \"36f779a6-7268-4911-8532-f6fda0d56533\") " pod="openshift-marketplace/redhat-operators-8mlkc" Mar 20 10:41:07 crc kubenswrapper[4748]: I0320 10:41:07.448789 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36f779a6-7268-4911-8532-f6fda0d56533-catalog-content\") pod \"redhat-operators-8mlkc\" (UID: \"36f779a6-7268-4911-8532-f6fda0d56533\") " pod="openshift-marketplace/redhat-operators-8mlkc" Mar 20 10:41:07 crc kubenswrapper[4748]: I0320 10:41:07.476894 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5qht\" (UniqueName: \"kubernetes.io/projected/36f779a6-7268-4911-8532-f6fda0d56533-kube-api-access-f5qht\") pod \"redhat-operators-8mlkc\" (UID: \"36f779a6-7268-4911-8532-f6fda0d56533\") " pod="openshift-marketplace/redhat-operators-8mlkc" Mar 20 10:41:07 crc kubenswrapper[4748]: I0320 10:41:07.515245 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8mlkc" Mar 20 10:41:07 crc kubenswrapper[4748]: I0320 10:41:07.549442 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jjtx\" (UniqueName: \"kubernetes.io/projected/ae14d541-76df-4379-98ef-87f4e35e7db3-kube-api-access-8jjtx\") pod \"redhat-marketplace-xvhwr\" (UID: \"ae14d541-76df-4379-98ef-87f4e35e7db3\") " pod="openshift-marketplace/redhat-marketplace-xvhwr" Mar 20 10:41:07 crc kubenswrapper[4748]: I0320 10:41:07.549508 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae14d541-76df-4379-98ef-87f4e35e7db3-catalog-content\") pod \"redhat-marketplace-xvhwr\" (UID: \"ae14d541-76df-4379-98ef-87f4e35e7db3\") " pod="openshift-marketplace/redhat-marketplace-xvhwr" Mar 20 10:41:07 crc kubenswrapper[4748]: I0320 10:41:07.549528 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae14d541-76df-4379-98ef-87f4e35e7db3-utilities\") pod \"redhat-marketplace-xvhwr\" (UID: \"ae14d541-76df-4379-98ef-87f4e35e7db3\") " pod="openshift-marketplace/redhat-marketplace-xvhwr" Mar 20 10:41:07 crc kubenswrapper[4748]: I0320 10:41:07.650816 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8jjtx\" (UniqueName: \"kubernetes.io/projected/ae14d541-76df-4379-98ef-87f4e35e7db3-kube-api-access-8jjtx\") pod \"redhat-marketplace-xvhwr\" (UID: \"ae14d541-76df-4379-98ef-87f4e35e7db3\") " pod="openshift-marketplace/redhat-marketplace-xvhwr" Mar 20 10:41:07 crc kubenswrapper[4748]: I0320 10:41:07.650881 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae14d541-76df-4379-98ef-87f4e35e7db3-catalog-content\") pod \"redhat-marketplace-xvhwr\" (UID: \"ae14d541-76df-4379-98ef-87f4e35e7db3\") " pod="openshift-marketplace/redhat-marketplace-xvhwr" Mar 20 10:41:07 crc kubenswrapper[4748]: I0320 10:41:07.650898 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae14d541-76df-4379-98ef-87f4e35e7db3-utilities\") pod \"redhat-marketplace-xvhwr\" (UID: \"ae14d541-76df-4379-98ef-87f4e35e7db3\") " pod="openshift-marketplace/redhat-marketplace-xvhwr" Mar 20 10:41:07 crc kubenswrapper[4748]: I0320 10:41:07.651457 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae14d541-76df-4379-98ef-87f4e35e7db3-utilities\") pod \"redhat-marketplace-xvhwr\" (UID: \"ae14d541-76df-4379-98ef-87f4e35e7db3\") " pod="openshift-marketplace/redhat-marketplace-xvhwr" Mar 20 10:41:07 crc kubenswrapper[4748]: I0320 10:41:07.651463 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae14d541-76df-4379-98ef-87f4e35e7db3-catalog-content\") pod \"redhat-marketplace-xvhwr\" (UID: \"ae14d541-76df-4379-98ef-87f4e35e7db3\") " pod="openshift-marketplace/redhat-marketplace-xvhwr" Mar 20 10:41:07 crc kubenswrapper[4748]: I0320 10:41:07.684785 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jjtx\" (UniqueName: \"kubernetes.io/projected/ae14d541-76df-4379-98ef-87f4e35e7db3-kube-api-access-8jjtx\") pod \"redhat-marketplace-xvhwr\" (UID: \"ae14d541-76df-4379-98ef-87f4e35e7db3\") " pod="openshift-marketplace/redhat-marketplace-xvhwr" Mar 20 10:41:07 crc kubenswrapper[4748]: I0320 10:41:07.690179 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xvhwr" Mar 20 10:41:07 crc kubenswrapper[4748]: I0320 10:41:07.922718 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8mlkc"] Mar 20 10:41:08 crc kubenswrapper[4748]: I0320 10:41:08.095944 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xvhwr"] Mar 20 10:41:08 crc kubenswrapper[4748]: I0320 10:41:08.108501 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8mlkc" event={"ID":"36f779a6-7268-4911-8532-f6fda0d56533","Type":"ContainerStarted","Data":"bdf866199443347649db4269d7d081abb939b4bd371a9dc010b4e5220fc5e059"} Mar 20 10:41:08 crc kubenswrapper[4748]: W0320 10:41:08.118242 4748 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podae14d541_76df_4379_98ef_87f4e35e7db3.slice/crio-c955bdff5edbb87eb716123fc67f0bdbd381faffa8ef2951c3021387de28ad05 WatchSource:0}: Error finding container c955bdff5edbb87eb716123fc67f0bdbd381faffa8ef2951c3021387de28ad05: Status 404 returned error can't find the container with id c955bdff5edbb87eb716123fc67f0bdbd381faffa8ef2951c3021387de28ad05 Mar 20 10:41:08 crc kubenswrapper[4748]: I0320 10:41:08.389409 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 20 10:41:09 crc kubenswrapper[4748]: I0320 10:41:09.116275 4748 generic.go:334] "Generic (PLEG): container finished" podID="36f779a6-7268-4911-8532-f6fda0d56533" containerID="57ed6aa353b992b59ba7e4bb8430d7ba5322674dcdb4c91301c879e7dfab1cb9" exitCode=0 Mar 20 10:41:09 crc kubenswrapper[4748]: I0320 10:41:09.116386 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8mlkc" event={"ID":"36f779a6-7268-4911-8532-f6fda0d56533","Type":"ContainerDied","Data":"57ed6aa353b992b59ba7e4bb8430d7ba5322674dcdb4c91301c879e7dfab1cb9"} Mar 20 10:41:09 crc kubenswrapper[4748]: I0320 10:41:09.121600 4748 generic.go:334] "Generic (PLEG): container finished" podID="ae14d541-76df-4379-98ef-87f4e35e7db3" containerID="a0a4dad363ad7e563a378747ccad3730c1414e87add3c5610238d621c0d2759d" exitCode=0 Mar 20 10:41:09 crc kubenswrapper[4748]: I0320 10:41:09.121661 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xvhwr" event={"ID":"ae14d541-76df-4379-98ef-87f4e35e7db3","Type":"ContainerDied","Data":"a0a4dad363ad7e563a378747ccad3730c1414e87add3c5610238d621c0d2759d"} Mar 20 10:41:09 crc kubenswrapper[4748]: I0320 10:41:09.121711 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xvhwr" event={"ID":"ae14d541-76df-4379-98ef-87f4e35e7db3","Type":"ContainerStarted","Data":"c955bdff5edbb87eb716123fc67f0bdbd381faffa8ef2951c3021387de28ad05"} Mar 20 10:41:09 crc kubenswrapper[4748]: I0320 10:41:09.575859 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-6z29z"] Mar 20 10:41:09 crc kubenswrapper[4748]: I0320 10:41:09.580797 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6z29z" Mar 20 10:41:09 crc kubenswrapper[4748]: I0320 10:41:09.582668 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 20 10:41:09 crc kubenswrapper[4748]: I0320 10:41:09.586197 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6z29z"] Mar 20 10:41:09 crc kubenswrapper[4748]: I0320 10:41:09.682668 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88e0b46d-a2cb-4927-9b1b-120bd2e07f5d-utilities\") pod \"community-operators-6z29z\" (UID: \"88e0b46d-a2cb-4927-9b1b-120bd2e07f5d\") " pod="openshift-marketplace/community-operators-6z29z" Mar 20 10:41:09 crc kubenswrapper[4748]: I0320 10:41:09.682760 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gt9qj\" (UniqueName: \"kubernetes.io/projected/88e0b46d-a2cb-4927-9b1b-120bd2e07f5d-kube-api-access-gt9qj\") pod \"community-operators-6z29z\" (UID: \"88e0b46d-a2cb-4927-9b1b-120bd2e07f5d\") " pod="openshift-marketplace/community-operators-6z29z" Mar 20 10:41:09 crc kubenswrapper[4748]: I0320 10:41:09.682799 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88e0b46d-a2cb-4927-9b1b-120bd2e07f5d-catalog-content\") pod \"community-operators-6z29z\" (UID: \"88e0b46d-a2cb-4927-9b1b-120bd2e07f5d\") " pod="openshift-marketplace/community-operators-6z29z" Mar 20 10:41:09 crc kubenswrapper[4748]: I0320 10:41:09.779917 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-52b5d"] Mar 20 10:41:09 crc kubenswrapper[4748]: I0320 10:41:09.781582 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-52b5d" Mar 20 10:41:09 crc kubenswrapper[4748]: I0320 10:41:09.783990 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 20 10:41:09 crc kubenswrapper[4748]: I0320 10:41:09.783992 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gt9qj\" (UniqueName: \"kubernetes.io/projected/88e0b46d-a2cb-4927-9b1b-120bd2e07f5d-kube-api-access-gt9qj\") pod \"community-operators-6z29z\" (UID: \"88e0b46d-a2cb-4927-9b1b-120bd2e07f5d\") " pod="openshift-marketplace/community-operators-6z29z" Mar 20 10:41:09 crc kubenswrapper[4748]: I0320 10:41:09.784267 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88e0b46d-a2cb-4927-9b1b-120bd2e07f5d-catalog-content\") pod \"community-operators-6z29z\" (UID: \"88e0b46d-a2cb-4927-9b1b-120bd2e07f5d\") " pod="openshift-marketplace/community-operators-6z29z" Mar 20 10:41:09 crc kubenswrapper[4748]: I0320 10:41:09.784408 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88e0b46d-a2cb-4927-9b1b-120bd2e07f5d-utilities\") pod \"community-operators-6z29z\" (UID: \"88e0b46d-a2cb-4927-9b1b-120bd2e07f5d\") " pod="openshift-marketplace/community-operators-6z29z" Mar 20 10:41:09 crc kubenswrapper[4748]: I0320 10:41:09.785018 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88e0b46d-a2cb-4927-9b1b-120bd2e07f5d-catalog-content\") pod \"community-operators-6z29z\" (UID: \"88e0b46d-a2cb-4927-9b1b-120bd2e07f5d\") " pod="openshift-marketplace/community-operators-6z29z" Mar 20 10:41:09 crc kubenswrapper[4748]: I0320 10:41:09.786742 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88e0b46d-a2cb-4927-9b1b-120bd2e07f5d-utilities\") pod \"community-operators-6z29z\" (UID: \"88e0b46d-a2cb-4927-9b1b-120bd2e07f5d\") " pod="openshift-marketplace/community-operators-6z29z" Mar 20 10:41:09 crc kubenswrapper[4748]: I0320 10:41:09.796246 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-52b5d"] Mar 20 10:41:09 crc kubenswrapper[4748]: I0320 10:41:09.836946 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gt9qj\" (UniqueName: \"kubernetes.io/projected/88e0b46d-a2cb-4927-9b1b-120bd2e07f5d-kube-api-access-gt9qj\") pod \"community-operators-6z29z\" (UID: \"88e0b46d-a2cb-4927-9b1b-120bd2e07f5d\") " pod="openshift-marketplace/community-operators-6z29z" Mar 20 10:41:09 crc kubenswrapper[4748]: I0320 10:41:09.886279 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce0ed389-9ca5-4022-bc2d-3dfed380bd01-utilities\") pod \"certified-operators-52b5d\" (UID: \"ce0ed389-9ca5-4022-bc2d-3dfed380bd01\") " pod="openshift-marketplace/certified-operators-52b5d" Mar 20 10:41:09 crc kubenswrapper[4748]: I0320 10:41:09.886369 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce0ed389-9ca5-4022-bc2d-3dfed380bd01-catalog-content\") pod \"certified-operators-52b5d\" (UID: \"ce0ed389-9ca5-4022-bc2d-3dfed380bd01\") " pod="openshift-marketplace/certified-operators-52b5d" Mar 20 10:41:09 crc kubenswrapper[4748]: I0320 10:41:09.886408 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7tc7\" (UniqueName: \"kubernetes.io/projected/ce0ed389-9ca5-4022-bc2d-3dfed380bd01-kube-api-access-r7tc7\") pod \"certified-operators-52b5d\" (UID: \"ce0ed389-9ca5-4022-bc2d-3dfed380bd01\") " pod="openshift-marketplace/certified-operators-52b5d" Mar 20 10:41:09 crc kubenswrapper[4748]: I0320 10:41:09.920856 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6z29z" Mar 20 10:41:09 crc kubenswrapper[4748]: I0320 10:41:09.987488 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce0ed389-9ca5-4022-bc2d-3dfed380bd01-utilities\") pod \"certified-operators-52b5d\" (UID: \"ce0ed389-9ca5-4022-bc2d-3dfed380bd01\") " pod="openshift-marketplace/certified-operators-52b5d" Mar 20 10:41:09 crc kubenswrapper[4748]: I0320 10:41:09.987555 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce0ed389-9ca5-4022-bc2d-3dfed380bd01-catalog-content\") pod \"certified-operators-52b5d\" (UID: \"ce0ed389-9ca5-4022-bc2d-3dfed380bd01\") " pod="openshift-marketplace/certified-operators-52b5d" Mar 20 10:41:09 crc kubenswrapper[4748]: I0320 10:41:09.987611 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7tc7\" (UniqueName: \"kubernetes.io/projected/ce0ed389-9ca5-4022-bc2d-3dfed380bd01-kube-api-access-r7tc7\") pod \"certified-operators-52b5d\" (UID: \"ce0ed389-9ca5-4022-bc2d-3dfed380bd01\") " pod="openshift-marketplace/certified-operators-52b5d" Mar 20 10:41:09 crc kubenswrapper[4748]: I0320 10:41:09.988132 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce0ed389-9ca5-4022-bc2d-3dfed380bd01-utilities\") pod \"certified-operators-52b5d\" (UID: \"ce0ed389-9ca5-4022-bc2d-3dfed380bd01\") " pod="openshift-marketplace/certified-operators-52b5d" Mar 20 10:41:09 crc kubenswrapper[4748]: I0320 10:41:09.988231 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce0ed389-9ca5-4022-bc2d-3dfed380bd01-catalog-content\") pod \"certified-operators-52b5d\" (UID: \"ce0ed389-9ca5-4022-bc2d-3dfed380bd01\") " pod="openshift-marketplace/certified-operators-52b5d" Mar 20 10:41:10 crc kubenswrapper[4748]: I0320 10:41:10.008467 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7tc7\" (UniqueName: \"kubernetes.io/projected/ce0ed389-9ca5-4022-bc2d-3dfed380bd01-kube-api-access-r7tc7\") pod \"certified-operators-52b5d\" (UID: \"ce0ed389-9ca5-4022-bc2d-3dfed380bd01\") " pod="openshift-marketplace/certified-operators-52b5d" Mar 20 10:41:10 crc kubenswrapper[4748]: I0320 10:41:10.107396 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-52b5d" Mar 20 10:41:10 crc kubenswrapper[4748]: I0320 10:41:10.131870 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xvhwr" event={"ID":"ae14d541-76df-4379-98ef-87f4e35e7db3","Type":"ContainerStarted","Data":"6c1ed16ea1aae3b73f15a5d677e86bedfaf1ebe7759a52e9244844bdddd29c2b"} Mar 20 10:41:10 crc kubenswrapper[4748]: I0320 10:41:10.449586 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6z29z"] Mar 20 10:41:10 crc kubenswrapper[4748]: W0320 10:41:10.455575 4748 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod88e0b46d_a2cb_4927_9b1b_120bd2e07f5d.slice/crio-cbb259e56d8151d610054f57499bc743d81cbf714108b8c61a6fd35bbec9f8a3 WatchSource:0}: Error finding container cbb259e56d8151d610054f57499bc743d81cbf714108b8c61a6fd35bbec9f8a3: Status 404 returned error can't find the container with id cbb259e56d8151d610054f57499bc743d81cbf714108b8c61a6fd35bbec9f8a3 Mar 20 10:41:10 crc kubenswrapper[4748]: I0320 10:41:10.574729 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-52b5d"] Mar 20 10:41:10 crc kubenswrapper[4748]: W0320 10:41:10.582555 4748 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podce0ed389_9ca5_4022_bc2d_3dfed380bd01.slice/crio-f57bf2fd83bc25b90d52ac918425563d7fdeebb08cf165e015fef81ea80be8a1 WatchSource:0}: Error finding container f57bf2fd83bc25b90d52ac918425563d7fdeebb08cf165e015fef81ea80be8a1: Status 404 returned error can't find the container with id f57bf2fd83bc25b90d52ac918425563d7fdeebb08cf165e015fef81ea80be8a1 Mar 20 10:41:10 crc kubenswrapper[4748]: I0320 10:41:10.739511 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 20 10:41:10 crc kubenswrapper[4748]: I0320 10:41:10.769012 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 20 10:41:11 crc kubenswrapper[4748]: I0320 10:41:11.140991 4748 generic.go:334] "Generic (PLEG): container finished" podID="ae14d541-76df-4379-98ef-87f4e35e7db3" containerID="6c1ed16ea1aae3b73f15a5d677e86bedfaf1ebe7759a52e9244844bdddd29c2b" exitCode=0 Mar 20 10:41:11 crc kubenswrapper[4748]: I0320 10:41:11.141141 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xvhwr" event={"ID":"ae14d541-76df-4379-98ef-87f4e35e7db3","Type":"ContainerDied","Data":"6c1ed16ea1aae3b73f15a5d677e86bedfaf1ebe7759a52e9244844bdddd29c2b"} Mar 20 10:41:11 crc kubenswrapper[4748]: I0320 10:41:11.142961 4748 generic.go:334] "Generic (PLEG): container finished" podID="ce0ed389-9ca5-4022-bc2d-3dfed380bd01" containerID="16362d7556d4258f635fba1997aa4dc6c9ff2544f90e2fc0d8372c538dd742dd" exitCode=0 Mar 20 10:41:11 crc kubenswrapper[4748]: I0320 10:41:11.143043 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-52b5d" event={"ID":"ce0ed389-9ca5-4022-bc2d-3dfed380bd01","Type":"ContainerDied","Data":"16362d7556d4258f635fba1997aa4dc6c9ff2544f90e2fc0d8372c538dd742dd"} Mar 20 10:41:11 crc kubenswrapper[4748]: I0320 10:41:11.143082 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-52b5d" event={"ID":"ce0ed389-9ca5-4022-bc2d-3dfed380bd01","Type":"ContainerStarted","Data":"f57bf2fd83bc25b90d52ac918425563d7fdeebb08cf165e015fef81ea80be8a1"} Mar 20 10:41:11 crc kubenswrapper[4748]: I0320 10:41:11.153824 4748 generic.go:334] "Generic (PLEG): container finished" podID="88e0b46d-a2cb-4927-9b1b-120bd2e07f5d" containerID="9177ba6cab2a6a6d71fde462aa7290c0b1d2b61323e4547f6a238c1997276536" exitCode=0 Mar 20 10:41:11 crc kubenswrapper[4748]: I0320 10:41:11.153955 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6z29z" event={"ID":"88e0b46d-a2cb-4927-9b1b-120bd2e07f5d","Type":"ContainerDied","Data":"9177ba6cab2a6a6d71fde462aa7290c0b1d2b61323e4547f6a238c1997276536"} Mar 20 10:41:11 crc kubenswrapper[4748]: I0320 10:41:11.153992 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6z29z" event={"ID":"88e0b46d-a2cb-4927-9b1b-120bd2e07f5d","Type":"ContainerStarted","Data":"cbb259e56d8151d610054f57499bc743d81cbf714108b8c61a6fd35bbec9f8a3"} Mar 20 10:41:11 crc kubenswrapper[4748]: I0320 10:41:11.158464 4748 generic.go:334] "Generic (PLEG): container finished" podID="36f779a6-7268-4911-8532-f6fda0d56533" containerID="8809519e93f5d02f4243781aa3941ec4d2a68b4637a733e8d07cad42eb1f250f" exitCode=0 Mar 20 10:41:11 crc kubenswrapper[4748]: I0320 10:41:11.158518 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8mlkc" event={"ID":"36f779a6-7268-4911-8532-f6fda0d56533","Type":"ContainerDied","Data":"8809519e93f5d02f4243781aa3941ec4d2a68b4637a733e8d07cad42eb1f250f"} Mar 20 10:41:12 crc kubenswrapper[4748]: I0320 10:41:12.169106 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8mlkc" event={"ID":"36f779a6-7268-4911-8532-f6fda0d56533","Type":"ContainerStarted","Data":"08bd36e00d6c5e4c1e49d9cb9765e4158b352558e9a844f4c68e7ea0341d7041"} Mar 20 10:41:12 crc kubenswrapper[4748]: I0320 10:41:12.175290 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xvhwr" event={"ID":"ae14d541-76df-4379-98ef-87f4e35e7db3","Type":"ContainerStarted","Data":"7fba6d31b4792f7abc90bf7fd586a3c9c55b9e6c450131113de7a8fe1c81b4e6"} Mar 20 10:41:12 crc kubenswrapper[4748]: I0320 10:41:12.176972 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-52b5d" event={"ID":"ce0ed389-9ca5-4022-bc2d-3dfed380bd01","Type":"ContainerStarted","Data":"fb6eda069a5212bc7263d54bc437494dbb5570a83a3cd71631a8e493e6a2868f"} Mar 20 10:41:12 crc kubenswrapper[4748]: I0320 10:41:12.188571 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-8mlkc" podStartSLOduration=2.7184142749999998 podStartE2EDuration="5.188544505s" podCreationTimestamp="2026-03-20 10:41:07 +0000 UTC" firstStartedPulling="2026-03-20 10:41:09.120356868 +0000 UTC m=+304.261902692" lastFinishedPulling="2026-03-20 10:41:11.590487108 +0000 UTC m=+306.732032922" observedRunningTime="2026-03-20 10:41:12.185257234 +0000 UTC m=+307.326803058" watchObservedRunningTime="2026-03-20 10:41:12.188544505 +0000 UTC m=+307.330090319" Mar 20 10:41:12 crc kubenswrapper[4748]: I0320 10:41:12.216617 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-xvhwr" podStartSLOduration=2.794314747 podStartE2EDuration="5.216589451s" podCreationTimestamp="2026-03-20 10:41:07 +0000 UTC" firstStartedPulling="2026-03-20 10:41:09.125363472 +0000 UTC m=+304.266909286" lastFinishedPulling="2026-03-20 10:41:11.547638176 +0000 UTC m=+306.689183990" observedRunningTime="2026-03-20 10:41:12.215388201 +0000 UTC m=+307.356934035" watchObservedRunningTime="2026-03-20 10:41:12.216589451 +0000 UTC m=+307.358135265" Mar 20 10:41:12 crc kubenswrapper[4748]: I0320 10:41:12.928687 4748 patch_prober.go:28] interesting pod/machine-config-daemon-5lbvz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 10:41:12 crc kubenswrapper[4748]: I0320 10:41:12.928768 4748 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 10:41:12 crc kubenswrapper[4748]: I0320 10:41:12.928852 4748 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" Mar 20 10:41:12 crc kubenswrapper[4748]: I0320 10:41:12.929739 4748 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"03c4104b260930c777d385e243f9163dff5c4db4d6a523b77574c5b0ba63b705"} pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 10:41:12 crc kubenswrapper[4748]: I0320 10:41:12.929918 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" containerName="machine-config-daemon" containerID="cri-o://03c4104b260930c777d385e243f9163dff5c4db4d6a523b77574c5b0ba63b705" gracePeriod=600 Mar 20 10:41:13 crc kubenswrapper[4748]: I0320 10:41:13.184124 4748 generic.go:334] "Generic (PLEG): container finished" podID="ce0ed389-9ca5-4022-bc2d-3dfed380bd01" containerID="fb6eda069a5212bc7263d54bc437494dbb5570a83a3cd71631a8e493e6a2868f" exitCode=0 Mar 20 10:41:13 crc kubenswrapper[4748]: I0320 10:41:13.184220 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-52b5d" event={"ID":"ce0ed389-9ca5-4022-bc2d-3dfed380bd01","Type":"ContainerDied","Data":"fb6eda069a5212bc7263d54bc437494dbb5570a83a3cd71631a8e493e6a2868f"} Mar 20 10:41:13 crc kubenswrapper[4748]: I0320 10:41:13.188804 4748 generic.go:334] "Generic (PLEG): container finished" podID="88e0b46d-a2cb-4927-9b1b-120bd2e07f5d" containerID="0252a36180031a6eed46041ba045fb4b4447018f24b3826a05a8d9513c52d198" exitCode=0 Mar 20 10:41:13 crc kubenswrapper[4748]: I0320 10:41:13.188946 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6z29z" event={"ID":"88e0b46d-a2cb-4927-9b1b-120bd2e07f5d","Type":"ContainerDied","Data":"0252a36180031a6eed46041ba045fb4b4447018f24b3826a05a8d9513c52d198"} Mar 20 10:41:13 crc kubenswrapper[4748]: I0320 10:41:13.195256 4748 generic.go:334] "Generic (PLEG): container finished" podID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" containerID="03c4104b260930c777d385e243f9163dff5c4db4d6a523b77574c5b0ba63b705" exitCode=0 Mar 20 10:41:13 crc kubenswrapper[4748]: I0320 10:41:13.195349 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" event={"ID":"8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c","Type":"ContainerDied","Data":"03c4104b260930c777d385e243f9163dff5c4db4d6a523b77574c5b0ba63b705"} Mar 20 10:41:14 crc kubenswrapper[4748]: I0320 10:41:14.205584 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6z29z" event={"ID":"88e0b46d-a2cb-4927-9b1b-120bd2e07f5d","Type":"ContainerStarted","Data":"750d87d8ef0f290024f77c3d1b17a9fa28a645c91eff50a38838d86cc4b74c8e"} Mar 20 10:41:14 crc kubenswrapper[4748]: I0320 10:41:14.208438 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" event={"ID":"8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c","Type":"ContainerStarted","Data":"3d722000c65d389fb1b79a8ffb41c8ee2832c576f06ef2d0738841c23d445dbc"} Mar 20 10:41:14 crc kubenswrapper[4748]: I0320 10:41:14.210723 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-52b5d" event={"ID":"ce0ed389-9ca5-4022-bc2d-3dfed380bd01","Type":"ContainerStarted","Data":"340e144014a6a293a1e27bdd3c22b3b2b7340cae7bfb146129c6c922cb2233ec"} Mar 20 10:41:14 crc kubenswrapper[4748]: I0320 10:41:14.242588 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-6z29z" podStartSLOduration=2.647370128 podStartE2EDuration="5.242558678s" podCreationTimestamp="2026-03-20 10:41:09 +0000 UTC" firstStartedPulling="2026-03-20 10:41:11.156350775 +0000 UTC m=+306.297896589" lastFinishedPulling="2026-03-20 10:41:13.751539325 +0000 UTC m=+308.893085139" observedRunningTime="2026-03-20 10:41:14.235888603 +0000 UTC m=+309.377434437" watchObservedRunningTime="2026-03-20 10:41:14.242558678 +0000 UTC m=+309.384104522" Mar 20 10:41:14 crc kubenswrapper[4748]: I0320 10:41:14.298065 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-52b5d" podStartSLOduration=2.871158666 podStartE2EDuration="5.298039034s" podCreationTimestamp="2026-03-20 10:41:09 +0000 UTC" firstStartedPulling="2026-03-20 10:41:11.145365882 +0000 UTC m=+306.286911696" lastFinishedPulling="2026-03-20 10:41:13.57224625 +0000 UTC m=+308.713792064" observedRunningTime="2026-03-20 10:41:14.295033519 +0000 UTC m=+309.436579353" watchObservedRunningTime="2026-03-20 10:41:14.298039034 +0000 UTC m=+309.439584848" Mar 20 10:41:15 crc kubenswrapper[4748]: I0320 10:41:15.339947 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 20 10:41:17 crc kubenswrapper[4748]: I0320 10:41:17.523979 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-8mlkc" Mar 20 10:41:17 crc kubenswrapper[4748]: I0320 10:41:17.524912 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-8mlkc" Mar 20 10:41:17 crc kubenswrapper[4748]: I0320 10:41:17.691037 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-xvhwr" Mar 20 10:41:17 crc kubenswrapper[4748]: I0320 10:41:17.691634 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-xvhwr" Mar 20 10:41:17 crc kubenswrapper[4748]: I0320 10:41:17.741314 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-xvhwr" Mar 20 10:41:18 crc kubenswrapper[4748]: I0320 10:41:18.279986 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-xvhwr" Mar 20 10:41:18 crc kubenswrapper[4748]: I0320 10:41:18.575236 4748 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-8mlkc" podUID="36f779a6-7268-4911-8532-f6fda0d56533" containerName="registry-server" probeResult="failure" output=< Mar 20 10:41:18 crc kubenswrapper[4748]: timeout: failed to connect service ":50051" within 1s Mar 20 10:41:18 crc kubenswrapper[4748]: > Mar 20 10:41:19 crc kubenswrapper[4748]: I0320 10:41:19.922358 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-6z29z" Mar 20 10:41:19 crc kubenswrapper[4748]: I0320 10:41:19.922873 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-6z29z" Mar 20 10:41:19 crc kubenswrapper[4748]: I0320 10:41:19.968685 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-6z29z" Mar 20 10:41:20 crc kubenswrapper[4748]: I0320 10:41:20.109370 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-52b5d" Mar 20 10:41:20 crc kubenswrapper[4748]: I0320 10:41:20.109431 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-52b5d" Mar 20 10:41:20 crc kubenswrapper[4748]: I0320 10:41:20.151887 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-52b5d" Mar 20 10:41:20 crc kubenswrapper[4748]: I0320 10:41:20.301090 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-6z29z" Mar 20 10:41:20 crc kubenswrapper[4748]: I0320 10:41:20.308137 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-52b5d" Mar 20 10:41:27 crc kubenswrapper[4748]: I0320 10:41:27.563761 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-8mlkc" Mar 20 10:41:27 crc kubenswrapper[4748]: I0320 10:41:27.609021 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-8mlkc" Mar 20 10:41:59 crc kubenswrapper[4748]: I0320 10:41:59.589270 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-qcjcc"] Mar 20 10:41:59 crc kubenswrapper[4748]: I0320 10:41:59.591012 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-qcjcc" Mar 20 10:41:59 crc kubenswrapper[4748]: I0320 10:41:59.603028 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-qcjcc"] Mar 20 10:41:59 crc kubenswrapper[4748]: I0320 10:41:59.723550 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/829907f7-4d41-4f50-b5f2-51d9c3c6a254-registry-certificates\") pod \"image-registry-66df7c8f76-qcjcc\" (UID: \"829907f7-4d41-4f50-b5f2-51d9c3c6a254\") " pod="openshift-image-registry/image-registry-66df7c8f76-qcjcc" Mar 20 10:41:59 crc kubenswrapper[4748]: I0320 10:41:59.724010 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/829907f7-4d41-4f50-b5f2-51d9c3c6a254-bound-sa-token\") pod \"image-registry-66df7c8f76-qcjcc\" (UID: \"829907f7-4d41-4f50-b5f2-51d9c3c6a254\") " pod="openshift-image-registry/image-registry-66df7c8f76-qcjcc" Mar 20 10:41:59 crc kubenswrapper[4748]: I0320 10:41:59.724045 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/829907f7-4d41-4f50-b5f2-51d9c3c6a254-ca-trust-extracted\") pod \"image-registry-66df7c8f76-qcjcc\" (UID: \"829907f7-4d41-4f50-b5f2-51d9c3c6a254\") " pod="openshift-image-registry/image-registry-66df7c8f76-qcjcc" Mar 20 10:41:59 crc kubenswrapper[4748]: I0320 10:41:59.724080 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwwbs\" (UniqueName: \"kubernetes.io/projected/829907f7-4d41-4f50-b5f2-51d9c3c6a254-kube-api-access-lwwbs\") pod \"image-registry-66df7c8f76-qcjcc\" (UID: \"829907f7-4d41-4f50-b5f2-51d9c3c6a254\") " pod="openshift-image-registry/image-registry-66df7c8f76-qcjcc" Mar 20 10:41:59 crc kubenswrapper[4748]: I0320 10:41:59.724103 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/829907f7-4d41-4f50-b5f2-51d9c3c6a254-trusted-ca\") pod \"image-registry-66df7c8f76-qcjcc\" (UID: \"829907f7-4d41-4f50-b5f2-51d9c3c6a254\") " pod="openshift-image-registry/image-registry-66df7c8f76-qcjcc" Mar 20 10:41:59 crc kubenswrapper[4748]: I0320 10:41:59.724129 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/829907f7-4d41-4f50-b5f2-51d9c3c6a254-registry-tls\") pod \"image-registry-66df7c8f76-qcjcc\" (UID: \"829907f7-4d41-4f50-b5f2-51d9c3c6a254\") " pod="openshift-image-registry/image-registry-66df7c8f76-qcjcc" Mar 20 10:41:59 crc kubenswrapper[4748]: I0320 10:41:59.724185 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-qcjcc\" (UID: \"829907f7-4d41-4f50-b5f2-51d9c3c6a254\") " pod="openshift-image-registry/image-registry-66df7c8f76-qcjcc" Mar 20 10:41:59 crc kubenswrapper[4748]: I0320 10:41:59.724209 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/829907f7-4d41-4f50-b5f2-51d9c3c6a254-installation-pull-secrets\") pod \"image-registry-66df7c8f76-qcjcc\" (UID: \"829907f7-4d41-4f50-b5f2-51d9c3c6a254\") " pod="openshift-image-registry/image-registry-66df7c8f76-qcjcc" Mar 20 10:41:59 crc kubenswrapper[4748]: I0320 10:41:59.757549 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-qcjcc\" (UID: \"829907f7-4d41-4f50-b5f2-51d9c3c6a254\") " pod="openshift-image-registry/image-registry-66df7c8f76-qcjcc" Mar 20 10:41:59 crc kubenswrapper[4748]: I0320 10:41:59.825351 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/829907f7-4d41-4f50-b5f2-51d9c3c6a254-registry-certificates\") pod \"image-registry-66df7c8f76-qcjcc\" (UID: \"829907f7-4d41-4f50-b5f2-51d9c3c6a254\") " pod="openshift-image-registry/image-registry-66df7c8f76-qcjcc" Mar 20 10:41:59 crc kubenswrapper[4748]: I0320 10:41:59.825450 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/829907f7-4d41-4f50-b5f2-51d9c3c6a254-bound-sa-token\") pod \"image-registry-66df7c8f76-qcjcc\" (UID: \"829907f7-4d41-4f50-b5f2-51d9c3c6a254\") " pod="openshift-image-registry/image-registry-66df7c8f76-qcjcc" Mar 20 10:41:59 crc kubenswrapper[4748]: I0320 10:41:59.825486 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/829907f7-4d41-4f50-b5f2-51d9c3c6a254-ca-trust-extracted\") pod \"image-registry-66df7c8f76-qcjcc\" (UID: \"829907f7-4d41-4f50-b5f2-51d9c3c6a254\") " pod="openshift-image-registry/image-registry-66df7c8f76-qcjcc" Mar 20 10:41:59 crc kubenswrapper[4748]: I0320 10:41:59.825543 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lwwbs\" (UniqueName: \"kubernetes.io/projected/829907f7-4d41-4f50-b5f2-51d9c3c6a254-kube-api-access-lwwbs\") pod \"image-registry-66df7c8f76-qcjcc\" (UID: \"829907f7-4d41-4f50-b5f2-51d9c3c6a254\") " pod="openshift-image-registry/image-registry-66df7c8f76-qcjcc" Mar 20 10:41:59 crc kubenswrapper[4748]: I0320 10:41:59.825569 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/829907f7-4d41-4f50-b5f2-51d9c3c6a254-trusted-ca\") pod \"image-registry-66df7c8f76-qcjcc\" (UID: \"829907f7-4d41-4f50-b5f2-51d9c3c6a254\") " pod="openshift-image-registry/image-registry-66df7c8f76-qcjcc" Mar 20 10:41:59 crc kubenswrapper[4748]: I0320 10:41:59.825613 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/829907f7-4d41-4f50-b5f2-51d9c3c6a254-registry-tls\") pod \"image-registry-66df7c8f76-qcjcc\" (UID: \"829907f7-4d41-4f50-b5f2-51d9c3c6a254\") " pod="openshift-image-registry/image-registry-66df7c8f76-qcjcc" Mar 20 10:41:59 crc kubenswrapper[4748]: I0320 10:41:59.825659 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/829907f7-4d41-4f50-b5f2-51d9c3c6a254-installation-pull-secrets\") pod \"image-registry-66df7c8f76-qcjcc\" (UID: \"829907f7-4d41-4f50-b5f2-51d9c3c6a254\") " pod="openshift-image-registry/image-registry-66df7c8f76-qcjcc" Mar 20 10:41:59 crc kubenswrapper[4748]: I0320 10:41:59.826298 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/829907f7-4d41-4f50-b5f2-51d9c3c6a254-ca-trust-extracted\") pod \"image-registry-66df7c8f76-qcjcc\" (UID: \"829907f7-4d41-4f50-b5f2-51d9c3c6a254\") " pod="openshift-image-registry/image-registry-66df7c8f76-qcjcc" Mar 20 10:41:59 crc kubenswrapper[4748]: I0320 10:41:59.827452 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/829907f7-4d41-4f50-b5f2-51d9c3c6a254-registry-certificates\") pod \"image-registry-66df7c8f76-qcjcc\" (UID: \"829907f7-4d41-4f50-b5f2-51d9c3c6a254\") " pod="openshift-image-registry/image-registry-66df7c8f76-qcjcc" Mar 20 10:41:59 crc kubenswrapper[4748]: I0320 10:41:59.827530 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/829907f7-4d41-4f50-b5f2-51d9c3c6a254-trusted-ca\") pod \"image-registry-66df7c8f76-qcjcc\" (UID: \"829907f7-4d41-4f50-b5f2-51d9c3c6a254\") " pod="openshift-image-registry/image-registry-66df7c8f76-qcjcc" Mar 20 10:41:59 crc kubenswrapper[4748]: I0320 10:41:59.838567 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/829907f7-4d41-4f50-b5f2-51d9c3c6a254-installation-pull-secrets\") pod \"image-registry-66df7c8f76-qcjcc\" (UID: \"829907f7-4d41-4f50-b5f2-51d9c3c6a254\") " pod="openshift-image-registry/image-registry-66df7c8f76-qcjcc" Mar 20 10:41:59 crc kubenswrapper[4748]: I0320 10:41:59.838750 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/829907f7-4d41-4f50-b5f2-51d9c3c6a254-registry-tls\") pod \"image-registry-66df7c8f76-qcjcc\" (UID: \"829907f7-4d41-4f50-b5f2-51d9c3c6a254\") " pod="openshift-image-registry/image-registry-66df7c8f76-qcjcc" Mar 20 10:41:59 crc kubenswrapper[4748]: I0320 10:41:59.857803 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/829907f7-4d41-4f50-b5f2-51d9c3c6a254-bound-sa-token\") pod \"image-registry-66df7c8f76-qcjcc\" (UID: \"829907f7-4d41-4f50-b5f2-51d9c3c6a254\") " pod="openshift-image-registry/image-registry-66df7c8f76-qcjcc" Mar 20 10:41:59 crc kubenswrapper[4748]: I0320 10:41:59.858139 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwwbs\" (UniqueName: \"kubernetes.io/projected/829907f7-4d41-4f50-b5f2-51d9c3c6a254-kube-api-access-lwwbs\") pod \"image-registry-66df7c8f76-qcjcc\" (UID: \"829907f7-4d41-4f50-b5f2-51d9c3c6a254\") " pod="openshift-image-registry/image-registry-66df7c8f76-qcjcc" Mar 20 10:41:59 crc kubenswrapper[4748]: I0320 10:41:59.910271 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-qcjcc" Mar 20 10:42:00 crc kubenswrapper[4748]: I0320 10:42:00.145241 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566722-lznxj"] Mar 20 10:42:00 crc kubenswrapper[4748]: I0320 10:42:00.147604 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566722-lznxj" Mar 20 10:42:00 crc kubenswrapper[4748]: I0320 10:42:00.152199 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 10:42:00 crc kubenswrapper[4748]: I0320 10:42:00.152294 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 10:42:00 crc kubenswrapper[4748]: I0320 10:42:00.152556 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-p6q8z" Mar 20 10:42:00 crc kubenswrapper[4748]: I0320 10:42:00.162082 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566722-lznxj"] Mar 20 10:42:00 crc kubenswrapper[4748]: I0320 10:42:00.228129 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-qcjcc"] Mar 20 10:42:00 crc kubenswrapper[4748]: I0320 10:42:00.235888 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f96dr\" (UniqueName: \"kubernetes.io/projected/7a83ac80-507f-4bb4-875c-72191dfc5d37-kube-api-access-f96dr\") pod \"auto-csr-approver-29566722-lznxj\" (UID: \"7a83ac80-507f-4bb4-875c-72191dfc5d37\") " pod="openshift-infra/auto-csr-approver-29566722-lznxj" Mar 20 10:42:00 crc kubenswrapper[4748]: I0320 10:42:00.337661 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f96dr\" (UniqueName: \"kubernetes.io/projected/7a83ac80-507f-4bb4-875c-72191dfc5d37-kube-api-access-f96dr\") pod \"auto-csr-approver-29566722-lznxj\" (UID: \"7a83ac80-507f-4bb4-875c-72191dfc5d37\") " pod="openshift-infra/auto-csr-approver-29566722-lznxj" Mar 20 10:42:00 crc kubenswrapper[4748]: I0320 10:42:00.357230 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f96dr\" (UniqueName: \"kubernetes.io/projected/7a83ac80-507f-4bb4-875c-72191dfc5d37-kube-api-access-f96dr\") pod \"auto-csr-approver-29566722-lznxj\" (UID: \"7a83ac80-507f-4bb4-875c-72191dfc5d37\") " pod="openshift-infra/auto-csr-approver-29566722-lznxj" Mar 20 10:42:00 crc kubenswrapper[4748]: I0320 10:42:00.491943 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-qcjcc" event={"ID":"829907f7-4d41-4f50-b5f2-51d9c3c6a254","Type":"ContainerStarted","Data":"d12ef9c5e13af9f38d1b6a36ceb0d64ce4388843803dae8254f5599440983d1d"} Mar 20 10:42:00 crc kubenswrapper[4748]: I0320 10:42:00.492373 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-qcjcc" event={"ID":"829907f7-4d41-4f50-b5f2-51d9c3c6a254","Type":"ContainerStarted","Data":"2d42bb7fd30034473aec8161e24d70cdd4d08d7d8b8778a28defafaaaf789f83"} Mar 20 10:42:00 crc kubenswrapper[4748]: I0320 10:42:00.493464 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-qcjcc" Mar 20 10:42:00 crc kubenswrapper[4748]: I0320 10:42:00.497514 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566722-lznxj" Mar 20 10:42:00 crc kubenswrapper[4748]: I0320 10:42:00.520881 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-qcjcc" podStartSLOduration=1.520852064 podStartE2EDuration="1.520852064s" podCreationTimestamp="2026-03-20 10:41:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:42:00.517172628 +0000 UTC m=+355.658718452" watchObservedRunningTime="2026-03-20 10:42:00.520852064 +0000 UTC m=+355.662397888" Mar 20 10:42:00 crc kubenswrapper[4748]: I0320 10:42:00.674095 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566722-lznxj"] Mar 20 10:42:01 crc kubenswrapper[4748]: I0320 10:42:01.503068 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566722-lznxj" event={"ID":"7a83ac80-507f-4bb4-875c-72191dfc5d37","Type":"ContainerStarted","Data":"aa662548a74bfe64254cc33cc953200a9ceb886a830ce9dcb528a2d4daae807e"} Mar 20 10:42:02 crc kubenswrapper[4748]: I0320 10:42:02.522684 4748 generic.go:334] "Generic (PLEG): container finished" podID="7a83ac80-507f-4bb4-875c-72191dfc5d37" containerID="05f6d172d8cbaa53d4393427574a4285c4c262cb2a0d2c352deda90a14cd7a44" exitCode=0 Mar 20 10:42:02 crc kubenswrapper[4748]: I0320 10:42:02.522774 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566722-lznxj" event={"ID":"7a83ac80-507f-4bb4-875c-72191dfc5d37","Type":"ContainerDied","Data":"05f6d172d8cbaa53d4393427574a4285c4c262cb2a0d2c352deda90a14cd7a44"} Mar 20 10:42:03 crc kubenswrapper[4748]: I0320 10:42:03.843702 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566722-lznxj" Mar 20 10:42:03 crc kubenswrapper[4748]: I0320 10:42:03.995599 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f96dr\" (UniqueName: \"kubernetes.io/projected/7a83ac80-507f-4bb4-875c-72191dfc5d37-kube-api-access-f96dr\") pod \"7a83ac80-507f-4bb4-875c-72191dfc5d37\" (UID: \"7a83ac80-507f-4bb4-875c-72191dfc5d37\") " Mar 20 10:42:04 crc kubenswrapper[4748]: I0320 10:42:04.004455 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a83ac80-507f-4bb4-875c-72191dfc5d37-kube-api-access-f96dr" (OuterVolumeSpecName: "kube-api-access-f96dr") pod "7a83ac80-507f-4bb4-875c-72191dfc5d37" (UID: "7a83ac80-507f-4bb4-875c-72191dfc5d37"). InnerVolumeSpecName "kube-api-access-f96dr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:42:04 crc kubenswrapper[4748]: I0320 10:42:04.097909 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f96dr\" (UniqueName: \"kubernetes.io/projected/7a83ac80-507f-4bb4-875c-72191dfc5d37-kube-api-access-f96dr\") on node \"crc\" DevicePath \"\"" Mar 20 10:42:04 crc kubenswrapper[4748]: I0320 10:42:04.542428 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566722-lznxj" event={"ID":"7a83ac80-507f-4bb4-875c-72191dfc5d37","Type":"ContainerDied","Data":"aa662548a74bfe64254cc33cc953200a9ceb886a830ce9dcb528a2d4daae807e"} Mar 20 10:42:04 crc kubenswrapper[4748]: I0320 10:42:04.542495 4748 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aa662548a74bfe64254cc33cc953200a9ceb886a830ce9dcb528a2d4daae807e" Mar 20 10:42:04 crc kubenswrapper[4748]: I0320 10:42:04.542552 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566722-lznxj" Mar 20 10:42:19 crc kubenswrapper[4748]: I0320 10:42:19.917801 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-qcjcc" Mar 20 10:42:20 crc kubenswrapper[4748]: I0320 10:42:20.037814 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-tlkp2"] Mar 20 10:42:45 crc kubenswrapper[4748]: I0320 10:42:45.092095 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-tlkp2" podUID="7da7288f-ca47-4172-a3dd-80a79e803277" containerName="registry" containerID="cri-o://ec12c12a7b334c9ad69d297bd9c80324cece598002a189fe9536a4aaf650a090" gracePeriod=30 Mar 20 10:42:45 crc kubenswrapper[4748]: I0320 10:42:45.532358 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-tlkp2" Mar 20 10:42:45 crc kubenswrapper[4748]: I0320 10:42:45.704298 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7da7288f-ca47-4172-a3dd-80a79e803277-registry-tls\") pod \"7da7288f-ca47-4172-a3dd-80a79e803277\" (UID: \"7da7288f-ca47-4172-a3dd-80a79e803277\") " Mar 20 10:42:45 crc kubenswrapper[4748]: I0320 10:42:45.704633 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"7da7288f-ca47-4172-a3dd-80a79e803277\" (UID: \"7da7288f-ca47-4172-a3dd-80a79e803277\") " Mar 20 10:42:45 crc kubenswrapper[4748]: I0320 10:42:45.704675 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7da7288f-ca47-4172-a3dd-80a79e803277-bound-sa-token\") pod \"7da7288f-ca47-4172-a3dd-80a79e803277\" (UID: \"7da7288f-ca47-4172-a3dd-80a79e803277\") " Mar 20 10:42:45 crc kubenswrapper[4748]: I0320 10:42:45.704714 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7da7288f-ca47-4172-a3dd-80a79e803277-ca-trust-extracted\") pod \"7da7288f-ca47-4172-a3dd-80a79e803277\" (UID: \"7da7288f-ca47-4172-a3dd-80a79e803277\") " Mar 20 10:42:45 crc kubenswrapper[4748]: I0320 10:42:45.704765 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7da7288f-ca47-4172-a3dd-80a79e803277-installation-pull-secrets\") pod \"7da7288f-ca47-4172-a3dd-80a79e803277\" (UID: \"7da7288f-ca47-4172-a3dd-80a79e803277\") " Mar 20 10:42:45 crc kubenswrapper[4748]: I0320 10:42:45.704818 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7da7288f-ca47-4172-a3dd-80a79e803277-trusted-ca\") pod \"7da7288f-ca47-4172-a3dd-80a79e803277\" (UID: \"7da7288f-ca47-4172-a3dd-80a79e803277\") " Mar 20 10:42:45 crc kubenswrapper[4748]: I0320 10:42:45.704887 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7bqkp\" (UniqueName: \"kubernetes.io/projected/7da7288f-ca47-4172-a3dd-80a79e803277-kube-api-access-7bqkp\") pod \"7da7288f-ca47-4172-a3dd-80a79e803277\" (UID: \"7da7288f-ca47-4172-a3dd-80a79e803277\") " Mar 20 10:42:45 crc kubenswrapper[4748]: I0320 10:42:45.704924 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7da7288f-ca47-4172-a3dd-80a79e803277-registry-certificates\") pod \"7da7288f-ca47-4172-a3dd-80a79e803277\" (UID: \"7da7288f-ca47-4172-a3dd-80a79e803277\") " Mar 20 10:42:45 crc kubenswrapper[4748]: I0320 10:42:45.706031 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7da7288f-ca47-4172-a3dd-80a79e803277-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "7da7288f-ca47-4172-a3dd-80a79e803277" (UID: "7da7288f-ca47-4172-a3dd-80a79e803277"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:42:45 crc kubenswrapper[4748]: I0320 10:42:45.706095 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7da7288f-ca47-4172-a3dd-80a79e803277-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "7da7288f-ca47-4172-a3dd-80a79e803277" (UID: "7da7288f-ca47-4172-a3dd-80a79e803277"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:42:45 crc kubenswrapper[4748]: I0320 10:42:45.715576 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7da7288f-ca47-4172-a3dd-80a79e803277-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "7da7288f-ca47-4172-a3dd-80a79e803277" (UID: "7da7288f-ca47-4172-a3dd-80a79e803277"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:42:45 crc kubenswrapper[4748]: I0320 10:42:45.716008 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7da7288f-ca47-4172-a3dd-80a79e803277-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "7da7288f-ca47-4172-a3dd-80a79e803277" (UID: "7da7288f-ca47-4172-a3dd-80a79e803277"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:42:45 crc kubenswrapper[4748]: I0320 10:42:45.716341 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7da7288f-ca47-4172-a3dd-80a79e803277-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "7da7288f-ca47-4172-a3dd-80a79e803277" (UID: "7da7288f-ca47-4172-a3dd-80a79e803277"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:42:45 crc kubenswrapper[4748]: I0320 10:42:45.719076 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7da7288f-ca47-4172-a3dd-80a79e803277-kube-api-access-7bqkp" (OuterVolumeSpecName: "kube-api-access-7bqkp") pod "7da7288f-ca47-4172-a3dd-80a79e803277" (UID: "7da7288f-ca47-4172-a3dd-80a79e803277"). InnerVolumeSpecName "kube-api-access-7bqkp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:42:45 crc kubenswrapper[4748]: I0320 10:42:45.719705 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "7da7288f-ca47-4172-a3dd-80a79e803277" (UID: "7da7288f-ca47-4172-a3dd-80a79e803277"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 20 10:42:45 crc kubenswrapper[4748]: I0320 10:42:45.733245 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7da7288f-ca47-4172-a3dd-80a79e803277-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "7da7288f-ca47-4172-a3dd-80a79e803277" (UID: "7da7288f-ca47-4172-a3dd-80a79e803277"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:42:45 crc kubenswrapper[4748]: I0320 10:42:45.806689 4748 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7da7288f-ca47-4172-a3dd-80a79e803277-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 20 10:42:45 crc kubenswrapper[4748]: I0320 10:42:45.806725 4748 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7da7288f-ca47-4172-a3dd-80a79e803277-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 20 10:42:45 crc kubenswrapper[4748]: I0320 10:42:45.806741 4748 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7da7288f-ca47-4172-a3dd-80a79e803277-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 20 10:42:45 crc kubenswrapper[4748]: I0320 10:42:45.806755 4748 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7da7288f-ca47-4172-a3dd-80a79e803277-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 10:42:45 crc kubenswrapper[4748]: I0320 10:42:45.806768 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7bqkp\" (UniqueName: \"kubernetes.io/projected/7da7288f-ca47-4172-a3dd-80a79e803277-kube-api-access-7bqkp\") on node \"crc\" DevicePath \"\"" Mar 20 10:42:45 crc kubenswrapper[4748]: I0320 10:42:45.806780 4748 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7da7288f-ca47-4172-a3dd-80a79e803277-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 20 10:42:45 crc kubenswrapper[4748]: I0320 10:42:45.806791 4748 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7da7288f-ca47-4172-a3dd-80a79e803277-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 20 10:42:46 crc kubenswrapper[4748]: I0320 10:42:46.165073 4748 generic.go:334] "Generic (PLEG): container finished" podID="7da7288f-ca47-4172-a3dd-80a79e803277" containerID="ec12c12a7b334c9ad69d297bd9c80324cece598002a189fe9536a4aaf650a090" exitCode=0 Mar 20 10:42:46 crc kubenswrapper[4748]: I0320 10:42:46.165138 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-tlkp2" event={"ID":"7da7288f-ca47-4172-a3dd-80a79e803277","Type":"ContainerDied","Data":"ec12c12a7b334c9ad69d297bd9c80324cece598002a189fe9536a4aaf650a090"} Mar 20 10:42:46 crc kubenswrapper[4748]: I0320 10:42:46.165178 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-tlkp2" Mar 20 10:42:46 crc kubenswrapper[4748]: I0320 10:42:46.165199 4748 scope.go:117] "RemoveContainer" containerID="ec12c12a7b334c9ad69d297bd9c80324cece598002a189fe9536a4aaf650a090" Mar 20 10:42:46 crc kubenswrapper[4748]: I0320 10:42:46.165182 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-tlkp2" event={"ID":"7da7288f-ca47-4172-a3dd-80a79e803277","Type":"ContainerDied","Data":"543bdf75f48d9514a0a08fe229aab3dc1bcc97e95fcbe956aea8900aef92ec9b"} Mar 20 10:42:46 crc kubenswrapper[4748]: I0320 10:42:46.202043 4748 scope.go:117] "RemoveContainer" containerID="ec12c12a7b334c9ad69d297bd9c80324cece598002a189fe9536a4aaf650a090" Mar 20 10:42:46 crc kubenswrapper[4748]: E0320 10:42:46.203776 4748 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec12c12a7b334c9ad69d297bd9c80324cece598002a189fe9536a4aaf650a090\": container with ID starting with ec12c12a7b334c9ad69d297bd9c80324cece598002a189fe9536a4aaf650a090 not found: ID does not exist" containerID="ec12c12a7b334c9ad69d297bd9c80324cece598002a189fe9536a4aaf650a090" Mar 20 10:42:46 crc kubenswrapper[4748]: I0320 10:42:46.204108 4748 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec12c12a7b334c9ad69d297bd9c80324cece598002a189fe9536a4aaf650a090"} err="failed to get container status \"ec12c12a7b334c9ad69d297bd9c80324cece598002a189fe9536a4aaf650a090\": rpc error: code = NotFound desc = could not find container \"ec12c12a7b334c9ad69d297bd9c80324cece598002a189fe9536a4aaf650a090\": container with ID starting with ec12c12a7b334c9ad69d297bd9c80324cece598002a189fe9536a4aaf650a090 not found: ID does not exist" Mar 20 10:42:46 crc kubenswrapper[4748]: I0320 10:42:46.230152 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-tlkp2"] Mar 20 10:42:46 crc kubenswrapper[4748]: I0320 10:42:46.235311 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-tlkp2"] Mar 20 10:42:47 crc kubenswrapper[4748]: I0320 10:42:47.528975 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7da7288f-ca47-4172-a3dd-80a79e803277" path="/var/lib/kubelet/pods/7da7288f-ca47-4172-a3dd-80a79e803277/volumes" Mar 20 10:43:42 crc kubenswrapper[4748]: I0320 10:43:42.928227 4748 patch_prober.go:28] interesting pod/machine-config-daemon-5lbvz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 10:43:42 crc kubenswrapper[4748]: I0320 10:43:42.928787 4748 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 10:44:00 crc kubenswrapper[4748]: I0320 10:44:00.153855 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566724-dh74f"] Mar 20 10:44:00 crc kubenswrapper[4748]: E0320 10:44:00.154806 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7da7288f-ca47-4172-a3dd-80a79e803277" containerName="registry" Mar 20 10:44:00 crc kubenswrapper[4748]: I0320 10:44:00.154827 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="7da7288f-ca47-4172-a3dd-80a79e803277" containerName="registry" Mar 20 10:44:00 crc kubenswrapper[4748]: E0320 10:44:00.154877 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a83ac80-507f-4bb4-875c-72191dfc5d37" containerName="oc" Mar 20 10:44:00 crc kubenswrapper[4748]: I0320 10:44:00.154889 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a83ac80-507f-4bb4-875c-72191dfc5d37" containerName="oc" Mar 20 10:44:00 crc kubenswrapper[4748]: I0320 10:44:00.155060 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a83ac80-507f-4bb4-875c-72191dfc5d37" containerName="oc" Mar 20 10:44:00 crc kubenswrapper[4748]: I0320 10:44:00.155089 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="7da7288f-ca47-4172-a3dd-80a79e803277" containerName="registry" Mar 20 10:44:00 crc kubenswrapper[4748]: I0320 10:44:00.155732 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566724-dh74f" Mar 20 10:44:00 crc kubenswrapper[4748]: I0320 10:44:00.158388 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-p6q8z" Mar 20 10:44:00 crc kubenswrapper[4748]: I0320 10:44:00.159567 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 10:44:00 crc kubenswrapper[4748]: I0320 10:44:00.160009 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 10:44:00 crc kubenswrapper[4748]: I0320 10:44:00.160051 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566724-dh74f"] Mar 20 10:44:00 crc kubenswrapper[4748]: I0320 10:44:00.197742 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jln84\" (UniqueName: \"kubernetes.io/projected/d228a510-6e8c-4489-9d87-5d0cdec16828-kube-api-access-jln84\") pod \"auto-csr-approver-29566724-dh74f\" (UID: \"d228a510-6e8c-4489-9d87-5d0cdec16828\") " pod="openshift-infra/auto-csr-approver-29566724-dh74f" Mar 20 10:44:00 crc kubenswrapper[4748]: I0320 10:44:00.299688 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jln84\" (UniqueName: \"kubernetes.io/projected/d228a510-6e8c-4489-9d87-5d0cdec16828-kube-api-access-jln84\") pod \"auto-csr-approver-29566724-dh74f\" (UID: \"d228a510-6e8c-4489-9d87-5d0cdec16828\") " pod="openshift-infra/auto-csr-approver-29566724-dh74f" Mar 20 10:44:00 crc kubenswrapper[4748]: I0320 10:44:00.327620 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jln84\" (UniqueName: \"kubernetes.io/projected/d228a510-6e8c-4489-9d87-5d0cdec16828-kube-api-access-jln84\") pod \"auto-csr-approver-29566724-dh74f\" (UID: \"d228a510-6e8c-4489-9d87-5d0cdec16828\") " pod="openshift-infra/auto-csr-approver-29566724-dh74f" Mar 20 10:44:00 crc kubenswrapper[4748]: I0320 10:44:00.476253 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566724-dh74f" Mar 20 10:44:00 crc kubenswrapper[4748]: I0320 10:44:00.907538 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566724-dh74f"] Mar 20 10:44:00 crc kubenswrapper[4748]: I0320 10:44:00.916791 4748 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 10:44:01 crc kubenswrapper[4748]: I0320 10:44:01.687645 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566724-dh74f" event={"ID":"d228a510-6e8c-4489-9d87-5d0cdec16828","Type":"ContainerStarted","Data":"ccccdd568eba1274fe79d039f076099c6fbe37ef1394b4b98fc8d18e6cf9c161"} Mar 20 10:44:04 crc kubenswrapper[4748]: I0320 10:44:04.708218 4748 generic.go:334] "Generic (PLEG): container finished" podID="d228a510-6e8c-4489-9d87-5d0cdec16828" containerID="e180064a952221cbb687ca77ba1ef3e2b706db532ef183b13386b8491ef01057" exitCode=0 Mar 20 10:44:04 crc kubenswrapper[4748]: I0320 10:44:04.708275 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566724-dh74f" event={"ID":"d228a510-6e8c-4489-9d87-5d0cdec16828","Type":"ContainerDied","Data":"e180064a952221cbb687ca77ba1ef3e2b706db532ef183b13386b8491ef01057"} Mar 20 10:44:05 crc kubenswrapper[4748]: I0320 10:44:05.988046 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566724-dh74f" Mar 20 10:44:06 crc kubenswrapper[4748]: I0320 10:44:06.075810 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jln84\" (UniqueName: \"kubernetes.io/projected/d228a510-6e8c-4489-9d87-5d0cdec16828-kube-api-access-jln84\") pod \"d228a510-6e8c-4489-9d87-5d0cdec16828\" (UID: \"d228a510-6e8c-4489-9d87-5d0cdec16828\") " Mar 20 10:44:06 crc kubenswrapper[4748]: I0320 10:44:06.080624 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d228a510-6e8c-4489-9d87-5d0cdec16828-kube-api-access-jln84" (OuterVolumeSpecName: "kube-api-access-jln84") pod "d228a510-6e8c-4489-9d87-5d0cdec16828" (UID: "d228a510-6e8c-4489-9d87-5d0cdec16828"). InnerVolumeSpecName "kube-api-access-jln84". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:44:06 crc kubenswrapper[4748]: I0320 10:44:06.176736 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jln84\" (UniqueName: \"kubernetes.io/projected/d228a510-6e8c-4489-9d87-5d0cdec16828-kube-api-access-jln84\") on node \"crc\" DevicePath \"\"" Mar 20 10:44:06 crc kubenswrapper[4748]: I0320 10:44:06.725167 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566724-dh74f" event={"ID":"d228a510-6e8c-4489-9d87-5d0cdec16828","Type":"ContainerDied","Data":"ccccdd568eba1274fe79d039f076099c6fbe37ef1394b4b98fc8d18e6cf9c161"} Mar 20 10:44:06 crc kubenswrapper[4748]: I0320 10:44:06.725222 4748 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ccccdd568eba1274fe79d039f076099c6fbe37ef1394b4b98fc8d18e6cf9c161" Mar 20 10:44:06 crc kubenswrapper[4748]: I0320 10:44:06.725228 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566724-dh74f" Mar 20 10:44:07 crc kubenswrapper[4748]: I0320 10:44:07.051889 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566718-7hmv9"] Mar 20 10:44:07 crc kubenswrapper[4748]: I0320 10:44:07.059823 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566718-7hmv9"] Mar 20 10:44:07 crc kubenswrapper[4748]: I0320 10:44:07.522284 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0de9aa72-edab-4ae9-b2dd-e20ef6b83277" path="/var/lib/kubelet/pods/0de9aa72-edab-4ae9-b2dd-e20ef6b83277/volumes" Mar 20 10:44:12 crc kubenswrapper[4748]: I0320 10:44:12.929065 4748 patch_prober.go:28] interesting pod/machine-config-daemon-5lbvz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 10:44:12 crc kubenswrapper[4748]: I0320 10:44:12.929435 4748 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 10:44:42 crc kubenswrapper[4748]: I0320 10:44:42.928929 4748 patch_prober.go:28] interesting pod/machine-config-daemon-5lbvz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 10:44:42 crc kubenswrapper[4748]: I0320 10:44:42.929623 4748 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 10:44:42 crc kubenswrapper[4748]: I0320 10:44:42.929703 4748 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" Mar 20 10:44:42 crc kubenswrapper[4748]: I0320 10:44:42.930917 4748 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3d722000c65d389fb1b79a8ffb41c8ee2832c576f06ef2d0738841c23d445dbc"} pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 10:44:42 crc kubenswrapper[4748]: I0320 10:44:42.931061 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" containerName="machine-config-daemon" containerID="cri-o://3d722000c65d389fb1b79a8ffb41c8ee2832c576f06ef2d0738841c23d445dbc" gracePeriod=600 Mar 20 10:44:43 crc kubenswrapper[4748]: I0320 10:44:43.965914 4748 generic.go:334] "Generic (PLEG): container finished" podID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" containerID="3d722000c65d389fb1b79a8ffb41c8ee2832c576f06ef2d0738841c23d445dbc" exitCode=0 Mar 20 10:44:43 crc kubenswrapper[4748]: I0320 10:44:43.965952 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" event={"ID":"8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c","Type":"ContainerDied","Data":"3d722000c65d389fb1b79a8ffb41c8ee2832c576f06ef2d0738841c23d445dbc"} Mar 20 10:44:43 crc kubenswrapper[4748]: I0320 10:44:43.967559 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" event={"ID":"8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c","Type":"ContainerStarted","Data":"261e51a50d63c2c75ffa3ccd3542b6a7465ac5ce52a44fe320cc511b3ec023ef"} Mar 20 10:44:43 crc kubenswrapper[4748]: I0320 10:44:43.967652 4748 scope.go:117] "RemoveContainer" containerID="03c4104b260930c777d385e243f9163dff5c4db4d6a523b77574c5b0ba63b705" Mar 20 10:45:00 crc kubenswrapper[4748]: I0320 10:45:00.156980 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566725-gpxq7"] Mar 20 10:45:00 crc kubenswrapper[4748]: E0320 10:45:00.157772 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d228a510-6e8c-4489-9d87-5d0cdec16828" containerName="oc" Mar 20 10:45:00 crc kubenswrapper[4748]: I0320 10:45:00.157788 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="d228a510-6e8c-4489-9d87-5d0cdec16828" containerName="oc" Mar 20 10:45:00 crc kubenswrapper[4748]: I0320 10:45:00.157946 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="d228a510-6e8c-4489-9d87-5d0cdec16828" containerName="oc" Mar 20 10:45:00 crc kubenswrapper[4748]: I0320 10:45:00.158462 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566725-gpxq7" Mar 20 10:45:00 crc kubenswrapper[4748]: I0320 10:45:00.162451 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 20 10:45:00 crc kubenswrapper[4748]: I0320 10:45:00.162612 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 20 10:45:00 crc kubenswrapper[4748]: I0320 10:45:00.165778 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566725-gpxq7"] Mar 20 10:45:00 crc kubenswrapper[4748]: I0320 10:45:00.335301 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8j2q\" (UniqueName: \"kubernetes.io/projected/ead5bfd4-b9a5-4edf-b020-e088057446c8-kube-api-access-l8j2q\") pod \"collect-profiles-29566725-gpxq7\" (UID: \"ead5bfd4-b9a5-4edf-b020-e088057446c8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566725-gpxq7" Mar 20 10:45:00 crc kubenswrapper[4748]: I0320 10:45:00.335360 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ead5bfd4-b9a5-4edf-b020-e088057446c8-config-volume\") pod \"collect-profiles-29566725-gpxq7\" (UID: \"ead5bfd4-b9a5-4edf-b020-e088057446c8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566725-gpxq7" Mar 20 10:45:00 crc kubenswrapper[4748]: I0320 10:45:00.335562 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ead5bfd4-b9a5-4edf-b020-e088057446c8-secret-volume\") pod \"collect-profiles-29566725-gpxq7\" (UID: \"ead5bfd4-b9a5-4edf-b020-e088057446c8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566725-gpxq7" Mar 20 10:45:00 crc kubenswrapper[4748]: I0320 10:45:00.437406 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ead5bfd4-b9a5-4edf-b020-e088057446c8-config-volume\") pod \"collect-profiles-29566725-gpxq7\" (UID: \"ead5bfd4-b9a5-4edf-b020-e088057446c8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566725-gpxq7" Mar 20 10:45:00 crc kubenswrapper[4748]: I0320 10:45:00.437468 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l8j2q\" (UniqueName: \"kubernetes.io/projected/ead5bfd4-b9a5-4edf-b020-e088057446c8-kube-api-access-l8j2q\") pod \"collect-profiles-29566725-gpxq7\" (UID: \"ead5bfd4-b9a5-4edf-b020-e088057446c8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566725-gpxq7" Mar 20 10:45:00 crc kubenswrapper[4748]: I0320 10:45:00.437517 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ead5bfd4-b9a5-4edf-b020-e088057446c8-secret-volume\") pod \"collect-profiles-29566725-gpxq7\" (UID: \"ead5bfd4-b9a5-4edf-b020-e088057446c8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566725-gpxq7" Mar 20 10:45:00 crc kubenswrapper[4748]: I0320 10:45:00.439329 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ead5bfd4-b9a5-4edf-b020-e088057446c8-config-volume\") pod \"collect-profiles-29566725-gpxq7\" (UID: \"ead5bfd4-b9a5-4edf-b020-e088057446c8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566725-gpxq7" Mar 20 10:45:00 crc kubenswrapper[4748]: I0320 10:45:00.444023 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ead5bfd4-b9a5-4edf-b020-e088057446c8-secret-volume\") pod \"collect-profiles-29566725-gpxq7\" (UID: \"ead5bfd4-b9a5-4edf-b020-e088057446c8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566725-gpxq7" Mar 20 10:45:00 crc kubenswrapper[4748]: I0320 10:45:00.454606 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8j2q\" (UniqueName: \"kubernetes.io/projected/ead5bfd4-b9a5-4edf-b020-e088057446c8-kube-api-access-l8j2q\") pod \"collect-profiles-29566725-gpxq7\" (UID: \"ead5bfd4-b9a5-4edf-b020-e088057446c8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566725-gpxq7" Mar 20 10:45:00 crc kubenswrapper[4748]: I0320 10:45:00.518934 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566725-gpxq7" Mar 20 10:45:00 crc kubenswrapper[4748]: I0320 10:45:00.903384 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566725-gpxq7"] Mar 20 10:45:01 crc kubenswrapper[4748]: I0320 10:45:01.089115 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566725-gpxq7" event={"ID":"ead5bfd4-b9a5-4edf-b020-e088057446c8","Type":"ContainerStarted","Data":"5bd32b9d23d9890b6a0cb2ba9e1c35f3220e35a9f55eb4065b0e177debfa0db2"} Mar 20 10:45:01 crc kubenswrapper[4748]: I0320 10:45:01.089174 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566725-gpxq7" event={"ID":"ead5bfd4-b9a5-4edf-b020-e088057446c8","Type":"ContainerStarted","Data":"8309283c259c355a1fc6cb217721536a29e6ba99712fb2a1fc2db51568f233eb"} Mar 20 10:45:01 crc kubenswrapper[4748]: I0320 10:45:01.111630 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29566725-gpxq7" podStartSLOduration=1.111607429 podStartE2EDuration="1.111607429s" podCreationTimestamp="2026-03-20 10:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:45:01.106540245 +0000 UTC m=+536.248086079" watchObservedRunningTime="2026-03-20 10:45:01.111607429 +0000 UTC m=+536.253153253" Mar 20 10:45:02 crc kubenswrapper[4748]: I0320 10:45:02.099421 4748 generic.go:334] "Generic (PLEG): container finished" podID="ead5bfd4-b9a5-4edf-b020-e088057446c8" containerID="5bd32b9d23d9890b6a0cb2ba9e1c35f3220e35a9f55eb4065b0e177debfa0db2" exitCode=0 Mar 20 10:45:02 crc kubenswrapper[4748]: I0320 10:45:02.099491 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566725-gpxq7" event={"ID":"ead5bfd4-b9a5-4edf-b020-e088057446c8","Type":"ContainerDied","Data":"5bd32b9d23d9890b6a0cb2ba9e1c35f3220e35a9f55eb4065b0e177debfa0db2"} Mar 20 10:45:03 crc kubenswrapper[4748]: I0320 10:45:03.428623 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566725-gpxq7" Mar 20 10:45:03 crc kubenswrapper[4748]: I0320 10:45:03.576105 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ead5bfd4-b9a5-4edf-b020-e088057446c8-config-volume\") pod \"ead5bfd4-b9a5-4edf-b020-e088057446c8\" (UID: \"ead5bfd4-b9a5-4edf-b020-e088057446c8\") " Mar 20 10:45:03 crc kubenswrapper[4748]: I0320 10:45:03.576207 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ead5bfd4-b9a5-4edf-b020-e088057446c8-secret-volume\") pod \"ead5bfd4-b9a5-4edf-b020-e088057446c8\" (UID: \"ead5bfd4-b9a5-4edf-b020-e088057446c8\") " Mar 20 10:45:03 crc kubenswrapper[4748]: I0320 10:45:03.576259 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l8j2q\" (UniqueName: \"kubernetes.io/projected/ead5bfd4-b9a5-4edf-b020-e088057446c8-kube-api-access-l8j2q\") pod \"ead5bfd4-b9a5-4edf-b020-e088057446c8\" (UID: \"ead5bfd4-b9a5-4edf-b020-e088057446c8\") " Mar 20 10:45:03 crc kubenswrapper[4748]: I0320 10:45:03.577356 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ead5bfd4-b9a5-4edf-b020-e088057446c8-config-volume" (OuterVolumeSpecName: "config-volume") pod "ead5bfd4-b9a5-4edf-b020-e088057446c8" (UID: "ead5bfd4-b9a5-4edf-b020-e088057446c8"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:45:03 crc kubenswrapper[4748]: I0320 10:45:03.582994 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ead5bfd4-b9a5-4edf-b020-e088057446c8-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "ead5bfd4-b9a5-4edf-b020-e088057446c8" (UID: "ead5bfd4-b9a5-4edf-b020-e088057446c8"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:45:03 crc kubenswrapper[4748]: I0320 10:45:03.583817 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ead5bfd4-b9a5-4edf-b020-e088057446c8-kube-api-access-l8j2q" (OuterVolumeSpecName: "kube-api-access-l8j2q") pod "ead5bfd4-b9a5-4edf-b020-e088057446c8" (UID: "ead5bfd4-b9a5-4edf-b020-e088057446c8"). InnerVolumeSpecName "kube-api-access-l8j2q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:45:03 crc kubenswrapper[4748]: I0320 10:45:03.677937 4748 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ead5bfd4-b9a5-4edf-b020-e088057446c8-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 10:45:03 crc kubenswrapper[4748]: I0320 10:45:03.678014 4748 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ead5bfd4-b9a5-4edf-b020-e088057446c8-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 20 10:45:03 crc kubenswrapper[4748]: I0320 10:45:03.678032 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l8j2q\" (UniqueName: \"kubernetes.io/projected/ead5bfd4-b9a5-4edf-b020-e088057446c8-kube-api-access-l8j2q\") on node \"crc\" DevicePath \"\"" Mar 20 10:45:04 crc kubenswrapper[4748]: I0320 10:45:04.114586 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566725-gpxq7" event={"ID":"ead5bfd4-b9a5-4edf-b020-e088057446c8","Type":"ContainerDied","Data":"8309283c259c355a1fc6cb217721536a29e6ba99712fb2a1fc2db51568f233eb"} Mar 20 10:45:04 crc kubenswrapper[4748]: I0320 10:45:04.114631 4748 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8309283c259c355a1fc6cb217721536a29e6ba99712fb2a1fc2db51568f233eb" Mar 20 10:45:04 crc kubenswrapper[4748]: I0320 10:45:04.114702 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566725-gpxq7" Mar 20 10:45:10 crc kubenswrapper[4748]: I0320 10:45:10.371206 4748 scope.go:117] "RemoveContainer" containerID="067458dd3e7c2dede621fcaf9ba3479682b15253b8c33280d96f1fbc0595b0bf" Mar 20 10:46:00 crc kubenswrapper[4748]: I0320 10:46:00.158813 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566726-ts528"] Mar 20 10:46:00 crc kubenswrapper[4748]: E0320 10:46:00.160268 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ead5bfd4-b9a5-4edf-b020-e088057446c8" containerName="collect-profiles" Mar 20 10:46:00 crc kubenswrapper[4748]: I0320 10:46:00.160298 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="ead5bfd4-b9a5-4edf-b020-e088057446c8" containerName="collect-profiles" Mar 20 10:46:00 crc kubenswrapper[4748]: I0320 10:46:00.160555 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="ead5bfd4-b9a5-4edf-b020-e088057446c8" containerName="collect-profiles" Mar 20 10:46:00 crc kubenswrapper[4748]: I0320 10:46:00.161370 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566726-ts528" Mar 20 10:46:00 crc kubenswrapper[4748]: I0320 10:46:00.164809 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-p6q8z" Mar 20 10:46:00 crc kubenswrapper[4748]: I0320 10:46:00.165475 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 10:46:00 crc kubenswrapper[4748]: I0320 10:46:00.168374 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566726-ts528"] Mar 20 10:46:00 crc kubenswrapper[4748]: I0320 10:46:00.173347 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 10:46:00 crc kubenswrapper[4748]: I0320 10:46:00.291358 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngsj6\" (UniqueName: \"kubernetes.io/projected/375ada94-2843-44ed-8f85-0be2255a00de-kube-api-access-ngsj6\") pod \"auto-csr-approver-29566726-ts528\" (UID: \"375ada94-2843-44ed-8f85-0be2255a00de\") " pod="openshift-infra/auto-csr-approver-29566726-ts528" Mar 20 10:46:00 crc kubenswrapper[4748]: I0320 10:46:00.393170 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngsj6\" (UniqueName: \"kubernetes.io/projected/375ada94-2843-44ed-8f85-0be2255a00de-kube-api-access-ngsj6\") pod \"auto-csr-approver-29566726-ts528\" (UID: \"375ada94-2843-44ed-8f85-0be2255a00de\") " pod="openshift-infra/auto-csr-approver-29566726-ts528" Mar 20 10:46:00 crc kubenswrapper[4748]: I0320 10:46:00.428243 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngsj6\" (UniqueName: \"kubernetes.io/projected/375ada94-2843-44ed-8f85-0be2255a00de-kube-api-access-ngsj6\") pod \"auto-csr-approver-29566726-ts528\" (UID: \"375ada94-2843-44ed-8f85-0be2255a00de\") " pod="openshift-infra/auto-csr-approver-29566726-ts528" Mar 20 10:46:00 crc kubenswrapper[4748]: I0320 10:46:00.487061 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566726-ts528" Mar 20 10:46:00 crc kubenswrapper[4748]: I0320 10:46:00.773021 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566726-ts528"] Mar 20 10:46:01 crc kubenswrapper[4748]: I0320 10:46:01.529604 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566726-ts528" event={"ID":"375ada94-2843-44ed-8f85-0be2255a00de","Type":"ContainerStarted","Data":"b006546e01bfd1b2f6dfd45f0d5b9a7903112d921b7379ab3d9340f9564f3c4a"} Mar 20 10:46:04 crc kubenswrapper[4748]: I0320 10:46:04.547995 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566726-ts528" event={"ID":"375ada94-2843-44ed-8f85-0be2255a00de","Type":"ContainerStarted","Data":"034dd2badb36035539c1599476c14607f6a703ff3b80fce629e2ab422f884e3a"} Mar 20 10:46:04 crc kubenswrapper[4748]: I0320 10:46:04.567454 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29566726-ts528" podStartSLOduration=1.111254035 podStartE2EDuration="4.567433067s" podCreationTimestamp="2026-03-20 10:46:00 +0000 UTC" firstStartedPulling="2026-03-20 10:46:00.78513025 +0000 UTC m=+595.926676104" lastFinishedPulling="2026-03-20 10:46:04.241309332 +0000 UTC m=+599.382855136" observedRunningTime="2026-03-20 10:46:04.56447828 +0000 UTC m=+599.706024124" watchObservedRunningTime="2026-03-20 10:46:04.567433067 +0000 UTC m=+599.708978891" Mar 20 10:46:05 crc kubenswrapper[4748]: I0320 10:46:05.556137 4748 generic.go:334] "Generic (PLEG): container finished" podID="375ada94-2843-44ed-8f85-0be2255a00de" containerID="034dd2badb36035539c1599476c14607f6a703ff3b80fce629e2ab422f884e3a" exitCode=0 Mar 20 10:46:05 crc kubenswrapper[4748]: I0320 10:46:05.556281 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566726-ts528" event={"ID":"375ada94-2843-44ed-8f85-0be2255a00de","Type":"ContainerDied","Data":"034dd2badb36035539c1599476c14607f6a703ff3b80fce629e2ab422f884e3a"} Mar 20 10:46:06 crc kubenswrapper[4748]: I0320 10:46:06.828269 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566726-ts528" Mar 20 10:46:06 crc kubenswrapper[4748]: I0320 10:46:06.978603 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngsj6\" (UniqueName: \"kubernetes.io/projected/375ada94-2843-44ed-8f85-0be2255a00de-kube-api-access-ngsj6\") pod \"375ada94-2843-44ed-8f85-0be2255a00de\" (UID: \"375ada94-2843-44ed-8f85-0be2255a00de\") " Mar 20 10:46:06 crc kubenswrapper[4748]: I0320 10:46:06.984813 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/375ada94-2843-44ed-8f85-0be2255a00de-kube-api-access-ngsj6" (OuterVolumeSpecName: "kube-api-access-ngsj6") pod "375ada94-2843-44ed-8f85-0be2255a00de" (UID: "375ada94-2843-44ed-8f85-0be2255a00de"). InnerVolumeSpecName "kube-api-access-ngsj6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:46:07 crc kubenswrapper[4748]: I0320 10:46:07.081287 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngsj6\" (UniqueName: \"kubernetes.io/projected/375ada94-2843-44ed-8f85-0be2255a00de-kube-api-access-ngsj6\") on node \"crc\" DevicePath \"\"" Mar 20 10:46:07 crc kubenswrapper[4748]: I0320 10:46:07.575604 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566726-ts528" event={"ID":"375ada94-2843-44ed-8f85-0be2255a00de","Type":"ContainerDied","Data":"b006546e01bfd1b2f6dfd45f0d5b9a7903112d921b7379ab3d9340f9564f3c4a"} Mar 20 10:46:07 crc kubenswrapper[4748]: I0320 10:46:07.575660 4748 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b006546e01bfd1b2f6dfd45f0d5b9a7903112d921b7379ab3d9340f9564f3c4a" Mar 20 10:46:07 crc kubenswrapper[4748]: I0320 10:46:07.575693 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566726-ts528" Mar 20 10:46:07 crc kubenswrapper[4748]: I0320 10:46:07.631390 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566720-mvtnf"] Mar 20 10:46:07 crc kubenswrapper[4748]: I0320 10:46:07.635149 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566720-mvtnf"] Mar 20 10:46:09 crc kubenswrapper[4748]: I0320 10:46:09.536627 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d344ca-51fc-428e-9703-7d57a3c4bd8d" path="/var/lib/kubelet/pods/31d344ca-51fc-428e-9703-7d57a3c4bd8d/volumes" Mar 20 10:46:10 crc kubenswrapper[4748]: I0320 10:46:10.427181 4748 scope.go:117] "RemoveContainer" containerID="7d3e72e3fd6fca7d1a1240ac3c55ba176e943063f21e150b1a4cea49f6cce92e" Mar 20 10:47:10 crc kubenswrapper[4748]: I0320 10:47:10.490223 4748 scope.go:117] "RemoveContainer" containerID="ef739de92cc4bee9bb07f6445e2f48d6e4709f37b3f9ddb9a97a80ee16d85cfb" Mar 20 10:47:12 crc kubenswrapper[4748]: I0320 10:47:12.928318 4748 patch_prober.go:28] interesting pod/machine-config-daemon-5lbvz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 10:47:12 crc kubenswrapper[4748]: I0320 10:47:12.928421 4748 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 10:47:42 crc kubenswrapper[4748]: I0320 10:47:42.928668 4748 patch_prober.go:28] interesting pod/machine-config-daemon-5lbvz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 10:47:42 crc kubenswrapper[4748]: I0320 10:47:42.929663 4748 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 10:47:59 crc kubenswrapper[4748]: I0320 10:47:59.262732 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-858654f9db-krcxd"] Mar 20 10:47:59 crc kubenswrapper[4748]: E0320 10:47:59.264334 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="375ada94-2843-44ed-8f85-0be2255a00de" containerName="oc" Mar 20 10:47:59 crc kubenswrapper[4748]: I0320 10:47:59.264425 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="375ada94-2843-44ed-8f85-0be2255a00de" containerName="oc" Mar 20 10:47:59 crc kubenswrapper[4748]: I0320 10:47:59.264619 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="375ada94-2843-44ed-8f85-0be2255a00de" containerName="oc" Mar 20 10:47:59 crc kubenswrapper[4748]: I0320 10:47:59.265158 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-krcxd" Mar 20 10:47:59 crc kubenswrapper[4748]: I0320 10:47:59.267391 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Mar 20 10:47:59 crc kubenswrapper[4748]: I0320 10:47:59.267823 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-wn2nc"] Mar 20 10:47:59 crc kubenswrapper[4748]: I0320 10:47:59.267998 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Mar 20 10:47:59 crc kubenswrapper[4748]: I0320 10:47:59.268574 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-wn2nc" Mar 20 10:47:59 crc kubenswrapper[4748]: I0320 10:47:59.269672 4748 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-tqvgv" Mar 20 10:47:59 crc kubenswrapper[4748]: I0320 10:47:59.271073 4748 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-dbknx" Mar 20 10:47:59 crc kubenswrapper[4748]: I0320 10:47:59.274648 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-jt8rf"] Mar 20 10:47:59 crc kubenswrapper[4748]: I0320 10:47:59.275801 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-jt8rf" Mar 20 10:47:59 crc kubenswrapper[4748]: I0320 10:47:59.277164 4748 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-rwghc" Mar 20 10:47:59 crc kubenswrapper[4748]: I0320 10:47:59.283995 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-wn2nc"] Mar 20 10:47:59 crc kubenswrapper[4748]: I0320 10:47:59.284351 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lnm9d\" (UniqueName: \"kubernetes.io/projected/bd82c1fc-6aff-4336-80fd-247fbc7aed58-kube-api-access-lnm9d\") pod \"cert-manager-cainjector-cf98fcc89-wn2nc\" (UID: \"bd82c1fc-6aff-4336-80fd-247fbc7aed58\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-wn2nc" Mar 20 10:47:59 crc kubenswrapper[4748]: I0320 10:47:59.284648 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tw755\" (UniqueName: \"kubernetes.io/projected/668082d7-988d-415d-bfde-1c28171130b5-kube-api-access-tw755\") pod \"cert-manager-858654f9db-krcxd\" (UID: \"668082d7-988d-415d-bfde-1c28171130b5\") " pod="cert-manager/cert-manager-858654f9db-krcxd" Mar 20 10:47:59 crc kubenswrapper[4748]: I0320 10:47:59.288977 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-krcxd"] Mar 20 10:47:59 crc kubenswrapper[4748]: I0320 10:47:59.290769 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-jt8rf"] Mar 20 10:47:59 crc kubenswrapper[4748]: I0320 10:47:59.385748 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jktd\" (UniqueName: \"kubernetes.io/projected/2f029dc3-bb1a-4b18-91ee-fd467cbe157f-kube-api-access-5jktd\") pod \"cert-manager-webhook-687f57d79b-jt8rf\" (UID: \"2f029dc3-bb1a-4b18-91ee-fd467cbe157f\") " pod="cert-manager/cert-manager-webhook-687f57d79b-jt8rf" Mar 20 10:47:59 crc kubenswrapper[4748]: I0320 10:47:59.386301 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lnm9d\" (UniqueName: \"kubernetes.io/projected/bd82c1fc-6aff-4336-80fd-247fbc7aed58-kube-api-access-lnm9d\") pod \"cert-manager-cainjector-cf98fcc89-wn2nc\" (UID: \"bd82c1fc-6aff-4336-80fd-247fbc7aed58\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-wn2nc" Mar 20 10:47:59 crc kubenswrapper[4748]: I0320 10:47:59.386497 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tw755\" (UniqueName: \"kubernetes.io/projected/668082d7-988d-415d-bfde-1c28171130b5-kube-api-access-tw755\") pod \"cert-manager-858654f9db-krcxd\" (UID: \"668082d7-988d-415d-bfde-1c28171130b5\") " pod="cert-manager/cert-manager-858654f9db-krcxd" Mar 20 10:47:59 crc kubenswrapper[4748]: I0320 10:47:59.409382 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lnm9d\" (UniqueName: \"kubernetes.io/projected/bd82c1fc-6aff-4336-80fd-247fbc7aed58-kube-api-access-lnm9d\") pod \"cert-manager-cainjector-cf98fcc89-wn2nc\" (UID: \"bd82c1fc-6aff-4336-80fd-247fbc7aed58\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-wn2nc" Mar 20 10:47:59 crc kubenswrapper[4748]: I0320 10:47:59.410758 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tw755\" (UniqueName: \"kubernetes.io/projected/668082d7-988d-415d-bfde-1c28171130b5-kube-api-access-tw755\") pod \"cert-manager-858654f9db-krcxd\" (UID: \"668082d7-988d-415d-bfde-1c28171130b5\") " pod="cert-manager/cert-manager-858654f9db-krcxd" Mar 20 10:47:59 crc kubenswrapper[4748]: I0320 10:47:59.488184 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5jktd\" (UniqueName: \"kubernetes.io/projected/2f029dc3-bb1a-4b18-91ee-fd467cbe157f-kube-api-access-5jktd\") pod \"cert-manager-webhook-687f57d79b-jt8rf\" (UID: \"2f029dc3-bb1a-4b18-91ee-fd467cbe157f\") " pod="cert-manager/cert-manager-webhook-687f57d79b-jt8rf" Mar 20 10:47:59 crc kubenswrapper[4748]: I0320 10:47:59.518952 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jktd\" (UniqueName: \"kubernetes.io/projected/2f029dc3-bb1a-4b18-91ee-fd467cbe157f-kube-api-access-5jktd\") pod \"cert-manager-webhook-687f57d79b-jt8rf\" (UID: \"2f029dc3-bb1a-4b18-91ee-fd467cbe157f\") " pod="cert-manager/cert-manager-webhook-687f57d79b-jt8rf" Mar 20 10:47:59 crc kubenswrapper[4748]: I0320 10:47:59.590260 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-krcxd" Mar 20 10:47:59 crc kubenswrapper[4748]: I0320 10:47:59.599860 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-wn2nc" Mar 20 10:47:59 crc kubenswrapper[4748]: I0320 10:47:59.613998 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-jt8rf" Mar 20 10:47:59 crc kubenswrapper[4748]: I0320 10:47:59.848471 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-krcxd"] Mar 20 10:47:59 crc kubenswrapper[4748]: W0320 10:47:59.859386 4748 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod668082d7_988d_415d_bfde_1c28171130b5.slice/crio-817c00b308829ae968d4653730eabc5cfb15811416a9d69ae4c5efc0ee2eb97a WatchSource:0}: Error finding container 817c00b308829ae968d4653730eabc5cfb15811416a9d69ae4c5efc0ee2eb97a: Status 404 returned error can't find the container with id 817c00b308829ae968d4653730eabc5cfb15811416a9d69ae4c5efc0ee2eb97a Mar 20 10:48:00 crc kubenswrapper[4748]: I0320 10:48:00.122988 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-jt8rf"] Mar 20 10:48:00 crc kubenswrapper[4748]: W0320 10:48:00.132736 4748 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2f029dc3_bb1a_4b18_91ee_fd467cbe157f.slice/crio-0a5a6b3e0d67e8dc62203cd15c20af0b7213dd9739935da105bf882555035a2d WatchSource:0}: Error finding container 0a5a6b3e0d67e8dc62203cd15c20af0b7213dd9739935da105bf882555035a2d: Status 404 returned error can't find the container with id 0a5a6b3e0d67e8dc62203cd15c20af0b7213dd9739935da105bf882555035a2d Mar 20 10:48:00 crc kubenswrapper[4748]: I0320 10:48:00.135122 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566728-z252s"] Mar 20 10:48:00 crc kubenswrapper[4748]: I0320 10:48:00.135957 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566728-z252s" Mar 20 10:48:00 crc kubenswrapper[4748]: I0320 10:48:00.146052 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-wn2nc"] Mar 20 10:48:00 crc kubenswrapper[4748]: I0320 10:48:00.146081 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 10:48:00 crc kubenswrapper[4748]: I0320 10:48:00.146219 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-p6q8z" Mar 20 10:48:00 crc kubenswrapper[4748]: I0320 10:48:00.148124 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 10:48:00 crc kubenswrapper[4748]: I0320 10:48:00.149743 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566728-z252s"] Mar 20 10:48:00 crc kubenswrapper[4748]: I0320 10:48:00.197127 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxpk4\" (UniqueName: \"kubernetes.io/projected/f04b7943-417c-40b2-bc8f-a8a7ed916a47-kube-api-access-cxpk4\") pod \"auto-csr-approver-29566728-z252s\" (UID: \"f04b7943-417c-40b2-bc8f-a8a7ed916a47\") " pod="openshift-infra/auto-csr-approver-29566728-z252s" Mar 20 10:48:00 crc kubenswrapper[4748]: I0320 10:48:00.298239 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cxpk4\" (UniqueName: \"kubernetes.io/projected/f04b7943-417c-40b2-bc8f-a8a7ed916a47-kube-api-access-cxpk4\") pod \"auto-csr-approver-29566728-z252s\" (UID: \"f04b7943-417c-40b2-bc8f-a8a7ed916a47\") " pod="openshift-infra/auto-csr-approver-29566728-z252s" Mar 20 10:48:00 crc kubenswrapper[4748]: I0320 10:48:00.317671 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxpk4\" (UniqueName: \"kubernetes.io/projected/f04b7943-417c-40b2-bc8f-a8a7ed916a47-kube-api-access-cxpk4\") pod \"auto-csr-approver-29566728-z252s\" (UID: \"f04b7943-417c-40b2-bc8f-a8a7ed916a47\") " pod="openshift-infra/auto-csr-approver-29566728-z252s" Mar 20 10:48:00 crc kubenswrapper[4748]: I0320 10:48:00.467657 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566728-z252s" Mar 20 10:48:00 crc kubenswrapper[4748]: I0320 10:48:00.595095 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-wn2nc" event={"ID":"bd82c1fc-6aff-4336-80fd-247fbc7aed58","Type":"ContainerStarted","Data":"7b89c5edda82e74f888ab6a824c983b6858d1e4d1f6ddd68267f8e657a40789b"} Mar 20 10:48:00 crc kubenswrapper[4748]: I0320 10:48:00.595673 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-jt8rf" event={"ID":"2f029dc3-bb1a-4b18-91ee-fd467cbe157f","Type":"ContainerStarted","Data":"0a5a6b3e0d67e8dc62203cd15c20af0b7213dd9739935da105bf882555035a2d"} Mar 20 10:48:00 crc kubenswrapper[4748]: I0320 10:48:00.596420 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-krcxd" event={"ID":"668082d7-988d-415d-bfde-1c28171130b5","Type":"ContainerStarted","Data":"817c00b308829ae968d4653730eabc5cfb15811416a9d69ae4c5efc0ee2eb97a"} Mar 20 10:48:00 crc kubenswrapper[4748]: I0320 10:48:00.679373 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566728-z252s"] Mar 20 10:48:01 crc kubenswrapper[4748]: I0320 10:48:01.603061 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566728-z252s" event={"ID":"f04b7943-417c-40b2-bc8f-a8a7ed916a47","Type":"ContainerStarted","Data":"cbfc5b84adb285e89a571116dff301907aedf400d0c662903dac2f8455ed3a0a"} Mar 20 10:48:04 crc kubenswrapper[4748]: I0320 10:48:04.625519 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-krcxd" event={"ID":"668082d7-988d-415d-bfde-1c28171130b5","Type":"ContainerStarted","Data":"daed16f95059ea0c45303bc107b34c187e11352c95457ec3d3c85ffe06b145cf"} Mar 20 10:48:04 crc kubenswrapper[4748]: I0320 10:48:04.627961 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-wn2nc" event={"ID":"bd82c1fc-6aff-4336-80fd-247fbc7aed58","Type":"ContainerStarted","Data":"164d1a1639bb1a878875ff1e7cb500210ac14d75965a236df55d55b1f81d3ba2"} Mar 20 10:48:04 crc kubenswrapper[4748]: I0320 10:48:04.630686 4748 generic.go:334] "Generic (PLEG): container finished" podID="f04b7943-417c-40b2-bc8f-a8a7ed916a47" containerID="b089bfcde2b05951629c210ad4c13c707d17e323e4706365a2d8ab37cb31bfb4" exitCode=0 Mar 20 10:48:04 crc kubenswrapper[4748]: I0320 10:48:04.630745 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566728-z252s" event={"ID":"f04b7943-417c-40b2-bc8f-a8a7ed916a47","Type":"ContainerDied","Data":"b089bfcde2b05951629c210ad4c13c707d17e323e4706365a2d8ab37cb31bfb4"} Mar 20 10:48:04 crc kubenswrapper[4748]: I0320 10:48:04.633199 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-jt8rf" event={"ID":"2f029dc3-bb1a-4b18-91ee-fd467cbe157f","Type":"ContainerStarted","Data":"d005fc01b8a78af60166646b713025474f50fc999d4715f10235f3ee75ef4003"} Mar 20 10:48:04 crc kubenswrapper[4748]: I0320 10:48:04.633334 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-687f57d79b-jt8rf" Mar 20 10:48:04 crc kubenswrapper[4748]: I0320 10:48:04.647418 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-858654f9db-krcxd" podStartSLOduration=1.8151137849999999 podStartE2EDuration="5.647395566s" podCreationTimestamp="2026-03-20 10:47:59 +0000 UTC" firstStartedPulling="2026-03-20 10:47:59.868996573 +0000 UTC m=+715.010542387" lastFinishedPulling="2026-03-20 10:48:03.701278354 +0000 UTC m=+718.842824168" observedRunningTime="2026-03-20 10:48:04.645480208 +0000 UTC m=+719.787026062" watchObservedRunningTime="2026-03-20 10:48:04.647395566 +0000 UTC m=+719.788941390" Mar 20 10:48:04 crc kubenswrapper[4748]: I0320 10:48:04.680141 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-687f57d79b-jt8rf" podStartSLOduration=2.088050008 podStartE2EDuration="5.680118877s" podCreationTimestamp="2026-03-20 10:47:59 +0000 UTC" firstStartedPulling="2026-03-20 10:48:00.154247015 +0000 UTC m=+715.295792829" lastFinishedPulling="2026-03-20 10:48:03.746315854 +0000 UTC m=+718.887861698" observedRunningTime="2026-03-20 10:48:04.679523012 +0000 UTC m=+719.821068846" watchObservedRunningTime="2026-03-20 10:48:04.680118877 +0000 UTC m=+719.821664701" Mar 20 10:48:04 crc kubenswrapper[4748]: I0320 10:48:04.713484 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-cf98fcc89-wn2nc" podStartSLOduration=2.22836782 podStartE2EDuration="5.713464114s" podCreationTimestamp="2026-03-20 10:47:59 +0000 UTC" firstStartedPulling="2026-03-20 10:48:00.155884776 +0000 UTC m=+715.297430630" lastFinishedPulling="2026-03-20 10:48:03.64098111 +0000 UTC m=+718.782526924" observedRunningTime="2026-03-20 10:48:04.712980732 +0000 UTC m=+719.854526536" watchObservedRunningTime="2026-03-20 10:48:04.713464114 +0000 UTC m=+719.855009928" Mar 20 10:48:05 crc kubenswrapper[4748]: I0320 10:48:05.953269 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566728-z252s" Mar 20 10:48:05 crc kubenswrapper[4748]: I0320 10:48:05.980705 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cxpk4\" (UniqueName: \"kubernetes.io/projected/f04b7943-417c-40b2-bc8f-a8a7ed916a47-kube-api-access-cxpk4\") pod \"f04b7943-417c-40b2-bc8f-a8a7ed916a47\" (UID: \"f04b7943-417c-40b2-bc8f-a8a7ed916a47\") " Mar 20 10:48:05 crc kubenswrapper[4748]: I0320 10:48:05.992472 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f04b7943-417c-40b2-bc8f-a8a7ed916a47-kube-api-access-cxpk4" (OuterVolumeSpecName: "kube-api-access-cxpk4") pod "f04b7943-417c-40b2-bc8f-a8a7ed916a47" (UID: "f04b7943-417c-40b2-bc8f-a8a7ed916a47"). InnerVolumeSpecName "kube-api-access-cxpk4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:48:06 crc kubenswrapper[4748]: I0320 10:48:06.084098 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cxpk4\" (UniqueName: \"kubernetes.io/projected/f04b7943-417c-40b2-bc8f-a8a7ed916a47-kube-api-access-cxpk4\") on node \"crc\" DevicePath \"\"" Mar 20 10:48:06 crc kubenswrapper[4748]: I0320 10:48:06.653925 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566728-z252s" event={"ID":"f04b7943-417c-40b2-bc8f-a8a7ed916a47","Type":"ContainerDied","Data":"cbfc5b84adb285e89a571116dff301907aedf400d0c662903dac2f8455ed3a0a"} Mar 20 10:48:06 crc kubenswrapper[4748]: I0320 10:48:06.653996 4748 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cbfc5b84adb285e89a571116dff301907aedf400d0c662903dac2f8455ed3a0a" Mar 20 10:48:06 crc kubenswrapper[4748]: I0320 10:48:06.654014 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566728-z252s" Mar 20 10:48:07 crc kubenswrapper[4748]: I0320 10:48:07.019985 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566722-lznxj"] Mar 20 10:48:07 crc kubenswrapper[4748]: I0320 10:48:07.023901 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566722-lznxj"] Mar 20 10:48:07 crc kubenswrapper[4748]: I0320 10:48:07.528921 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a83ac80-507f-4bb4-875c-72191dfc5d37" path="/var/lib/kubelet/pods/7a83ac80-507f-4bb4-875c-72191dfc5d37/volumes" Mar 20 10:48:09 crc kubenswrapper[4748]: I0320 10:48:09.423816 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-xdzb8"] Mar 20 10:48:09 crc kubenswrapper[4748]: I0320 10:48:09.424535 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-xdzb8" podUID="f31addae-43ae-459d-bf9d-b5c0ee58faba" containerName="ovn-controller" containerID="cri-o://87af6eef1dc0be4682b0bbfe7b485e49360e19dd3b48012cc6a0e79e02b6c3f8" gracePeriod=30 Mar 20 10:48:09 crc kubenswrapper[4748]: I0320 10:48:09.424591 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-xdzb8" podUID="f31addae-43ae-459d-bf9d-b5c0ee58faba" containerName="nbdb" containerID="cri-o://a22f1239a0ca652003eb95b617e5a6b84f13d70ac2dca8f1593c2fb3cc3ff87a" gracePeriod=30 Mar 20 10:48:09 crc kubenswrapper[4748]: I0320 10:48:09.424682 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-xdzb8" podUID="f31addae-43ae-459d-bf9d-b5c0ee58faba" containerName="northd" containerID="cri-o://2bf2050b15cfd2fd678dca05462c6e8ce79672ef315106eb9fdc8fea4ab5c0a6" gracePeriod=30 Mar 20 10:48:09 crc kubenswrapper[4748]: I0320 10:48:09.424735 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-xdzb8" podUID="f31addae-43ae-459d-bf9d-b5c0ee58faba" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://8b874abcca50cad2b1c5a29e4d7254aea1e0233ad9280e327d03c34634402211" gracePeriod=30 Mar 20 10:48:09 crc kubenswrapper[4748]: I0320 10:48:09.424781 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-xdzb8" podUID="f31addae-43ae-459d-bf9d-b5c0ee58faba" containerName="kube-rbac-proxy-node" containerID="cri-o://52213e8defefb79f069d768559f145776cd25ea532c71296687b5922c982adf4" gracePeriod=30 Mar 20 10:48:09 crc kubenswrapper[4748]: I0320 10:48:09.424786 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-xdzb8" podUID="f31addae-43ae-459d-bf9d-b5c0ee58faba" containerName="sbdb" containerID="cri-o://fe1f6bd9aee16c9e4c4607895576f28294888b4c1a29f8bf4b4a23bc5b35ec68" gracePeriod=30 Mar 20 10:48:09 crc kubenswrapper[4748]: I0320 10:48:09.424821 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-xdzb8" podUID="f31addae-43ae-459d-bf9d-b5c0ee58faba" containerName="ovn-acl-logging" containerID="cri-o://f795435d75086ef32acedeee8f60354fd510f49f2df1bd3d522380ef8d12365f" gracePeriod=30 Mar 20 10:48:09 crc kubenswrapper[4748]: I0320 10:48:09.468933 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-xdzb8" podUID="f31addae-43ae-459d-bf9d-b5c0ee58faba" containerName="ovnkube-controller" containerID="cri-o://0c85912fc1ac385cb91c20ee935596fe12cd390837609dbaec03b30f96d4168c" gracePeriod=30 Mar 20 10:48:09 crc kubenswrapper[4748]: I0320 10:48:09.617321 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-687f57d79b-jt8rf" Mar 20 10:48:09 crc kubenswrapper[4748]: I0320 10:48:09.673433 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xdzb8_f31addae-43ae-459d-bf9d-b5c0ee58faba/ovn-acl-logging/0.log" Mar 20 10:48:09 crc kubenswrapper[4748]: I0320 10:48:09.673821 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xdzb8_f31addae-43ae-459d-bf9d-b5c0ee58faba/ovn-controller/0.log" Mar 20 10:48:09 crc kubenswrapper[4748]: I0320 10:48:09.674106 4748 generic.go:334] "Generic (PLEG): container finished" podID="f31addae-43ae-459d-bf9d-b5c0ee58faba" containerID="0c85912fc1ac385cb91c20ee935596fe12cd390837609dbaec03b30f96d4168c" exitCode=0 Mar 20 10:48:09 crc kubenswrapper[4748]: I0320 10:48:09.674130 4748 generic.go:334] "Generic (PLEG): container finished" podID="f31addae-43ae-459d-bf9d-b5c0ee58faba" containerID="fe1f6bd9aee16c9e4c4607895576f28294888b4c1a29f8bf4b4a23bc5b35ec68" exitCode=0 Mar 20 10:48:09 crc kubenswrapper[4748]: I0320 10:48:09.674137 4748 generic.go:334] "Generic (PLEG): container finished" podID="f31addae-43ae-459d-bf9d-b5c0ee58faba" containerID="a22f1239a0ca652003eb95b617e5a6b84f13d70ac2dca8f1593c2fb3cc3ff87a" exitCode=0 Mar 20 10:48:09 crc kubenswrapper[4748]: I0320 10:48:09.674145 4748 generic.go:334] "Generic (PLEG): container finished" podID="f31addae-43ae-459d-bf9d-b5c0ee58faba" containerID="2bf2050b15cfd2fd678dca05462c6e8ce79672ef315106eb9fdc8fea4ab5c0a6" exitCode=0 Mar 20 10:48:09 crc kubenswrapper[4748]: I0320 10:48:09.674152 4748 generic.go:334] "Generic (PLEG): container finished" podID="f31addae-43ae-459d-bf9d-b5c0ee58faba" containerID="8b874abcca50cad2b1c5a29e4d7254aea1e0233ad9280e327d03c34634402211" exitCode=0 Mar 20 10:48:09 crc kubenswrapper[4748]: I0320 10:48:09.674159 4748 generic.go:334] "Generic (PLEG): container finished" podID="f31addae-43ae-459d-bf9d-b5c0ee58faba" containerID="52213e8defefb79f069d768559f145776cd25ea532c71296687b5922c982adf4" exitCode=0 Mar 20 10:48:09 crc kubenswrapper[4748]: I0320 10:48:09.674165 4748 generic.go:334] "Generic (PLEG): container finished" podID="f31addae-43ae-459d-bf9d-b5c0ee58faba" containerID="f795435d75086ef32acedeee8f60354fd510f49f2df1bd3d522380ef8d12365f" exitCode=143 Mar 20 10:48:09 crc kubenswrapper[4748]: I0320 10:48:09.674172 4748 generic.go:334] "Generic (PLEG): container finished" podID="f31addae-43ae-459d-bf9d-b5c0ee58faba" containerID="87af6eef1dc0be4682b0bbfe7b485e49360e19dd3b48012cc6a0e79e02b6c3f8" exitCode=143 Mar 20 10:48:09 crc kubenswrapper[4748]: I0320 10:48:09.674201 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xdzb8" event={"ID":"f31addae-43ae-459d-bf9d-b5c0ee58faba","Type":"ContainerDied","Data":"0c85912fc1ac385cb91c20ee935596fe12cd390837609dbaec03b30f96d4168c"} Mar 20 10:48:09 crc kubenswrapper[4748]: I0320 10:48:09.674226 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xdzb8" event={"ID":"f31addae-43ae-459d-bf9d-b5c0ee58faba","Type":"ContainerDied","Data":"fe1f6bd9aee16c9e4c4607895576f28294888b4c1a29f8bf4b4a23bc5b35ec68"} Mar 20 10:48:09 crc kubenswrapper[4748]: I0320 10:48:09.674239 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xdzb8" event={"ID":"f31addae-43ae-459d-bf9d-b5c0ee58faba","Type":"ContainerDied","Data":"a22f1239a0ca652003eb95b617e5a6b84f13d70ac2dca8f1593c2fb3cc3ff87a"} Mar 20 10:48:09 crc kubenswrapper[4748]: I0320 10:48:09.674248 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xdzb8" event={"ID":"f31addae-43ae-459d-bf9d-b5c0ee58faba","Type":"ContainerDied","Data":"2bf2050b15cfd2fd678dca05462c6e8ce79672ef315106eb9fdc8fea4ab5c0a6"} Mar 20 10:48:09 crc kubenswrapper[4748]: I0320 10:48:09.674256 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xdzb8" event={"ID":"f31addae-43ae-459d-bf9d-b5c0ee58faba","Type":"ContainerDied","Data":"8b874abcca50cad2b1c5a29e4d7254aea1e0233ad9280e327d03c34634402211"} Mar 20 10:48:09 crc kubenswrapper[4748]: I0320 10:48:09.674264 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xdzb8" event={"ID":"f31addae-43ae-459d-bf9d-b5c0ee58faba","Type":"ContainerDied","Data":"52213e8defefb79f069d768559f145776cd25ea532c71296687b5922c982adf4"} Mar 20 10:48:09 crc kubenswrapper[4748]: I0320 10:48:09.674272 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xdzb8" event={"ID":"f31addae-43ae-459d-bf9d-b5c0ee58faba","Type":"ContainerDied","Data":"f795435d75086ef32acedeee8f60354fd510f49f2df1bd3d522380ef8d12365f"} Mar 20 10:48:09 crc kubenswrapper[4748]: I0320 10:48:09.674280 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xdzb8" event={"ID":"f31addae-43ae-459d-bf9d-b5c0ee58faba","Type":"ContainerDied","Data":"87af6eef1dc0be4682b0bbfe7b485e49360e19dd3b48012cc6a0e79e02b6c3f8"} Mar 20 10:48:09 crc kubenswrapper[4748]: I0320 10:48:09.675561 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-z5ksw_4275e40d-41ca-4fe4-a44b-fe86f4d2e78b/kube-multus/0.log" Mar 20 10:48:09 crc kubenswrapper[4748]: I0320 10:48:09.675587 4748 generic.go:334] "Generic (PLEG): container finished" podID="4275e40d-41ca-4fe4-a44b-fe86f4d2e78b" containerID="6d7da9e84325d6db16238d0a74d6557fb637f246413b37115035e4fd0aa8eaa9" exitCode=2 Mar 20 10:48:09 crc kubenswrapper[4748]: I0320 10:48:09.675601 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-z5ksw" event={"ID":"4275e40d-41ca-4fe4-a44b-fe86f4d2e78b","Type":"ContainerDied","Data":"6d7da9e84325d6db16238d0a74d6557fb637f246413b37115035e4fd0aa8eaa9"} Mar 20 10:48:09 crc kubenswrapper[4748]: I0320 10:48:09.676234 4748 scope.go:117] "RemoveContainer" containerID="6d7da9e84325d6db16238d0a74d6557fb637f246413b37115035e4fd0aa8eaa9" Mar 20 10:48:09 crc kubenswrapper[4748]: I0320 10:48:09.808755 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xdzb8_f31addae-43ae-459d-bf9d-b5c0ee58faba/ovn-acl-logging/0.log" Mar 20 10:48:09 crc kubenswrapper[4748]: I0320 10:48:09.809328 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xdzb8_f31addae-43ae-459d-bf9d-b5c0ee58faba/ovn-controller/0.log" Mar 20 10:48:09 crc kubenswrapper[4748]: I0320 10:48:09.809771 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-xdzb8" Mar 20 10:48:09 crc kubenswrapper[4748]: I0320 10:48:09.847755 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f31addae-43ae-459d-bf9d-b5c0ee58faba-node-log\") pod \"f31addae-43ae-459d-bf9d-b5c0ee58faba\" (UID: \"f31addae-43ae-459d-bf9d-b5c0ee58faba\") " Mar 20 10:48:09 crc kubenswrapper[4748]: I0320 10:48:09.847816 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f31addae-43ae-459d-bf9d-b5c0ee58faba-node-log" (OuterVolumeSpecName: "node-log") pod "f31addae-43ae-459d-bf9d-b5c0ee58faba" (UID: "f31addae-43ae-459d-bf9d-b5c0ee58faba"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 10:48:09 crc kubenswrapper[4748]: I0320 10:48:09.848186 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f31addae-43ae-459d-bf9d-b5c0ee58faba-ovnkube-script-lib\") pod \"f31addae-43ae-459d-bf9d-b5c0ee58faba\" (UID: \"f31addae-43ae-459d-bf9d-b5c0ee58faba\") " Mar 20 10:48:09 crc kubenswrapper[4748]: I0320 10:48:09.848267 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f31addae-43ae-459d-bf9d-b5c0ee58faba-host-cni-bin\") pod \"f31addae-43ae-459d-bf9d-b5c0ee58faba\" (UID: \"f31addae-43ae-459d-bf9d-b5c0ee58faba\") " Mar 20 10:48:09 crc kubenswrapper[4748]: I0320 10:48:09.848317 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f31addae-43ae-459d-bf9d-b5c0ee58faba-host-var-lib-cni-networks-ovn-kubernetes\") pod \"f31addae-43ae-459d-bf9d-b5c0ee58faba\" (UID: \"f31addae-43ae-459d-bf9d-b5c0ee58faba\") " Mar 20 10:48:09 crc kubenswrapper[4748]: I0320 10:48:09.848341 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f31addae-43ae-459d-bf9d-b5c0ee58faba-ovnkube-config\") pod \"f31addae-43ae-459d-bf9d-b5c0ee58faba\" (UID: \"f31addae-43ae-459d-bf9d-b5c0ee58faba\") " Mar 20 10:48:09 crc kubenswrapper[4748]: I0320 10:48:09.848494 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f31addae-43ae-459d-bf9d-b5c0ee58faba-host-run-netns\") pod \"f31addae-43ae-459d-bf9d-b5c0ee58faba\" (UID: \"f31addae-43ae-459d-bf9d-b5c0ee58faba\") " Mar 20 10:48:09 crc kubenswrapper[4748]: I0320 10:48:09.848527 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ndt8m\" (UniqueName: \"kubernetes.io/projected/f31addae-43ae-459d-bf9d-b5c0ee58faba-kube-api-access-ndt8m\") pod \"f31addae-43ae-459d-bf9d-b5c0ee58faba\" (UID: \"f31addae-43ae-459d-bf9d-b5c0ee58faba\") " Mar 20 10:48:09 crc kubenswrapper[4748]: I0320 10:48:09.848547 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f31addae-43ae-459d-bf9d-b5c0ee58faba-run-openvswitch\") pod \"f31addae-43ae-459d-bf9d-b5c0ee58faba\" (UID: \"f31addae-43ae-459d-bf9d-b5c0ee58faba\") " Mar 20 10:48:09 crc kubenswrapper[4748]: I0320 10:48:09.848570 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f31addae-43ae-459d-bf9d-b5c0ee58faba-env-overrides\") pod \"f31addae-43ae-459d-bf9d-b5c0ee58faba\" (UID: \"f31addae-43ae-459d-bf9d-b5c0ee58faba\") " Mar 20 10:48:09 crc kubenswrapper[4748]: I0320 10:48:09.848604 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f31addae-43ae-459d-bf9d-b5c0ee58faba-host-run-ovn-kubernetes\") pod \"f31addae-43ae-459d-bf9d-b5c0ee58faba\" (UID: \"f31addae-43ae-459d-bf9d-b5c0ee58faba\") " Mar 20 10:48:09 crc kubenswrapper[4748]: I0320 10:48:09.848622 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f31addae-43ae-459d-bf9d-b5c0ee58faba-etc-openvswitch\") pod \"f31addae-43ae-459d-bf9d-b5c0ee58faba\" (UID: \"f31addae-43ae-459d-bf9d-b5c0ee58faba\") " Mar 20 10:48:09 crc kubenswrapper[4748]: I0320 10:48:09.848643 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f31addae-43ae-459d-bf9d-b5c0ee58faba-host-kubelet\") pod \"f31addae-43ae-459d-bf9d-b5c0ee58faba\" (UID: \"f31addae-43ae-459d-bf9d-b5c0ee58faba\") " Mar 20 10:48:09 crc kubenswrapper[4748]: I0320 10:48:09.848682 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f31addae-43ae-459d-bf9d-b5c0ee58faba-systemd-units\") pod \"f31addae-43ae-459d-bf9d-b5c0ee58faba\" (UID: \"f31addae-43ae-459d-bf9d-b5c0ee58faba\") " Mar 20 10:48:09 crc kubenswrapper[4748]: I0320 10:48:09.848719 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f31addae-43ae-459d-bf9d-b5c0ee58faba-host-cni-netd\") pod \"f31addae-43ae-459d-bf9d-b5c0ee58faba\" (UID: \"f31addae-43ae-459d-bf9d-b5c0ee58faba\") " Mar 20 10:48:09 crc kubenswrapper[4748]: I0320 10:48:09.848740 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f31addae-43ae-459d-bf9d-b5c0ee58faba-host-slash\") pod \"f31addae-43ae-459d-bf9d-b5c0ee58faba\" (UID: \"f31addae-43ae-459d-bf9d-b5c0ee58faba\") " Mar 20 10:48:09 crc kubenswrapper[4748]: I0320 10:48:09.848762 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f31addae-43ae-459d-bf9d-b5c0ee58faba-run-systemd\") pod \"f31addae-43ae-459d-bf9d-b5c0ee58faba\" (UID: \"f31addae-43ae-459d-bf9d-b5c0ee58faba\") " Mar 20 10:48:09 crc kubenswrapper[4748]: I0320 10:48:09.848798 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f31addae-43ae-459d-bf9d-b5c0ee58faba-run-ovn\") pod \"f31addae-43ae-459d-bf9d-b5c0ee58faba\" (UID: \"f31addae-43ae-459d-bf9d-b5c0ee58faba\") " Mar 20 10:48:09 crc kubenswrapper[4748]: I0320 10:48:09.848826 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f31addae-43ae-459d-bf9d-b5c0ee58faba-log-socket\") pod \"f31addae-43ae-459d-bf9d-b5c0ee58faba\" (UID: \"f31addae-43ae-459d-bf9d-b5c0ee58faba\") " Mar 20 10:48:09 crc kubenswrapper[4748]: I0320 10:48:09.848883 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f31addae-43ae-459d-bf9d-b5c0ee58faba-ovn-node-metrics-cert\") pod \"f31addae-43ae-459d-bf9d-b5c0ee58faba\" (UID: \"f31addae-43ae-459d-bf9d-b5c0ee58faba\") " Mar 20 10:48:09 crc kubenswrapper[4748]: I0320 10:48:09.848915 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f31addae-43ae-459d-bf9d-b5c0ee58faba-var-lib-openvswitch\") pod \"f31addae-43ae-459d-bf9d-b5c0ee58faba\" (UID: \"f31addae-43ae-459d-bf9d-b5c0ee58faba\") " Mar 20 10:48:09 crc kubenswrapper[4748]: I0320 10:48:09.848444 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f31addae-43ae-459d-bf9d-b5c0ee58faba-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "f31addae-43ae-459d-bf9d-b5c0ee58faba" (UID: "f31addae-43ae-459d-bf9d-b5c0ee58faba"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 10:48:09 crc kubenswrapper[4748]: I0320 10:48:09.848441 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f31addae-43ae-459d-bf9d-b5c0ee58faba-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "f31addae-43ae-459d-bf9d-b5c0ee58faba" (UID: "f31addae-43ae-459d-bf9d-b5c0ee58faba"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 10:48:09 crc kubenswrapper[4748]: I0320 10:48:09.848900 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f31addae-43ae-459d-bf9d-b5c0ee58faba-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "f31addae-43ae-459d-bf9d-b5c0ee58faba" (UID: "f31addae-43ae-459d-bf9d-b5c0ee58faba"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:48:09 crc kubenswrapper[4748]: I0320 10:48:09.848938 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f31addae-43ae-459d-bf9d-b5c0ee58faba-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "f31addae-43ae-459d-bf9d-b5c0ee58faba" (UID: "f31addae-43ae-459d-bf9d-b5c0ee58faba"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 10:48:09 crc kubenswrapper[4748]: I0320 10:48:09.849000 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f31addae-43ae-459d-bf9d-b5c0ee58faba-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "f31addae-43ae-459d-bf9d-b5c0ee58faba" (UID: "f31addae-43ae-459d-bf9d-b5c0ee58faba"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 10:48:09 crc kubenswrapper[4748]: I0320 10:48:09.849039 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f31addae-43ae-459d-bf9d-b5c0ee58faba-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "f31addae-43ae-459d-bf9d-b5c0ee58faba" (UID: "f31addae-43ae-459d-bf9d-b5c0ee58faba"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:48:09 crc kubenswrapper[4748]: I0320 10:48:09.849108 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f31addae-43ae-459d-bf9d-b5c0ee58faba-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "f31addae-43ae-459d-bf9d-b5c0ee58faba" (UID: "f31addae-43ae-459d-bf9d-b5c0ee58faba"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 10:48:09 crc kubenswrapper[4748]: I0320 10:48:09.849181 4748 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f31addae-43ae-459d-bf9d-b5c0ee58faba-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 20 10:48:09 crc kubenswrapper[4748]: I0320 10:48:09.849196 4748 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f31addae-43ae-459d-bf9d-b5c0ee58faba-node-log\") on node \"crc\" DevicePath \"\"" Mar 20 10:48:09 crc kubenswrapper[4748]: I0320 10:48:09.849208 4748 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f31addae-43ae-459d-bf9d-b5c0ee58faba-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 20 10:48:09 crc kubenswrapper[4748]: I0320 10:48:09.849220 4748 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f31addae-43ae-459d-bf9d-b5c0ee58faba-host-cni-bin\") on node \"crc\" DevicePath \"\"" Mar 20 10:48:09 crc kubenswrapper[4748]: I0320 10:48:09.849233 4748 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f31addae-43ae-459d-bf9d-b5c0ee58faba-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 20 10:48:09 crc kubenswrapper[4748]: I0320 10:48:09.849247 4748 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f31addae-43ae-459d-bf9d-b5c0ee58faba-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:48:09 crc kubenswrapper[4748]: I0320 10:48:09.849279 4748 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f31addae-43ae-459d-bf9d-b5c0ee58faba-host-run-netns\") on node \"crc\" DevicePath \"\"" Mar 20 10:48:09 crc kubenswrapper[4748]: I0320 10:48:09.849293 4748 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f31addae-43ae-459d-bf9d-b5c0ee58faba-host-kubelet\") on node \"crc\" DevicePath \"\"" Mar 20 10:48:09 crc kubenswrapper[4748]: I0320 10:48:09.849311 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f31addae-43ae-459d-bf9d-b5c0ee58faba-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "f31addae-43ae-459d-bf9d-b5c0ee58faba" (UID: "f31addae-43ae-459d-bf9d-b5c0ee58faba"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 10:48:09 crc kubenswrapper[4748]: I0320 10:48:09.849323 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f31addae-43ae-459d-bf9d-b5c0ee58faba-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "f31addae-43ae-459d-bf9d-b5c0ee58faba" (UID: "f31addae-43ae-459d-bf9d-b5c0ee58faba"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 10:48:09 crc kubenswrapper[4748]: I0320 10:48:09.849349 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f31addae-43ae-459d-bf9d-b5c0ee58faba-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "f31addae-43ae-459d-bf9d-b5c0ee58faba" (UID: "f31addae-43ae-459d-bf9d-b5c0ee58faba"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 10:48:09 crc kubenswrapper[4748]: I0320 10:48:09.849353 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f31addae-43ae-459d-bf9d-b5c0ee58faba-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "f31addae-43ae-459d-bf9d-b5c0ee58faba" (UID: "f31addae-43ae-459d-bf9d-b5c0ee58faba"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 10:48:09 crc kubenswrapper[4748]: I0320 10:48:09.849375 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f31addae-43ae-459d-bf9d-b5c0ee58faba-host-slash" (OuterVolumeSpecName: "host-slash") pod "f31addae-43ae-459d-bf9d-b5c0ee58faba" (UID: "f31addae-43ae-459d-bf9d-b5c0ee58faba"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 10:48:09 crc kubenswrapper[4748]: I0320 10:48:09.849380 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f31addae-43ae-459d-bf9d-b5c0ee58faba-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "f31addae-43ae-459d-bf9d-b5c0ee58faba" (UID: "f31addae-43ae-459d-bf9d-b5c0ee58faba"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 10:48:09 crc kubenswrapper[4748]: I0320 10:48:09.849398 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f31addae-43ae-459d-bf9d-b5c0ee58faba-log-socket" (OuterVolumeSpecName: "log-socket") pod "f31addae-43ae-459d-bf9d-b5c0ee58faba" (UID: "f31addae-43ae-459d-bf9d-b5c0ee58faba"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 10:48:09 crc kubenswrapper[4748]: I0320 10:48:09.849471 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f31addae-43ae-459d-bf9d-b5c0ee58faba-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "f31addae-43ae-459d-bf9d-b5c0ee58faba" (UID: "f31addae-43ae-459d-bf9d-b5c0ee58faba"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:48:09 crc kubenswrapper[4748]: I0320 10:48:09.849525 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f31addae-43ae-459d-bf9d-b5c0ee58faba-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "f31addae-43ae-459d-bf9d-b5c0ee58faba" (UID: "f31addae-43ae-459d-bf9d-b5c0ee58faba"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 10:48:09 crc kubenswrapper[4748]: I0320 10:48:09.856589 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f31addae-43ae-459d-bf9d-b5c0ee58faba-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "f31addae-43ae-459d-bf9d-b5c0ee58faba" (UID: "f31addae-43ae-459d-bf9d-b5c0ee58faba"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:48:09 crc kubenswrapper[4748]: I0320 10:48:09.856919 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f31addae-43ae-459d-bf9d-b5c0ee58faba-kube-api-access-ndt8m" (OuterVolumeSpecName: "kube-api-access-ndt8m") pod "f31addae-43ae-459d-bf9d-b5c0ee58faba" (UID: "f31addae-43ae-459d-bf9d-b5c0ee58faba"). InnerVolumeSpecName "kube-api-access-ndt8m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:48:09 crc kubenswrapper[4748]: I0320 10:48:09.865618 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-997hk"] Mar 20 10:48:09 crc kubenswrapper[4748]: E0320 10:48:09.866050 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f31addae-43ae-459d-bf9d-b5c0ee58faba" containerName="nbdb" Mar 20 10:48:09 crc kubenswrapper[4748]: I0320 10:48:09.866077 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="f31addae-43ae-459d-bf9d-b5c0ee58faba" containerName="nbdb" Mar 20 10:48:09 crc kubenswrapper[4748]: E0320 10:48:09.866096 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f31addae-43ae-459d-bf9d-b5c0ee58faba" containerName="ovnkube-controller" Mar 20 10:48:09 crc kubenswrapper[4748]: I0320 10:48:09.866105 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="f31addae-43ae-459d-bf9d-b5c0ee58faba" containerName="ovnkube-controller" Mar 20 10:48:09 crc kubenswrapper[4748]: E0320 10:48:09.866115 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f31addae-43ae-459d-bf9d-b5c0ee58faba" containerName="ovn-controller" Mar 20 10:48:09 crc kubenswrapper[4748]: I0320 10:48:09.866127 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="f31addae-43ae-459d-bf9d-b5c0ee58faba" containerName="ovn-controller" Mar 20 10:48:09 crc kubenswrapper[4748]: E0320 10:48:09.866135 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f31addae-43ae-459d-bf9d-b5c0ee58faba" containerName="ovn-acl-logging" Mar 20 10:48:09 crc kubenswrapper[4748]: I0320 10:48:09.866143 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="f31addae-43ae-459d-bf9d-b5c0ee58faba" containerName="ovn-acl-logging" Mar 20 10:48:09 crc kubenswrapper[4748]: E0320 10:48:09.866159 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f31addae-43ae-459d-bf9d-b5c0ee58faba" containerName="kubecfg-setup" Mar 20 10:48:09 crc kubenswrapper[4748]: I0320 10:48:09.866167 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="f31addae-43ae-459d-bf9d-b5c0ee58faba" containerName="kubecfg-setup" Mar 20 10:48:09 crc kubenswrapper[4748]: E0320 10:48:09.866181 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f31addae-43ae-459d-bf9d-b5c0ee58faba" containerName="kube-rbac-proxy-node" Mar 20 10:48:09 crc kubenswrapper[4748]: I0320 10:48:09.866189 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="f31addae-43ae-459d-bf9d-b5c0ee58faba" containerName="kube-rbac-proxy-node" Mar 20 10:48:09 crc kubenswrapper[4748]: E0320 10:48:09.866201 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f31addae-43ae-459d-bf9d-b5c0ee58faba" containerName="sbdb" Mar 20 10:48:09 crc kubenswrapper[4748]: I0320 10:48:09.866210 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="f31addae-43ae-459d-bf9d-b5c0ee58faba" containerName="sbdb" Mar 20 10:48:09 crc kubenswrapper[4748]: E0320 10:48:09.866223 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f31addae-43ae-459d-bf9d-b5c0ee58faba" containerName="kube-rbac-proxy-ovn-metrics" Mar 20 10:48:09 crc kubenswrapper[4748]: I0320 10:48:09.866231 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="f31addae-43ae-459d-bf9d-b5c0ee58faba" containerName="kube-rbac-proxy-ovn-metrics" Mar 20 10:48:09 crc kubenswrapper[4748]: E0320 10:48:09.866242 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f31addae-43ae-459d-bf9d-b5c0ee58faba" containerName="northd" Mar 20 10:48:09 crc kubenswrapper[4748]: I0320 10:48:09.866251 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="f31addae-43ae-459d-bf9d-b5c0ee58faba" containerName="northd" Mar 20 10:48:09 crc kubenswrapper[4748]: E0320 10:48:09.866263 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f04b7943-417c-40b2-bc8f-a8a7ed916a47" containerName="oc" Mar 20 10:48:09 crc kubenswrapper[4748]: I0320 10:48:09.866272 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="f04b7943-417c-40b2-bc8f-a8a7ed916a47" containerName="oc" Mar 20 10:48:09 crc kubenswrapper[4748]: I0320 10:48:09.866399 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="f04b7943-417c-40b2-bc8f-a8a7ed916a47" containerName="oc" Mar 20 10:48:09 crc kubenswrapper[4748]: I0320 10:48:09.866421 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="f31addae-43ae-459d-bf9d-b5c0ee58faba" containerName="sbdb" Mar 20 10:48:09 crc kubenswrapper[4748]: I0320 10:48:09.866440 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="f31addae-43ae-459d-bf9d-b5c0ee58faba" containerName="kube-rbac-proxy-node" Mar 20 10:48:09 crc kubenswrapper[4748]: I0320 10:48:09.866457 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="f31addae-43ae-459d-bf9d-b5c0ee58faba" containerName="ovn-controller" Mar 20 10:48:09 crc kubenswrapper[4748]: I0320 10:48:09.866469 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="f31addae-43ae-459d-bf9d-b5c0ee58faba" containerName="ovnkube-controller" Mar 20 10:48:09 crc kubenswrapper[4748]: I0320 10:48:09.866483 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="f31addae-43ae-459d-bf9d-b5c0ee58faba" containerName="northd" Mar 20 10:48:09 crc kubenswrapper[4748]: I0320 10:48:09.866496 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="f31addae-43ae-459d-bf9d-b5c0ee58faba" containerName="kube-rbac-proxy-ovn-metrics" Mar 20 10:48:09 crc kubenswrapper[4748]: I0320 10:48:09.866511 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="f31addae-43ae-459d-bf9d-b5c0ee58faba" containerName="ovn-acl-logging" Mar 20 10:48:09 crc kubenswrapper[4748]: I0320 10:48:09.866523 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="f31addae-43ae-459d-bf9d-b5c0ee58faba" containerName="nbdb" Mar 20 10:48:09 crc kubenswrapper[4748]: I0320 10:48:09.872171 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-997hk" Mar 20 10:48:09 crc kubenswrapper[4748]: I0320 10:48:09.880505 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f31addae-43ae-459d-bf9d-b5c0ee58faba-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "f31addae-43ae-459d-bf9d-b5c0ee58faba" (UID: "f31addae-43ae-459d-bf9d-b5c0ee58faba"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 10:48:09 crc kubenswrapper[4748]: I0320 10:48:09.950221 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/bfc6a8bc-4449-4770-8d97-537c2e2a450b-run-systemd\") pod \"ovnkube-node-997hk\" (UID: \"bfc6a8bc-4449-4770-8d97-537c2e2a450b\") " pod="openshift-ovn-kubernetes/ovnkube-node-997hk" Mar 20 10:48:09 crc kubenswrapper[4748]: I0320 10:48:09.950370 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/bfc6a8bc-4449-4770-8d97-537c2e2a450b-systemd-units\") pod \"ovnkube-node-997hk\" (UID: \"bfc6a8bc-4449-4770-8d97-537c2e2a450b\") " pod="openshift-ovn-kubernetes/ovnkube-node-997hk" Mar 20 10:48:09 crc kubenswrapper[4748]: I0320 10:48:09.950400 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/bfc6a8bc-4449-4770-8d97-537c2e2a450b-host-run-netns\") pod \"ovnkube-node-997hk\" (UID: \"bfc6a8bc-4449-4770-8d97-537c2e2a450b\") " pod="openshift-ovn-kubernetes/ovnkube-node-997hk" Mar 20 10:48:09 crc kubenswrapper[4748]: I0320 10:48:09.950433 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/bfc6a8bc-4449-4770-8d97-537c2e2a450b-host-cni-netd\") pod \"ovnkube-node-997hk\" (UID: \"bfc6a8bc-4449-4770-8d97-537c2e2a450b\") " pod="openshift-ovn-kubernetes/ovnkube-node-997hk" Mar 20 10:48:09 crc kubenswrapper[4748]: I0320 10:48:09.950458 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/bfc6a8bc-4449-4770-8d97-537c2e2a450b-host-cni-bin\") pod \"ovnkube-node-997hk\" (UID: \"bfc6a8bc-4449-4770-8d97-537c2e2a450b\") " pod="openshift-ovn-kubernetes/ovnkube-node-997hk" Mar 20 10:48:09 crc kubenswrapper[4748]: I0320 10:48:09.950478 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bfc6a8bc-4449-4770-8d97-537c2e2a450b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-997hk\" (UID: \"bfc6a8bc-4449-4770-8d97-537c2e2a450b\") " pod="openshift-ovn-kubernetes/ovnkube-node-997hk" Mar 20 10:48:09 crc kubenswrapper[4748]: I0320 10:48:09.950509 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/bfc6a8bc-4449-4770-8d97-537c2e2a450b-host-kubelet\") pod \"ovnkube-node-997hk\" (UID: \"bfc6a8bc-4449-4770-8d97-537c2e2a450b\") " pod="openshift-ovn-kubernetes/ovnkube-node-997hk" Mar 20 10:48:09 crc kubenswrapper[4748]: I0320 10:48:09.950530 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/bfc6a8bc-4449-4770-8d97-537c2e2a450b-node-log\") pod \"ovnkube-node-997hk\" (UID: \"bfc6a8bc-4449-4770-8d97-537c2e2a450b\") " pod="openshift-ovn-kubernetes/ovnkube-node-997hk" Mar 20 10:48:09 crc kubenswrapper[4748]: I0320 10:48:09.950552 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrhq9\" (UniqueName: \"kubernetes.io/projected/bfc6a8bc-4449-4770-8d97-537c2e2a450b-kube-api-access-vrhq9\") pod \"ovnkube-node-997hk\" (UID: \"bfc6a8bc-4449-4770-8d97-537c2e2a450b\") " pod="openshift-ovn-kubernetes/ovnkube-node-997hk" Mar 20 10:48:09 crc kubenswrapper[4748]: I0320 10:48:09.950575 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/bfc6a8bc-4449-4770-8d97-537c2e2a450b-log-socket\") pod \"ovnkube-node-997hk\" (UID: \"bfc6a8bc-4449-4770-8d97-537c2e2a450b\") " pod="openshift-ovn-kubernetes/ovnkube-node-997hk" Mar 20 10:48:09 crc kubenswrapper[4748]: I0320 10:48:09.950598 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/bfc6a8bc-4449-4770-8d97-537c2e2a450b-run-ovn\") pod \"ovnkube-node-997hk\" (UID: \"bfc6a8bc-4449-4770-8d97-537c2e2a450b\") " pod="openshift-ovn-kubernetes/ovnkube-node-997hk" Mar 20 10:48:09 crc kubenswrapper[4748]: I0320 10:48:09.950617 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/bfc6a8bc-4449-4770-8d97-537c2e2a450b-env-overrides\") pod \"ovnkube-node-997hk\" (UID: \"bfc6a8bc-4449-4770-8d97-537c2e2a450b\") " pod="openshift-ovn-kubernetes/ovnkube-node-997hk" Mar 20 10:48:09 crc kubenswrapper[4748]: I0320 10:48:09.950642 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bfc6a8bc-4449-4770-8d97-537c2e2a450b-etc-openvswitch\") pod \"ovnkube-node-997hk\" (UID: \"bfc6a8bc-4449-4770-8d97-537c2e2a450b\") " pod="openshift-ovn-kubernetes/ovnkube-node-997hk" Mar 20 10:48:09 crc kubenswrapper[4748]: I0320 10:48:09.950662 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/bfc6a8bc-4449-4770-8d97-537c2e2a450b-host-slash\") pod \"ovnkube-node-997hk\" (UID: \"bfc6a8bc-4449-4770-8d97-537c2e2a450b\") " pod="openshift-ovn-kubernetes/ovnkube-node-997hk" Mar 20 10:48:09 crc kubenswrapper[4748]: I0320 10:48:09.950701 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bfc6a8bc-4449-4770-8d97-537c2e2a450b-host-run-ovn-kubernetes\") pod \"ovnkube-node-997hk\" (UID: \"bfc6a8bc-4449-4770-8d97-537c2e2a450b\") " pod="openshift-ovn-kubernetes/ovnkube-node-997hk" Mar 20 10:48:09 crc kubenswrapper[4748]: I0320 10:48:09.950729 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/bfc6a8bc-4449-4770-8d97-537c2e2a450b-ovnkube-config\") pod \"ovnkube-node-997hk\" (UID: \"bfc6a8bc-4449-4770-8d97-537c2e2a450b\") " pod="openshift-ovn-kubernetes/ovnkube-node-997hk" Mar 20 10:48:09 crc kubenswrapper[4748]: I0320 10:48:09.950754 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/bfc6a8bc-4449-4770-8d97-537c2e2a450b-ovn-node-metrics-cert\") pod \"ovnkube-node-997hk\" (UID: \"bfc6a8bc-4449-4770-8d97-537c2e2a450b\") " pod="openshift-ovn-kubernetes/ovnkube-node-997hk" Mar 20 10:48:09 crc kubenswrapper[4748]: I0320 10:48:09.950782 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bfc6a8bc-4449-4770-8d97-537c2e2a450b-var-lib-openvswitch\") pod \"ovnkube-node-997hk\" (UID: \"bfc6a8bc-4449-4770-8d97-537c2e2a450b\") " pod="openshift-ovn-kubernetes/ovnkube-node-997hk" Mar 20 10:48:09 crc kubenswrapper[4748]: I0320 10:48:09.950805 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bfc6a8bc-4449-4770-8d97-537c2e2a450b-run-openvswitch\") pod \"ovnkube-node-997hk\" (UID: \"bfc6a8bc-4449-4770-8d97-537c2e2a450b\") " pod="openshift-ovn-kubernetes/ovnkube-node-997hk" Mar 20 10:48:09 crc kubenswrapper[4748]: I0320 10:48:09.950852 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/bfc6a8bc-4449-4770-8d97-537c2e2a450b-ovnkube-script-lib\") pod \"ovnkube-node-997hk\" (UID: \"bfc6a8bc-4449-4770-8d97-537c2e2a450b\") " pod="openshift-ovn-kubernetes/ovnkube-node-997hk" Mar 20 10:48:09 crc kubenswrapper[4748]: I0320 10:48:09.950903 4748 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f31addae-43ae-459d-bf9d-b5c0ee58faba-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 20 10:48:09 crc kubenswrapper[4748]: I0320 10:48:09.950917 4748 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f31addae-43ae-459d-bf9d-b5c0ee58faba-log-socket\") on node \"crc\" DevicePath \"\"" Mar 20 10:48:09 crc kubenswrapper[4748]: I0320 10:48:09.950930 4748 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f31addae-43ae-459d-bf9d-b5c0ee58faba-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:48:09 crc kubenswrapper[4748]: I0320 10:48:09.950943 4748 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f31addae-43ae-459d-bf9d-b5c0ee58faba-run-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 20 10:48:09 crc kubenswrapper[4748]: I0320 10:48:09.950955 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ndt8m\" (UniqueName: \"kubernetes.io/projected/f31addae-43ae-459d-bf9d-b5c0ee58faba-kube-api-access-ndt8m\") on node \"crc\" DevicePath \"\"" Mar 20 10:48:09 crc kubenswrapper[4748]: I0320 10:48:09.950966 4748 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f31addae-43ae-459d-bf9d-b5c0ee58faba-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 20 10:48:09 crc kubenswrapper[4748]: I0320 10:48:09.950978 4748 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f31addae-43ae-459d-bf9d-b5c0ee58faba-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 20 10:48:09 crc kubenswrapper[4748]: I0320 10:48:09.950989 4748 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f31addae-43ae-459d-bf9d-b5c0ee58faba-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 20 10:48:09 crc kubenswrapper[4748]: I0320 10:48:09.951000 4748 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f31addae-43ae-459d-bf9d-b5c0ee58faba-systemd-units\") on node \"crc\" DevicePath \"\"" Mar 20 10:48:09 crc kubenswrapper[4748]: I0320 10:48:09.951011 4748 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f31addae-43ae-459d-bf9d-b5c0ee58faba-host-slash\") on node \"crc\" DevicePath \"\"" Mar 20 10:48:09 crc kubenswrapper[4748]: I0320 10:48:09.951023 4748 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f31addae-43ae-459d-bf9d-b5c0ee58faba-host-cni-netd\") on node \"crc\" DevicePath \"\"" Mar 20 10:48:09 crc kubenswrapper[4748]: I0320 10:48:09.951034 4748 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f31addae-43ae-459d-bf9d-b5c0ee58faba-run-systemd\") on node \"crc\" DevicePath \"\"" Mar 20 10:48:10 crc kubenswrapper[4748]: I0320 10:48:10.052404 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bfc6a8bc-4449-4770-8d97-537c2e2a450b-host-run-ovn-kubernetes\") pod \"ovnkube-node-997hk\" (UID: \"bfc6a8bc-4449-4770-8d97-537c2e2a450b\") " pod="openshift-ovn-kubernetes/ovnkube-node-997hk" Mar 20 10:48:10 crc kubenswrapper[4748]: I0320 10:48:10.052467 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/bfc6a8bc-4449-4770-8d97-537c2e2a450b-ovnkube-config\") pod \"ovnkube-node-997hk\" (UID: \"bfc6a8bc-4449-4770-8d97-537c2e2a450b\") " pod="openshift-ovn-kubernetes/ovnkube-node-997hk" Mar 20 10:48:10 crc kubenswrapper[4748]: I0320 10:48:10.052495 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/bfc6a8bc-4449-4770-8d97-537c2e2a450b-ovn-node-metrics-cert\") pod \"ovnkube-node-997hk\" (UID: \"bfc6a8bc-4449-4770-8d97-537c2e2a450b\") " pod="openshift-ovn-kubernetes/ovnkube-node-997hk" Mar 20 10:48:10 crc kubenswrapper[4748]: I0320 10:48:10.052522 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bfc6a8bc-4449-4770-8d97-537c2e2a450b-var-lib-openvswitch\") pod \"ovnkube-node-997hk\" (UID: \"bfc6a8bc-4449-4770-8d97-537c2e2a450b\") " pod="openshift-ovn-kubernetes/ovnkube-node-997hk" Mar 20 10:48:10 crc kubenswrapper[4748]: I0320 10:48:10.052548 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bfc6a8bc-4449-4770-8d97-537c2e2a450b-run-openvswitch\") pod \"ovnkube-node-997hk\" (UID: \"bfc6a8bc-4449-4770-8d97-537c2e2a450b\") " pod="openshift-ovn-kubernetes/ovnkube-node-997hk" Mar 20 10:48:10 crc kubenswrapper[4748]: I0320 10:48:10.052639 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bfc6a8bc-4449-4770-8d97-537c2e2a450b-var-lib-openvswitch\") pod \"ovnkube-node-997hk\" (UID: \"bfc6a8bc-4449-4770-8d97-537c2e2a450b\") " pod="openshift-ovn-kubernetes/ovnkube-node-997hk" Mar 20 10:48:10 crc kubenswrapper[4748]: I0320 10:48:10.052723 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/bfc6a8bc-4449-4770-8d97-537c2e2a450b-ovnkube-script-lib\") pod \"ovnkube-node-997hk\" (UID: \"bfc6a8bc-4449-4770-8d97-537c2e2a450b\") " pod="openshift-ovn-kubernetes/ovnkube-node-997hk" Mar 20 10:48:10 crc kubenswrapper[4748]: I0320 10:48:10.052769 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bfc6a8bc-4449-4770-8d97-537c2e2a450b-run-openvswitch\") pod \"ovnkube-node-997hk\" (UID: \"bfc6a8bc-4449-4770-8d97-537c2e2a450b\") " pod="openshift-ovn-kubernetes/ovnkube-node-997hk" Mar 20 10:48:10 crc kubenswrapper[4748]: I0320 10:48:10.053437 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/bfc6a8bc-4449-4770-8d97-537c2e2a450b-run-systemd\") pod \"ovnkube-node-997hk\" (UID: \"bfc6a8bc-4449-4770-8d97-537c2e2a450b\") " pod="openshift-ovn-kubernetes/ovnkube-node-997hk" Mar 20 10:48:10 crc kubenswrapper[4748]: I0320 10:48:10.053629 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/bfc6a8bc-4449-4770-8d97-537c2e2a450b-systemd-units\") pod \"ovnkube-node-997hk\" (UID: \"bfc6a8bc-4449-4770-8d97-537c2e2a450b\") " pod="openshift-ovn-kubernetes/ovnkube-node-997hk" Mar 20 10:48:10 crc kubenswrapper[4748]: I0320 10:48:10.053771 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/bfc6a8bc-4449-4770-8d97-537c2e2a450b-host-run-netns\") pod \"ovnkube-node-997hk\" (UID: \"bfc6a8bc-4449-4770-8d97-537c2e2a450b\") " pod="openshift-ovn-kubernetes/ovnkube-node-997hk" Mar 20 10:48:10 crc kubenswrapper[4748]: I0320 10:48:10.053968 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/bfc6a8bc-4449-4770-8d97-537c2e2a450b-host-cni-netd\") pod \"ovnkube-node-997hk\" (UID: \"bfc6a8bc-4449-4770-8d97-537c2e2a450b\") " pod="openshift-ovn-kubernetes/ovnkube-node-997hk" Mar 20 10:48:10 crc kubenswrapper[4748]: I0320 10:48:10.054147 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/bfc6a8bc-4449-4770-8d97-537c2e2a450b-host-cni-bin\") pod \"ovnkube-node-997hk\" (UID: \"bfc6a8bc-4449-4770-8d97-537c2e2a450b\") " pod="openshift-ovn-kubernetes/ovnkube-node-997hk" Mar 20 10:48:10 crc kubenswrapper[4748]: I0320 10:48:10.054300 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bfc6a8bc-4449-4770-8d97-537c2e2a450b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-997hk\" (UID: \"bfc6a8bc-4449-4770-8d97-537c2e2a450b\") " pod="openshift-ovn-kubernetes/ovnkube-node-997hk" Mar 20 10:48:10 crc kubenswrapper[4748]: I0320 10:48:10.054477 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/bfc6a8bc-4449-4770-8d97-537c2e2a450b-host-kubelet\") pod \"ovnkube-node-997hk\" (UID: \"bfc6a8bc-4449-4770-8d97-537c2e2a450b\") " pod="openshift-ovn-kubernetes/ovnkube-node-997hk" Mar 20 10:48:10 crc kubenswrapper[4748]: I0320 10:48:10.054621 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/bfc6a8bc-4449-4770-8d97-537c2e2a450b-node-log\") pod \"ovnkube-node-997hk\" (UID: \"bfc6a8bc-4449-4770-8d97-537c2e2a450b\") " pod="openshift-ovn-kubernetes/ovnkube-node-997hk" Mar 20 10:48:10 crc kubenswrapper[4748]: I0320 10:48:10.054774 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vrhq9\" (UniqueName: \"kubernetes.io/projected/bfc6a8bc-4449-4770-8d97-537c2e2a450b-kube-api-access-vrhq9\") pod \"ovnkube-node-997hk\" (UID: \"bfc6a8bc-4449-4770-8d97-537c2e2a450b\") " pod="openshift-ovn-kubernetes/ovnkube-node-997hk" Mar 20 10:48:10 crc kubenswrapper[4748]: I0320 10:48:10.054963 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/bfc6a8bc-4449-4770-8d97-537c2e2a450b-log-socket\") pod \"ovnkube-node-997hk\" (UID: \"bfc6a8bc-4449-4770-8d97-537c2e2a450b\") " pod="openshift-ovn-kubernetes/ovnkube-node-997hk" Mar 20 10:48:10 crc kubenswrapper[4748]: I0320 10:48:10.054175 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/bfc6a8bc-4449-4770-8d97-537c2e2a450b-ovnkube-script-lib\") pod \"ovnkube-node-997hk\" (UID: \"bfc6a8bc-4449-4770-8d97-537c2e2a450b\") " pod="openshift-ovn-kubernetes/ovnkube-node-997hk" Mar 20 10:48:10 crc kubenswrapper[4748]: I0320 10:48:10.054191 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/bfc6a8bc-4449-4770-8d97-537c2e2a450b-host-cni-bin\") pod \"ovnkube-node-997hk\" (UID: \"bfc6a8bc-4449-4770-8d97-537c2e2a450b\") " pod="openshift-ovn-kubernetes/ovnkube-node-997hk" Mar 20 10:48:10 crc kubenswrapper[4748]: I0320 10:48:10.053643 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/bfc6a8bc-4449-4770-8d97-537c2e2a450b-ovnkube-config\") pod \"ovnkube-node-997hk\" (UID: \"bfc6a8bc-4449-4770-8d97-537c2e2a450b\") " pod="openshift-ovn-kubernetes/ovnkube-node-997hk" Mar 20 10:48:10 crc kubenswrapper[4748]: I0320 10:48:10.054359 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bfc6a8bc-4449-4770-8d97-537c2e2a450b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-997hk\" (UID: \"bfc6a8bc-4449-4770-8d97-537c2e2a450b\") " pod="openshift-ovn-kubernetes/ovnkube-node-997hk" Mar 20 10:48:10 crc kubenswrapper[4748]: I0320 10:48:10.053864 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/bfc6a8bc-4449-4770-8d97-537c2e2a450b-host-run-netns\") pod \"ovnkube-node-997hk\" (UID: \"bfc6a8bc-4449-4770-8d97-537c2e2a450b\") " pod="openshift-ovn-kubernetes/ovnkube-node-997hk" Mar 20 10:48:10 crc kubenswrapper[4748]: I0320 10:48:10.054515 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/bfc6a8bc-4449-4770-8d97-537c2e2a450b-host-kubelet\") pod \"ovnkube-node-997hk\" (UID: \"bfc6a8bc-4449-4770-8d97-537c2e2a450b\") " pod="openshift-ovn-kubernetes/ovnkube-node-997hk" Mar 20 10:48:10 crc kubenswrapper[4748]: I0320 10:48:10.053525 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/bfc6a8bc-4449-4770-8d97-537c2e2a450b-run-systemd\") pod \"ovnkube-node-997hk\" (UID: \"bfc6a8bc-4449-4770-8d97-537c2e2a450b\") " pod="openshift-ovn-kubernetes/ovnkube-node-997hk" Mar 20 10:48:10 crc kubenswrapper[4748]: I0320 10:48:10.054676 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/bfc6a8bc-4449-4770-8d97-537c2e2a450b-node-log\") pod \"ovnkube-node-997hk\" (UID: \"bfc6a8bc-4449-4770-8d97-537c2e2a450b\") " pod="openshift-ovn-kubernetes/ovnkube-node-997hk" Mar 20 10:48:10 crc kubenswrapper[4748]: I0320 10:48:10.054001 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/bfc6a8bc-4449-4770-8d97-537c2e2a450b-host-cni-netd\") pod \"ovnkube-node-997hk\" (UID: \"bfc6a8bc-4449-4770-8d97-537c2e2a450b\") " pod="openshift-ovn-kubernetes/ovnkube-node-997hk" Mar 20 10:48:10 crc kubenswrapper[4748]: I0320 10:48:10.053672 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/bfc6a8bc-4449-4770-8d97-537c2e2a450b-systemd-units\") pod \"ovnkube-node-997hk\" (UID: \"bfc6a8bc-4449-4770-8d97-537c2e2a450b\") " pod="openshift-ovn-kubernetes/ovnkube-node-997hk" Mar 20 10:48:10 crc kubenswrapper[4748]: I0320 10:48:10.055042 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/bfc6a8bc-4449-4770-8d97-537c2e2a450b-log-socket\") pod \"ovnkube-node-997hk\" (UID: \"bfc6a8bc-4449-4770-8d97-537c2e2a450b\") " pod="openshift-ovn-kubernetes/ovnkube-node-997hk" Mar 20 10:48:10 crc kubenswrapper[4748]: I0320 10:48:10.055293 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/bfc6a8bc-4449-4770-8d97-537c2e2a450b-run-ovn\") pod \"ovnkube-node-997hk\" (UID: \"bfc6a8bc-4449-4770-8d97-537c2e2a450b\") " pod="openshift-ovn-kubernetes/ovnkube-node-997hk" Mar 20 10:48:10 crc kubenswrapper[4748]: I0320 10:48:10.055117 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/bfc6a8bc-4449-4770-8d97-537c2e2a450b-run-ovn\") pod \"ovnkube-node-997hk\" (UID: \"bfc6a8bc-4449-4770-8d97-537c2e2a450b\") " pod="openshift-ovn-kubernetes/ovnkube-node-997hk" Mar 20 10:48:10 crc kubenswrapper[4748]: I0320 10:48:10.056037 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/bfc6a8bc-4449-4770-8d97-537c2e2a450b-env-overrides\") pod \"ovnkube-node-997hk\" (UID: \"bfc6a8bc-4449-4770-8d97-537c2e2a450b\") " pod="openshift-ovn-kubernetes/ovnkube-node-997hk" Mar 20 10:48:10 crc kubenswrapper[4748]: I0320 10:48:10.056193 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bfc6a8bc-4449-4770-8d97-537c2e2a450b-etc-openvswitch\") pod \"ovnkube-node-997hk\" (UID: \"bfc6a8bc-4449-4770-8d97-537c2e2a450b\") " pod="openshift-ovn-kubernetes/ovnkube-node-997hk" Mar 20 10:48:10 crc kubenswrapper[4748]: I0320 10:48:10.056308 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/bfc6a8bc-4449-4770-8d97-537c2e2a450b-ovn-node-metrics-cert\") pod \"ovnkube-node-997hk\" (UID: \"bfc6a8bc-4449-4770-8d97-537c2e2a450b\") " pod="openshift-ovn-kubernetes/ovnkube-node-997hk" Mar 20 10:48:10 crc kubenswrapper[4748]: I0320 10:48:10.056469 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/bfc6a8bc-4449-4770-8d97-537c2e2a450b-host-slash\") pod \"ovnkube-node-997hk\" (UID: \"bfc6a8bc-4449-4770-8d97-537c2e2a450b\") " pod="openshift-ovn-kubernetes/ovnkube-node-997hk" Mar 20 10:48:10 crc kubenswrapper[4748]: I0320 10:48:10.056729 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/bfc6a8bc-4449-4770-8d97-537c2e2a450b-env-overrides\") pod \"ovnkube-node-997hk\" (UID: \"bfc6a8bc-4449-4770-8d97-537c2e2a450b\") " pod="openshift-ovn-kubernetes/ovnkube-node-997hk" Mar 20 10:48:10 crc kubenswrapper[4748]: I0320 10:48:10.056782 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bfc6a8bc-4449-4770-8d97-537c2e2a450b-etc-openvswitch\") pod \"ovnkube-node-997hk\" (UID: \"bfc6a8bc-4449-4770-8d97-537c2e2a450b\") " pod="openshift-ovn-kubernetes/ovnkube-node-997hk" Mar 20 10:48:10 crc kubenswrapper[4748]: I0320 10:48:10.057093 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/bfc6a8bc-4449-4770-8d97-537c2e2a450b-host-slash\") pod \"ovnkube-node-997hk\" (UID: \"bfc6a8bc-4449-4770-8d97-537c2e2a450b\") " pod="openshift-ovn-kubernetes/ovnkube-node-997hk" Mar 20 10:48:10 crc kubenswrapper[4748]: I0320 10:48:10.057352 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bfc6a8bc-4449-4770-8d97-537c2e2a450b-host-run-ovn-kubernetes\") pod \"ovnkube-node-997hk\" (UID: \"bfc6a8bc-4449-4770-8d97-537c2e2a450b\") " pod="openshift-ovn-kubernetes/ovnkube-node-997hk" Mar 20 10:48:10 crc kubenswrapper[4748]: I0320 10:48:10.072472 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrhq9\" (UniqueName: \"kubernetes.io/projected/bfc6a8bc-4449-4770-8d97-537c2e2a450b-kube-api-access-vrhq9\") pod \"ovnkube-node-997hk\" (UID: \"bfc6a8bc-4449-4770-8d97-537c2e2a450b\") " pod="openshift-ovn-kubernetes/ovnkube-node-997hk" Mar 20 10:48:10 crc kubenswrapper[4748]: I0320 10:48:10.190958 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-997hk" Mar 20 10:48:10 crc kubenswrapper[4748]: I0320 10:48:10.568481 4748 scope.go:117] "RemoveContainer" containerID="05f6d172d8cbaa53d4393427574a4285c4c262cb2a0d2c352deda90a14cd7a44" Mar 20 10:48:10 crc kubenswrapper[4748]: I0320 10:48:10.603029 4748 scope.go:117] "RemoveContainer" containerID="8b874abcca50cad2b1c5a29e4d7254aea1e0233ad9280e327d03c34634402211" Mar 20 10:48:10 crc kubenswrapper[4748]: I0320 10:48:10.617698 4748 scope.go:117] "RemoveContainer" containerID="87af6eef1dc0be4682b0bbfe7b485e49360e19dd3b48012cc6a0e79e02b6c3f8" Mar 20 10:48:10 crc kubenswrapper[4748]: I0320 10:48:10.633032 4748 scope.go:117] "RemoveContainer" containerID="fe1f6bd9aee16c9e4c4607895576f28294888b4c1a29f8bf4b4a23bc5b35ec68" Mar 20 10:48:10 crc kubenswrapper[4748]: I0320 10:48:10.647756 4748 scope.go:117] "RemoveContainer" containerID="a22f1239a0ca652003eb95b617e5a6b84f13d70ac2dca8f1593c2fb3cc3ff87a" Mar 20 10:48:10 crc kubenswrapper[4748]: I0320 10:48:10.663628 4748 scope.go:117] "RemoveContainer" containerID="52213e8defefb79f069d768559f145776cd25ea532c71296687b5922c982adf4" Mar 20 10:48:10 crc kubenswrapper[4748]: I0320 10:48:10.678932 4748 scope.go:117] "RemoveContainer" containerID="f795435d75086ef32acedeee8f60354fd510f49f2df1bd3d522380ef8d12365f" Mar 20 10:48:10 crc kubenswrapper[4748]: I0320 10:48:10.685948 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-997hk" event={"ID":"bfc6a8bc-4449-4770-8d97-537c2e2a450b","Type":"ContainerStarted","Data":"42772536d496e9189df0275c820d10a5ea1c6fe5a5ed55fec873ecdd9ea22b9e"} Mar 20 10:48:10 crc kubenswrapper[4748]: I0320 10:48:10.689752 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xdzb8" event={"ID":"f31addae-43ae-459d-bf9d-b5c0ee58faba","Type":"ContainerDied","Data":"407b8a51d2008d9ca2a94f6c8b8c647c80a8508cff7c3681726790013119adb5"} Mar 20 10:48:10 crc kubenswrapper[4748]: I0320 10:48:10.689818 4748 scope.go:117] "RemoveContainer" containerID="0c85912fc1ac385cb91c20ee935596fe12cd390837609dbaec03b30f96d4168c" Mar 20 10:48:10 crc kubenswrapper[4748]: I0320 10:48:10.690055 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-xdzb8" Mar 20 10:48:11 crc kubenswrapper[4748]: I0320 10:48:11.002531 4748 scope.go:117] "RemoveContainer" containerID="de562d64e46c2b835774aef715b11ffb91bcd82bad3f663874d0ee232ea9b341" Mar 20 10:48:11 crc kubenswrapper[4748]: I0320 10:48:11.024727 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-z5ksw_4275e40d-41ca-4fe4-a44b-fe86f4d2e78b/kube-multus/0.log" Mar 20 10:48:11 crc kubenswrapper[4748]: I0320 10:48:11.024796 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-z5ksw" event={"ID":"4275e40d-41ca-4fe4-a44b-fe86f4d2e78b","Type":"ContainerStarted","Data":"2b95a44f75e6b2509aa89dc31d46d83ac57c24b4980480bb8893a96c378b0f50"} Mar 20 10:48:11 crc kubenswrapper[4748]: I0320 10:48:11.051333 4748 scope.go:117] "RemoveContainer" containerID="2bf2050b15cfd2fd678dca05462c6e8ce79672ef315106eb9fdc8fea4ab5c0a6" Mar 20 10:48:11 crc kubenswrapper[4748]: I0320 10:48:11.051488 4748 scope.go:117] "RemoveContainer" containerID="0c85912fc1ac385cb91c20ee935596fe12cd390837609dbaec03b30f96d4168c" Mar 20 10:48:11 crc kubenswrapper[4748]: E0320 10:48:11.053932 4748 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c85912fc1ac385cb91c20ee935596fe12cd390837609dbaec03b30f96d4168c\": container with ID starting with 0c85912fc1ac385cb91c20ee935596fe12cd390837609dbaec03b30f96d4168c not found: ID does not exist" containerID="0c85912fc1ac385cb91c20ee935596fe12cd390837609dbaec03b30f96d4168c" Mar 20 10:48:11 crc kubenswrapper[4748]: E0320 10:48:11.053975 4748 kuberuntime_gc.go:150] "Failed to remove container" err="failed to get container status \"0c85912fc1ac385cb91c20ee935596fe12cd390837609dbaec03b30f96d4168c\": rpc error: code = NotFound desc = could not find container \"0c85912fc1ac385cb91c20ee935596fe12cd390837609dbaec03b30f96d4168c\": container with ID starting with 0c85912fc1ac385cb91c20ee935596fe12cd390837609dbaec03b30f96d4168c not found: ID does not exist" containerID="0c85912fc1ac385cb91c20ee935596fe12cd390837609dbaec03b30f96d4168c" Mar 20 10:48:11 crc kubenswrapper[4748]: I0320 10:48:11.054005 4748 scope.go:117] "RemoveContainer" containerID="2bf2050b15cfd2fd678dca05462c6e8ce79672ef315106eb9fdc8fea4ab5c0a6" Mar 20 10:48:11 crc kubenswrapper[4748]: I0320 10:48:11.091128 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-xdzb8"] Mar 20 10:48:11 crc kubenswrapper[4748]: I0320 10:48:11.094566 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-xdzb8"] Mar 20 10:48:11 crc kubenswrapper[4748]: E0320 10:48:11.122554 4748 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_northd_ovnkube-node-xdzb8_openshift-ovn-kubernetes_f31addae-43ae-459d-bf9d-b5c0ee58faba_0 in pod sandbox 407b8a51d2008d9ca2a94f6c8b8c647c80a8508cff7c3681726790013119adb5 from index: no such id: '2bf2050b15cfd2fd678dca05462c6e8ce79672ef315106eb9fdc8fea4ab5c0a6'" containerID="2bf2050b15cfd2fd678dca05462c6e8ce79672ef315106eb9fdc8fea4ab5c0a6" Mar 20 10:48:11 crc kubenswrapper[4748]: I0320 10:48:11.122583 4748 scope.go:117] "RemoveContainer" containerID="f795435d75086ef32acedeee8f60354fd510f49f2df1bd3d522380ef8d12365f" Mar 20 10:48:11 crc kubenswrapper[4748]: E0320 10:48:11.122615 4748 kuberuntime_gc.go:150] "Failed to remove container" err="rpc error: code = Unknown desc = failed to delete container k8s_northd_ovnkube-node-xdzb8_openshift-ovn-kubernetes_f31addae-43ae-459d-bf9d-b5c0ee58faba_0 in pod sandbox 407b8a51d2008d9ca2a94f6c8b8c647c80a8508cff7c3681726790013119adb5 from index: no such id: '2bf2050b15cfd2fd678dca05462c6e8ce79672ef315106eb9fdc8fea4ab5c0a6'" containerID="2bf2050b15cfd2fd678dca05462c6e8ce79672ef315106eb9fdc8fea4ab5c0a6" Mar 20 10:48:11 crc kubenswrapper[4748]: E0320 10:48:11.123106 4748 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f795435d75086ef32acedeee8f60354fd510f49f2df1bd3d522380ef8d12365f\": container with ID starting with f795435d75086ef32acedeee8f60354fd510f49f2df1bd3d522380ef8d12365f not found: ID does not exist" containerID="f795435d75086ef32acedeee8f60354fd510f49f2df1bd3d522380ef8d12365f" Mar 20 10:48:11 crc kubenswrapper[4748]: I0320 10:48:11.123144 4748 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f795435d75086ef32acedeee8f60354fd510f49f2df1bd3d522380ef8d12365f"} err="failed to get container status \"f795435d75086ef32acedeee8f60354fd510f49f2df1bd3d522380ef8d12365f\": rpc error: code = NotFound desc = could not find container \"f795435d75086ef32acedeee8f60354fd510f49f2df1bd3d522380ef8d12365f\": container with ID starting with f795435d75086ef32acedeee8f60354fd510f49f2df1bd3d522380ef8d12365f not found: ID does not exist" Mar 20 10:48:11 crc kubenswrapper[4748]: I0320 10:48:11.123169 4748 scope.go:117] "RemoveContainer" containerID="de562d64e46c2b835774aef715b11ffb91bcd82bad3f663874d0ee232ea9b341" Mar 20 10:48:11 crc kubenswrapper[4748]: E0320 10:48:11.123465 4748 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de562d64e46c2b835774aef715b11ffb91bcd82bad3f663874d0ee232ea9b341\": container with ID starting with de562d64e46c2b835774aef715b11ffb91bcd82bad3f663874d0ee232ea9b341 not found: ID does not exist" containerID="de562d64e46c2b835774aef715b11ffb91bcd82bad3f663874d0ee232ea9b341" Mar 20 10:48:11 crc kubenswrapper[4748]: I0320 10:48:11.123515 4748 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de562d64e46c2b835774aef715b11ffb91bcd82bad3f663874d0ee232ea9b341"} err="failed to get container status \"de562d64e46c2b835774aef715b11ffb91bcd82bad3f663874d0ee232ea9b341\": rpc error: code = NotFound desc = could not find container \"de562d64e46c2b835774aef715b11ffb91bcd82bad3f663874d0ee232ea9b341\": container with ID starting with de562d64e46c2b835774aef715b11ffb91bcd82bad3f663874d0ee232ea9b341 not found: ID does not exist" Mar 20 10:48:11 crc kubenswrapper[4748]: I0320 10:48:11.525789 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f31addae-43ae-459d-bf9d-b5c0ee58faba" path="/var/lib/kubelet/pods/f31addae-43ae-459d-bf9d-b5c0ee58faba/volumes" Mar 20 10:48:12 crc kubenswrapper[4748]: I0320 10:48:12.031751 4748 generic.go:334] "Generic (PLEG): container finished" podID="bfc6a8bc-4449-4770-8d97-537c2e2a450b" containerID="5c420188a6c28e21738d3b51453703e1f7b7e37b026e028b5a89312ce3fcdf5d" exitCode=0 Mar 20 10:48:12 crc kubenswrapper[4748]: I0320 10:48:12.031796 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-997hk" event={"ID":"bfc6a8bc-4449-4770-8d97-537c2e2a450b","Type":"ContainerDied","Data":"5c420188a6c28e21738d3b51453703e1f7b7e37b026e028b5a89312ce3fcdf5d"} Mar 20 10:48:12 crc kubenswrapper[4748]: I0320 10:48:12.929210 4748 patch_prober.go:28] interesting pod/machine-config-daemon-5lbvz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 10:48:12 crc kubenswrapper[4748]: I0320 10:48:12.929616 4748 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 10:48:12 crc kubenswrapper[4748]: I0320 10:48:12.929724 4748 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" Mar 20 10:48:12 crc kubenswrapper[4748]: I0320 10:48:12.930635 4748 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"261e51a50d63c2c75ffa3ccd3542b6a7465ac5ce52a44fe320cc511b3ec023ef"} pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 10:48:12 crc kubenswrapper[4748]: I0320 10:48:12.930763 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" containerName="machine-config-daemon" containerID="cri-o://261e51a50d63c2c75ffa3ccd3542b6a7465ac5ce52a44fe320cc511b3ec023ef" gracePeriod=600 Mar 20 10:48:13 crc kubenswrapper[4748]: I0320 10:48:13.041399 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-997hk" event={"ID":"bfc6a8bc-4449-4770-8d97-537c2e2a450b","Type":"ContainerStarted","Data":"b1f37d847daf760d4a179987dcfb670cdee3d772144234dd786a5b6bf9c13fd0"} Mar 20 10:48:13 crc kubenswrapper[4748]: I0320 10:48:13.041454 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-997hk" event={"ID":"bfc6a8bc-4449-4770-8d97-537c2e2a450b","Type":"ContainerStarted","Data":"250bb3a664567b797786823499b16bde4dd9d47b110cf4620927a3e34be903b8"} Mar 20 10:48:13 crc kubenswrapper[4748]: I0320 10:48:13.041467 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-997hk" event={"ID":"bfc6a8bc-4449-4770-8d97-537c2e2a450b","Type":"ContainerStarted","Data":"0c33ffc3fe4dc23e4049ae39426540c2ebef43053176935e01ceadb1e415fe00"} Mar 20 10:48:13 crc kubenswrapper[4748]: I0320 10:48:13.041478 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-997hk" event={"ID":"bfc6a8bc-4449-4770-8d97-537c2e2a450b","Type":"ContainerStarted","Data":"51ba389377f7c226120573ed74c5e3ee28c19345bd29bc2558762ee6b4c39759"} Mar 20 10:48:13 crc kubenswrapper[4748]: I0320 10:48:13.041494 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-997hk" event={"ID":"bfc6a8bc-4449-4770-8d97-537c2e2a450b","Type":"ContainerStarted","Data":"47c459df9cd366bba4ff8f724e321cdf3df165b0b5141b9215390f2e6d6f2752"} Mar 20 10:48:13 crc kubenswrapper[4748]: I0320 10:48:13.041505 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-997hk" event={"ID":"bfc6a8bc-4449-4770-8d97-537c2e2a450b","Type":"ContainerStarted","Data":"01d701b5fdca2b74c34452bcd0eb9fd24ac39529c54723199c4c445482afd929"} Mar 20 10:48:14 crc kubenswrapper[4748]: I0320 10:48:14.053702 4748 generic.go:334] "Generic (PLEG): container finished" podID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" containerID="261e51a50d63c2c75ffa3ccd3542b6a7465ac5ce52a44fe320cc511b3ec023ef" exitCode=0 Mar 20 10:48:14 crc kubenswrapper[4748]: I0320 10:48:14.053766 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" event={"ID":"8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c","Type":"ContainerDied","Data":"261e51a50d63c2c75ffa3ccd3542b6a7465ac5ce52a44fe320cc511b3ec023ef"} Mar 20 10:48:14 crc kubenswrapper[4748]: I0320 10:48:14.054508 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" event={"ID":"8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c","Type":"ContainerStarted","Data":"9de74f70ec1f6bdcd394aa24f2c91f712fff4a83d715b6c60cbd64842427bc57"} Mar 20 10:48:14 crc kubenswrapper[4748]: I0320 10:48:14.054546 4748 scope.go:117] "RemoveContainer" containerID="3d722000c65d389fb1b79a8ffb41c8ee2832c576f06ef2d0738841c23d445dbc" Mar 20 10:48:16 crc kubenswrapper[4748]: I0320 10:48:16.086784 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-997hk" event={"ID":"bfc6a8bc-4449-4770-8d97-537c2e2a450b","Type":"ContainerStarted","Data":"abd1ffc16100de79e95e5f926ceea41cf3d6d94eb4c226ede54d6abb4d3f5491"} Mar 20 10:48:18 crc kubenswrapper[4748]: I0320 10:48:18.105135 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-997hk" event={"ID":"bfc6a8bc-4449-4770-8d97-537c2e2a450b","Type":"ContainerStarted","Data":"3e48085a5105fb8fd443233fe96bca3355d52e1e3076e3af9bf7c5d87d23463f"} Mar 20 10:48:18 crc kubenswrapper[4748]: I0320 10:48:18.105467 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-997hk" Mar 20 10:48:18 crc kubenswrapper[4748]: I0320 10:48:18.105501 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-997hk" Mar 20 10:48:18 crc kubenswrapper[4748]: I0320 10:48:18.144509 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-997hk" podStartSLOduration=9.144476842 podStartE2EDuration="9.144476842s" podCreationTimestamp="2026-03-20 10:48:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:48:18.141450977 +0000 UTC m=+733.282996801" watchObservedRunningTime="2026-03-20 10:48:18.144476842 +0000 UTC m=+733.286022666" Mar 20 10:48:18 crc kubenswrapper[4748]: I0320 10:48:18.146137 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-997hk" Mar 20 10:48:19 crc kubenswrapper[4748]: I0320 10:48:19.110424 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-997hk" Mar 20 10:48:19 crc kubenswrapper[4748]: I0320 10:48:19.146917 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-997hk" Mar 20 10:48:28 crc kubenswrapper[4748]: I0320 10:48:28.810101 4748 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 20 10:48:40 crc kubenswrapper[4748]: I0320 10:48:40.210404 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-997hk" Mar 20 10:48:47 crc kubenswrapper[4748]: I0320 10:48:47.529342 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874tsfzz"] Mar 20 10:48:47 crc kubenswrapper[4748]: I0320 10:48:47.530802 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874tsfzz" Mar 20 10:48:47 crc kubenswrapper[4748]: I0320 10:48:47.533452 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 20 10:48:47 crc kubenswrapper[4748]: I0320 10:48:47.546624 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8nfh\" (UniqueName: \"kubernetes.io/projected/5e2ce8ab-5247-412b-a4fb-d35645c906c6-kube-api-access-t8nfh\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874tsfzz\" (UID: \"5e2ce8ab-5247-412b-a4fb-d35645c906c6\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874tsfzz" Mar 20 10:48:47 crc kubenswrapper[4748]: I0320 10:48:47.546753 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5e2ce8ab-5247-412b-a4fb-d35645c906c6-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874tsfzz\" (UID: \"5e2ce8ab-5247-412b-a4fb-d35645c906c6\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874tsfzz" Mar 20 10:48:47 crc kubenswrapper[4748]: I0320 10:48:47.546890 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5e2ce8ab-5247-412b-a4fb-d35645c906c6-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874tsfzz\" (UID: \"5e2ce8ab-5247-412b-a4fb-d35645c906c6\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874tsfzz" Mar 20 10:48:47 crc kubenswrapper[4748]: I0320 10:48:47.546958 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874tsfzz"] Mar 20 10:48:47 crc kubenswrapper[4748]: I0320 10:48:47.647948 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5e2ce8ab-5247-412b-a4fb-d35645c906c6-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874tsfzz\" (UID: \"5e2ce8ab-5247-412b-a4fb-d35645c906c6\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874tsfzz" Mar 20 10:48:47 crc kubenswrapper[4748]: I0320 10:48:47.648349 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8nfh\" (UniqueName: \"kubernetes.io/projected/5e2ce8ab-5247-412b-a4fb-d35645c906c6-kube-api-access-t8nfh\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874tsfzz\" (UID: \"5e2ce8ab-5247-412b-a4fb-d35645c906c6\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874tsfzz" Mar 20 10:48:47 crc kubenswrapper[4748]: I0320 10:48:47.648382 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5e2ce8ab-5247-412b-a4fb-d35645c906c6-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874tsfzz\" (UID: \"5e2ce8ab-5247-412b-a4fb-d35645c906c6\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874tsfzz" Mar 20 10:48:47 crc kubenswrapper[4748]: I0320 10:48:47.648446 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5e2ce8ab-5247-412b-a4fb-d35645c906c6-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874tsfzz\" (UID: \"5e2ce8ab-5247-412b-a4fb-d35645c906c6\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874tsfzz" Mar 20 10:48:47 crc kubenswrapper[4748]: I0320 10:48:47.648928 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5e2ce8ab-5247-412b-a4fb-d35645c906c6-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874tsfzz\" (UID: \"5e2ce8ab-5247-412b-a4fb-d35645c906c6\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874tsfzz" Mar 20 10:48:47 crc kubenswrapper[4748]: I0320 10:48:47.670993 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8nfh\" (UniqueName: \"kubernetes.io/projected/5e2ce8ab-5247-412b-a4fb-d35645c906c6-kube-api-access-t8nfh\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874tsfzz\" (UID: \"5e2ce8ab-5247-412b-a4fb-d35645c906c6\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874tsfzz" Mar 20 10:48:47 crc kubenswrapper[4748]: I0320 10:48:47.852496 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874tsfzz" Mar 20 10:48:48 crc kubenswrapper[4748]: I0320 10:48:48.302635 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874tsfzz"] Mar 20 10:48:48 crc kubenswrapper[4748]: W0320 10:48:48.313604 4748 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e2ce8ab_5247_412b_a4fb_d35645c906c6.slice/crio-3f343e8952a525565b976ee2a7d598056f9cb9c37a223e219dd200414f649576 WatchSource:0}: Error finding container 3f343e8952a525565b976ee2a7d598056f9cb9c37a223e219dd200414f649576: Status 404 returned error can't find the container with id 3f343e8952a525565b976ee2a7d598056f9cb9c37a223e219dd200414f649576 Mar 20 10:48:49 crc kubenswrapper[4748]: I0320 10:48:49.299122 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874tsfzz" event={"ID":"5e2ce8ab-5247-412b-a4fb-d35645c906c6","Type":"ContainerStarted","Data":"d782549b02d6d0c1c040f850763a4b8dfb8280742261c92f488865fd25803e15"} Mar 20 10:48:49 crc kubenswrapper[4748]: I0320 10:48:49.299434 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874tsfzz" event={"ID":"5e2ce8ab-5247-412b-a4fb-d35645c906c6","Type":"ContainerStarted","Data":"3f343e8952a525565b976ee2a7d598056f9cb9c37a223e219dd200414f649576"} Mar 20 10:48:49 crc kubenswrapper[4748]: I0320 10:48:49.891003 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-wvzz9"] Mar 20 10:48:49 crc kubenswrapper[4748]: I0320 10:48:49.894305 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wvzz9" Mar 20 10:48:49 crc kubenswrapper[4748]: I0320 10:48:49.909935 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wvzz9"] Mar 20 10:48:50 crc kubenswrapper[4748]: I0320 10:48:50.082220 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c1a42e3-9016-431c-9cc1-deccd468a3a9-catalog-content\") pod \"redhat-operators-wvzz9\" (UID: \"4c1a42e3-9016-431c-9cc1-deccd468a3a9\") " pod="openshift-marketplace/redhat-operators-wvzz9" Mar 20 10:48:50 crc kubenswrapper[4748]: I0320 10:48:50.082382 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmf7f\" (UniqueName: \"kubernetes.io/projected/4c1a42e3-9016-431c-9cc1-deccd468a3a9-kube-api-access-hmf7f\") pod \"redhat-operators-wvzz9\" (UID: \"4c1a42e3-9016-431c-9cc1-deccd468a3a9\") " pod="openshift-marketplace/redhat-operators-wvzz9" Mar 20 10:48:50 crc kubenswrapper[4748]: I0320 10:48:50.082439 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c1a42e3-9016-431c-9cc1-deccd468a3a9-utilities\") pod \"redhat-operators-wvzz9\" (UID: \"4c1a42e3-9016-431c-9cc1-deccd468a3a9\") " pod="openshift-marketplace/redhat-operators-wvzz9" Mar 20 10:48:50 crc kubenswrapper[4748]: I0320 10:48:50.183696 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c1a42e3-9016-431c-9cc1-deccd468a3a9-utilities\") pod \"redhat-operators-wvzz9\" (UID: \"4c1a42e3-9016-431c-9cc1-deccd468a3a9\") " pod="openshift-marketplace/redhat-operators-wvzz9" Mar 20 10:48:50 crc kubenswrapper[4748]: I0320 10:48:50.183781 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c1a42e3-9016-431c-9cc1-deccd468a3a9-catalog-content\") pod \"redhat-operators-wvzz9\" (UID: \"4c1a42e3-9016-431c-9cc1-deccd468a3a9\") " pod="openshift-marketplace/redhat-operators-wvzz9" Mar 20 10:48:50 crc kubenswrapper[4748]: I0320 10:48:50.183883 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hmf7f\" (UniqueName: \"kubernetes.io/projected/4c1a42e3-9016-431c-9cc1-deccd468a3a9-kube-api-access-hmf7f\") pod \"redhat-operators-wvzz9\" (UID: \"4c1a42e3-9016-431c-9cc1-deccd468a3a9\") " pod="openshift-marketplace/redhat-operators-wvzz9" Mar 20 10:48:50 crc kubenswrapper[4748]: I0320 10:48:50.184447 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c1a42e3-9016-431c-9cc1-deccd468a3a9-utilities\") pod \"redhat-operators-wvzz9\" (UID: \"4c1a42e3-9016-431c-9cc1-deccd468a3a9\") " pod="openshift-marketplace/redhat-operators-wvzz9" Mar 20 10:48:50 crc kubenswrapper[4748]: I0320 10:48:50.184579 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c1a42e3-9016-431c-9cc1-deccd468a3a9-catalog-content\") pod \"redhat-operators-wvzz9\" (UID: \"4c1a42e3-9016-431c-9cc1-deccd468a3a9\") " pod="openshift-marketplace/redhat-operators-wvzz9" Mar 20 10:48:50 crc kubenswrapper[4748]: I0320 10:48:50.210724 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmf7f\" (UniqueName: \"kubernetes.io/projected/4c1a42e3-9016-431c-9cc1-deccd468a3a9-kube-api-access-hmf7f\") pod \"redhat-operators-wvzz9\" (UID: \"4c1a42e3-9016-431c-9cc1-deccd468a3a9\") " pod="openshift-marketplace/redhat-operators-wvzz9" Mar 20 10:48:50 crc kubenswrapper[4748]: I0320 10:48:50.215221 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wvzz9" Mar 20 10:48:50 crc kubenswrapper[4748]: I0320 10:48:50.314191 4748 generic.go:334] "Generic (PLEG): container finished" podID="5e2ce8ab-5247-412b-a4fb-d35645c906c6" containerID="d782549b02d6d0c1c040f850763a4b8dfb8280742261c92f488865fd25803e15" exitCode=0 Mar 20 10:48:50 crc kubenswrapper[4748]: I0320 10:48:50.314240 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874tsfzz" event={"ID":"5e2ce8ab-5247-412b-a4fb-d35645c906c6","Type":"ContainerDied","Data":"d782549b02d6d0c1c040f850763a4b8dfb8280742261c92f488865fd25803e15"} Mar 20 10:48:50 crc kubenswrapper[4748]: I0320 10:48:50.594938 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wvzz9"] Mar 20 10:48:51 crc kubenswrapper[4748]: I0320 10:48:51.321293 4748 generic.go:334] "Generic (PLEG): container finished" podID="4c1a42e3-9016-431c-9cc1-deccd468a3a9" containerID="3eb4c951913e1efb59afb58f4e2f5874d00d3227cee10e872a54aa159e0b6400" exitCode=0 Mar 20 10:48:51 crc kubenswrapper[4748]: I0320 10:48:51.321368 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wvzz9" event={"ID":"4c1a42e3-9016-431c-9cc1-deccd468a3a9","Type":"ContainerDied","Data":"3eb4c951913e1efb59afb58f4e2f5874d00d3227cee10e872a54aa159e0b6400"} Mar 20 10:48:51 crc kubenswrapper[4748]: I0320 10:48:51.321601 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wvzz9" event={"ID":"4c1a42e3-9016-431c-9cc1-deccd468a3a9","Type":"ContainerStarted","Data":"31a1d5d255b27f30f0c17e640b2798d6cdfe1982764549b80d6d83d395f03b38"} Mar 20 10:48:54 crc kubenswrapper[4748]: I0320 10:48:54.345109 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wvzz9" event={"ID":"4c1a42e3-9016-431c-9cc1-deccd468a3a9","Type":"ContainerStarted","Data":"4b67475559be7198b4227b17123ca09cc2cc6e8cc283acf45d31210d1e8f4e17"} Mar 20 10:48:54 crc kubenswrapper[4748]: I0320 10:48:54.348801 4748 generic.go:334] "Generic (PLEG): container finished" podID="5e2ce8ab-5247-412b-a4fb-d35645c906c6" containerID="5057623b710c07099a0921ba84c027d5805ffc8914d9594cfa6a738a63a36a66" exitCode=0 Mar 20 10:48:54 crc kubenswrapper[4748]: I0320 10:48:54.348883 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874tsfzz" event={"ID":"5e2ce8ab-5247-412b-a4fb-d35645c906c6","Type":"ContainerDied","Data":"5057623b710c07099a0921ba84c027d5805ffc8914d9594cfa6a738a63a36a66"} Mar 20 10:48:55 crc kubenswrapper[4748]: I0320 10:48:55.356706 4748 generic.go:334] "Generic (PLEG): container finished" podID="4c1a42e3-9016-431c-9cc1-deccd468a3a9" containerID="4b67475559be7198b4227b17123ca09cc2cc6e8cc283acf45d31210d1e8f4e17" exitCode=0 Mar 20 10:48:55 crc kubenswrapper[4748]: I0320 10:48:55.356803 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wvzz9" event={"ID":"4c1a42e3-9016-431c-9cc1-deccd468a3a9","Type":"ContainerDied","Data":"4b67475559be7198b4227b17123ca09cc2cc6e8cc283acf45d31210d1e8f4e17"} Mar 20 10:48:55 crc kubenswrapper[4748]: I0320 10:48:55.363742 4748 generic.go:334] "Generic (PLEG): container finished" podID="5e2ce8ab-5247-412b-a4fb-d35645c906c6" containerID="987047b5fcfb195ff0c762de4e21ae8d68cf8b91aca15493025e99ddb4d41589" exitCode=0 Mar 20 10:48:55 crc kubenswrapper[4748]: I0320 10:48:55.363793 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874tsfzz" event={"ID":"5e2ce8ab-5247-412b-a4fb-d35645c906c6","Type":"ContainerDied","Data":"987047b5fcfb195ff0c762de4e21ae8d68cf8b91aca15493025e99ddb4d41589"} Mar 20 10:48:56 crc kubenswrapper[4748]: I0320 10:48:56.371268 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wvzz9" event={"ID":"4c1a42e3-9016-431c-9cc1-deccd468a3a9","Type":"ContainerStarted","Data":"2d77d7ba249fd897ceda75b5b7294240959ab1cb8a1be74d838c311ba9b43b90"} Mar 20 10:48:56 crc kubenswrapper[4748]: I0320 10:48:56.395711 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-wvzz9" podStartSLOduration=2.7788623059999997 podStartE2EDuration="7.395694803s" podCreationTimestamp="2026-03-20 10:48:49 +0000 UTC" firstStartedPulling="2026-03-20 10:48:51.322649383 +0000 UTC m=+766.464195187" lastFinishedPulling="2026-03-20 10:48:55.93948188 +0000 UTC m=+771.081027684" observedRunningTime="2026-03-20 10:48:56.392042762 +0000 UTC m=+771.533588586" watchObservedRunningTime="2026-03-20 10:48:56.395694803 +0000 UTC m=+771.537240617" Mar 20 10:48:56 crc kubenswrapper[4748]: I0320 10:48:56.629828 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874tsfzz" Mar 20 10:48:56 crc kubenswrapper[4748]: I0320 10:48:56.667980 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5e2ce8ab-5247-412b-a4fb-d35645c906c6-util\") pod \"5e2ce8ab-5247-412b-a4fb-d35645c906c6\" (UID: \"5e2ce8ab-5247-412b-a4fb-d35645c906c6\") " Mar 20 10:48:56 crc kubenswrapper[4748]: I0320 10:48:56.668060 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t8nfh\" (UniqueName: \"kubernetes.io/projected/5e2ce8ab-5247-412b-a4fb-d35645c906c6-kube-api-access-t8nfh\") pod \"5e2ce8ab-5247-412b-a4fb-d35645c906c6\" (UID: \"5e2ce8ab-5247-412b-a4fb-d35645c906c6\") " Mar 20 10:48:56 crc kubenswrapper[4748]: I0320 10:48:56.668130 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5e2ce8ab-5247-412b-a4fb-d35645c906c6-bundle\") pod \"5e2ce8ab-5247-412b-a4fb-d35645c906c6\" (UID: \"5e2ce8ab-5247-412b-a4fb-d35645c906c6\") " Mar 20 10:48:56 crc kubenswrapper[4748]: I0320 10:48:56.668753 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e2ce8ab-5247-412b-a4fb-d35645c906c6-bundle" (OuterVolumeSpecName: "bundle") pod "5e2ce8ab-5247-412b-a4fb-d35645c906c6" (UID: "5e2ce8ab-5247-412b-a4fb-d35645c906c6"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:48:56 crc kubenswrapper[4748]: I0320 10:48:56.674930 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e2ce8ab-5247-412b-a4fb-d35645c906c6-kube-api-access-t8nfh" (OuterVolumeSpecName: "kube-api-access-t8nfh") pod "5e2ce8ab-5247-412b-a4fb-d35645c906c6" (UID: "5e2ce8ab-5247-412b-a4fb-d35645c906c6"). InnerVolumeSpecName "kube-api-access-t8nfh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:48:56 crc kubenswrapper[4748]: I0320 10:48:56.684321 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e2ce8ab-5247-412b-a4fb-d35645c906c6-util" (OuterVolumeSpecName: "util") pod "5e2ce8ab-5247-412b-a4fb-d35645c906c6" (UID: "5e2ce8ab-5247-412b-a4fb-d35645c906c6"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:48:56 crc kubenswrapper[4748]: I0320 10:48:56.769185 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t8nfh\" (UniqueName: \"kubernetes.io/projected/5e2ce8ab-5247-412b-a4fb-d35645c906c6-kube-api-access-t8nfh\") on node \"crc\" DevicePath \"\"" Mar 20 10:48:56 crc kubenswrapper[4748]: I0320 10:48:56.769233 4748 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5e2ce8ab-5247-412b-a4fb-d35645c906c6-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 10:48:56 crc kubenswrapper[4748]: I0320 10:48:56.769245 4748 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5e2ce8ab-5247-412b-a4fb-d35645c906c6-util\") on node \"crc\" DevicePath \"\"" Mar 20 10:48:57 crc kubenswrapper[4748]: I0320 10:48:57.378827 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874tsfzz" event={"ID":"5e2ce8ab-5247-412b-a4fb-d35645c906c6","Type":"ContainerDied","Data":"3f343e8952a525565b976ee2a7d598056f9cb9c37a223e219dd200414f649576"} Mar 20 10:48:57 crc kubenswrapper[4748]: I0320 10:48:57.378884 4748 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3f343e8952a525565b976ee2a7d598056f9cb9c37a223e219dd200414f649576" Mar 20 10:48:57 crc kubenswrapper[4748]: I0320 10:48:57.378909 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874tsfzz" Mar 20 10:49:00 crc kubenswrapper[4748]: I0320 10:49:00.216171 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-wvzz9" Mar 20 10:49:00 crc kubenswrapper[4748]: I0320 10:49:00.217250 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-wvzz9" Mar 20 10:49:00 crc kubenswrapper[4748]: I0320 10:49:00.253448 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-hfdql"] Mar 20 10:49:00 crc kubenswrapper[4748]: E0320 10:49:00.253955 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e2ce8ab-5247-412b-a4fb-d35645c906c6" containerName="util" Mar 20 10:49:00 crc kubenswrapper[4748]: I0320 10:49:00.253994 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e2ce8ab-5247-412b-a4fb-d35645c906c6" containerName="util" Mar 20 10:49:00 crc kubenswrapper[4748]: E0320 10:49:00.254013 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e2ce8ab-5247-412b-a4fb-d35645c906c6" containerName="pull" Mar 20 10:49:00 crc kubenswrapper[4748]: I0320 10:49:00.254019 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e2ce8ab-5247-412b-a4fb-d35645c906c6" containerName="pull" Mar 20 10:49:00 crc kubenswrapper[4748]: E0320 10:49:00.254028 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e2ce8ab-5247-412b-a4fb-d35645c906c6" containerName="extract" Mar 20 10:49:00 crc kubenswrapper[4748]: I0320 10:49:00.254035 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e2ce8ab-5247-412b-a4fb-d35645c906c6" containerName="extract" Mar 20 10:49:00 crc kubenswrapper[4748]: I0320 10:49:00.254130 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e2ce8ab-5247-412b-a4fb-d35645c906c6" containerName="extract" Mar 20 10:49:00 crc kubenswrapper[4748]: I0320 10:49:00.254616 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-hfdql" Mar 20 10:49:00 crc kubenswrapper[4748]: I0320 10:49:00.256409 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-4sqjv" Mar 20 10:49:00 crc kubenswrapper[4748]: I0320 10:49:00.256537 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Mar 20 10:49:00 crc kubenswrapper[4748]: I0320 10:49:00.256700 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Mar 20 10:49:00 crc kubenswrapper[4748]: I0320 10:49:00.271127 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-hfdql"] Mar 20 10:49:00 crc kubenswrapper[4748]: I0320 10:49:00.411536 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5gsp5\" (UniqueName: \"kubernetes.io/projected/6ec3f263-5ad8-494d-88d0-ec9f60f7c6bc-kube-api-access-5gsp5\") pod \"nmstate-operator-796d4cfff4-hfdql\" (UID: \"6ec3f263-5ad8-494d-88d0-ec9f60f7c6bc\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-hfdql" Mar 20 10:49:00 crc kubenswrapper[4748]: I0320 10:49:00.512683 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5gsp5\" (UniqueName: \"kubernetes.io/projected/6ec3f263-5ad8-494d-88d0-ec9f60f7c6bc-kube-api-access-5gsp5\") pod \"nmstate-operator-796d4cfff4-hfdql\" (UID: \"6ec3f263-5ad8-494d-88d0-ec9f60f7c6bc\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-hfdql" Mar 20 10:49:00 crc kubenswrapper[4748]: I0320 10:49:00.569029 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5gsp5\" (UniqueName: \"kubernetes.io/projected/6ec3f263-5ad8-494d-88d0-ec9f60f7c6bc-kube-api-access-5gsp5\") pod \"nmstate-operator-796d4cfff4-hfdql\" (UID: \"6ec3f263-5ad8-494d-88d0-ec9f60f7c6bc\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-hfdql" Mar 20 10:49:00 crc kubenswrapper[4748]: I0320 10:49:00.570465 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-hfdql" Mar 20 10:49:00 crc kubenswrapper[4748]: I0320 10:49:00.837863 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-hfdql"] Mar 20 10:49:01 crc kubenswrapper[4748]: I0320 10:49:01.265594 4748 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-wvzz9" podUID="4c1a42e3-9016-431c-9cc1-deccd468a3a9" containerName="registry-server" probeResult="failure" output=< Mar 20 10:49:01 crc kubenswrapper[4748]: timeout: failed to connect service ":50051" within 1s Mar 20 10:49:01 crc kubenswrapper[4748]: > Mar 20 10:49:01 crc kubenswrapper[4748]: I0320 10:49:01.402400 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-hfdql" event={"ID":"6ec3f263-5ad8-494d-88d0-ec9f60f7c6bc","Type":"ContainerStarted","Data":"57518db340578ebbb2e765048955aafd3997270091dcdd724f8c3461dc392940"} Mar 20 10:49:07 crc kubenswrapper[4748]: I0320 10:49:07.447634 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-hfdql" event={"ID":"6ec3f263-5ad8-494d-88d0-ec9f60f7c6bc","Type":"ContainerStarted","Data":"fd027754ed36780ad1fc2ce6cb12949327a5224b4c3d6eb3b63d5466499dc84b"} Mar 20 10:49:07 crc kubenswrapper[4748]: I0320 10:49:07.465670 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-796d4cfff4-hfdql" podStartSLOduration=1.5084846280000002 podStartE2EDuration="7.465645705s" podCreationTimestamp="2026-03-20 10:49:00 +0000 UTC" firstStartedPulling="2026-03-20 10:49:00.844349847 +0000 UTC m=+775.985895661" lastFinishedPulling="2026-03-20 10:49:06.801510924 +0000 UTC m=+781.943056738" observedRunningTime="2026-03-20 10:49:07.464807445 +0000 UTC m=+782.606353289" watchObservedRunningTime="2026-03-20 10:49:07.465645705 +0000 UTC m=+782.607191519" Mar 20 10:49:08 crc kubenswrapper[4748]: I0320 10:49:08.568227 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-n2kjn"] Mar 20 10:49:08 crc kubenswrapper[4748]: I0320 10:49:08.569545 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-n2kjn" Mar 20 10:49:08 crc kubenswrapper[4748]: I0320 10:49:08.578408 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-nw62c" Mar 20 10:49:08 crc kubenswrapper[4748]: I0320 10:49:08.582261 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-n2kjn"] Mar 20 10:49:08 crc kubenswrapper[4748]: I0320 10:49:08.601763 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-w284q"] Mar 20 10:49:08 crc kubenswrapper[4748]: I0320 10:49:08.602507 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-w284q" Mar 20 10:49:08 crc kubenswrapper[4748]: I0320 10:49:08.613228 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-sms75"] Mar 20 10:49:08 crc kubenswrapper[4748]: I0320 10:49:08.614119 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-sms75" Mar 20 10:49:08 crc kubenswrapper[4748]: I0320 10:49:08.619104 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Mar 20 10:49:08 crc kubenswrapper[4748]: I0320 10:49:08.625802 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-w284q"] Mar 20 10:49:08 crc kubenswrapper[4748]: I0320 10:49:08.672412 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7dcf\" (UniqueName: \"kubernetes.io/projected/48cc57fd-25e5-490f-af0b-13a1e5f9be6d-kube-api-access-g7dcf\") pod \"nmstate-metrics-9b8c8685d-n2kjn\" (UID: \"48cc57fd-25e5-490f-af0b-13a1e5f9be6d\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-n2kjn" Mar 20 10:49:08 crc kubenswrapper[4748]: I0320 10:49:08.732527 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-l8vtz"] Mar 20 10:49:08 crc kubenswrapper[4748]: I0320 10:49:08.733456 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-l8vtz" Mar 20 10:49:08 crc kubenswrapper[4748]: I0320 10:49:08.737918 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Mar 20 10:49:08 crc kubenswrapper[4748]: I0320 10:49:08.738203 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-mdsgc" Mar 20 10:49:08 crc kubenswrapper[4748]: I0320 10:49:08.742209 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Mar 20 10:49:08 crc kubenswrapper[4748]: I0320 10:49:08.746308 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-l8vtz"] Mar 20 10:49:08 crc kubenswrapper[4748]: I0320 10:49:08.777092 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/12dadf04-5eff-4e48-96cd-d8033b0baf63-nmstate-lock\") pod \"nmstate-handler-sms75\" (UID: \"12dadf04-5eff-4e48-96cd-d8033b0baf63\") " pod="openshift-nmstate/nmstate-handler-sms75" Mar 20 10:49:08 crc kubenswrapper[4748]: I0320 10:49:08.777189 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g7dcf\" (UniqueName: \"kubernetes.io/projected/48cc57fd-25e5-490f-af0b-13a1e5f9be6d-kube-api-access-g7dcf\") pod \"nmstate-metrics-9b8c8685d-n2kjn\" (UID: \"48cc57fd-25e5-490f-af0b-13a1e5f9be6d\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-n2kjn" Mar 20 10:49:08 crc kubenswrapper[4748]: I0320 10:49:08.777657 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/12dadf04-5eff-4e48-96cd-d8033b0baf63-dbus-socket\") pod \"nmstate-handler-sms75\" (UID: \"12dadf04-5eff-4e48-96cd-d8033b0baf63\") " pod="openshift-nmstate/nmstate-handler-sms75" Mar 20 10:49:08 crc kubenswrapper[4748]: I0320 10:49:08.778184 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/fd6c2ee9-cef6-406b-a6e0-e1f741be9f61-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-w284q\" (UID: \"fd6c2ee9-cef6-406b-a6e0-e1f741be9f61\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-w284q" Mar 20 10:49:08 crc kubenswrapper[4748]: I0320 10:49:08.778221 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/12dadf04-5eff-4e48-96cd-d8033b0baf63-ovs-socket\") pod \"nmstate-handler-sms75\" (UID: \"12dadf04-5eff-4e48-96cd-d8033b0baf63\") " pod="openshift-nmstate/nmstate-handler-sms75" Mar 20 10:49:08 crc kubenswrapper[4748]: I0320 10:49:08.778252 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7t8p\" (UniqueName: \"kubernetes.io/projected/12dadf04-5eff-4e48-96cd-d8033b0baf63-kube-api-access-q7t8p\") pod \"nmstate-handler-sms75\" (UID: \"12dadf04-5eff-4e48-96cd-d8033b0baf63\") " pod="openshift-nmstate/nmstate-handler-sms75" Mar 20 10:49:08 crc kubenswrapper[4748]: I0320 10:49:08.778270 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mm8c\" (UniqueName: \"kubernetes.io/projected/fd6c2ee9-cef6-406b-a6e0-e1f741be9f61-kube-api-access-6mm8c\") pod \"nmstate-webhook-5f558f5558-w284q\" (UID: \"fd6c2ee9-cef6-406b-a6e0-e1f741be9f61\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-w284q" Mar 20 10:49:08 crc kubenswrapper[4748]: I0320 10:49:08.798930 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7dcf\" (UniqueName: \"kubernetes.io/projected/48cc57fd-25e5-490f-af0b-13a1e5f9be6d-kube-api-access-g7dcf\") pod \"nmstate-metrics-9b8c8685d-n2kjn\" (UID: \"48cc57fd-25e5-490f-af0b-13a1e5f9be6d\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-n2kjn" Mar 20 10:49:08 crc kubenswrapper[4748]: I0320 10:49:08.879404 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/2781b78b-43e7-4826-8e44-74f302a93478-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-l8vtz\" (UID: \"2781b78b-43e7-4826-8e44-74f302a93478\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-l8vtz" Mar 20 10:49:08 crc kubenswrapper[4748]: I0320 10:49:08.879750 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/12dadf04-5eff-4e48-96cd-d8033b0baf63-nmstate-lock\") pod \"nmstate-handler-sms75\" (UID: \"12dadf04-5eff-4e48-96cd-d8033b0baf63\") " pod="openshift-nmstate/nmstate-handler-sms75" Mar 20 10:49:08 crc kubenswrapper[4748]: I0320 10:49:08.879891 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/12dadf04-5eff-4e48-96cd-d8033b0baf63-dbus-socket\") pod \"nmstate-handler-sms75\" (UID: \"12dadf04-5eff-4e48-96cd-d8033b0baf63\") " pod="openshift-nmstate/nmstate-handler-sms75" Mar 20 10:49:08 crc kubenswrapper[4748]: I0320 10:49:08.880005 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/fd6c2ee9-cef6-406b-a6e0-e1f741be9f61-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-w284q\" (UID: \"fd6c2ee9-cef6-406b-a6e0-e1f741be9f61\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-w284q" Mar 20 10:49:08 crc kubenswrapper[4748]: I0320 10:49:08.880119 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kd2js\" (UniqueName: \"kubernetes.io/projected/2781b78b-43e7-4826-8e44-74f302a93478-kube-api-access-kd2js\") pod \"nmstate-console-plugin-86f58fcf4-l8vtz\" (UID: \"2781b78b-43e7-4826-8e44-74f302a93478\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-l8vtz" Mar 20 10:49:08 crc kubenswrapper[4748]: I0320 10:49:08.880263 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/12dadf04-5eff-4e48-96cd-d8033b0baf63-ovs-socket\") pod \"nmstate-handler-sms75\" (UID: \"12dadf04-5eff-4e48-96cd-d8033b0baf63\") " pod="openshift-nmstate/nmstate-handler-sms75" Mar 20 10:49:08 crc kubenswrapper[4748]: I0320 10:49:08.880391 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/2781b78b-43e7-4826-8e44-74f302a93478-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-l8vtz\" (UID: \"2781b78b-43e7-4826-8e44-74f302a93478\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-l8vtz" Mar 20 10:49:08 crc kubenswrapper[4748]: I0320 10:49:08.880515 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q7t8p\" (UniqueName: \"kubernetes.io/projected/12dadf04-5eff-4e48-96cd-d8033b0baf63-kube-api-access-q7t8p\") pod \"nmstate-handler-sms75\" (UID: \"12dadf04-5eff-4e48-96cd-d8033b0baf63\") " pod="openshift-nmstate/nmstate-handler-sms75" Mar 20 10:49:08 crc kubenswrapper[4748]: I0320 10:49:08.880619 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6mm8c\" (UniqueName: \"kubernetes.io/projected/fd6c2ee9-cef6-406b-a6e0-e1f741be9f61-kube-api-access-6mm8c\") pod \"nmstate-webhook-5f558f5558-w284q\" (UID: \"fd6c2ee9-cef6-406b-a6e0-e1f741be9f61\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-w284q" Mar 20 10:49:08 crc kubenswrapper[4748]: I0320 10:49:08.879939 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/12dadf04-5eff-4e48-96cd-d8033b0baf63-nmstate-lock\") pod \"nmstate-handler-sms75\" (UID: \"12dadf04-5eff-4e48-96cd-d8033b0baf63\") " pod="openshift-nmstate/nmstate-handler-sms75" Mar 20 10:49:08 crc kubenswrapper[4748]: I0320 10:49:08.880343 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/12dadf04-5eff-4e48-96cd-d8033b0baf63-ovs-socket\") pod \"nmstate-handler-sms75\" (UID: \"12dadf04-5eff-4e48-96cd-d8033b0baf63\") " pod="openshift-nmstate/nmstate-handler-sms75" Mar 20 10:49:08 crc kubenswrapper[4748]: I0320 10:49:08.880392 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/12dadf04-5eff-4e48-96cd-d8033b0baf63-dbus-socket\") pod \"nmstate-handler-sms75\" (UID: \"12dadf04-5eff-4e48-96cd-d8033b0baf63\") " pod="openshift-nmstate/nmstate-handler-sms75" Mar 20 10:49:08 crc kubenswrapper[4748]: I0320 10:49:08.883409 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/fd6c2ee9-cef6-406b-a6e0-e1f741be9f61-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-w284q\" (UID: \"fd6c2ee9-cef6-406b-a6e0-e1f741be9f61\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-w284q" Mar 20 10:49:08 crc kubenswrapper[4748]: I0320 10:49:08.886097 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-n2kjn" Mar 20 10:49:08 crc kubenswrapper[4748]: I0320 10:49:08.914424 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mm8c\" (UniqueName: \"kubernetes.io/projected/fd6c2ee9-cef6-406b-a6e0-e1f741be9f61-kube-api-access-6mm8c\") pod \"nmstate-webhook-5f558f5558-w284q\" (UID: \"fd6c2ee9-cef6-406b-a6e0-e1f741be9f61\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-w284q" Mar 20 10:49:08 crc kubenswrapper[4748]: I0320 10:49:08.917163 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7t8p\" (UniqueName: \"kubernetes.io/projected/12dadf04-5eff-4e48-96cd-d8033b0baf63-kube-api-access-q7t8p\") pod \"nmstate-handler-sms75\" (UID: \"12dadf04-5eff-4e48-96cd-d8033b0baf63\") " pod="openshift-nmstate/nmstate-handler-sms75" Mar 20 10:49:08 crc kubenswrapper[4748]: I0320 10:49:08.919256 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-84bb7df847-9nwwt"] Mar 20 10:49:08 crc kubenswrapper[4748]: I0320 10:49:08.920228 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-84bb7df847-9nwwt" Mar 20 10:49:08 crc kubenswrapper[4748]: I0320 10:49:08.935311 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-84bb7df847-9nwwt"] Mar 20 10:49:08 crc kubenswrapper[4748]: I0320 10:49:08.936285 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-w284q" Mar 20 10:49:08 crc kubenswrapper[4748]: I0320 10:49:08.953562 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-sms75" Mar 20 10:49:08 crc kubenswrapper[4748]: I0320 10:49:08.981638 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/22b14afb-8498-470c-8824-783100dcfaa6-oauth-serving-cert\") pod \"console-84bb7df847-9nwwt\" (UID: \"22b14afb-8498-470c-8824-783100dcfaa6\") " pod="openshift-console/console-84bb7df847-9nwwt" Mar 20 10:49:08 crc kubenswrapper[4748]: I0320 10:49:08.981689 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kd2js\" (UniqueName: \"kubernetes.io/projected/2781b78b-43e7-4826-8e44-74f302a93478-kube-api-access-kd2js\") pod \"nmstate-console-plugin-86f58fcf4-l8vtz\" (UID: \"2781b78b-43e7-4826-8e44-74f302a93478\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-l8vtz" Mar 20 10:49:08 crc kubenswrapper[4748]: I0320 10:49:08.981727 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/2781b78b-43e7-4826-8e44-74f302a93478-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-l8vtz\" (UID: \"2781b78b-43e7-4826-8e44-74f302a93478\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-l8vtz" Mar 20 10:49:08 crc kubenswrapper[4748]: I0320 10:49:08.981752 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hx4n8\" (UniqueName: \"kubernetes.io/projected/22b14afb-8498-470c-8824-783100dcfaa6-kube-api-access-hx4n8\") pod \"console-84bb7df847-9nwwt\" (UID: \"22b14afb-8498-470c-8824-783100dcfaa6\") " pod="openshift-console/console-84bb7df847-9nwwt" Mar 20 10:49:08 crc kubenswrapper[4748]: I0320 10:49:08.981787 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/22b14afb-8498-470c-8824-783100dcfaa6-trusted-ca-bundle\") pod \"console-84bb7df847-9nwwt\" (UID: \"22b14afb-8498-470c-8824-783100dcfaa6\") " pod="openshift-console/console-84bb7df847-9nwwt" Mar 20 10:49:08 crc kubenswrapper[4748]: I0320 10:49:08.981805 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/22b14afb-8498-470c-8824-783100dcfaa6-console-config\") pod \"console-84bb7df847-9nwwt\" (UID: \"22b14afb-8498-470c-8824-783100dcfaa6\") " pod="openshift-console/console-84bb7df847-9nwwt" Mar 20 10:49:08 crc kubenswrapper[4748]: I0320 10:49:08.981907 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/22b14afb-8498-470c-8824-783100dcfaa6-console-serving-cert\") pod \"console-84bb7df847-9nwwt\" (UID: \"22b14afb-8498-470c-8824-783100dcfaa6\") " pod="openshift-console/console-84bb7df847-9nwwt" Mar 20 10:49:08 crc kubenswrapper[4748]: I0320 10:49:08.981940 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/2781b78b-43e7-4826-8e44-74f302a93478-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-l8vtz\" (UID: \"2781b78b-43e7-4826-8e44-74f302a93478\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-l8vtz" Mar 20 10:49:08 crc kubenswrapper[4748]: I0320 10:49:08.981968 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/22b14afb-8498-470c-8824-783100dcfaa6-service-ca\") pod \"console-84bb7df847-9nwwt\" (UID: \"22b14afb-8498-470c-8824-783100dcfaa6\") " pod="openshift-console/console-84bb7df847-9nwwt" Mar 20 10:49:08 crc kubenswrapper[4748]: I0320 10:49:08.982006 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/22b14afb-8498-470c-8824-783100dcfaa6-console-oauth-config\") pod \"console-84bb7df847-9nwwt\" (UID: \"22b14afb-8498-470c-8824-783100dcfaa6\") " pod="openshift-console/console-84bb7df847-9nwwt" Mar 20 10:49:08 crc kubenswrapper[4748]: W0320 10:49:08.985383 4748 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod12dadf04_5eff_4e48_96cd_d8033b0baf63.slice/crio-798ff8f0f52ac239ce72c65ad1f1cc239fdeb8b5539a32479840037b6fb0325f WatchSource:0}: Error finding container 798ff8f0f52ac239ce72c65ad1f1cc239fdeb8b5539a32479840037b6fb0325f: Status 404 returned error can't find the container with id 798ff8f0f52ac239ce72c65ad1f1cc239fdeb8b5539a32479840037b6fb0325f Mar 20 10:49:08 crc kubenswrapper[4748]: I0320 10:49:08.986212 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/2781b78b-43e7-4826-8e44-74f302a93478-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-l8vtz\" (UID: \"2781b78b-43e7-4826-8e44-74f302a93478\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-l8vtz" Mar 20 10:49:08 crc kubenswrapper[4748]: I0320 10:49:08.986325 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/2781b78b-43e7-4826-8e44-74f302a93478-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-l8vtz\" (UID: \"2781b78b-43e7-4826-8e44-74f302a93478\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-l8vtz" Mar 20 10:49:08 crc kubenswrapper[4748]: I0320 10:49:08.995468 4748 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 10:49:08 crc kubenswrapper[4748]: I0320 10:49:08.998798 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kd2js\" (UniqueName: \"kubernetes.io/projected/2781b78b-43e7-4826-8e44-74f302a93478-kube-api-access-kd2js\") pod \"nmstate-console-plugin-86f58fcf4-l8vtz\" (UID: \"2781b78b-43e7-4826-8e44-74f302a93478\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-l8vtz" Mar 20 10:49:09 crc kubenswrapper[4748]: I0320 10:49:09.051500 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-l8vtz" Mar 20 10:49:09 crc kubenswrapper[4748]: I0320 10:49:09.082914 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/22b14afb-8498-470c-8824-783100dcfaa6-trusted-ca-bundle\") pod \"console-84bb7df847-9nwwt\" (UID: \"22b14afb-8498-470c-8824-783100dcfaa6\") " pod="openshift-console/console-84bb7df847-9nwwt" Mar 20 10:49:09 crc kubenswrapper[4748]: I0320 10:49:09.082960 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/22b14afb-8498-470c-8824-783100dcfaa6-console-config\") pod \"console-84bb7df847-9nwwt\" (UID: \"22b14afb-8498-470c-8824-783100dcfaa6\") " pod="openshift-console/console-84bb7df847-9nwwt" Mar 20 10:49:09 crc kubenswrapper[4748]: I0320 10:49:09.082988 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/22b14afb-8498-470c-8824-783100dcfaa6-console-serving-cert\") pod \"console-84bb7df847-9nwwt\" (UID: \"22b14afb-8498-470c-8824-783100dcfaa6\") " pod="openshift-console/console-84bb7df847-9nwwt" Mar 20 10:49:09 crc kubenswrapper[4748]: I0320 10:49:09.083018 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/22b14afb-8498-470c-8824-783100dcfaa6-service-ca\") pod \"console-84bb7df847-9nwwt\" (UID: \"22b14afb-8498-470c-8824-783100dcfaa6\") " pod="openshift-console/console-84bb7df847-9nwwt" Mar 20 10:49:09 crc kubenswrapper[4748]: I0320 10:49:09.083052 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/22b14afb-8498-470c-8824-783100dcfaa6-console-oauth-config\") pod \"console-84bb7df847-9nwwt\" (UID: \"22b14afb-8498-470c-8824-783100dcfaa6\") " pod="openshift-console/console-84bb7df847-9nwwt" Mar 20 10:49:09 crc kubenswrapper[4748]: I0320 10:49:09.083069 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/22b14afb-8498-470c-8824-783100dcfaa6-oauth-serving-cert\") pod \"console-84bb7df847-9nwwt\" (UID: \"22b14afb-8498-470c-8824-783100dcfaa6\") " pod="openshift-console/console-84bb7df847-9nwwt" Mar 20 10:49:09 crc kubenswrapper[4748]: I0320 10:49:09.083098 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hx4n8\" (UniqueName: \"kubernetes.io/projected/22b14afb-8498-470c-8824-783100dcfaa6-kube-api-access-hx4n8\") pod \"console-84bb7df847-9nwwt\" (UID: \"22b14afb-8498-470c-8824-783100dcfaa6\") " pod="openshift-console/console-84bb7df847-9nwwt" Mar 20 10:49:09 crc kubenswrapper[4748]: I0320 10:49:09.084435 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/22b14afb-8498-470c-8824-783100dcfaa6-console-config\") pod \"console-84bb7df847-9nwwt\" (UID: \"22b14afb-8498-470c-8824-783100dcfaa6\") " pod="openshift-console/console-84bb7df847-9nwwt" Mar 20 10:49:09 crc kubenswrapper[4748]: I0320 10:49:09.084909 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/22b14afb-8498-470c-8824-783100dcfaa6-trusted-ca-bundle\") pod \"console-84bb7df847-9nwwt\" (UID: \"22b14afb-8498-470c-8824-783100dcfaa6\") " pod="openshift-console/console-84bb7df847-9nwwt" Mar 20 10:49:09 crc kubenswrapper[4748]: I0320 10:49:09.085649 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/22b14afb-8498-470c-8824-783100dcfaa6-service-ca\") pod \"console-84bb7df847-9nwwt\" (UID: \"22b14afb-8498-470c-8824-783100dcfaa6\") " pod="openshift-console/console-84bb7df847-9nwwt" Mar 20 10:49:09 crc kubenswrapper[4748]: I0320 10:49:09.086354 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/22b14afb-8498-470c-8824-783100dcfaa6-oauth-serving-cert\") pod \"console-84bb7df847-9nwwt\" (UID: \"22b14afb-8498-470c-8824-783100dcfaa6\") " pod="openshift-console/console-84bb7df847-9nwwt" Mar 20 10:49:09 crc kubenswrapper[4748]: I0320 10:49:09.091080 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/22b14afb-8498-470c-8824-783100dcfaa6-console-oauth-config\") pod \"console-84bb7df847-9nwwt\" (UID: \"22b14afb-8498-470c-8824-783100dcfaa6\") " pod="openshift-console/console-84bb7df847-9nwwt" Mar 20 10:49:09 crc kubenswrapper[4748]: I0320 10:49:09.092572 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/22b14afb-8498-470c-8824-783100dcfaa6-console-serving-cert\") pod \"console-84bb7df847-9nwwt\" (UID: \"22b14afb-8498-470c-8824-783100dcfaa6\") " pod="openshift-console/console-84bb7df847-9nwwt" Mar 20 10:49:09 crc kubenswrapper[4748]: I0320 10:49:09.101741 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hx4n8\" (UniqueName: \"kubernetes.io/projected/22b14afb-8498-470c-8824-783100dcfaa6-kube-api-access-hx4n8\") pod \"console-84bb7df847-9nwwt\" (UID: \"22b14afb-8498-470c-8824-783100dcfaa6\") " pod="openshift-console/console-84bb7df847-9nwwt" Mar 20 10:49:09 crc kubenswrapper[4748]: W0320 10:49:09.160473 4748 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod48cc57fd_25e5_490f_af0b_13a1e5f9be6d.slice/crio-36bfed931580b9d2cba7093cfcfcb7f80e581054d975d6ab2e6840963babe6ff WatchSource:0}: Error finding container 36bfed931580b9d2cba7093cfcfcb7f80e581054d975d6ab2e6840963babe6ff: Status 404 returned error can't find the container with id 36bfed931580b9d2cba7093cfcfcb7f80e581054d975d6ab2e6840963babe6ff Mar 20 10:49:09 crc kubenswrapper[4748]: I0320 10:49:09.160786 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-n2kjn"] Mar 20 10:49:09 crc kubenswrapper[4748]: I0320 10:49:09.289654 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-84bb7df847-9nwwt" Mar 20 10:49:09 crc kubenswrapper[4748]: I0320 10:49:09.416093 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-w284q"] Mar 20 10:49:09 crc kubenswrapper[4748]: W0320 10:49:09.425252 4748 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfd6c2ee9_cef6_406b_a6e0_e1f741be9f61.slice/crio-007f934d56d8ea9a074de1afa4284e34bcf0a1d23af5291e69d40564053766f0 WatchSource:0}: Error finding container 007f934d56d8ea9a074de1afa4284e34bcf0a1d23af5291e69d40564053766f0: Status 404 returned error can't find the container with id 007f934d56d8ea9a074de1afa4284e34bcf0a1d23af5291e69d40564053766f0 Mar 20 10:49:09 crc kubenswrapper[4748]: I0320 10:49:09.464271 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-sms75" event={"ID":"12dadf04-5eff-4e48-96cd-d8033b0baf63","Type":"ContainerStarted","Data":"798ff8f0f52ac239ce72c65ad1f1cc239fdeb8b5539a32479840037b6fb0325f"} Mar 20 10:49:09 crc kubenswrapper[4748]: I0320 10:49:09.466511 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-84bb7df847-9nwwt"] Mar 20 10:49:09 crc kubenswrapper[4748]: I0320 10:49:09.466694 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-n2kjn" event={"ID":"48cc57fd-25e5-490f-af0b-13a1e5f9be6d","Type":"ContainerStarted","Data":"36bfed931580b9d2cba7093cfcfcb7f80e581054d975d6ab2e6840963babe6ff"} Mar 20 10:49:09 crc kubenswrapper[4748]: I0320 10:49:09.467975 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-w284q" event={"ID":"fd6c2ee9-cef6-406b-a6e0-e1f741be9f61","Type":"ContainerStarted","Data":"007f934d56d8ea9a074de1afa4284e34bcf0a1d23af5291e69d40564053766f0"} Mar 20 10:49:09 crc kubenswrapper[4748]: W0320 10:49:09.473861 4748 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod22b14afb_8498_470c_8824_783100dcfaa6.slice/crio-fd70c490ee9f17012d3ecc58c254f708c0f70da326cb331c11b6d30fd462b579 WatchSource:0}: Error finding container fd70c490ee9f17012d3ecc58c254f708c0f70da326cb331c11b6d30fd462b579: Status 404 returned error can't find the container with id fd70c490ee9f17012d3ecc58c254f708c0f70da326cb331c11b6d30fd462b579 Mar 20 10:49:09 crc kubenswrapper[4748]: I0320 10:49:09.489118 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-l8vtz"] Mar 20 10:49:10 crc kubenswrapper[4748]: I0320 10:49:10.275459 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-wvzz9" Mar 20 10:49:10 crc kubenswrapper[4748]: I0320 10:49:10.320696 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-wvzz9" Mar 20 10:49:10 crc kubenswrapper[4748]: I0320 10:49:10.475545 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-l8vtz" event={"ID":"2781b78b-43e7-4826-8e44-74f302a93478","Type":"ContainerStarted","Data":"29d0d1a8595dadc394fed8697ff7797d4d525d2c1d57fa03b24bc9250d2238e5"} Mar 20 10:49:10 crc kubenswrapper[4748]: I0320 10:49:10.477146 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-84bb7df847-9nwwt" event={"ID":"22b14afb-8498-470c-8824-783100dcfaa6","Type":"ContainerStarted","Data":"fd70c490ee9f17012d3ecc58c254f708c0f70da326cb331c11b6d30fd462b579"} Mar 20 10:49:10 crc kubenswrapper[4748]: I0320 10:49:10.506025 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wvzz9"] Mar 20 10:49:11 crc kubenswrapper[4748]: I0320 10:49:11.485662 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-84bb7df847-9nwwt" event={"ID":"22b14afb-8498-470c-8824-783100dcfaa6","Type":"ContainerStarted","Data":"30911a5969249c0b1954b43d77104f5f6b1be4941eda106237d774205a1441fb"} Mar 20 10:49:11 crc kubenswrapper[4748]: I0320 10:49:11.485881 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-wvzz9" podUID="4c1a42e3-9016-431c-9cc1-deccd468a3a9" containerName="registry-server" containerID="cri-o://2d77d7ba249fd897ceda75b5b7294240959ab1cb8a1be74d838c311ba9b43b90" gracePeriod=2 Mar 20 10:49:11 crc kubenswrapper[4748]: I0320 10:49:11.510113 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-84bb7df847-9nwwt" podStartSLOduration=3.510090919 podStartE2EDuration="3.510090919s" podCreationTimestamp="2026-03-20 10:49:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:49:11.503805463 +0000 UTC m=+786.645351287" watchObservedRunningTime="2026-03-20 10:49:11.510090919 +0000 UTC m=+786.651636733" Mar 20 10:49:12 crc kubenswrapper[4748]: I0320 10:49:12.498677 4748 generic.go:334] "Generic (PLEG): container finished" podID="4c1a42e3-9016-431c-9cc1-deccd468a3a9" containerID="2d77d7ba249fd897ceda75b5b7294240959ab1cb8a1be74d838c311ba9b43b90" exitCode=0 Mar 20 10:49:12 crc kubenswrapper[4748]: I0320 10:49:12.498712 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wvzz9" event={"ID":"4c1a42e3-9016-431c-9cc1-deccd468a3a9","Type":"ContainerDied","Data":"2d77d7ba249fd897ceda75b5b7294240959ab1cb8a1be74d838c311ba9b43b90"} Mar 20 10:49:13 crc kubenswrapper[4748]: I0320 10:49:13.282542 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wvzz9" Mar 20 10:49:13 crc kubenswrapper[4748]: I0320 10:49:13.349263 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c1a42e3-9016-431c-9cc1-deccd468a3a9-catalog-content\") pod \"4c1a42e3-9016-431c-9cc1-deccd468a3a9\" (UID: \"4c1a42e3-9016-431c-9cc1-deccd468a3a9\") " Mar 20 10:49:13 crc kubenswrapper[4748]: I0320 10:49:13.349312 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hmf7f\" (UniqueName: \"kubernetes.io/projected/4c1a42e3-9016-431c-9cc1-deccd468a3a9-kube-api-access-hmf7f\") pod \"4c1a42e3-9016-431c-9cc1-deccd468a3a9\" (UID: \"4c1a42e3-9016-431c-9cc1-deccd468a3a9\") " Mar 20 10:49:13 crc kubenswrapper[4748]: I0320 10:49:13.349349 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c1a42e3-9016-431c-9cc1-deccd468a3a9-utilities\") pod \"4c1a42e3-9016-431c-9cc1-deccd468a3a9\" (UID: \"4c1a42e3-9016-431c-9cc1-deccd468a3a9\") " Mar 20 10:49:13 crc kubenswrapper[4748]: I0320 10:49:13.350360 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c1a42e3-9016-431c-9cc1-deccd468a3a9-utilities" (OuterVolumeSpecName: "utilities") pod "4c1a42e3-9016-431c-9cc1-deccd468a3a9" (UID: "4c1a42e3-9016-431c-9cc1-deccd468a3a9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:49:13 crc kubenswrapper[4748]: I0320 10:49:13.354677 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c1a42e3-9016-431c-9cc1-deccd468a3a9-kube-api-access-hmf7f" (OuterVolumeSpecName: "kube-api-access-hmf7f") pod "4c1a42e3-9016-431c-9cc1-deccd468a3a9" (UID: "4c1a42e3-9016-431c-9cc1-deccd468a3a9"). InnerVolumeSpecName "kube-api-access-hmf7f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:49:13 crc kubenswrapper[4748]: I0320 10:49:13.450928 4748 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c1a42e3-9016-431c-9cc1-deccd468a3a9-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 10:49:13 crc kubenswrapper[4748]: I0320 10:49:13.451278 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hmf7f\" (UniqueName: \"kubernetes.io/projected/4c1a42e3-9016-431c-9cc1-deccd468a3a9-kube-api-access-hmf7f\") on node \"crc\" DevicePath \"\"" Mar 20 10:49:13 crc kubenswrapper[4748]: I0320 10:49:13.487387 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c1a42e3-9016-431c-9cc1-deccd468a3a9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4c1a42e3-9016-431c-9cc1-deccd468a3a9" (UID: "4c1a42e3-9016-431c-9cc1-deccd468a3a9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:49:13 crc kubenswrapper[4748]: I0320 10:49:13.510388 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wvzz9" event={"ID":"4c1a42e3-9016-431c-9cc1-deccd468a3a9","Type":"ContainerDied","Data":"31a1d5d255b27f30f0c17e640b2798d6cdfe1982764549b80d6d83d395f03b38"} Mar 20 10:49:13 crc kubenswrapper[4748]: I0320 10:49:13.511027 4748 scope.go:117] "RemoveContainer" containerID="2d77d7ba249fd897ceda75b5b7294240959ab1cb8a1be74d838c311ba9b43b90" Mar 20 10:49:13 crc kubenswrapper[4748]: I0320 10:49:13.510586 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wvzz9" Mar 20 10:49:13 crc kubenswrapper[4748]: I0320 10:49:13.534173 4748 scope.go:117] "RemoveContainer" containerID="4b67475559be7198b4227b17123ca09cc2cc6e8cc283acf45d31210d1e8f4e17" Mar 20 10:49:13 crc kubenswrapper[4748]: I0320 10:49:13.553294 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wvzz9"] Mar 20 10:49:13 crc kubenswrapper[4748]: I0320 10:49:13.559327 4748 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c1a42e3-9016-431c-9cc1-deccd468a3a9-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 10:49:13 crc kubenswrapper[4748]: I0320 10:49:13.560361 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-wvzz9"] Mar 20 10:49:13 crc kubenswrapper[4748]: I0320 10:49:13.565789 4748 scope.go:117] "RemoveContainer" containerID="3eb4c951913e1efb59afb58f4e2f5874d00d3227cee10e872a54aa159e0b6400" Mar 20 10:49:15 crc kubenswrapper[4748]: I0320 10:49:15.524342 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c1a42e3-9016-431c-9cc1-deccd468a3a9" path="/var/lib/kubelet/pods/4c1a42e3-9016-431c-9cc1-deccd468a3a9/volumes" Mar 20 10:49:15 crc kubenswrapper[4748]: I0320 10:49:15.526700 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-sms75" event={"ID":"12dadf04-5eff-4e48-96cd-d8033b0baf63","Type":"ContainerStarted","Data":"0762215786ea6d6b7f6b2e79ab0db89b5f6356fa1c4829096e48f2b88aaa37ae"} Mar 20 10:49:15 crc kubenswrapper[4748]: I0320 10:49:15.528989 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-n2kjn" event={"ID":"48cc57fd-25e5-490f-af0b-13a1e5f9be6d","Type":"ContainerStarted","Data":"9b8c4b72720df29ba8a502658b9b3694744d18d3f82d6f9ac5924689b8464aee"} Mar 20 10:49:15 crc kubenswrapper[4748]: I0320 10:49:15.530180 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-l8vtz" event={"ID":"2781b78b-43e7-4826-8e44-74f302a93478","Type":"ContainerStarted","Data":"e8553ffd9812d65212e4057484ae6295c81a639e9df2d95edf247fb4c5c11ddb"} Mar 20 10:49:15 crc kubenswrapper[4748]: I0320 10:49:15.532926 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-w284q" event={"ID":"fd6c2ee9-cef6-406b-a6e0-e1f741be9f61","Type":"ContainerStarted","Data":"3b8ec86892c58c93c117f800f9e4bfbb90ab096493c9495a289033b9f62d3d0b"} Mar 20 10:49:15 crc kubenswrapper[4748]: I0320 10:49:15.533069 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f558f5558-w284q" Mar 20 10:49:15 crc kubenswrapper[4748]: I0320 10:49:15.614567 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-sms75" podStartSLOduration=1.460256361 podStartE2EDuration="7.6145304s" podCreationTimestamp="2026-03-20 10:49:08 +0000 UTC" firstStartedPulling="2026-03-20 10:49:08.995253339 +0000 UTC m=+784.136799153" lastFinishedPulling="2026-03-20 10:49:15.149527378 +0000 UTC m=+790.291073192" observedRunningTime="2026-03-20 10:49:15.588820193 +0000 UTC m=+790.730366017" watchObservedRunningTime="2026-03-20 10:49:15.6145304 +0000 UTC m=+790.756076214" Mar 20 10:49:15 crc kubenswrapper[4748]: I0320 10:49:15.617769 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f558f5558-w284q" podStartSLOduration=1.884454212 podStartE2EDuration="7.61775609s" podCreationTimestamp="2026-03-20 10:49:08 +0000 UTC" firstStartedPulling="2026-03-20 10:49:09.432318369 +0000 UTC m=+784.573864183" lastFinishedPulling="2026-03-20 10:49:15.165620247 +0000 UTC m=+790.307166061" observedRunningTime="2026-03-20 10:49:15.611488785 +0000 UTC m=+790.753034609" watchObservedRunningTime="2026-03-20 10:49:15.61775609 +0000 UTC m=+790.759301904" Mar 20 10:49:15 crc kubenswrapper[4748]: I0320 10:49:15.634827 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-l8vtz" podStartSLOduration=1.985328624 podStartE2EDuration="7.634804883s" podCreationTimestamp="2026-03-20 10:49:08 +0000 UTC" firstStartedPulling="2026-03-20 10:49:09.498194683 +0000 UTC m=+784.639740497" lastFinishedPulling="2026-03-20 10:49:15.147670952 +0000 UTC m=+790.289216756" observedRunningTime="2026-03-20 10:49:15.631484431 +0000 UTC m=+790.773030265" watchObservedRunningTime="2026-03-20 10:49:15.634804883 +0000 UTC m=+790.776350687" Mar 20 10:49:16 crc kubenswrapper[4748]: I0320 10:49:16.538079 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-sms75" Mar 20 10:49:18 crc kubenswrapper[4748]: I0320 10:49:18.552972 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-n2kjn" event={"ID":"48cc57fd-25e5-490f-af0b-13a1e5f9be6d","Type":"ContainerStarted","Data":"621519e9a98678a4e8e8122f93ac708b8b6a1ca375610f7edd928d732a1a8d5c"} Mar 20 10:49:18 crc kubenswrapper[4748]: I0320 10:49:18.578877 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-n2kjn" podStartSLOduration=1.6914171439999999 podStartE2EDuration="10.578816815s" podCreationTimestamp="2026-03-20 10:49:08 +0000 UTC" firstStartedPulling="2026-03-20 10:49:09.163351128 +0000 UTC m=+784.304896942" lastFinishedPulling="2026-03-20 10:49:18.050750789 +0000 UTC m=+793.192296613" observedRunningTime="2026-03-20 10:49:18.575986565 +0000 UTC m=+793.717532429" watchObservedRunningTime="2026-03-20 10:49:18.578816815 +0000 UTC m=+793.720362639" Mar 20 10:49:19 crc kubenswrapper[4748]: I0320 10:49:19.290357 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-84bb7df847-9nwwt" Mar 20 10:49:19 crc kubenswrapper[4748]: I0320 10:49:19.290722 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-84bb7df847-9nwwt" Mar 20 10:49:19 crc kubenswrapper[4748]: I0320 10:49:19.294760 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-84bb7df847-9nwwt" Mar 20 10:49:19 crc kubenswrapper[4748]: I0320 10:49:19.562248 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-84bb7df847-9nwwt" Mar 20 10:49:19 crc kubenswrapper[4748]: I0320 10:49:19.614755 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-dmmdh"] Mar 20 10:49:23 crc kubenswrapper[4748]: I0320 10:49:23.980395 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-sms75" Mar 20 10:49:28 crc kubenswrapper[4748]: I0320 10:49:28.942826 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f558f5558-w284q" Mar 20 10:49:41 crc kubenswrapper[4748]: I0320 10:49:41.335098 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1426jk"] Mar 20 10:49:41 crc kubenswrapper[4748]: E0320 10:49:41.336012 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c1a42e3-9016-431c-9cc1-deccd468a3a9" containerName="registry-server" Mar 20 10:49:41 crc kubenswrapper[4748]: I0320 10:49:41.336032 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c1a42e3-9016-431c-9cc1-deccd468a3a9" containerName="registry-server" Mar 20 10:49:41 crc kubenswrapper[4748]: E0320 10:49:41.336049 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c1a42e3-9016-431c-9cc1-deccd468a3a9" containerName="extract-utilities" Mar 20 10:49:41 crc kubenswrapper[4748]: I0320 10:49:41.336062 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c1a42e3-9016-431c-9cc1-deccd468a3a9" containerName="extract-utilities" Mar 20 10:49:41 crc kubenswrapper[4748]: E0320 10:49:41.336088 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c1a42e3-9016-431c-9cc1-deccd468a3a9" containerName="extract-content" Mar 20 10:49:41 crc kubenswrapper[4748]: I0320 10:49:41.336103 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c1a42e3-9016-431c-9cc1-deccd468a3a9" containerName="extract-content" Mar 20 10:49:41 crc kubenswrapper[4748]: I0320 10:49:41.336271 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c1a42e3-9016-431c-9cc1-deccd468a3a9" containerName="registry-server" Mar 20 10:49:41 crc kubenswrapper[4748]: I0320 10:49:41.337314 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1426jk" Mar 20 10:49:41 crc kubenswrapper[4748]: I0320 10:49:41.339045 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 20 10:49:41 crc kubenswrapper[4748]: I0320 10:49:41.349207 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1426jk"] Mar 20 10:49:42 crc kubenswrapper[4748]: I0320 10:49:42.295631 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlpf8\" (UniqueName: \"kubernetes.io/projected/22772fc8-2959-4dfa-b1aa-070f9db955a1-kube-api-access-dlpf8\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1426jk\" (UID: \"22772fc8-2959-4dfa-b1aa-070f9db955a1\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1426jk" Mar 20 10:49:42 crc kubenswrapper[4748]: I0320 10:49:42.295697 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/22772fc8-2959-4dfa-b1aa-070f9db955a1-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1426jk\" (UID: \"22772fc8-2959-4dfa-b1aa-070f9db955a1\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1426jk" Mar 20 10:49:42 crc kubenswrapper[4748]: I0320 10:49:42.295717 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/22772fc8-2959-4dfa-b1aa-070f9db955a1-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1426jk\" (UID: \"22772fc8-2959-4dfa-b1aa-070f9db955a1\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1426jk" Mar 20 10:49:42 crc kubenswrapper[4748]: I0320 10:49:42.396833 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dlpf8\" (UniqueName: \"kubernetes.io/projected/22772fc8-2959-4dfa-b1aa-070f9db955a1-kube-api-access-dlpf8\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1426jk\" (UID: \"22772fc8-2959-4dfa-b1aa-070f9db955a1\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1426jk" Mar 20 10:49:42 crc kubenswrapper[4748]: I0320 10:49:42.396933 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/22772fc8-2959-4dfa-b1aa-070f9db955a1-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1426jk\" (UID: \"22772fc8-2959-4dfa-b1aa-070f9db955a1\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1426jk" Mar 20 10:49:42 crc kubenswrapper[4748]: I0320 10:49:42.396986 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/22772fc8-2959-4dfa-b1aa-070f9db955a1-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1426jk\" (UID: \"22772fc8-2959-4dfa-b1aa-070f9db955a1\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1426jk" Mar 20 10:49:42 crc kubenswrapper[4748]: I0320 10:49:42.397657 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/22772fc8-2959-4dfa-b1aa-070f9db955a1-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1426jk\" (UID: \"22772fc8-2959-4dfa-b1aa-070f9db955a1\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1426jk" Mar 20 10:49:42 crc kubenswrapper[4748]: I0320 10:49:42.397731 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/22772fc8-2959-4dfa-b1aa-070f9db955a1-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1426jk\" (UID: \"22772fc8-2959-4dfa-b1aa-070f9db955a1\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1426jk" Mar 20 10:49:42 crc kubenswrapper[4748]: I0320 10:49:42.423202 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dlpf8\" (UniqueName: \"kubernetes.io/projected/22772fc8-2959-4dfa-b1aa-070f9db955a1-kube-api-access-dlpf8\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1426jk\" (UID: \"22772fc8-2959-4dfa-b1aa-070f9db955a1\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1426jk" Mar 20 10:49:42 crc kubenswrapper[4748]: I0320 10:49:42.515157 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1426jk" Mar 20 10:49:42 crc kubenswrapper[4748]: I0320 10:49:42.700275 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1426jk"] Mar 20 10:49:43 crc kubenswrapper[4748]: I0320 10:49:43.221412 4748 generic.go:334] "Generic (PLEG): container finished" podID="22772fc8-2959-4dfa-b1aa-070f9db955a1" containerID="daeab4eee5f66cb5e36ce79dcf0fd2b5289cc22b5c8e8eb84bed89e1b3e972b3" exitCode=0 Mar 20 10:49:43 crc kubenswrapper[4748]: I0320 10:49:43.221495 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1426jk" event={"ID":"22772fc8-2959-4dfa-b1aa-070f9db955a1","Type":"ContainerDied","Data":"daeab4eee5f66cb5e36ce79dcf0fd2b5289cc22b5c8e8eb84bed89e1b3e972b3"} Mar 20 10:49:43 crc kubenswrapper[4748]: I0320 10:49:43.221857 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1426jk" event={"ID":"22772fc8-2959-4dfa-b1aa-070f9db955a1","Type":"ContainerStarted","Data":"60dd38feeab077e871d7b755cc6a429c57634551ff8810876f8ba5f6d0da6b21"} Mar 20 10:49:44 crc kubenswrapper[4748]: I0320 10:49:44.696089 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-dmmdh" podUID="ae334fdf-f952-4b6b-8372-1fd7ef332362" containerName="console" containerID="cri-o://8b271e3b6bd380bd3d266526eeff076faa8628bd20c184b5920981e864ec7a47" gracePeriod=15 Mar 20 10:49:45 crc kubenswrapper[4748]: I0320 10:49:45.179160 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-dmmdh_ae334fdf-f952-4b6b-8372-1fd7ef332362/console/0.log" Mar 20 10:49:45 crc kubenswrapper[4748]: I0320 10:49:45.179255 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-dmmdh" Mar 20 10:49:45 crc kubenswrapper[4748]: I0320 10:49:45.237215 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-dmmdh_ae334fdf-f952-4b6b-8372-1fd7ef332362/console/0.log" Mar 20 10:49:45 crc kubenswrapper[4748]: I0320 10:49:45.237265 4748 generic.go:334] "Generic (PLEG): container finished" podID="ae334fdf-f952-4b6b-8372-1fd7ef332362" containerID="8b271e3b6bd380bd3d266526eeff076faa8628bd20c184b5920981e864ec7a47" exitCode=2 Mar 20 10:49:45 crc kubenswrapper[4748]: I0320 10:49:45.237298 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-dmmdh" event={"ID":"ae334fdf-f952-4b6b-8372-1fd7ef332362","Type":"ContainerDied","Data":"8b271e3b6bd380bd3d266526eeff076faa8628bd20c184b5920981e864ec7a47"} Mar 20 10:49:45 crc kubenswrapper[4748]: I0320 10:49:45.237331 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-dmmdh" event={"ID":"ae334fdf-f952-4b6b-8372-1fd7ef332362","Type":"ContainerDied","Data":"51f9ea693ca416aeffb6c23f1fb4c79c8b0c6bd47470b07515d782f1059f369a"} Mar 20 10:49:45 crc kubenswrapper[4748]: I0320 10:49:45.237351 4748 scope.go:117] "RemoveContainer" containerID="8b271e3b6bd380bd3d266526eeff076faa8628bd20c184b5920981e864ec7a47" Mar 20 10:49:45 crc kubenswrapper[4748]: I0320 10:49:45.237464 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-dmmdh" Mar 20 10:49:45 crc kubenswrapper[4748]: I0320 10:49:45.252197 4748 scope.go:117] "RemoveContainer" containerID="8b271e3b6bd380bd3d266526eeff076faa8628bd20c184b5920981e864ec7a47" Mar 20 10:49:45 crc kubenswrapper[4748]: E0320 10:49:45.253243 4748 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b271e3b6bd380bd3d266526eeff076faa8628bd20c184b5920981e864ec7a47\": container with ID starting with 8b271e3b6bd380bd3d266526eeff076faa8628bd20c184b5920981e864ec7a47 not found: ID does not exist" containerID="8b271e3b6bd380bd3d266526eeff076faa8628bd20c184b5920981e864ec7a47" Mar 20 10:49:45 crc kubenswrapper[4748]: I0320 10:49:45.253298 4748 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b271e3b6bd380bd3d266526eeff076faa8628bd20c184b5920981e864ec7a47"} err="failed to get container status \"8b271e3b6bd380bd3d266526eeff076faa8628bd20c184b5920981e864ec7a47\": rpc error: code = NotFound desc = could not find container \"8b271e3b6bd380bd3d266526eeff076faa8628bd20c184b5920981e864ec7a47\": container with ID starting with 8b271e3b6bd380bd3d266526eeff076faa8628bd20c184b5920981e864ec7a47 not found: ID does not exist" Mar 20 10:49:45 crc kubenswrapper[4748]: I0320 10:49:45.334296 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ae334fdf-f952-4b6b-8372-1fd7ef332362-console-config\") pod \"ae334fdf-f952-4b6b-8372-1fd7ef332362\" (UID: \"ae334fdf-f952-4b6b-8372-1fd7ef332362\") " Mar 20 10:49:45 crc kubenswrapper[4748]: I0320 10:49:45.334368 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ae334fdf-f952-4b6b-8372-1fd7ef332362-oauth-serving-cert\") pod \"ae334fdf-f952-4b6b-8372-1fd7ef332362\" (UID: \"ae334fdf-f952-4b6b-8372-1fd7ef332362\") " Mar 20 10:49:45 crc kubenswrapper[4748]: I0320 10:49:45.334452 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ae334fdf-f952-4b6b-8372-1fd7ef332362-console-serving-cert\") pod \"ae334fdf-f952-4b6b-8372-1fd7ef332362\" (UID: \"ae334fdf-f952-4b6b-8372-1fd7ef332362\") " Mar 20 10:49:45 crc kubenswrapper[4748]: I0320 10:49:45.334486 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wqndt\" (UniqueName: \"kubernetes.io/projected/ae334fdf-f952-4b6b-8372-1fd7ef332362-kube-api-access-wqndt\") pod \"ae334fdf-f952-4b6b-8372-1fd7ef332362\" (UID: \"ae334fdf-f952-4b6b-8372-1fd7ef332362\") " Mar 20 10:49:45 crc kubenswrapper[4748]: I0320 10:49:45.334535 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ae334fdf-f952-4b6b-8372-1fd7ef332362-trusted-ca-bundle\") pod \"ae334fdf-f952-4b6b-8372-1fd7ef332362\" (UID: \"ae334fdf-f952-4b6b-8372-1fd7ef332362\") " Mar 20 10:49:45 crc kubenswrapper[4748]: I0320 10:49:45.334557 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ae334fdf-f952-4b6b-8372-1fd7ef332362-console-oauth-config\") pod \"ae334fdf-f952-4b6b-8372-1fd7ef332362\" (UID: \"ae334fdf-f952-4b6b-8372-1fd7ef332362\") " Mar 20 10:49:45 crc kubenswrapper[4748]: I0320 10:49:45.334593 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ae334fdf-f952-4b6b-8372-1fd7ef332362-service-ca\") pod \"ae334fdf-f952-4b6b-8372-1fd7ef332362\" (UID: \"ae334fdf-f952-4b6b-8372-1fd7ef332362\") " Mar 20 10:49:45 crc kubenswrapper[4748]: I0320 10:49:45.335750 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae334fdf-f952-4b6b-8372-1fd7ef332362-console-config" (OuterVolumeSpecName: "console-config") pod "ae334fdf-f952-4b6b-8372-1fd7ef332362" (UID: "ae334fdf-f952-4b6b-8372-1fd7ef332362"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:49:45 crc kubenswrapper[4748]: I0320 10:49:45.335768 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae334fdf-f952-4b6b-8372-1fd7ef332362-service-ca" (OuterVolumeSpecName: "service-ca") pod "ae334fdf-f952-4b6b-8372-1fd7ef332362" (UID: "ae334fdf-f952-4b6b-8372-1fd7ef332362"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:49:45 crc kubenswrapper[4748]: I0320 10:49:45.336225 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae334fdf-f952-4b6b-8372-1fd7ef332362-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "ae334fdf-f952-4b6b-8372-1fd7ef332362" (UID: "ae334fdf-f952-4b6b-8372-1fd7ef332362"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:49:45 crc kubenswrapper[4748]: I0320 10:49:45.336314 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae334fdf-f952-4b6b-8372-1fd7ef332362-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "ae334fdf-f952-4b6b-8372-1fd7ef332362" (UID: "ae334fdf-f952-4b6b-8372-1fd7ef332362"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:49:45 crc kubenswrapper[4748]: I0320 10:49:45.341608 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae334fdf-f952-4b6b-8372-1fd7ef332362-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "ae334fdf-f952-4b6b-8372-1fd7ef332362" (UID: "ae334fdf-f952-4b6b-8372-1fd7ef332362"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:49:45 crc kubenswrapper[4748]: I0320 10:49:45.342036 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae334fdf-f952-4b6b-8372-1fd7ef332362-kube-api-access-wqndt" (OuterVolumeSpecName: "kube-api-access-wqndt") pod "ae334fdf-f952-4b6b-8372-1fd7ef332362" (UID: "ae334fdf-f952-4b6b-8372-1fd7ef332362"). InnerVolumeSpecName "kube-api-access-wqndt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:49:45 crc kubenswrapper[4748]: I0320 10:49:45.342418 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae334fdf-f952-4b6b-8372-1fd7ef332362-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "ae334fdf-f952-4b6b-8372-1fd7ef332362" (UID: "ae334fdf-f952-4b6b-8372-1fd7ef332362"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:49:45 crc kubenswrapper[4748]: I0320 10:49:45.441082 4748 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ae334fdf-f952-4b6b-8372-1fd7ef332362-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:49:45 crc kubenswrapper[4748]: I0320 10:49:45.441125 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wqndt\" (UniqueName: \"kubernetes.io/projected/ae334fdf-f952-4b6b-8372-1fd7ef332362-kube-api-access-wqndt\") on node \"crc\" DevicePath \"\"" Mar 20 10:49:45 crc kubenswrapper[4748]: I0320 10:49:45.441137 4748 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ae334fdf-f952-4b6b-8372-1fd7ef332362-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 10:49:45 crc kubenswrapper[4748]: I0320 10:49:45.441150 4748 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ae334fdf-f952-4b6b-8372-1fd7ef332362-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:49:45 crc kubenswrapper[4748]: I0320 10:49:45.441161 4748 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ae334fdf-f952-4b6b-8372-1fd7ef332362-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 10:49:45 crc kubenswrapper[4748]: I0320 10:49:45.441193 4748 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ae334fdf-f952-4b6b-8372-1fd7ef332362-console-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:49:45 crc kubenswrapper[4748]: I0320 10:49:45.441204 4748 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ae334fdf-f952-4b6b-8372-1fd7ef332362-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:49:45 crc kubenswrapper[4748]: I0320 10:49:45.559390 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-dmmdh"] Mar 20 10:49:45 crc kubenswrapper[4748]: I0320 10:49:45.564604 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-dmmdh"] Mar 20 10:49:45 crc kubenswrapper[4748]: E0320 10:49:45.626105 4748 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podae334fdf_f952_4b6b_8372_1fd7ef332362.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podae334fdf_f952_4b6b_8372_1fd7ef332362.slice/crio-51f9ea693ca416aeffb6c23f1fb4c79c8b0c6bd47470b07515d782f1059f369a\": RecentStats: unable to find data in memory cache]" Mar 20 10:49:46 crc kubenswrapper[4748]: I0320 10:49:46.248627 4748 generic.go:334] "Generic (PLEG): container finished" podID="22772fc8-2959-4dfa-b1aa-070f9db955a1" containerID="c5d1259892232ae130ad4d9a9c8ab67ed9f89171f17c0762f601809f370f88e9" exitCode=0 Mar 20 10:49:46 crc kubenswrapper[4748]: I0320 10:49:46.248692 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1426jk" event={"ID":"22772fc8-2959-4dfa-b1aa-070f9db955a1","Type":"ContainerDied","Data":"c5d1259892232ae130ad4d9a9c8ab67ed9f89171f17c0762f601809f370f88e9"} Mar 20 10:49:47 crc kubenswrapper[4748]: I0320 10:49:47.260487 4748 generic.go:334] "Generic (PLEG): container finished" podID="22772fc8-2959-4dfa-b1aa-070f9db955a1" containerID="e1003e471fc78237e2765346cad0d194b4377c1b15c0521119f269a7c2cb6e0d" exitCode=0 Mar 20 10:49:47 crc kubenswrapper[4748]: I0320 10:49:47.260561 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1426jk" event={"ID":"22772fc8-2959-4dfa-b1aa-070f9db955a1","Type":"ContainerDied","Data":"e1003e471fc78237e2765346cad0d194b4377c1b15c0521119f269a7c2cb6e0d"} Mar 20 10:49:47 crc kubenswrapper[4748]: I0320 10:49:47.522370 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae334fdf-f952-4b6b-8372-1fd7ef332362" path="/var/lib/kubelet/pods/ae334fdf-f952-4b6b-8372-1fd7ef332362/volumes" Mar 20 10:49:48 crc kubenswrapper[4748]: I0320 10:49:48.531313 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1426jk" Mar 20 10:49:48 crc kubenswrapper[4748]: I0320 10:49:48.578598 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/22772fc8-2959-4dfa-b1aa-070f9db955a1-bundle\") pod \"22772fc8-2959-4dfa-b1aa-070f9db955a1\" (UID: \"22772fc8-2959-4dfa-b1aa-070f9db955a1\") " Mar 20 10:49:48 crc kubenswrapper[4748]: I0320 10:49:48.578701 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dlpf8\" (UniqueName: \"kubernetes.io/projected/22772fc8-2959-4dfa-b1aa-070f9db955a1-kube-api-access-dlpf8\") pod \"22772fc8-2959-4dfa-b1aa-070f9db955a1\" (UID: \"22772fc8-2959-4dfa-b1aa-070f9db955a1\") " Mar 20 10:49:48 crc kubenswrapper[4748]: I0320 10:49:48.578744 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/22772fc8-2959-4dfa-b1aa-070f9db955a1-util\") pod \"22772fc8-2959-4dfa-b1aa-070f9db955a1\" (UID: \"22772fc8-2959-4dfa-b1aa-070f9db955a1\") " Mar 20 10:49:48 crc kubenswrapper[4748]: I0320 10:49:48.580734 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/22772fc8-2959-4dfa-b1aa-070f9db955a1-bundle" (OuterVolumeSpecName: "bundle") pod "22772fc8-2959-4dfa-b1aa-070f9db955a1" (UID: "22772fc8-2959-4dfa-b1aa-070f9db955a1"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:49:48 crc kubenswrapper[4748]: I0320 10:49:48.585485 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22772fc8-2959-4dfa-b1aa-070f9db955a1-kube-api-access-dlpf8" (OuterVolumeSpecName: "kube-api-access-dlpf8") pod "22772fc8-2959-4dfa-b1aa-070f9db955a1" (UID: "22772fc8-2959-4dfa-b1aa-070f9db955a1"). InnerVolumeSpecName "kube-api-access-dlpf8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:49:48 crc kubenswrapper[4748]: I0320 10:49:48.679781 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dlpf8\" (UniqueName: \"kubernetes.io/projected/22772fc8-2959-4dfa-b1aa-070f9db955a1-kube-api-access-dlpf8\") on node \"crc\" DevicePath \"\"" Mar 20 10:49:48 crc kubenswrapper[4748]: I0320 10:49:48.679813 4748 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/22772fc8-2959-4dfa-b1aa-070f9db955a1-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 10:49:49 crc kubenswrapper[4748]: I0320 10:49:49.251203 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/22772fc8-2959-4dfa-b1aa-070f9db955a1-util" (OuterVolumeSpecName: "util") pod "22772fc8-2959-4dfa-b1aa-070f9db955a1" (UID: "22772fc8-2959-4dfa-b1aa-070f9db955a1"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:49:49 crc kubenswrapper[4748]: I0320 10:49:49.274405 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1426jk" event={"ID":"22772fc8-2959-4dfa-b1aa-070f9db955a1","Type":"ContainerDied","Data":"60dd38feeab077e871d7b755cc6a429c57634551ff8810876f8ba5f6d0da6b21"} Mar 20 10:49:49 crc kubenswrapper[4748]: I0320 10:49:49.274457 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1426jk" Mar 20 10:49:49 crc kubenswrapper[4748]: I0320 10:49:49.274475 4748 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="60dd38feeab077e871d7b755cc6a429c57634551ff8810876f8ba5f6d0da6b21" Mar 20 10:49:49 crc kubenswrapper[4748]: I0320 10:49:49.286801 4748 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/22772fc8-2959-4dfa-b1aa-070f9db955a1-util\") on node \"crc\" DevicePath \"\"" Mar 20 10:49:59 crc kubenswrapper[4748]: I0320 10:49:59.876558 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-5b86d95c7b-kqtkn"] Mar 20 10:49:59 crc kubenswrapper[4748]: E0320 10:49:59.877443 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae334fdf-f952-4b6b-8372-1fd7ef332362" containerName="console" Mar 20 10:49:59 crc kubenswrapper[4748]: I0320 10:49:59.877458 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae334fdf-f952-4b6b-8372-1fd7ef332362" containerName="console" Mar 20 10:49:59 crc kubenswrapper[4748]: E0320 10:49:59.877473 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22772fc8-2959-4dfa-b1aa-070f9db955a1" containerName="util" Mar 20 10:49:59 crc kubenswrapper[4748]: I0320 10:49:59.877480 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="22772fc8-2959-4dfa-b1aa-070f9db955a1" containerName="util" Mar 20 10:49:59 crc kubenswrapper[4748]: E0320 10:49:59.877505 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22772fc8-2959-4dfa-b1aa-070f9db955a1" containerName="extract" Mar 20 10:49:59 crc kubenswrapper[4748]: I0320 10:49:59.877513 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="22772fc8-2959-4dfa-b1aa-070f9db955a1" containerName="extract" Mar 20 10:49:59 crc kubenswrapper[4748]: E0320 10:49:59.877530 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22772fc8-2959-4dfa-b1aa-070f9db955a1" containerName="pull" Mar 20 10:49:59 crc kubenswrapper[4748]: I0320 10:49:59.877540 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="22772fc8-2959-4dfa-b1aa-070f9db955a1" containerName="pull" Mar 20 10:49:59 crc kubenswrapper[4748]: I0320 10:49:59.877657 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae334fdf-f952-4b6b-8372-1fd7ef332362" containerName="console" Mar 20 10:49:59 crc kubenswrapper[4748]: I0320 10:49:59.877679 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="22772fc8-2959-4dfa-b1aa-070f9db955a1" containerName="extract" Mar 20 10:49:59 crc kubenswrapper[4748]: I0320 10:49:59.878209 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-5b86d95c7b-kqtkn" Mar 20 10:49:59 crc kubenswrapper[4748]: I0320 10:49:59.891876 4748 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Mar 20 10:49:59 crc kubenswrapper[4748]: I0320 10:49:59.892445 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Mar 20 10:49:59 crc kubenswrapper[4748]: I0320 10:49:59.892559 4748 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Mar 20 10:49:59 crc kubenswrapper[4748]: I0320 10:49:59.896605 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Mar 20 10:49:59 crc kubenswrapper[4748]: I0320 10:49:59.896675 4748 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-kzttr" Mar 20 10:49:59 crc kubenswrapper[4748]: I0320 10:49:59.905930 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-5b86d95c7b-kqtkn"] Mar 20 10:49:59 crc kubenswrapper[4748]: I0320 10:49:59.963423 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7b6c6eee-f00c-458f-b050-6aaab992addf-webhook-cert\") pod \"metallb-operator-controller-manager-5b86d95c7b-kqtkn\" (UID: \"7b6c6eee-f00c-458f-b050-6aaab992addf\") " pod="metallb-system/metallb-operator-controller-manager-5b86d95c7b-kqtkn" Mar 20 10:49:59 crc kubenswrapper[4748]: I0320 10:49:59.963467 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwsrb\" (UniqueName: \"kubernetes.io/projected/7b6c6eee-f00c-458f-b050-6aaab992addf-kube-api-access-qwsrb\") pod \"metallb-operator-controller-manager-5b86d95c7b-kqtkn\" (UID: \"7b6c6eee-f00c-458f-b050-6aaab992addf\") " pod="metallb-system/metallb-operator-controller-manager-5b86d95c7b-kqtkn" Mar 20 10:49:59 crc kubenswrapper[4748]: I0320 10:49:59.963512 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7b6c6eee-f00c-458f-b050-6aaab992addf-apiservice-cert\") pod \"metallb-operator-controller-manager-5b86d95c7b-kqtkn\" (UID: \"7b6c6eee-f00c-458f-b050-6aaab992addf\") " pod="metallb-system/metallb-operator-controller-manager-5b86d95c7b-kqtkn" Mar 20 10:50:00 crc kubenswrapper[4748]: I0320 10:50:00.064354 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7b6c6eee-f00c-458f-b050-6aaab992addf-webhook-cert\") pod \"metallb-operator-controller-manager-5b86d95c7b-kqtkn\" (UID: \"7b6c6eee-f00c-458f-b050-6aaab992addf\") " pod="metallb-system/metallb-operator-controller-manager-5b86d95c7b-kqtkn" Mar 20 10:50:00 crc kubenswrapper[4748]: I0320 10:50:00.064410 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qwsrb\" (UniqueName: \"kubernetes.io/projected/7b6c6eee-f00c-458f-b050-6aaab992addf-kube-api-access-qwsrb\") pod \"metallb-operator-controller-manager-5b86d95c7b-kqtkn\" (UID: \"7b6c6eee-f00c-458f-b050-6aaab992addf\") " pod="metallb-system/metallb-operator-controller-manager-5b86d95c7b-kqtkn" Mar 20 10:50:00 crc kubenswrapper[4748]: I0320 10:50:00.064468 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7b6c6eee-f00c-458f-b050-6aaab992addf-apiservice-cert\") pod \"metallb-operator-controller-manager-5b86d95c7b-kqtkn\" (UID: \"7b6c6eee-f00c-458f-b050-6aaab992addf\") " pod="metallb-system/metallb-operator-controller-manager-5b86d95c7b-kqtkn" Mar 20 10:50:00 crc kubenswrapper[4748]: I0320 10:50:00.076659 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7b6c6eee-f00c-458f-b050-6aaab992addf-apiservice-cert\") pod \"metallb-operator-controller-manager-5b86d95c7b-kqtkn\" (UID: \"7b6c6eee-f00c-458f-b050-6aaab992addf\") " pod="metallb-system/metallb-operator-controller-manager-5b86d95c7b-kqtkn" Mar 20 10:50:00 crc kubenswrapper[4748]: I0320 10:50:00.076659 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7b6c6eee-f00c-458f-b050-6aaab992addf-webhook-cert\") pod \"metallb-operator-controller-manager-5b86d95c7b-kqtkn\" (UID: \"7b6c6eee-f00c-458f-b050-6aaab992addf\") " pod="metallb-system/metallb-operator-controller-manager-5b86d95c7b-kqtkn" Mar 20 10:50:00 crc kubenswrapper[4748]: I0320 10:50:00.085151 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwsrb\" (UniqueName: \"kubernetes.io/projected/7b6c6eee-f00c-458f-b050-6aaab992addf-kube-api-access-qwsrb\") pod \"metallb-operator-controller-manager-5b86d95c7b-kqtkn\" (UID: \"7b6c6eee-f00c-458f-b050-6aaab992addf\") " pod="metallb-system/metallb-operator-controller-manager-5b86d95c7b-kqtkn" Mar 20 10:50:00 crc kubenswrapper[4748]: I0320 10:50:00.108124 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-c7659c7ff-w4c8m"] Mar 20 10:50:00 crc kubenswrapper[4748]: I0320 10:50:00.108809 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-c7659c7ff-w4c8m" Mar 20 10:50:00 crc kubenswrapper[4748]: I0320 10:50:00.114212 4748 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 20 10:50:00 crc kubenswrapper[4748]: I0320 10:50:00.116617 4748 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Mar 20 10:50:00 crc kubenswrapper[4748]: I0320 10:50:00.116829 4748 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-xtpch" Mar 20 10:50:00 crc kubenswrapper[4748]: I0320 10:50:00.134771 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-c7659c7ff-w4c8m"] Mar 20 10:50:00 crc kubenswrapper[4748]: I0320 10:50:00.144279 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566730-gq92r"] Mar 20 10:50:00 crc kubenswrapper[4748]: I0320 10:50:00.145412 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566730-gq92r" Mar 20 10:50:00 crc kubenswrapper[4748]: I0320 10:50:00.152355 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 10:50:00 crc kubenswrapper[4748]: I0320 10:50:00.152567 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-p6q8z" Mar 20 10:50:00 crc kubenswrapper[4748]: I0320 10:50:00.152666 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 10:50:00 crc kubenswrapper[4748]: I0320 10:50:00.157169 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566730-gq92r"] Mar 20 10:50:00 crc kubenswrapper[4748]: I0320 10:50:00.166775 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bghdv\" (UniqueName: \"kubernetes.io/projected/a3d53081-2293-4131-bbfb-99306c7b2c0a-kube-api-access-bghdv\") pod \"auto-csr-approver-29566730-gq92r\" (UID: \"a3d53081-2293-4131-bbfb-99306c7b2c0a\") " pod="openshift-infra/auto-csr-approver-29566730-gq92r" Mar 20 10:50:00 crc kubenswrapper[4748]: I0320 10:50:00.166929 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2w8d6\" (UniqueName: \"kubernetes.io/projected/be8702a2-29a5-4037-94f4-0f3a4b48754d-kube-api-access-2w8d6\") pod \"metallb-operator-webhook-server-c7659c7ff-w4c8m\" (UID: \"be8702a2-29a5-4037-94f4-0f3a4b48754d\") " pod="metallb-system/metallb-operator-webhook-server-c7659c7ff-w4c8m" Mar 20 10:50:00 crc kubenswrapper[4748]: I0320 10:50:00.166963 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/be8702a2-29a5-4037-94f4-0f3a4b48754d-apiservice-cert\") pod \"metallb-operator-webhook-server-c7659c7ff-w4c8m\" (UID: \"be8702a2-29a5-4037-94f4-0f3a4b48754d\") " pod="metallb-system/metallb-operator-webhook-server-c7659c7ff-w4c8m" Mar 20 10:50:00 crc kubenswrapper[4748]: I0320 10:50:00.167004 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/be8702a2-29a5-4037-94f4-0f3a4b48754d-webhook-cert\") pod \"metallb-operator-webhook-server-c7659c7ff-w4c8m\" (UID: \"be8702a2-29a5-4037-94f4-0f3a4b48754d\") " pod="metallb-system/metallb-operator-webhook-server-c7659c7ff-w4c8m" Mar 20 10:50:00 crc kubenswrapper[4748]: I0320 10:50:00.195170 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-5b86d95c7b-kqtkn" Mar 20 10:50:00 crc kubenswrapper[4748]: I0320 10:50:00.269452 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bghdv\" (UniqueName: \"kubernetes.io/projected/a3d53081-2293-4131-bbfb-99306c7b2c0a-kube-api-access-bghdv\") pod \"auto-csr-approver-29566730-gq92r\" (UID: \"a3d53081-2293-4131-bbfb-99306c7b2c0a\") " pod="openshift-infra/auto-csr-approver-29566730-gq92r" Mar 20 10:50:00 crc kubenswrapper[4748]: I0320 10:50:00.269543 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2w8d6\" (UniqueName: \"kubernetes.io/projected/be8702a2-29a5-4037-94f4-0f3a4b48754d-kube-api-access-2w8d6\") pod \"metallb-operator-webhook-server-c7659c7ff-w4c8m\" (UID: \"be8702a2-29a5-4037-94f4-0f3a4b48754d\") " pod="metallb-system/metallb-operator-webhook-server-c7659c7ff-w4c8m" Mar 20 10:50:00 crc kubenswrapper[4748]: I0320 10:50:00.269574 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/be8702a2-29a5-4037-94f4-0f3a4b48754d-apiservice-cert\") pod \"metallb-operator-webhook-server-c7659c7ff-w4c8m\" (UID: \"be8702a2-29a5-4037-94f4-0f3a4b48754d\") " pod="metallb-system/metallb-operator-webhook-server-c7659c7ff-w4c8m" Mar 20 10:50:00 crc kubenswrapper[4748]: I0320 10:50:00.269618 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/be8702a2-29a5-4037-94f4-0f3a4b48754d-webhook-cert\") pod \"metallb-operator-webhook-server-c7659c7ff-w4c8m\" (UID: \"be8702a2-29a5-4037-94f4-0f3a4b48754d\") " pod="metallb-system/metallb-operator-webhook-server-c7659c7ff-w4c8m" Mar 20 10:50:00 crc kubenswrapper[4748]: I0320 10:50:00.274618 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/be8702a2-29a5-4037-94f4-0f3a4b48754d-apiservice-cert\") pod \"metallb-operator-webhook-server-c7659c7ff-w4c8m\" (UID: \"be8702a2-29a5-4037-94f4-0f3a4b48754d\") " pod="metallb-system/metallb-operator-webhook-server-c7659c7ff-w4c8m" Mar 20 10:50:00 crc kubenswrapper[4748]: I0320 10:50:00.277468 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/be8702a2-29a5-4037-94f4-0f3a4b48754d-webhook-cert\") pod \"metallb-operator-webhook-server-c7659c7ff-w4c8m\" (UID: \"be8702a2-29a5-4037-94f4-0f3a4b48754d\") " pod="metallb-system/metallb-operator-webhook-server-c7659c7ff-w4c8m" Mar 20 10:50:00 crc kubenswrapper[4748]: I0320 10:50:00.296715 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2w8d6\" (UniqueName: \"kubernetes.io/projected/be8702a2-29a5-4037-94f4-0f3a4b48754d-kube-api-access-2w8d6\") pod \"metallb-operator-webhook-server-c7659c7ff-w4c8m\" (UID: \"be8702a2-29a5-4037-94f4-0f3a4b48754d\") " pod="metallb-system/metallb-operator-webhook-server-c7659c7ff-w4c8m" Mar 20 10:50:00 crc kubenswrapper[4748]: I0320 10:50:00.297423 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bghdv\" (UniqueName: \"kubernetes.io/projected/a3d53081-2293-4131-bbfb-99306c7b2c0a-kube-api-access-bghdv\") pod \"auto-csr-approver-29566730-gq92r\" (UID: \"a3d53081-2293-4131-bbfb-99306c7b2c0a\") " pod="openshift-infra/auto-csr-approver-29566730-gq92r" Mar 20 10:50:00 crc kubenswrapper[4748]: I0320 10:50:00.434730 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-c7659c7ff-w4c8m" Mar 20 10:50:00 crc kubenswrapper[4748]: I0320 10:50:00.470878 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566730-gq92r" Mar 20 10:50:00 crc kubenswrapper[4748]: I0320 10:50:00.551418 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-5b86d95c7b-kqtkn"] Mar 20 10:50:00 crc kubenswrapper[4748]: W0320 10:50:00.602427 4748 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7b6c6eee_f00c_458f_b050_6aaab992addf.slice/crio-3c8960c2d8f087f5a7f43b34879f1bf563bd78537cc95910e6ba44286f5b3f49 WatchSource:0}: Error finding container 3c8960c2d8f087f5a7f43b34879f1bf563bd78537cc95910e6ba44286f5b3f49: Status 404 returned error can't find the container with id 3c8960c2d8f087f5a7f43b34879f1bf563bd78537cc95910e6ba44286f5b3f49 Mar 20 10:50:00 crc kubenswrapper[4748]: I0320 10:50:00.629209 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-5b86d95c7b-kqtkn" event={"ID":"7b6c6eee-f00c-458f-b050-6aaab992addf","Type":"ContainerStarted","Data":"3c8960c2d8f087f5a7f43b34879f1bf563bd78537cc95910e6ba44286f5b3f49"} Mar 20 10:50:00 crc kubenswrapper[4748]: I0320 10:50:00.782853 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566730-gq92r"] Mar 20 10:50:00 crc kubenswrapper[4748]: I0320 10:50:00.946930 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-c7659c7ff-w4c8m"] Mar 20 10:50:00 crc kubenswrapper[4748]: W0320 10:50:00.949121 4748 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbe8702a2_29a5_4037_94f4_0f3a4b48754d.slice/crio-481fd0c9943137d79a1a7d20b85b9e3ca05e4cf0a89b732f0501d77b6714bedc WatchSource:0}: Error finding container 481fd0c9943137d79a1a7d20b85b9e3ca05e4cf0a89b732f0501d77b6714bedc: Status 404 returned error can't find the container with id 481fd0c9943137d79a1a7d20b85b9e3ca05e4cf0a89b732f0501d77b6714bedc Mar 20 10:50:01 crc kubenswrapper[4748]: I0320 10:50:01.638022 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-c7659c7ff-w4c8m" event={"ID":"be8702a2-29a5-4037-94f4-0f3a4b48754d","Type":"ContainerStarted","Data":"481fd0c9943137d79a1a7d20b85b9e3ca05e4cf0a89b732f0501d77b6714bedc"} Mar 20 10:50:01 crc kubenswrapper[4748]: I0320 10:50:01.639941 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566730-gq92r" event={"ID":"a3d53081-2293-4131-bbfb-99306c7b2c0a","Type":"ContainerStarted","Data":"a6139c3229bdd4f3b4f47dca572509ad23a42274c285a88ffdd65c8bbc0c976a"} Mar 20 10:50:04 crc kubenswrapper[4748]: I0320 10:50:04.691410 4748 generic.go:334] "Generic (PLEG): container finished" podID="a3d53081-2293-4131-bbfb-99306c7b2c0a" containerID="f7863050dbe48cee1b8d6c88fffaf895e820d9a23f3721578114ae080eaaeb81" exitCode=0 Mar 20 10:50:04 crc kubenswrapper[4748]: I0320 10:50:04.691594 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566730-gq92r" event={"ID":"a3d53081-2293-4131-bbfb-99306c7b2c0a","Type":"ContainerDied","Data":"f7863050dbe48cee1b8d6c88fffaf895e820d9a23f3721578114ae080eaaeb81"} Mar 20 10:50:06 crc kubenswrapper[4748]: I0320 10:50:06.470036 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566730-gq92r" Mar 20 10:50:06 crc kubenswrapper[4748]: I0320 10:50:06.622714 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bghdv\" (UniqueName: \"kubernetes.io/projected/a3d53081-2293-4131-bbfb-99306c7b2c0a-kube-api-access-bghdv\") pod \"a3d53081-2293-4131-bbfb-99306c7b2c0a\" (UID: \"a3d53081-2293-4131-bbfb-99306c7b2c0a\") " Mar 20 10:50:06 crc kubenswrapper[4748]: I0320 10:50:06.629826 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3d53081-2293-4131-bbfb-99306c7b2c0a-kube-api-access-bghdv" (OuterVolumeSpecName: "kube-api-access-bghdv") pod "a3d53081-2293-4131-bbfb-99306c7b2c0a" (UID: "a3d53081-2293-4131-bbfb-99306c7b2c0a"). InnerVolumeSpecName "kube-api-access-bghdv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:50:06 crc kubenswrapper[4748]: I0320 10:50:06.705272 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566730-gq92r" event={"ID":"a3d53081-2293-4131-bbfb-99306c7b2c0a","Type":"ContainerDied","Data":"a6139c3229bdd4f3b4f47dca572509ad23a42274c285a88ffdd65c8bbc0c976a"} Mar 20 10:50:06 crc kubenswrapper[4748]: I0320 10:50:06.705315 4748 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a6139c3229bdd4f3b4f47dca572509ad23a42274c285a88ffdd65c8bbc0c976a" Mar 20 10:50:06 crc kubenswrapper[4748]: I0320 10:50:06.705366 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566730-gq92r" Mar 20 10:50:06 crc kubenswrapper[4748]: I0320 10:50:06.724573 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bghdv\" (UniqueName: \"kubernetes.io/projected/a3d53081-2293-4131-bbfb-99306c7b2c0a-kube-api-access-bghdv\") on node \"crc\" DevicePath \"\"" Mar 20 10:50:07 crc kubenswrapper[4748]: I0320 10:50:07.525880 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566724-dh74f"] Mar 20 10:50:07 crc kubenswrapper[4748]: I0320 10:50:07.527878 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566724-dh74f"] Mar 20 10:50:09 crc kubenswrapper[4748]: I0320 10:50:09.534687 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d228a510-6e8c-4489-9d87-5d0cdec16828" path="/var/lib/kubelet/pods/d228a510-6e8c-4489-9d87-5d0cdec16828/volumes" Mar 20 10:50:11 crc kubenswrapper[4748]: I0320 10:50:11.157387 4748 scope.go:117] "RemoveContainer" containerID="e180064a952221cbb687ca77ba1ef3e2b706db532ef183b13386b8491ef01057" Mar 20 10:50:15 crc kubenswrapper[4748]: I0320 10:50:15.777034 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-c7659c7ff-w4c8m" event={"ID":"be8702a2-29a5-4037-94f4-0f3a4b48754d","Type":"ContainerStarted","Data":"4b22a7d0b7fb8a4a849ce4aaa5f855bb5af92f3b922ff5f3e7b796a65f83189a"} Mar 20 10:50:15 crc kubenswrapper[4748]: I0320 10:50:15.777882 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-c7659c7ff-w4c8m" Mar 20 10:50:15 crc kubenswrapper[4748]: I0320 10:50:15.779481 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-5b86d95c7b-kqtkn" event={"ID":"7b6c6eee-f00c-458f-b050-6aaab992addf","Type":"ContainerStarted","Data":"d3eb06410843b9e2ce84384f448baf672d85ddd5bd16af2c9529f6ba6bfe8cf4"} Mar 20 10:50:15 crc kubenswrapper[4748]: I0320 10:50:15.779677 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-5b86d95c7b-kqtkn" Mar 20 10:50:15 crc kubenswrapper[4748]: I0320 10:50:15.804906 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-c7659c7ff-w4c8m" podStartSLOduration=2.147913857 podStartE2EDuration="15.804879144s" podCreationTimestamp="2026-03-20 10:50:00 +0000 UTC" firstStartedPulling="2026-03-20 10:50:00.952224209 +0000 UTC m=+836.093770023" lastFinishedPulling="2026-03-20 10:50:14.609189486 +0000 UTC m=+849.750735310" observedRunningTime="2026-03-20 10:50:15.801300534 +0000 UTC m=+850.942846388" watchObservedRunningTime="2026-03-20 10:50:15.804879144 +0000 UTC m=+850.946424998" Mar 20 10:50:15 crc kubenswrapper[4748]: I0320 10:50:15.832263 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-5b86d95c7b-kqtkn" podStartSLOduration=2.838846748 podStartE2EDuration="16.832215707s" podCreationTimestamp="2026-03-20 10:49:59 +0000 UTC" firstStartedPulling="2026-03-20 10:50:00.606648118 +0000 UTC m=+835.748193932" lastFinishedPulling="2026-03-20 10:50:14.600017067 +0000 UTC m=+849.741562891" observedRunningTime="2026-03-20 10:50:15.828358101 +0000 UTC m=+850.969903935" watchObservedRunningTime="2026-03-20 10:50:15.832215707 +0000 UTC m=+850.973761531" Mar 20 10:50:30 crc kubenswrapper[4748]: I0320 10:50:30.439723 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-c7659c7ff-w4c8m" Mar 20 10:50:42 crc kubenswrapper[4748]: I0320 10:50:42.928719 4748 patch_prober.go:28] interesting pod/machine-config-daemon-5lbvz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 10:50:42 crc kubenswrapper[4748]: I0320 10:50:42.929285 4748 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 10:50:50 crc kubenswrapper[4748]: I0320 10:50:50.199882 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-5b86d95c7b-kqtkn" Mar 20 10:50:50 crc kubenswrapper[4748]: I0320 10:50:50.880418 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-8rblj"] Mar 20 10:50:50 crc kubenswrapper[4748]: E0320 10:50:50.881131 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3d53081-2293-4131-bbfb-99306c7b2c0a" containerName="oc" Mar 20 10:50:50 crc kubenswrapper[4748]: I0320 10:50:50.881149 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3d53081-2293-4131-bbfb-99306c7b2c0a" containerName="oc" Mar 20 10:50:50 crc kubenswrapper[4748]: I0320 10:50:50.881319 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3d53081-2293-4131-bbfb-99306c7b2c0a" containerName="oc" Mar 20 10:50:50 crc kubenswrapper[4748]: I0320 10:50:50.881869 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-8rblj" Mar 20 10:50:50 crc kubenswrapper[4748]: I0320 10:50:50.883588 4748 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Mar 20 10:50:50 crc kubenswrapper[4748]: I0320 10:50:50.884167 4748 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-2kf2j" Mar 20 10:50:50 crc kubenswrapper[4748]: I0320 10:50:50.884481 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-wf8gw"] Mar 20 10:50:50 crc kubenswrapper[4748]: I0320 10:50:50.886740 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-wf8gw" Mar 20 10:50:50 crc kubenswrapper[4748]: I0320 10:50:50.889904 4748 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Mar 20 10:50:50 crc kubenswrapper[4748]: I0320 10:50:50.891446 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Mar 20 10:50:50 crc kubenswrapper[4748]: I0320 10:50:50.894874 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-8rblj"] Mar 20 10:50:50 crc kubenswrapper[4748]: I0320 10:50:50.950427 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/9c9f62ea-e7bf-4b0b-adc7-d0f282d6d13c-frr-startup\") pod \"frr-k8s-wf8gw\" (UID: \"9c9f62ea-e7bf-4b0b-adc7-d0f282d6d13c\") " pod="metallb-system/frr-k8s-wf8gw" Mar 20 10:50:50 crc kubenswrapper[4748]: I0320 10:50:50.950483 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7qqf\" (UniqueName: \"kubernetes.io/projected/de736bb5-e7a6-4a9a-8841-5ff65871db92-kube-api-access-c7qqf\") pod \"frr-k8s-webhook-server-bcc4b6f68-8rblj\" (UID: \"de736bb5-e7a6-4a9a-8841-5ff65871db92\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-8rblj" Mar 20 10:50:50 crc kubenswrapper[4748]: I0320 10:50:50.950612 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/9c9f62ea-e7bf-4b0b-adc7-d0f282d6d13c-frr-sockets\") pod \"frr-k8s-wf8gw\" (UID: \"9c9f62ea-e7bf-4b0b-adc7-d0f282d6d13c\") " pod="metallb-system/frr-k8s-wf8gw" Mar 20 10:50:50 crc kubenswrapper[4748]: I0320 10:50:50.950640 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9c9f62ea-e7bf-4b0b-adc7-d0f282d6d13c-metrics-certs\") pod \"frr-k8s-wf8gw\" (UID: \"9c9f62ea-e7bf-4b0b-adc7-d0f282d6d13c\") " pod="metallb-system/frr-k8s-wf8gw" Mar 20 10:50:50 crc kubenswrapper[4748]: I0320 10:50:50.950794 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/de736bb5-e7a6-4a9a-8841-5ff65871db92-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-8rblj\" (UID: \"de736bb5-e7a6-4a9a-8841-5ff65871db92\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-8rblj" Mar 20 10:50:50 crc kubenswrapper[4748]: I0320 10:50:50.950880 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/9c9f62ea-e7bf-4b0b-adc7-d0f282d6d13c-metrics\") pod \"frr-k8s-wf8gw\" (UID: \"9c9f62ea-e7bf-4b0b-adc7-d0f282d6d13c\") " pod="metallb-system/frr-k8s-wf8gw" Mar 20 10:50:50 crc kubenswrapper[4748]: I0320 10:50:50.950911 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cnmps\" (UniqueName: \"kubernetes.io/projected/9c9f62ea-e7bf-4b0b-adc7-d0f282d6d13c-kube-api-access-cnmps\") pod \"frr-k8s-wf8gw\" (UID: \"9c9f62ea-e7bf-4b0b-adc7-d0f282d6d13c\") " pod="metallb-system/frr-k8s-wf8gw" Mar 20 10:50:50 crc kubenswrapper[4748]: I0320 10:50:50.950976 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/9c9f62ea-e7bf-4b0b-adc7-d0f282d6d13c-reloader\") pod \"frr-k8s-wf8gw\" (UID: \"9c9f62ea-e7bf-4b0b-adc7-d0f282d6d13c\") " pod="metallb-system/frr-k8s-wf8gw" Mar 20 10:50:50 crc kubenswrapper[4748]: I0320 10:50:50.950994 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/9c9f62ea-e7bf-4b0b-adc7-d0f282d6d13c-frr-conf\") pod \"frr-k8s-wf8gw\" (UID: \"9c9f62ea-e7bf-4b0b-adc7-d0f282d6d13c\") " pod="metallb-system/frr-k8s-wf8gw" Mar 20 10:50:50 crc kubenswrapper[4748]: I0320 10:50:50.970848 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-vkqf7"] Mar 20 10:50:50 crc kubenswrapper[4748]: I0320 10:50:50.971686 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-vkqf7" Mar 20 10:50:50 crc kubenswrapper[4748]: I0320 10:50:50.974118 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Mar 20 10:50:50 crc kubenswrapper[4748]: I0320 10:50:50.974473 4748 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-t5548" Mar 20 10:50:50 crc kubenswrapper[4748]: I0320 10:50:50.974759 4748 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Mar 20 10:50:50 crc kubenswrapper[4748]: I0320 10:50:50.976476 4748 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Mar 20 10:50:50 crc kubenswrapper[4748]: I0320 10:50:50.985738 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-7bb4cc7c98-ppl7d"] Mar 20 10:50:50 crc kubenswrapper[4748]: I0320 10:50:50.986694 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-ppl7d" Mar 20 10:50:50 crc kubenswrapper[4748]: I0320 10:50:50.990862 4748 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Mar 20 10:50:51 crc kubenswrapper[4748]: I0320 10:50:51.015248 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-ppl7d"] Mar 20 10:50:51 crc kubenswrapper[4748]: I0320 10:50:51.052685 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/7a239ede-0107-4751-b103-27b225f2cf5e-memberlist\") pod \"speaker-vkqf7\" (UID: \"7a239ede-0107-4751-b103-27b225f2cf5e\") " pod="metallb-system/speaker-vkqf7" Mar 20 10:50:51 crc kubenswrapper[4748]: I0320 10:50:51.052738 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8df7\" (UniqueName: \"kubernetes.io/projected/7a239ede-0107-4751-b103-27b225f2cf5e-kube-api-access-r8df7\") pod \"speaker-vkqf7\" (UID: \"7a239ede-0107-4751-b103-27b225f2cf5e\") " pod="metallb-system/speaker-vkqf7" Mar 20 10:50:51 crc kubenswrapper[4748]: I0320 10:50:51.052765 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1dcaf132-7ad3-4b86-ba2f-e695238b2001-metrics-certs\") pod \"controller-7bb4cc7c98-ppl7d\" (UID: \"1dcaf132-7ad3-4b86-ba2f-e695238b2001\") " pod="metallb-system/controller-7bb4cc7c98-ppl7d" Mar 20 10:50:51 crc kubenswrapper[4748]: I0320 10:50:51.052878 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/9c9f62ea-e7bf-4b0b-adc7-d0f282d6d13c-reloader\") pod \"frr-k8s-wf8gw\" (UID: \"9c9f62ea-e7bf-4b0b-adc7-d0f282d6d13c\") " pod="metallb-system/frr-k8s-wf8gw" Mar 20 10:50:51 crc kubenswrapper[4748]: I0320 10:50:51.052928 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/9c9f62ea-e7bf-4b0b-adc7-d0f282d6d13c-frr-conf\") pod \"frr-k8s-wf8gw\" (UID: \"9c9f62ea-e7bf-4b0b-adc7-d0f282d6d13c\") " pod="metallb-system/frr-k8s-wf8gw" Mar 20 10:50:51 crc kubenswrapper[4748]: I0320 10:50:51.052995 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9ggkg\" (UniqueName: \"kubernetes.io/projected/1dcaf132-7ad3-4b86-ba2f-e695238b2001-kube-api-access-9ggkg\") pod \"controller-7bb4cc7c98-ppl7d\" (UID: \"1dcaf132-7ad3-4b86-ba2f-e695238b2001\") " pod="metallb-system/controller-7bb4cc7c98-ppl7d" Mar 20 10:50:51 crc kubenswrapper[4748]: I0320 10:50:51.053020 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1dcaf132-7ad3-4b86-ba2f-e695238b2001-cert\") pod \"controller-7bb4cc7c98-ppl7d\" (UID: \"1dcaf132-7ad3-4b86-ba2f-e695238b2001\") " pod="metallb-system/controller-7bb4cc7c98-ppl7d" Mar 20 10:50:51 crc kubenswrapper[4748]: I0320 10:50:51.053054 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/7a239ede-0107-4751-b103-27b225f2cf5e-metallb-excludel2\") pod \"speaker-vkqf7\" (UID: \"7a239ede-0107-4751-b103-27b225f2cf5e\") " pod="metallb-system/speaker-vkqf7" Mar 20 10:50:51 crc kubenswrapper[4748]: I0320 10:50:51.053080 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/9c9f62ea-e7bf-4b0b-adc7-d0f282d6d13c-frr-startup\") pod \"frr-k8s-wf8gw\" (UID: \"9c9f62ea-e7bf-4b0b-adc7-d0f282d6d13c\") " pod="metallb-system/frr-k8s-wf8gw" Mar 20 10:50:51 crc kubenswrapper[4748]: I0320 10:50:51.053104 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c7qqf\" (UniqueName: \"kubernetes.io/projected/de736bb5-e7a6-4a9a-8841-5ff65871db92-kube-api-access-c7qqf\") pod \"frr-k8s-webhook-server-bcc4b6f68-8rblj\" (UID: \"de736bb5-e7a6-4a9a-8841-5ff65871db92\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-8rblj" Mar 20 10:50:51 crc kubenswrapper[4748]: I0320 10:50:51.053192 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/9c9f62ea-e7bf-4b0b-adc7-d0f282d6d13c-frr-sockets\") pod \"frr-k8s-wf8gw\" (UID: \"9c9f62ea-e7bf-4b0b-adc7-d0f282d6d13c\") " pod="metallb-system/frr-k8s-wf8gw" Mar 20 10:50:51 crc kubenswrapper[4748]: I0320 10:50:51.053212 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9c9f62ea-e7bf-4b0b-adc7-d0f282d6d13c-metrics-certs\") pod \"frr-k8s-wf8gw\" (UID: \"9c9f62ea-e7bf-4b0b-adc7-d0f282d6d13c\") " pod="metallb-system/frr-k8s-wf8gw" Mar 20 10:50:51 crc kubenswrapper[4748]: I0320 10:50:51.053263 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7a239ede-0107-4751-b103-27b225f2cf5e-metrics-certs\") pod \"speaker-vkqf7\" (UID: \"7a239ede-0107-4751-b103-27b225f2cf5e\") " pod="metallb-system/speaker-vkqf7" Mar 20 10:50:51 crc kubenswrapper[4748]: I0320 10:50:51.053307 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/de736bb5-e7a6-4a9a-8841-5ff65871db92-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-8rblj\" (UID: \"de736bb5-e7a6-4a9a-8841-5ff65871db92\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-8rblj" Mar 20 10:50:51 crc kubenswrapper[4748]: I0320 10:50:51.053338 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/9c9f62ea-e7bf-4b0b-adc7-d0f282d6d13c-metrics\") pod \"frr-k8s-wf8gw\" (UID: \"9c9f62ea-e7bf-4b0b-adc7-d0f282d6d13c\") " pod="metallb-system/frr-k8s-wf8gw" Mar 20 10:50:51 crc kubenswrapper[4748]: I0320 10:50:51.053356 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cnmps\" (UniqueName: \"kubernetes.io/projected/9c9f62ea-e7bf-4b0b-adc7-d0f282d6d13c-kube-api-access-cnmps\") pod \"frr-k8s-wf8gw\" (UID: \"9c9f62ea-e7bf-4b0b-adc7-d0f282d6d13c\") " pod="metallb-system/frr-k8s-wf8gw" Mar 20 10:50:51 crc kubenswrapper[4748]: I0320 10:50:51.053393 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/9c9f62ea-e7bf-4b0b-adc7-d0f282d6d13c-frr-conf\") pod \"frr-k8s-wf8gw\" (UID: \"9c9f62ea-e7bf-4b0b-adc7-d0f282d6d13c\") " pod="metallb-system/frr-k8s-wf8gw" Mar 20 10:50:51 crc kubenswrapper[4748]: I0320 10:50:51.053536 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/9c9f62ea-e7bf-4b0b-adc7-d0f282d6d13c-reloader\") pod \"frr-k8s-wf8gw\" (UID: \"9c9f62ea-e7bf-4b0b-adc7-d0f282d6d13c\") " pod="metallb-system/frr-k8s-wf8gw" Mar 20 10:50:51 crc kubenswrapper[4748]: E0320 10:50:51.053647 4748 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Mar 20 10:50:51 crc kubenswrapper[4748]: E0320 10:50:51.053692 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/de736bb5-e7a6-4a9a-8841-5ff65871db92-cert podName:de736bb5-e7a6-4a9a-8841-5ff65871db92 nodeName:}" failed. No retries permitted until 2026-03-20 10:50:51.55367688 +0000 UTC m=+886.695222684 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/de736bb5-e7a6-4a9a-8841-5ff65871db92-cert") pod "frr-k8s-webhook-server-bcc4b6f68-8rblj" (UID: "de736bb5-e7a6-4a9a-8841-5ff65871db92") : secret "frr-k8s-webhook-server-cert" not found Mar 20 10:50:51 crc kubenswrapper[4748]: I0320 10:50:51.054054 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/9c9f62ea-e7bf-4b0b-adc7-d0f282d6d13c-metrics\") pod \"frr-k8s-wf8gw\" (UID: \"9c9f62ea-e7bf-4b0b-adc7-d0f282d6d13c\") " pod="metallb-system/frr-k8s-wf8gw" Mar 20 10:50:51 crc kubenswrapper[4748]: I0320 10:50:51.054437 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/9c9f62ea-e7bf-4b0b-adc7-d0f282d6d13c-frr-sockets\") pod \"frr-k8s-wf8gw\" (UID: \"9c9f62ea-e7bf-4b0b-adc7-d0f282d6d13c\") " pod="metallb-system/frr-k8s-wf8gw" Mar 20 10:50:51 crc kubenswrapper[4748]: I0320 10:50:51.054716 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/9c9f62ea-e7bf-4b0b-adc7-d0f282d6d13c-frr-startup\") pod \"frr-k8s-wf8gw\" (UID: \"9c9f62ea-e7bf-4b0b-adc7-d0f282d6d13c\") " pod="metallb-system/frr-k8s-wf8gw" Mar 20 10:50:51 crc kubenswrapper[4748]: I0320 10:50:51.079736 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9c9f62ea-e7bf-4b0b-adc7-d0f282d6d13c-metrics-certs\") pod \"frr-k8s-wf8gw\" (UID: \"9c9f62ea-e7bf-4b0b-adc7-d0f282d6d13c\") " pod="metallb-system/frr-k8s-wf8gw" Mar 20 10:50:51 crc kubenswrapper[4748]: I0320 10:50:51.095486 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cnmps\" (UniqueName: \"kubernetes.io/projected/9c9f62ea-e7bf-4b0b-adc7-d0f282d6d13c-kube-api-access-cnmps\") pod \"frr-k8s-wf8gw\" (UID: \"9c9f62ea-e7bf-4b0b-adc7-d0f282d6d13c\") " pod="metallb-system/frr-k8s-wf8gw" Mar 20 10:50:51 crc kubenswrapper[4748]: I0320 10:50:51.103419 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7qqf\" (UniqueName: \"kubernetes.io/projected/de736bb5-e7a6-4a9a-8841-5ff65871db92-kube-api-access-c7qqf\") pod \"frr-k8s-webhook-server-bcc4b6f68-8rblj\" (UID: \"de736bb5-e7a6-4a9a-8841-5ff65871db92\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-8rblj" Mar 20 10:50:51 crc kubenswrapper[4748]: I0320 10:50:51.154364 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9ggkg\" (UniqueName: \"kubernetes.io/projected/1dcaf132-7ad3-4b86-ba2f-e695238b2001-kube-api-access-9ggkg\") pod \"controller-7bb4cc7c98-ppl7d\" (UID: \"1dcaf132-7ad3-4b86-ba2f-e695238b2001\") " pod="metallb-system/controller-7bb4cc7c98-ppl7d" Mar 20 10:50:51 crc kubenswrapper[4748]: I0320 10:50:51.154414 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1dcaf132-7ad3-4b86-ba2f-e695238b2001-cert\") pod \"controller-7bb4cc7c98-ppl7d\" (UID: \"1dcaf132-7ad3-4b86-ba2f-e695238b2001\") " pod="metallb-system/controller-7bb4cc7c98-ppl7d" Mar 20 10:50:51 crc kubenswrapper[4748]: I0320 10:50:51.154450 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/7a239ede-0107-4751-b103-27b225f2cf5e-metallb-excludel2\") pod \"speaker-vkqf7\" (UID: \"7a239ede-0107-4751-b103-27b225f2cf5e\") " pod="metallb-system/speaker-vkqf7" Mar 20 10:50:51 crc kubenswrapper[4748]: I0320 10:50:51.154509 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7a239ede-0107-4751-b103-27b225f2cf5e-metrics-certs\") pod \"speaker-vkqf7\" (UID: \"7a239ede-0107-4751-b103-27b225f2cf5e\") " pod="metallb-system/speaker-vkqf7" Mar 20 10:50:51 crc kubenswrapper[4748]: I0320 10:50:51.154602 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/7a239ede-0107-4751-b103-27b225f2cf5e-memberlist\") pod \"speaker-vkqf7\" (UID: \"7a239ede-0107-4751-b103-27b225f2cf5e\") " pod="metallb-system/speaker-vkqf7" Mar 20 10:50:51 crc kubenswrapper[4748]: I0320 10:50:51.154629 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8df7\" (UniqueName: \"kubernetes.io/projected/7a239ede-0107-4751-b103-27b225f2cf5e-kube-api-access-r8df7\") pod \"speaker-vkqf7\" (UID: \"7a239ede-0107-4751-b103-27b225f2cf5e\") " pod="metallb-system/speaker-vkqf7" Mar 20 10:50:51 crc kubenswrapper[4748]: I0320 10:50:51.154651 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1dcaf132-7ad3-4b86-ba2f-e695238b2001-metrics-certs\") pod \"controller-7bb4cc7c98-ppl7d\" (UID: \"1dcaf132-7ad3-4b86-ba2f-e695238b2001\") " pod="metallb-system/controller-7bb4cc7c98-ppl7d" Mar 20 10:50:51 crc kubenswrapper[4748]: E0320 10:50:51.154773 4748 secret.go:188] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Mar 20 10:50:51 crc kubenswrapper[4748]: E0320 10:50:51.154875 4748 secret.go:188] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found Mar 20 10:50:51 crc kubenswrapper[4748]: E0320 10:50:51.154817 4748 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 20 10:50:51 crc kubenswrapper[4748]: E0320 10:50:51.154913 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7a239ede-0107-4751-b103-27b225f2cf5e-metrics-certs podName:7a239ede-0107-4751-b103-27b225f2cf5e nodeName:}" failed. No retries permitted until 2026-03-20 10:50:51.654885169 +0000 UTC m=+886.796430983 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7a239ede-0107-4751-b103-27b225f2cf5e-metrics-certs") pod "speaker-vkqf7" (UID: "7a239ede-0107-4751-b103-27b225f2cf5e") : secret "speaker-certs-secret" not found Mar 20 10:50:51 crc kubenswrapper[4748]: E0320 10:50:51.154936 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7a239ede-0107-4751-b103-27b225f2cf5e-memberlist podName:7a239ede-0107-4751-b103-27b225f2cf5e nodeName:}" failed. No retries permitted until 2026-03-20 10:50:51.65492321 +0000 UTC m=+886.796469024 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/7a239ede-0107-4751-b103-27b225f2cf5e-memberlist") pod "speaker-vkqf7" (UID: "7a239ede-0107-4751-b103-27b225f2cf5e") : secret "metallb-memberlist" not found Mar 20 10:50:51 crc kubenswrapper[4748]: E0320 10:50:51.154954 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1dcaf132-7ad3-4b86-ba2f-e695238b2001-metrics-certs podName:1dcaf132-7ad3-4b86-ba2f-e695238b2001 nodeName:}" failed. No retries permitted until 2026-03-20 10:50:51.65494579 +0000 UTC m=+886.796491604 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1dcaf132-7ad3-4b86-ba2f-e695238b2001-metrics-certs") pod "controller-7bb4cc7c98-ppl7d" (UID: "1dcaf132-7ad3-4b86-ba2f-e695238b2001") : secret "controller-certs-secret" not found Mar 20 10:50:51 crc kubenswrapper[4748]: I0320 10:50:51.155374 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/7a239ede-0107-4751-b103-27b225f2cf5e-metallb-excludel2\") pod \"speaker-vkqf7\" (UID: \"7a239ede-0107-4751-b103-27b225f2cf5e\") " pod="metallb-system/speaker-vkqf7" Mar 20 10:50:51 crc kubenswrapper[4748]: I0320 10:50:51.156705 4748 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 20 10:50:51 crc kubenswrapper[4748]: I0320 10:50:51.169322 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1dcaf132-7ad3-4b86-ba2f-e695238b2001-cert\") pod \"controller-7bb4cc7c98-ppl7d\" (UID: \"1dcaf132-7ad3-4b86-ba2f-e695238b2001\") " pod="metallb-system/controller-7bb4cc7c98-ppl7d" Mar 20 10:50:51 crc kubenswrapper[4748]: I0320 10:50:51.172142 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9ggkg\" (UniqueName: \"kubernetes.io/projected/1dcaf132-7ad3-4b86-ba2f-e695238b2001-kube-api-access-9ggkg\") pod \"controller-7bb4cc7c98-ppl7d\" (UID: \"1dcaf132-7ad3-4b86-ba2f-e695238b2001\") " pod="metallb-system/controller-7bb4cc7c98-ppl7d" Mar 20 10:50:51 crc kubenswrapper[4748]: I0320 10:50:51.188375 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8df7\" (UniqueName: \"kubernetes.io/projected/7a239ede-0107-4751-b103-27b225f2cf5e-kube-api-access-r8df7\") pod \"speaker-vkqf7\" (UID: \"7a239ede-0107-4751-b103-27b225f2cf5e\") " pod="metallb-system/speaker-vkqf7" Mar 20 10:50:51 crc kubenswrapper[4748]: I0320 10:50:51.251220 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-wf8gw" Mar 20 10:50:51 crc kubenswrapper[4748]: I0320 10:50:51.559807 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/de736bb5-e7a6-4a9a-8841-5ff65871db92-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-8rblj\" (UID: \"de736bb5-e7a6-4a9a-8841-5ff65871db92\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-8rblj" Mar 20 10:50:51 crc kubenswrapper[4748]: I0320 10:50:51.565028 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/de736bb5-e7a6-4a9a-8841-5ff65871db92-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-8rblj\" (UID: \"de736bb5-e7a6-4a9a-8841-5ff65871db92\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-8rblj" Mar 20 10:50:51 crc kubenswrapper[4748]: I0320 10:50:51.660905 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/7a239ede-0107-4751-b103-27b225f2cf5e-memberlist\") pod \"speaker-vkqf7\" (UID: \"7a239ede-0107-4751-b103-27b225f2cf5e\") " pod="metallb-system/speaker-vkqf7" Mar 20 10:50:51 crc kubenswrapper[4748]: I0320 10:50:51.660985 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1dcaf132-7ad3-4b86-ba2f-e695238b2001-metrics-certs\") pod \"controller-7bb4cc7c98-ppl7d\" (UID: \"1dcaf132-7ad3-4b86-ba2f-e695238b2001\") " pod="metallb-system/controller-7bb4cc7c98-ppl7d" Mar 20 10:50:51 crc kubenswrapper[4748]: I0320 10:50:51.661107 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7a239ede-0107-4751-b103-27b225f2cf5e-metrics-certs\") pod \"speaker-vkqf7\" (UID: \"7a239ede-0107-4751-b103-27b225f2cf5e\") " pod="metallb-system/speaker-vkqf7" Mar 20 10:50:51 crc kubenswrapper[4748]: E0320 10:50:51.661139 4748 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 20 10:50:51 crc kubenswrapper[4748]: E0320 10:50:51.661223 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7a239ede-0107-4751-b103-27b225f2cf5e-memberlist podName:7a239ede-0107-4751-b103-27b225f2cf5e nodeName:}" failed. No retries permitted until 2026-03-20 10:50:52.661198131 +0000 UTC m=+887.802743955 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/7a239ede-0107-4751-b103-27b225f2cf5e-memberlist") pod "speaker-vkqf7" (UID: "7a239ede-0107-4751-b103-27b225f2cf5e") : secret "metallb-memberlist" not found Mar 20 10:50:51 crc kubenswrapper[4748]: I0320 10:50:51.664248 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1dcaf132-7ad3-4b86-ba2f-e695238b2001-metrics-certs\") pod \"controller-7bb4cc7c98-ppl7d\" (UID: \"1dcaf132-7ad3-4b86-ba2f-e695238b2001\") " pod="metallb-system/controller-7bb4cc7c98-ppl7d" Mar 20 10:50:51 crc kubenswrapper[4748]: I0320 10:50:51.664501 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7a239ede-0107-4751-b103-27b225f2cf5e-metrics-certs\") pod \"speaker-vkqf7\" (UID: \"7a239ede-0107-4751-b103-27b225f2cf5e\") " pod="metallb-system/speaker-vkqf7" Mar 20 10:50:51 crc kubenswrapper[4748]: I0320 10:50:51.799224 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-8rblj" Mar 20 10:50:51 crc kubenswrapper[4748]: I0320 10:50:51.905968 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-ppl7d" Mar 20 10:50:52 crc kubenswrapper[4748]: I0320 10:50:52.111609 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-ppl7d"] Mar 20 10:50:52 crc kubenswrapper[4748]: W0320 10:50:52.113449 4748 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1dcaf132_7ad3_4b86_ba2f_e695238b2001.slice/crio-77970dd5af76bcfbd57c78f8e09a00e7211c14343cd57f00bdd43807ccaaf77d WatchSource:0}: Error finding container 77970dd5af76bcfbd57c78f8e09a00e7211c14343cd57f00bdd43807ccaaf77d: Status 404 returned error can't find the container with id 77970dd5af76bcfbd57c78f8e09a00e7211c14343cd57f00bdd43807ccaaf77d Mar 20 10:50:52 crc kubenswrapper[4748]: I0320 10:50:52.202094 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-8rblj"] Mar 20 10:50:52 crc kubenswrapper[4748]: I0320 10:50:52.213479 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-wf8gw" event={"ID":"9c9f62ea-e7bf-4b0b-adc7-d0f282d6d13c","Type":"ContainerStarted","Data":"df479fa14a2ef0b0ba7fe7c4215de7230b68406ce49dcb477e9fb0a956d3fae4"} Mar 20 10:50:52 crc kubenswrapper[4748]: I0320 10:50:52.215101 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-8rblj" event={"ID":"de736bb5-e7a6-4a9a-8841-5ff65871db92","Type":"ContainerStarted","Data":"38e346d28a95d71066df61c0872e9ebeffbb76aafce657ba0a6f141a0a8dc3e6"} Mar 20 10:50:52 crc kubenswrapper[4748]: I0320 10:50:52.216489 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-ppl7d" event={"ID":"1dcaf132-7ad3-4b86-ba2f-e695238b2001","Type":"ContainerStarted","Data":"77970dd5af76bcfbd57c78f8e09a00e7211c14343cd57f00bdd43807ccaaf77d"} Mar 20 10:50:52 crc kubenswrapper[4748]: I0320 10:50:52.681289 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/7a239ede-0107-4751-b103-27b225f2cf5e-memberlist\") pod \"speaker-vkqf7\" (UID: \"7a239ede-0107-4751-b103-27b225f2cf5e\") " pod="metallb-system/speaker-vkqf7" Mar 20 10:50:52 crc kubenswrapper[4748]: I0320 10:50:52.688285 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/7a239ede-0107-4751-b103-27b225f2cf5e-memberlist\") pod \"speaker-vkqf7\" (UID: \"7a239ede-0107-4751-b103-27b225f2cf5e\") " pod="metallb-system/speaker-vkqf7" Mar 20 10:50:52 crc kubenswrapper[4748]: I0320 10:50:52.787554 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-vkqf7" Mar 20 10:50:52 crc kubenswrapper[4748]: W0320 10:50:52.816755 4748 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a239ede_0107_4751_b103_27b225f2cf5e.slice/crio-e6be7342490ae727be670d97316a3285e61129d8d9382cacdd1fce11ad97a259 WatchSource:0}: Error finding container e6be7342490ae727be670d97316a3285e61129d8d9382cacdd1fce11ad97a259: Status 404 returned error can't find the container with id e6be7342490ae727be670d97316a3285e61129d8d9382cacdd1fce11ad97a259 Mar 20 10:50:53 crc kubenswrapper[4748]: I0320 10:50:53.235667 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-vkqf7" event={"ID":"7a239ede-0107-4751-b103-27b225f2cf5e","Type":"ContainerStarted","Data":"d10a7904eeeacbfea39b5ce8790f134a7897ea921b553c03f368fe06d03fe129"} Mar 20 10:50:53 crc kubenswrapper[4748]: I0320 10:50:53.236027 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-vkqf7" event={"ID":"7a239ede-0107-4751-b103-27b225f2cf5e","Type":"ContainerStarted","Data":"e6be7342490ae727be670d97316a3285e61129d8d9382cacdd1fce11ad97a259"} Mar 20 10:50:53 crc kubenswrapper[4748]: I0320 10:50:53.240074 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-ppl7d" event={"ID":"1dcaf132-7ad3-4b86-ba2f-e695238b2001","Type":"ContainerStarted","Data":"a79c79375f76d896b4d78be093716601d6441910acc02666c9aeeebf0326943a"} Mar 20 10:50:53 crc kubenswrapper[4748]: I0320 10:50:53.240135 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-ppl7d" event={"ID":"1dcaf132-7ad3-4b86-ba2f-e695238b2001","Type":"ContainerStarted","Data":"b927ba65c44c4647414681d61b3460d21f95d11e9c26f9b75cede9e57c80982f"} Mar 20 10:50:53 crc kubenswrapper[4748]: I0320 10:50:53.241198 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-7bb4cc7c98-ppl7d" Mar 20 10:50:53 crc kubenswrapper[4748]: I0320 10:50:53.268234 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-7bb4cc7c98-ppl7d" podStartSLOduration=3.268204595 podStartE2EDuration="3.268204595s" podCreationTimestamp="2026-03-20 10:50:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:50:53.261905878 +0000 UTC m=+888.403451722" watchObservedRunningTime="2026-03-20 10:50:53.268204595 +0000 UTC m=+888.409750409" Mar 20 10:50:54 crc kubenswrapper[4748]: I0320 10:50:54.252007 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-vkqf7" event={"ID":"7a239ede-0107-4751-b103-27b225f2cf5e","Type":"ContainerStarted","Data":"4af1427d6c9b60cac6ea74ef1bd7d76f4dd3c67038ca59a68631d2a204b3f16a"} Mar 20 10:50:54 crc kubenswrapper[4748]: I0320 10:50:54.273789 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-vkqf7" podStartSLOduration=4.273770072 podStartE2EDuration="4.273770072s" podCreationTimestamp="2026-03-20 10:50:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:50:54.27128879 +0000 UTC m=+889.412834604" watchObservedRunningTime="2026-03-20 10:50:54.273770072 +0000 UTC m=+889.415315886" Mar 20 10:50:55 crc kubenswrapper[4748]: I0320 10:50:55.257934 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-vkqf7" Mar 20 10:50:59 crc kubenswrapper[4748]: I0320 10:50:59.289303 4748 generic.go:334] "Generic (PLEG): container finished" podID="9c9f62ea-e7bf-4b0b-adc7-d0f282d6d13c" containerID="d1399488595d380735b6232b242093e351b49154023a9138025bb689917915b5" exitCode=0 Mar 20 10:50:59 crc kubenswrapper[4748]: I0320 10:50:59.289437 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-wf8gw" event={"ID":"9c9f62ea-e7bf-4b0b-adc7-d0f282d6d13c","Type":"ContainerDied","Data":"d1399488595d380735b6232b242093e351b49154023a9138025bb689917915b5"} Mar 20 10:50:59 crc kubenswrapper[4748]: I0320 10:50:59.291874 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-8rblj" event={"ID":"de736bb5-e7a6-4a9a-8841-5ff65871db92","Type":"ContainerStarted","Data":"2ff179a09508985b9b6475521102ab61c18b06051820ffed561d2ad8b8e17c8a"} Mar 20 10:50:59 crc kubenswrapper[4748]: I0320 10:50:59.292041 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-8rblj" Mar 20 10:51:00 crc kubenswrapper[4748]: I0320 10:51:00.329407 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-8rblj" podStartSLOduration=3.859719325 podStartE2EDuration="10.329386564s" podCreationTimestamp="2026-03-20 10:50:50 +0000 UTC" firstStartedPulling="2026-03-20 10:50:52.208675141 +0000 UTC m=+887.350220955" lastFinishedPulling="2026-03-20 10:50:58.67834238 +0000 UTC m=+893.819888194" observedRunningTime="2026-03-20 10:50:59.310318001 +0000 UTC m=+894.451863815" watchObservedRunningTime="2026-03-20 10:51:00.329386564 +0000 UTC m=+895.470932378" Mar 20 10:51:01 crc kubenswrapper[4748]: I0320 10:51:01.307495 4748 generic.go:334] "Generic (PLEG): container finished" podID="9c9f62ea-e7bf-4b0b-adc7-d0f282d6d13c" containerID="2013a168460c2f1151cee77e467bc2bf1cfc8f4f3e8d430fa4cc26efe94d7ed1" exitCode=0 Mar 20 10:51:01 crc kubenswrapper[4748]: I0320 10:51:01.307989 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-wf8gw" event={"ID":"9c9f62ea-e7bf-4b0b-adc7-d0f282d6d13c","Type":"ContainerDied","Data":"2013a168460c2f1151cee77e467bc2bf1cfc8f4f3e8d430fa4cc26efe94d7ed1"} Mar 20 10:51:02 crc kubenswrapper[4748]: I0320 10:51:02.321065 4748 generic.go:334] "Generic (PLEG): container finished" podID="9c9f62ea-e7bf-4b0b-adc7-d0f282d6d13c" containerID="5ca03db0a5c6e73656705403b4512b3da008df63154935dd72987bb0d308e8c1" exitCode=0 Mar 20 10:51:02 crc kubenswrapper[4748]: I0320 10:51:02.321153 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-wf8gw" event={"ID":"9c9f62ea-e7bf-4b0b-adc7-d0f282d6d13c","Type":"ContainerDied","Data":"5ca03db0a5c6e73656705403b4512b3da008df63154935dd72987bb0d308e8c1"} Mar 20 10:51:03 crc kubenswrapper[4748]: I0320 10:51:03.345574 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-wf8gw" event={"ID":"9c9f62ea-e7bf-4b0b-adc7-d0f282d6d13c","Type":"ContainerStarted","Data":"bf61f0b6b6fe689ec1908afd7b6aca80a71d2d664d2a85bd0a084d3af11b12ca"} Mar 20 10:51:03 crc kubenswrapper[4748]: I0320 10:51:03.345920 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-wf8gw" event={"ID":"9c9f62ea-e7bf-4b0b-adc7-d0f282d6d13c","Type":"ContainerStarted","Data":"9db30ad23c2912124104daae3b7a83e029ba6c918dbcda62d5b4f86d5f0e8de7"} Mar 20 10:51:03 crc kubenswrapper[4748]: I0320 10:51:03.345934 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-wf8gw" event={"ID":"9c9f62ea-e7bf-4b0b-adc7-d0f282d6d13c","Type":"ContainerStarted","Data":"a643c03d5a40f3d3751b80308c67de784ed76f62ef542446e5f98f1fe4c4780b"} Mar 20 10:51:03 crc kubenswrapper[4748]: I0320 10:51:03.345946 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-wf8gw" event={"ID":"9c9f62ea-e7bf-4b0b-adc7-d0f282d6d13c","Type":"ContainerStarted","Data":"628d279e54f72b08ac89828181cd97b4247b0a335200f5b41422b769a7350eed"} Mar 20 10:51:03 crc kubenswrapper[4748]: I0320 10:51:03.345959 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-wf8gw" event={"ID":"9c9f62ea-e7bf-4b0b-adc7-d0f282d6d13c","Type":"ContainerStarted","Data":"e0de6b88efdbd5d92690305dfc03db7b00c4f212566f7ffe2996bde62b791beb"} Mar 20 10:51:04 crc kubenswrapper[4748]: I0320 10:51:04.360546 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-wf8gw" event={"ID":"9c9f62ea-e7bf-4b0b-adc7-d0f282d6d13c","Type":"ContainerStarted","Data":"1664ca9474c40882dc1713dd6e00f56eb248e5469c5d324b9af997ad1a1f0c97"} Mar 20 10:51:04 crc kubenswrapper[4748]: I0320 10:51:04.360884 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-wf8gw" Mar 20 10:51:04 crc kubenswrapper[4748]: I0320 10:51:04.391575 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-wf8gw" podStartSLOduration=7.078142894 podStartE2EDuration="14.391547757s" podCreationTimestamp="2026-03-20 10:50:50 +0000 UTC" firstStartedPulling="2026-03-20 10:50:51.349757408 +0000 UTC m=+886.491303222" lastFinishedPulling="2026-03-20 10:50:58.663162271 +0000 UTC m=+893.804708085" observedRunningTime="2026-03-20 10:51:04.389107636 +0000 UTC m=+899.530653490" watchObservedRunningTime="2026-03-20 10:51:04.391547757 +0000 UTC m=+899.533093581" Mar 20 10:51:06 crc kubenswrapper[4748]: I0320 10:51:06.252381 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-wf8gw" Mar 20 10:51:06 crc kubenswrapper[4748]: I0320 10:51:06.290141 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-wf8gw" Mar 20 10:51:11 crc kubenswrapper[4748]: I0320 10:51:11.808052 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-8rblj" Mar 20 10:51:11 crc kubenswrapper[4748]: I0320 10:51:11.911787 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-7bb4cc7c98-ppl7d" Mar 20 10:51:12 crc kubenswrapper[4748]: I0320 10:51:12.793785 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-vkqf7" Mar 20 10:51:12 crc kubenswrapper[4748]: I0320 10:51:12.928610 4748 patch_prober.go:28] interesting pod/machine-config-daemon-5lbvz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 10:51:12 crc kubenswrapper[4748]: I0320 10:51:12.928735 4748 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 10:51:14 crc kubenswrapper[4748]: I0320 10:51:14.323011 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-m7p7r"] Mar 20 10:51:14 crc kubenswrapper[4748]: I0320 10:51:14.324359 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m7p7r" Mar 20 10:51:14 crc kubenswrapper[4748]: I0320 10:51:14.332453 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-m7p7r"] Mar 20 10:51:14 crc kubenswrapper[4748]: I0320 10:51:14.475558 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f5e3014-2051-4bf5-a1df-b761841fcdaa-utilities\") pod \"community-operators-m7p7r\" (UID: \"9f5e3014-2051-4bf5-a1df-b761841fcdaa\") " pod="openshift-marketplace/community-operators-m7p7r" Mar 20 10:51:14 crc kubenswrapper[4748]: I0320 10:51:14.475613 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f5e3014-2051-4bf5-a1df-b761841fcdaa-catalog-content\") pod \"community-operators-m7p7r\" (UID: \"9f5e3014-2051-4bf5-a1df-b761841fcdaa\") " pod="openshift-marketplace/community-operators-m7p7r" Mar 20 10:51:14 crc kubenswrapper[4748]: I0320 10:51:14.475709 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rttjf\" (UniqueName: \"kubernetes.io/projected/9f5e3014-2051-4bf5-a1df-b761841fcdaa-kube-api-access-rttjf\") pod \"community-operators-m7p7r\" (UID: \"9f5e3014-2051-4bf5-a1df-b761841fcdaa\") " pod="openshift-marketplace/community-operators-m7p7r" Mar 20 10:51:14 crc kubenswrapper[4748]: I0320 10:51:14.577167 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rttjf\" (UniqueName: \"kubernetes.io/projected/9f5e3014-2051-4bf5-a1df-b761841fcdaa-kube-api-access-rttjf\") pod \"community-operators-m7p7r\" (UID: \"9f5e3014-2051-4bf5-a1df-b761841fcdaa\") " pod="openshift-marketplace/community-operators-m7p7r" Mar 20 10:51:14 crc kubenswrapper[4748]: I0320 10:51:14.577297 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f5e3014-2051-4bf5-a1df-b761841fcdaa-utilities\") pod \"community-operators-m7p7r\" (UID: \"9f5e3014-2051-4bf5-a1df-b761841fcdaa\") " pod="openshift-marketplace/community-operators-m7p7r" Mar 20 10:51:14 crc kubenswrapper[4748]: I0320 10:51:14.577329 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f5e3014-2051-4bf5-a1df-b761841fcdaa-catalog-content\") pod \"community-operators-m7p7r\" (UID: \"9f5e3014-2051-4bf5-a1df-b761841fcdaa\") " pod="openshift-marketplace/community-operators-m7p7r" Mar 20 10:51:14 crc kubenswrapper[4748]: I0320 10:51:14.577888 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f5e3014-2051-4bf5-a1df-b761841fcdaa-catalog-content\") pod \"community-operators-m7p7r\" (UID: \"9f5e3014-2051-4bf5-a1df-b761841fcdaa\") " pod="openshift-marketplace/community-operators-m7p7r" Mar 20 10:51:14 crc kubenswrapper[4748]: I0320 10:51:14.578021 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f5e3014-2051-4bf5-a1df-b761841fcdaa-utilities\") pod \"community-operators-m7p7r\" (UID: \"9f5e3014-2051-4bf5-a1df-b761841fcdaa\") " pod="openshift-marketplace/community-operators-m7p7r" Mar 20 10:51:14 crc kubenswrapper[4748]: I0320 10:51:14.599544 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rttjf\" (UniqueName: \"kubernetes.io/projected/9f5e3014-2051-4bf5-a1df-b761841fcdaa-kube-api-access-rttjf\") pod \"community-operators-m7p7r\" (UID: \"9f5e3014-2051-4bf5-a1df-b761841fcdaa\") " pod="openshift-marketplace/community-operators-m7p7r" Mar 20 10:51:14 crc kubenswrapper[4748]: I0320 10:51:14.686462 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m7p7r" Mar 20 10:51:15 crc kubenswrapper[4748]: I0320 10:51:15.210239 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-m7p7r"] Mar 20 10:51:15 crc kubenswrapper[4748]: I0320 10:51:15.469895 4748 generic.go:334] "Generic (PLEG): container finished" podID="9f5e3014-2051-4bf5-a1df-b761841fcdaa" containerID="f3dcc17a058604c935e0fab3100d46d8231d05e491813ef46bd98f13dec34e3b" exitCode=0 Mar 20 10:51:15 crc kubenswrapper[4748]: I0320 10:51:15.469990 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m7p7r" event={"ID":"9f5e3014-2051-4bf5-a1df-b761841fcdaa","Type":"ContainerDied","Data":"f3dcc17a058604c935e0fab3100d46d8231d05e491813ef46bd98f13dec34e3b"} Mar 20 10:51:15 crc kubenswrapper[4748]: I0320 10:51:15.470223 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m7p7r" event={"ID":"9f5e3014-2051-4bf5-a1df-b761841fcdaa","Type":"ContainerStarted","Data":"a768881acfa6eef2ccbd22dc4b62813f3f4ceda59c6ed5a23a4c34f18c75ebdb"} Mar 20 10:51:18 crc kubenswrapper[4748]: I0320 10:51:18.900734 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-p9wd8"] Mar 20 10:51:18 crc kubenswrapper[4748]: I0320 10:51:18.902500 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-p9wd8" Mar 20 10:51:18 crc kubenswrapper[4748]: I0320 10:51:18.905405 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Mar 20 10:51:18 crc kubenswrapper[4748]: I0320 10:51:18.905738 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-n47hh" Mar 20 10:51:18 crc kubenswrapper[4748]: I0320 10:51:18.905988 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Mar 20 10:51:18 crc kubenswrapper[4748]: I0320 10:51:18.911924 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-p9wd8"] Mar 20 10:51:19 crc kubenswrapper[4748]: I0320 10:51:19.046747 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4nv6j\" (UniqueName: \"kubernetes.io/projected/246b06bc-5f0b-4ef1-87eb-a0f56ad26e30-kube-api-access-4nv6j\") pod \"openstack-operator-index-p9wd8\" (UID: \"246b06bc-5f0b-4ef1-87eb-a0f56ad26e30\") " pod="openstack-operators/openstack-operator-index-p9wd8" Mar 20 10:51:19 crc kubenswrapper[4748]: I0320 10:51:19.147629 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4nv6j\" (UniqueName: \"kubernetes.io/projected/246b06bc-5f0b-4ef1-87eb-a0f56ad26e30-kube-api-access-4nv6j\") pod \"openstack-operator-index-p9wd8\" (UID: \"246b06bc-5f0b-4ef1-87eb-a0f56ad26e30\") " pod="openstack-operators/openstack-operator-index-p9wd8" Mar 20 10:51:19 crc kubenswrapper[4748]: I0320 10:51:19.168247 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4nv6j\" (UniqueName: \"kubernetes.io/projected/246b06bc-5f0b-4ef1-87eb-a0f56ad26e30-kube-api-access-4nv6j\") pod \"openstack-operator-index-p9wd8\" (UID: \"246b06bc-5f0b-4ef1-87eb-a0f56ad26e30\") " pod="openstack-operators/openstack-operator-index-p9wd8" Mar 20 10:51:19 crc kubenswrapper[4748]: I0320 10:51:19.224373 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-p9wd8" Mar 20 10:51:19 crc kubenswrapper[4748]: I0320 10:51:19.498919 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m7p7r" event={"ID":"9f5e3014-2051-4bf5-a1df-b761841fcdaa","Type":"ContainerStarted","Data":"02b6abc773e2198162ecd085fda3710605be41e4a5d6b20e5011e1b74a004ab2"} Mar 20 10:51:19 crc kubenswrapper[4748]: I0320 10:51:19.642526 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-p9wd8"] Mar 20 10:51:19 crc kubenswrapper[4748]: W0320 10:51:19.648030 4748 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod246b06bc_5f0b_4ef1_87eb_a0f56ad26e30.slice/crio-7af89c20596e76f5235aa429ac44073a3411dd9c3c8b4cad55436bf461902e91 WatchSource:0}: Error finding container 7af89c20596e76f5235aa429ac44073a3411dd9c3c8b4cad55436bf461902e91: Status 404 returned error can't find the container with id 7af89c20596e76f5235aa429ac44073a3411dd9c3c8b4cad55436bf461902e91 Mar 20 10:51:20 crc kubenswrapper[4748]: I0320 10:51:20.507999 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-p9wd8" event={"ID":"246b06bc-5f0b-4ef1-87eb-a0f56ad26e30","Type":"ContainerStarted","Data":"7af89c20596e76f5235aa429ac44073a3411dd9c3c8b4cad55436bf461902e91"} Mar 20 10:51:20 crc kubenswrapper[4748]: I0320 10:51:20.509514 4748 generic.go:334] "Generic (PLEG): container finished" podID="9f5e3014-2051-4bf5-a1df-b761841fcdaa" containerID="02b6abc773e2198162ecd085fda3710605be41e4a5d6b20e5011e1b74a004ab2" exitCode=0 Mar 20 10:51:20 crc kubenswrapper[4748]: I0320 10:51:20.509553 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m7p7r" event={"ID":"9f5e3014-2051-4bf5-a1df-b761841fcdaa","Type":"ContainerDied","Data":"02b6abc773e2198162ecd085fda3710605be41e4a5d6b20e5011e1b74a004ab2"} Mar 20 10:51:21 crc kubenswrapper[4748]: I0320 10:51:21.255461 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-wf8gw" Mar 20 10:51:22 crc kubenswrapper[4748]: I0320 10:51:22.521974 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-p9wd8" event={"ID":"246b06bc-5f0b-4ef1-87eb-a0f56ad26e30","Type":"ContainerStarted","Data":"9b196e1aeb64530ff77bd6f91684df744a7082b281be25d0e0b6e9d27f18ae83"} Mar 20 10:51:22 crc kubenswrapper[4748]: I0320 10:51:22.525501 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m7p7r" event={"ID":"9f5e3014-2051-4bf5-a1df-b761841fcdaa","Type":"ContainerStarted","Data":"c1ef5b64fa459611b73fc70b1351facc185315a2a5a6ebdd5c64375ae7ec4705"} Mar 20 10:51:22 crc kubenswrapper[4748]: I0320 10:51:22.540236 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-p9wd8" podStartSLOduration=2.126628256 podStartE2EDuration="4.540215125s" podCreationTimestamp="2026-03-20 10:51:18 +0000 UTC" firstStartedPulling="2026-03-20 10:51:19.651861322 +0000 UTC m=+914.793407136" lastFinishedPulling="2026-03-20 10:51:22.065448191 +0000 UTC m=+917.206994005" observedRunningTime="2026-03-20 10:51:22.533585559 +0000 UTC m=+917.675131373" watchObservedRunningTime="2026-03-20 10:51:22.540215125 +0000 UTC m=+917.681760939" Mar 20 10:51:22 crc kubenswrapper[4748]: I0320 10:51:22.561188 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-m7p7r" podStartSLOduration=2.00957962 podStartE2EDuration="8.561163758s" podCreationTimestamp="2026-03-20 10:51:14 +0000 UTC" firstStartedPulling="2026-03-20 10:51:15.471549216 +0000 UTC m=+910.613095050" lastFinishedPulling="2026-03-20 10:51:22.023133384 +0000 UTC m=+917.164679188" observedRunningTime="2026-03-20 10:51:22.553526887 +0000 UTC m=+917.695072721" watchObservedRunningTime="2026-03-20 10:51:22.561163758 +0000 UTC m=+917.702709582" Mar 20 10:51:23 crc kubenswrapper[4748]: I0320 10:51:23.903678 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-gwtr7"] Mar 20 10:51:23 crc kubenswrapper[4748]: I0320 10:51:23.905620 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gwtr7" Mar 20 10:51:23 crc kubenswrapper[4748]: I0320 10:51:23.912169 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gwtr7"] Mar 20 10:51:23 crc kubenswrapper[4748]: I0320 10:51:23.916073 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a8d9c43-e449-4317-9c7d-9dba5e1130c3-catalog-content\") pod \"redhat-marketplace-gwtr7\" (UID: \"7a8d9c43-e449-4317-9c7d-9dba5e1130c3\") " pod="openshift-marketplace/redhat-marketplace-gwtr7" Mar 20 10:51:23 crc kubenswrapper[4748]: I0320 10:51:23.916153 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppzx7\" (UniqueName: \"kubernetes.io/projected/7a8d9c43-e449-4317-9c7d-9dba5e1130c3-kube-api-access-ppzx7\") pod \"redhat-marketplace-gwtr7\" (UID: \"7a8d9c43-e449-4317-9c7d-9dba5e1130c3\") " pod="openshift-marketplace/redhat-marketplace-gwtr7" Mar 20 10:51:23 crc kubenswrapper[4748]: I0320 10:51:23.916192 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a8d9c43-e449-4317-9c7d-9dba5e1130c3-utilities\") pod \"redhat-marketplace-gwtr7\" (UID: \"7a8d9c43-e449-4317-9c7d-9dba5e1130c3\") " pod="openshift-marketplace/redhat-marketplace-gwtr7" Mar 20 10:51:24 crc kubenswrapper[4748]: I0320 10:51:24.017067 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a8d9c43-e449-4317-9c7d-9dba5e1130c3-catalog-content\") pod \"redhat-marketplace-gwtr7\" (UID: \"7a8d9c43-e449-4317-9c7d-9dba5e1130c3\") " pod="openshift-marketplace/redhat-marketplace-gwtr7" Mar 20 10:51:24 crc kubenswrapper[4748]: I0320 10:51:24.017152 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ppzx7\" (UniqueName: \"kubernetes.io/projected/7a8d9c43-e449-4317-9c7d-9dba5e1130c3-kube-api-access-ppzx7\") pod \"redhat-marketplace-gwtr7\" (UID: \"7a8d9c43-e449-4317-9c7d-9dba5e1130c3\") " pod="openshift-marketplace/redhat-marketplace-gwtr7" Mar 20 10:51:24 crc kubenswrapper[4748]: I0320 10:51:24.017188 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a8d9c43-e449-4317-9c7d-9dba5e1130c3-utilities\") pod \"redhat-marketplace-gwtr7\" (UID: \"7a8d9c43-e449-4317-9c7d-9dba5e1130c3\") " pod="openshift-marketplace/redhat-marketplace-gwtr7" Mar 20 10:51:24 crc kubenswrapper[4748]: I0320 10:51:24.017717 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a8d9c43-e449-4317-9c7d-9dba5e1130c3-catalog-content\") pod \"redhat-marketplace-gwtr7\" (UID: \"7a8d9c43-e449-4317-9c7d-9dba5e1130c3\") " pod="openshift-marketplace/redhat-marketplace-gwtr7" Mar 20 10:51:24 crc kubenswrapper[4748]: I0320 10:51:24.017749 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a8d9c43-e449-4317-9c7d-9dba5e1130c3-utilities\") pod \"redhat-marketplace-gwtr7\" (UID: \"7a8d9c43-e449-4317-9c7d-9dba5e1130c3\") " pod="openshift-marketplace/redhat-marketplace-gwtr7" Mar 20 10:51:24 crc kubenswrapper[4748]: I0320 10:51:24.039349 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppzx7\" (UniqueName: \"kubernetes.io/projected/7a8d9c43-e449-4317-9c7d-9dba5e1130c3-kube-api-access-ppzx7\") pod \"redhat-marketplace-gwtr7\" (UID: \"7a8d9c43-e449-4317-9c7d-9dba5e1130c3\") " pod="openshift-marketplace/redhat-marketplace-gwtr7" Mar 20 10:51:24 crc kubenswrapper[4748]: I0320 10:51:24.230311 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gwtr7" Mar 20 10:51:24 crc kubenswrapper[4748]: I0320 10:51:24.686599 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-m7p7r" Mar 20 10:51:24 crc kubenswrapper[4748]: I0320 10:51:24.686941 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-m7p7r" Mar 20 10:51:24 crc kubenswrapper[4748]: W0320 10:51:24.827747 4748 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a8d9c43_e449_4317_9c7d_9dba5e1130c3.slice/crio-26c1520e3c7e771a33e4d5d3757e9b58b903994e32c52812947e1974fc906930 WatchSource:0}: Error finding container 26c1520e3c7e771a33e4d5d3757e9b58b903994e32c52812947e1974fc906930: Status 404 returned error can't find the container with id 26c1520e3c7e771a33e4d5d3757e9b58b903994e32c52812947e1974fc906930 Mar 20 10:51:24 crc kubenswrapper[4748]: I0320 10:51:24.838438 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gwtr7"] Mar 20 10:51:24 crc kubenswrapper[4748]: I0320 10:51:24.842515 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-m7p7r" Mar 20 10:51:25 crc kubenswrapper[4748]: I0320 10:51:25.565446 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gwtr7" event={"ID":"7a8d9c43-e449-4317-9c7d-9dba5e1130c3","Type":"ContainerDied","Data":"027d5789e4d26493abf5f4a142f7d232e8062c7e1e469cbd0bd7d2906896441d"} Mar 20 10:51:25 crc kubenswrapper[4748]: I0320 10:51:25.565003 4748 generic.go:334] "Generic (PLEG): container finished" podID="7a8d9c43-e449-4317-9c7d-9dba5e1130c3" containerID="027d5789e4d26493abf5f4a142f7d232e8062c7e1e469cbd0bd7d2906896441d" exitCode=0 Mar 20 10:51:25 crc kubenswrapper[4748]: I0320 10:51:25.566117 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gwtr7" event={"ID":"7a8d9c43-e449-4317-9c7d-9dba5e1130c3","Type":"ContainerStarted","Data":"26c1520e3c7e771a33e4d5d3757e9b58b903994e32c52812947e1974fc906930"} Mar 20 10:51:26 crc kubenswrapper[4748]: I0320 10:51:26.576538 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gwtr7" event={"ID":"7a8d9c43-e449-4317-9c7d-9dba5e1130c3","Type":"ContainerStarted","Data":"000b0928ae8384e259ba2497c27a2b601e1f93ef550c10ef2c650fc79963e4b1"} Mar 20 10:51:27 crc kubenswrapper[4748]: I0320 10:51:27.587675 4748 generic.go:334] "Generic (PLEG): container finished" podID="7a8d9c43-e449-4317-9c7d-9dba5e1130c3" containerID="000b0928ae8384e259ba2497c27a2b601e1f93ef550c10ef2c650fc79963e4b1" exitCode=0 Mar 20 10:51:27 crc kubenswrapper[4748]: I0320 10:51:27.587749 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gwtr7" event={"ID":"7a8d9c43-e449-4317-9c7d-9dba5e1130c3","Type":"ContainerDied","Data":"000b0928ae8384e259ba2497c27a2b601e1f93ef550c10ef2c650fc79963e4b1"} Mar 20 10:51:28 crc kubenswrapper[4748]: I0320 10:51:28.598587 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gwtr7" event={"ID":"7a8d9c43-e449-4317-9c7d-9dba5e1130c3","Type":"ContainerStarted","Data":"2a067076c7440c0bccb3a73e82a8e1ea2928c3b36d1032879d9161078c979f72"} Mar 20 10:51:28 crc kubenswrapper[4748]: I0320 10:51:28.624178 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-gwtr7" podStartSLOduration=3.218552576 podStartE2EDuration="5.624152657s" podCreationTimestamp="2026-03-20 10:51:23 +0000 UTC" firstStartedPulling="2026-03-20 10:51:25.568998006 +0000 UTC m=+920.710543820" lastFinishedPulling="2026-03-20 10:51:27.974598097 +0000 UTC m=+923.116143901" observedRunningTime="2026-03-20 10:51:28.61706962 +0000 UTC m=+923.758615454" watchObservedRunningTime="2026-03-20 10:51:28.624152657 +0000 UTC m=+923.765698491" Mar 20 10:51:29 crc kubenswrapper[4748]: I0320 10:51:29.224930 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-p9wd8" Mar 20 10:51:29 crc kubenswrapper[4748]: I0320 10:51:29.226161 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-p9wd8" Mar 20 10:51:29 crc kubenswrapper[4748]: I0320 10:51:29.306514 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-p9wd8" Mar 20 10:51:29 crc kubenswrapper[4748]: I0320 10:51:29.633700 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-p9wd8" Mar 20 10:51:31 crc kubenswrapper[4748]: I0320 10:51:31.375861 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/69f52c17ab4278ef34a91246e6ae1710f6a470e03e4488535e84bfb4ebh7v5w"] Mar 20 10:51:31 crc kubenswrapper[4748]: I0320 10:51:31.377823 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/69f52c17ab4278ef34a91246e6ae1710f6a470e03e4488535e84bfb4ebh7v5w" Mar 20 10:51:31 crc kubenswrapper[4748]: I0320 10:51:31.380434 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-242zb" Mar 20 10:51:31 crc kubenswrapper[4748]: I0320 10:51:31.398106 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/69f52c17ab4278ef34a91246e6ae1710f6a470e03e4488535e84bfb4ebh7v5w"] Mar 20 10:51:31 crc kubenswrapper[4748]: I0320 10:51:31.431609 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzjg8\" (UniqueName: \"kubernetes.io/projected/c9a1c042-3dd9-486b-aafd-05767bd2e20b-kube-api-access-tzjg8\") pod \"69f52c17ab4278ef34a91246e6ae1710f6a470e03e4488535e84bfb4ebh7v5w\" (UID: \"c9a1c042-3dd9-486b-aafd-05767bd2e20b\") " pod="openstack-operators/69f52c17ab4278ef34a91246e6ae1710f6a470e03e4488535e84bfb4ebh7v5w" Mar 20 10:51:31 crc kubenswrapper[4748]: I0320 10:51:31.431897 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c9a1c042-3dd9-486b-aafd-05767bd2e20b-util\") pod \"69f52c17ab4278ef34a91246e6ae1710f6a470e03e4488535e84bfb4ebh7v5w\" (UID: \"c9a1c042-3dd9-486b-aafd-05767bd2e20b\") " pod="openstack-operators/69f52c17ab4278ef34a91246e6ae1710f6a470e03e4488535e84bfb4ebh7v5w" Mar 20 10:51:31 crc kubenswrapper[4748]: I0320 10:51:31.431995 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c9a1c042-3dd9-486b-aafd-05767bd2e20b-bundle\") pod \"69f52c17ab4278ef34a91246e6ae1710f6a470e03e4488535e84bfb4ebh7v5w\" (UID: \"c9a1c042-3dd9-486b-aafd-05767bd2e20b\") " pod="openstack-operators/69f52c17ab4278ef34a91246e6ae1710f6a470e03e4488535e84bfb4ebh7v5w" Mar 20 10:51:31 crc kubenswrapper[4748]: I0320 10:51:31.533334 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tzjg8\" (UniqueName: \"kubernetes.io/projected/c9a1c042-3dd9-486b-aafd-05767bd2e20b-kube-api-access-tzjg8\") pod \"69f52c17ab4278ef34a91246e6ae1710f6a470e03e4488535e84bfb4ebh7v5w\" (UID: \"c9a1c042-3dd9-486b-aafd-05767bd2e20b\") " pod="openstack-operators/69f52c17ab4278ef34a91246e6ae1710f6a470e03e4488535e84bfb4ebh7v5w" Mar 20 10:51:31 crc kubenswrapper[4748]: I0320 10:51:31.533430 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c9a1c042-3dd9-486b-aafd-05767bd2e20b-util\") pod \"69f52c17ab4278ef34a91246e6ae1710f6a470e03e4488535e84bfb4ebh7v5w\" (UID: \"c9a1c042-3dd9-486b-aafd-05767bd2e20b\") " pod="openstack-operators/69f52c17ab4278ef34a91246e6ae1710f6a470e03e4488535e84bfb4ebh7v5w" Mar 20 10:51:31 crc kubenswrapper[4748]: I0320 10:51:31.533468 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c9a1c042-3dd9-486b-aafd-05767bd2e20b-bundle\") pod \"69f52c17ab4278ef34a91246e6ae1710f6a470e03e4488535e84bfb4ebh7v5w\" (UID: \"c9a1c042-3dd9-486b-aafd-05767bd2e20b\") " pod="openstack-operators/69f52c17ab4278ef34a91246e6ae1710f6a470e03e4488535e84bfb4ebh7v5w" Mar 20 10:51:31 crc kubenswrapper[4748]: I0320 10:51:31.534177 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c9a1c042-3dd9-486b-aafd-05767bd2e20b-bundle\") pod \"69f52c17ab4278ef34a91246e6ae1710f6a470e03e4488535e84bfb4ebh7v5w\" (UID: \"c9a1c042-3dd9-486b-aafd-05767bd2e20b\") " pod="openstack-operators/69f52c17ab4278ef34a91246e6ae1710f6a470e03e4488535e84bfb4ebh7v5w" Mar 20 10:51:31 crc kubenswrapper[4748]: I0320 10:51:31.534310 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c9a1c042-3dd9-486b-aafd-05767bd2e20b-util\") pod \"69f52c17ab4278ef34a91246e6ae1710f6a470e03e4488535e84bfb4ebh7v5w\" (UID: \"c9a1c042-3dd9-486b-aafd-05767bd2e20b\") " pod="openstack-operators/69f52c17ab4278ef34a91246e6ae1710f6a470e03e4488535e84bfb4ebh7v5w" Mar 20 10:51:31 crc kubenswrapper[4748]: I0320 10:51:31.558769 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzjg8\" (UniqueName: \"kubernetes.io/projected/c9a1c042-3dd9-486b-aafd-05767bd2e20b-kube-api-access-tzjg8\") pod \"69f52c17ab4278ef34a91246e6ae1710f6a470e03e4488535e84bfb4ebh7v5w\" (UID: \"c9a1c042-3dd9-486b-aafd-05767bd2e20b\") " pod="openstack-operators/69f52c17ab4278ef34a91246e6ae1710f6a470e03e4488535e84bfb4ebh7v5w" Mar 20 10:51:31 crc kubenswrapper[4748]: I0320 10:51:31.698184 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/69f52c17ab4278ef34a91246e6ae1710f6a470e03e4488535e84bfb4ebh7v5w" Mar 20 10:51:32 crc kubenswrapper[4748]: I0320 10:51:32.160123 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/69f52c17ab4278ef34a91246e6ae1710f6a470e03e4488535e84bfb4ebh7v5w"] Mar 20 10:51:32 crc kubenswrapper[4748]: I0320 10:51:32.628510 4748 generic.go:334] "Generic (PLEG): container finished" podID="c9a1c042-3dd9-486b-aafd-05767bd2e20b" containerID="c5c6c8a053b57d8ae70691898c1b629163bd875df5e5432280b90ce09d9bbd52" exitCode=0 Mar 20 10:51:32 crc kubenswrapper[4748]: I0320 10:51:32.628570 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/69f52c17ab4278ef34a91246e6ae1710f6a470e03e4488535e84bfb4ebh7v5w" event={"ID":"c9a1c042-3dd9-486b-aafd-05767bd2e20b","Type":"ContainerDied","Data":"c5c6c8a053b57d8ae70691898c1b629163bd875df5e5432280b90ce09d9bbd52"} Mar 20 10:51:32 crc kubenswrapper[4748]: I0320 10:51:32.628826 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/69f52c17ab4278ef34a91246e6ae1710f6a470e03e4488535e84bfb4ebh7v5w" event={"ID":"c9a1c042-3dd9-486b-aafd-05767bd2e20b","Type":"ContainerStarted","Data":"ea43b7d4183a3423fa6a71bb019a792cbdfa8607896d3a2274c3aa3dac93251f"} Mar 20 10:51:33 crc kubenswrapper[4748]: I0320 10:51:33.638057 4748 generic.go:334] "Generic (PLEG): container finished" podID="c9a1c042-3dd9-486b-aafd-05767bd2e20b" containerID="f7d9aa6ee67b2e16358c81127cc00d7796e90dcd815d85a8bc18fca8bfb7177a" exitCode=0 Mar 20 10:51:33 crc kubenswrapper[4748]: I0320 10:51:33.638131 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/69f52c17ab4278ef34a91246e6ae1710f6a470e03e4488535e84bfb4ebh7v5w" event={"ID":"c9a1c042-3dd9-486b-aafd-05767bd2e20b","Type":"ContainerDied","Data":"f7d9aa6ee67b2e16358c81127cc00d7796e90dcd815d85a8bc18fca8bfb7177a"} Mar 20 10:51:34 crc kubenswrapper[4748]: I0320 10:51:34.230744 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-gwtr7" Mar 20 10:51:34 crc kubenswrapper[4748]: I0320 10:51:34.230861 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-gwtr7" Mar 20 10:51:34 crc kubenswrapper[4748]: I0320 10:51:34.293489 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-gwtr7" Mar 20 10:51:34 crc kubenswrapper[4748]: I0320 10:51:34.648046 4748 generic.go:334] "Generic (PLEG): container finished" podID="c9a1c042-3dd9-486b-aafd-05767bd2e20b" containerID="0ab0549867eb23395252c60855c28d7f221df486e1fcdf364aacd02391473683" exitCode=0 Mar 20 10:51:34 crc kubenswrapper[4748]: I0320 10:51:34.648317 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/69f52c17ab4278ef34a91246e6ae1710f6a470e03e4488535e84bfb4ebh7v5w" event={"ID":"c9a1c042-3dd9-486b-aafd-05767bd2e20b","Type":"ContainerDied","Data":"0ab0549867eb23395252c60855c28d7f221df486e1fcdf364aacd02391473683"} Mar 20 10:51:34 crc kubenswrapper[4748]: I0320 10:51:34.710189 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-gwtr7" Mar 20 10:51:34 crc kubenswrapper[4748]: I0320 10:51:34.754205 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-m7p7r" Mar 20 10:51:35 crc kubenswrapper[4748]: I0320 10:51:35.698494 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gwtr7"] Mar 20 10:51:35 crc kubenswrapper[4748]: I0320 10:51:35.983525 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/69f52c17ab4278ef34a91246e6ae1710f6a470e03e4488535e84bfb4ebh7v5w" Mar 20 10:51:36 crc kubenswrapper[4748]: I0320 10:51:36.012157 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c9a1c042-3dd9-486b-aafd-05767bd2e20b-util\") pod \"c9a1c042-3dd9-486b-aafd-05767bd2e20b\" (UID: \"c9a1c042-3dd9-486b-aafd-05767bd2e20b\") " Mar 20 10:51:36 crc kubenswrapper[4748]: I0320 10:51:36.012339 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c9a1c042-3dd9-486b-aafd-05767bd2e20b-bundle\") pod \"c9a1c042-3dd9-486b-aafd-05767bd2e20b\" (UID: \"c9a1c042-3dd9-486b-aafd-05767bd2e20b\") " Mar 20 10:51:36 crc kubenswrapper[4748]: I0320 10:51:36.012372 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tzjg8\" (UniqueName: \"kubernetes.io/projected/c9a1c042-3dd9-486b-aafd-05767bd2e20b-kube-api-access-tzjg8\") pod \"c9a1c042-3dd9-486b-aafd-05767bd2e20b\" (UID: \"c9a1c042-3dd9-486b-aafd-05767bd2e20b\") " Mar 20 10:51:36 crc kubenswrapper[4748]: I0320 10:51:36.014714 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9a1c042-3dd9-486b-aafd-05767bd2e20b-bundle" (OuterVolumeSpecName: "bundle") pod "c9a1c042-3dd9-486b-aafd-05767bd2e20b" (UID: "c9a1c042-3dd9-486b-aafd-05767bd2e20b"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:51:36 crc kubenswrapper[4748]: I0320 10:51:36.020657 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9a1c042-3dd9-486b-aafd-05767bd2e20b-kube-api-access-tzjg8" (OuterVolumeSpecName: "kube-api-access-tzjg8") pod "c9a1c042-3dd9-486b-aafd-05767bd2e20b" (UID: "c9a1c042-3dd9-486b-aafd-05767bd2e20b"). InnerVolumeSpecName "kube-api-access-tzjg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:51:36 crc kubenswrapper[4748]: I0320 10:51:36.036973 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9a1c042-3dd9-486b-aafd-05767bd2e20b-util" (OuterVolumeSpecName: "util") pod "c9a1c042-3dd9-486b-aafd-05767bd2e20b" (UID: "c9a1c042-3dd9-486b-aafd-05767bd2e20b"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:51:36 crc kubenswrapper[4748]: I0320 10:51:36.115201 4748 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c9a1c042-3dd9-486b-aafd-05767bd2e20b-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 10:51:36 crc kubenswrapper[4748]: I0320 10:51:36.115246 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tzjg8\" (UniqueName: \"kubernetes.io/projected/c9a1c042-3dd9-486b-aafd-05767bd2e20b-kube-api-access-tzjg8\") on node \"crc\" DevicePath \"\"" Mar 20 10:51:36 crc kubenswrapper[4748]: I0320 10:51:36.115333 4748 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c9a1c042-3dd9-486b-aafd-05767bd2e20b-util\") on node \"crc\" DevicePath \"\"" Mar 20 10:51:36 crc kubenswrapper[4748]: I0320 10:51:36.668123 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/69f52c17ab4278ef34a91246e6ae1710f6a470e03e4488535e84bfb4ebh7v5w" event={"ID":"c9a1c042-3dd9-486b-aafd-05767bd2e20b","Type":"ContainerDied","Data":"ea43b7d4183a3423fa6a71bb019a792cbdfa8607896d3a2274c3aa3dac93251f"} Mar 20 10:51:36 crc kubenswrapper[4748]: I0320 10:51:36.668561 4748 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ea43b7d4183a3423fa6a71bb019a792cbdfa8607896d3a2274c3aa3dac93251f" Mar 20 10:51:36 crc kubenswrapper[4748]: I0320 10:51:36.668163 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/69f52c17ab4278ef34a91246e6ae1710f6a470e03e4488535e84bfb4ebh7v5w" Mar 20 10:51:36 crc kubenswrapper[4748]: I0320 10:51:36.668319 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-gwtr7" podUID="7a8d9c43-e449-4317-9c7d-9dba5e1130c3" containerName="registry-server" containerID="cri-o://2a067076c7440c0bccb3a73e82a8e1ea2928c3b36d1032879d9161078c979f72" gracePeriod=2 Mar 20 10:51:37 crc kubenswrapper[4748]: I0320 10:51:37.078720 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gwtr7" Mar 20 10:51:37 crc kubenswrapper[4748]: I0320 10:51:37.131212 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a8d9c43-e449-4317-9c7d-9dba5e1130c3-catalog-content\") pod \"7a8d9c43-e449-4317-9c7d-9dba5e1130c3\" (UID: \"7a8d9c43-e449-4317-9c7d-9dba5e1130c3\") " Mar 20 10:51:37 crc kubenswrapper[4748]: I0320 10:51:37.131350 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ppzx7\" (UniqueName: \"kubernetes.io/projected/7a8d9c43-e449-4317-9c7d-9dba5e1130c3-kube-api-access-ppzx7\") pod \"7a8d9c43-e449-4317-9c7d-9dba5e1130c3\" (UID: \"7a8d9c43-e449-4317-9c7d-9dba5e1130c3\") " Mar 20 10:51:37 crc kubenswrapper[4748]: I0320 10:51:37.131406 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a8d9c43-e449-4317-9c7d-9dba5e1130c3-utilities\") pod \"7a8d9c43-e449-4317-9c7d-9dba5e1130c3\" (UID: \"7a8d9c43-e449-4317-9c7d-9dba5e1130c3\") " Mar 20 10:51:37 crc kubenswrapper[4748]: I0320 10:51:37.132795 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a8d9c43-e449-4317-9c7d-9dba5e1130c3-utilities" (OuterVolumeSpecName: "utilities") pod "7a8d9c43-e449-4317-9c7d-9dba5e1130c3" (UID: "7a8d9c43-e449-4317-9c7d-9dba5e1130c3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:51:37 crc kubenswrapper[4748]: I0320 10:51:37.141032 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a8d9c43-e449-4317-9c7d-9dba5e1130c3-kube-api-access-ppzx7" (OuterVolumeSpecName: "kube-api-access-ppzx7") pod "7a8d9c43-e449-4317-9c7d-9dba5e1130c3" (UID: "7a8d9c43-e449-4317-9c7d-9dba5e1130c3"). InnerVolumeSpecName "kube-api-access-ppzx7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:51:37 crc kubenswrapper[4748]: I0320 10:51:37.165751 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a8d9c43-e449-4317-9c7d-9dba5e1130c3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7a8d9c43-e449-4317-9c7d-9dba5e1130c3" (UID: "7a8d9c43-e449-4317-9c7d-9dba5e1130c3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:51:37 crc kubenswrapper[4748]: I0320 10:51:37.232861 4748 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a8d9c43-e449-4317-9c7d-9dba5e1130c3-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 10:51:37 crc kubenswrapper[4748]: I0320 10:51:37.232901 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ppzx7\" (UniqueName: \"kubernetes.io/projected/7a8d9c43-e449-4317-9c7d-9dba5e1130c3-kube-api-access-ppzx7\") on node \"crc\" DevicePath \"\"" Mar 20 10:51:37 crc kubenswrapper[4748]: I0320 10:51:37.232937 4748 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a8d9c43-e449-4317-9c7d-9dba5e1130c3-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 10:51:37 crc kubenswrapper[4748]: I0320 10:51:37.679309 4748 generic.go:334] "Generic (PLEG): container finished" podID="7a8d9c43-e449-4317-9c7d-9dba5e1130c3" containerID="2a067076c7440c0bccb3a73e82a8e1ea2928c3b36d1032879d9161078c979f72" exitCode=0 Mar 20 10:51:37 crc kubenswrapper[4748]: I0320 10:51:37.679506 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gwtr7" event={"ID":"7a8d9c43-e449-4317-9c7d-9dba5e1130c3","Type":"ContainerDied","Data":"2a067076c7440c0bccb3a73e82a8e1ea2928c3b36d1032879d9161078c979f72"} Mar 20 10:51:37 crc kubenswrapper[4748]: I0320 10:51:37.685929 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gwtr7" event={"ID":"7a8d9c43-e449-4317-9c7d-9dba5e1130c3","Type":"ContainerDied","Data":"26c1520e3c7e771a33e4d5d3757e9b58b903994e32c52812947e1974fc906930"} Mar 20 10:51:37 crc kubenswrapper[4748]: I0320 10:51:37.685967 4748 scope.go:117] "RemoveContainer" containerID="2a067076c7440c0bccb3a73e82a8e1ea2928c3b36d1032879d9161078c979f72" Mar 20 10:51:37 crc kubenswrapper[4748]: I0320 10:51:37.679661 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gwtr7" Mar 20 10:51:37 crc kubenswrapper[4748]: I0320 10:51:37.710539 4748 scope.go:117] "RemoveContainer" containerID="000b0928ae8384e259ba2497c27a2b601e1f93ef550c10ef2c650fc79963e4b1" Mar 20 10:51:37 crc kubenswrapper[4748]: I0320 10:51:37.715464 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gwtr7"] Mar 20 10:51:37 crc kubenswrapper[4748]: I0320 10:51:37.722136 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-gwtr7"] Mar 20 10:51:37 crc kubenswrapper[4748]: I0320 10:51:37.732614 4748 scope.go:117] "RemoveContainer" containerID="027d5789e4d26493abf5f4a142f7d232e8062c7e1e469cbd0bd7d2906896441d" Mar 20 10:51:37 crc kubenswrapper[4748]: I0320 10:51:37.754054 4748 scope.go:117] "RemoveContainer" containerID="2a067076c7440c0bccb3a73e82a8e1ea2928c3b36d1032879d9161078c979f72" Mar 20 10:51:37 crc kubenswrapper[4748]: E0320 10:51:37.754514 4748 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a067076c7440c0bccb3a73e82a8e1ea2928c3b36d1032879d9161078c979f72\": container with ID starting with 2a067076c7440c0bccb3a73e82a8e1ea2928c3b36d1032879d9161078c979f72 not found: ID does not exist" containerID="2a067076c7440c0bccb3a73e82a8e1ea2928c3b36d1032879d9161078c979f72" Mar 20 10:51:37 crc kubenswrapper[4748]: I0320 10:51:37.754558 4748 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a067076c7440c0bccb3a73e82a8e1ea2928c3b36d1032879d9161078c979f72"} err="failed to get container status \"2a067076c7440c0bccb3a73e82a8e1ea2928c3b36d1032879d9161078c979f72\": rpc error: code = NotFound desc = could not find container \"2a067076c7440c0bccb3a73e82a8e1ea2928c3b36d1032879d9161078c979f72\": container with ID starting with 2a067076c7440c0bccb3a73e82a8e1ea2928c3b36d1032879d9161078c979f72 not found: ID does not exist" Mar 20 10:51:37 crc kubenswrapper[4748]: I0320 10:51:37.754591 4748 scope.go:117] "RemoveContainer" containerID="000b0928ae8384e259ba2497c27a2b601e1f93ef550c10ef2c650fc79963e4b1" Mar 20 10:51:37 crc kubenswrapper[4748]: E0320 10:51:37.754962 4748 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"000b0928ae8384e259ba2497c27a2b601e1f93ef550c10ef2c650fc79963e4b1\": container with ID starting with 000b0928ae8384e259ba2497c27a2b601e1f93ef550c10ef2c650fc79963e4b1 not found: ID does not exist" containerID="000b0928ae8384e259ba2497c27a2b601e1f93ef550c10ef2c650fc79963e4b1" Mar 20 10:51:37 crc kubenswrapper[4748]: I0320 10:51:37.754983 4748 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"000b0928ae8384e259ba2497c27a2b601e1f93ef550c10ef2c650fc79963e4b1"} err="failed to get container status \"000b0928ae8384e259ba2497c27a2b601e1f93ef550c10ef2c650fc79963e4b1\": rpc error: code = NotFound desc = could not find container \"000b0928ae8384e259ba2497c27a2b601e1f93ef550c10ef2c650fc79963e4b1\": container with ID starting with 000b0928ae8384e259ba2497c27a2b601e1f93ef550c10ef2c650fc79963e4b1 not found: ID does not exist" Mar 20 10:51:37 crc kubenswrapper[4748]: I0320 10:51:37.754998 4748 scope.go:117] "RemoveContainer" containerID="027d5789e4d26493abf5f4a142f7d232e8062c7e1e469cbd0bd7d2906896441d" Mar 20 10:51:37 crc kubenswrapper[4748]: E0320 10:51:37.755268 4748 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"027d5789e4d26493abf5f4a142f7d232e8062c7e1e469cbd0bd7d2906896441d\": container with ID starting with 027d5789e4d26493abf5f4a142f7d232e8062c7e1e469cbd0bd7d2906896441d not found: ID does not exist" containerID="027d5789e4d26493abf5f4a142f7d232e8062c7e1e469cbd0bd7d2906896441d" Mar 20 10:51:37 crc kubenswrapper[4748]: I0320 10:51:37.755286 4748 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"027d5789e4d26493abf5f4a142f7d232e8062c7e1e469cbd0bd7d2906896441d"} err="failed to get container status \"027d5789e4d26493abf5f4a142f7d232e8062c7e1e469cbd0bd7d2906896441d\": rpc error: code = NotFound desc = could not find container \"027d5789e4d26493abf5f4a142f7d232e8062c7e1e469cbd0bd7d2906896441d\": container with ID starting with 027d5789e4d26493abf5f4a142f7d232e8062c7e1e469cbd0bd7d2906896441d not found: ID does not exist" Mar 20 10:51:38 crc kubenswrapper[4748]: I0320 10:51:38.734345 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-m7p7r"] Mar 20 10:51:39 crc kubenswrapper[4748]: I0320 10:51:39.292944 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6z29z"] Mar 20 10:51:39 crc kubenswrapper[4748]: I0320 10:51:39.293699 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-6z29z" podUID="88e0b46d-a2cb-4927-9b1b-120bd2e07f5d" containerName="registry-server" containerID="cri-o://750d87d8ef0f290024f77c3d1b17a9fa28a645c91eff50a38838d86cc4b74c8e" gracePeriod=2 Mar 20 10:51:39 crc kubenswrapper[4748]: I0320 10:51:39.527216 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a8d9c43-e449-4317-9c7d-9dba5e1130c3" path="/var/lib/kubelet/pods/7a8d9c43-e449-4317-9c7d-9dba5e1130c3/volumes" Mar 20 10:51:39 crc kubenswrapper[4748]: I0320 10:51:39.704540 4748 generic.go:334] "Generic (PLEG): container finished" podID="88e0b46d-a2cb-4927-9b1b-120bd2e07f5d" containerID="750d87d8ef0f290024f77c3d1b17a9fa28a645c91eff50a38838d86cc4b74c8e" exitCode=0 Mar 20 10:51:39 crc kubenswrapper[4748]: I0320 10:51:39.704604 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6z29z" event={"ID":"88e0b46d-a2cb-4927-9b1b-120bd2e07f5d","Type":"ContainerDied","Data":"750d87d8ef0f290024f77c3d1b17a9fa28a645c91eff50a38838d86cc4b74c8e"} Mar 20 10:51:39 crc kubenswrapper[4748]: I0320 10:51:39.786776 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6z29z" Mar 20 10:51:39 crc kubenswrapper[4748]: I0320 10:51:39.972443 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gt9qj\" (UniqueName: \"kubernetes.io/projected/88e0b46d-a2cb-4927-9b1b-120bd2e07f5d-kube-api-access-gt9qj\") pod \"88e0b46d-a2cb-4927-9b1b-120bd2e07f5d\" (UID: \"88e0b46d-a2cb-4927-9b1b-120bd2e07f5d\") " Mar 20 10:51:39 crc kubenswrapper[4748]: I0320 10:51:39.972715 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88e0b46d-a2cb-4927-9b1b-120bd2e07f5d-catalog-content\") pod \"88e0b46d-a2cb-4927-9b1b-120bd2e07f5d\" (UID: \"88e0b46d-a2cb-4927-9b1b-120bd2e07f5d\") " Mar 20 10:51:39 crc kubenswrapper[4748]: I0320 10:51:39.972751 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88e0b46d-a2cb-4927-9b1b-120bd2e07f5d-utilities\") pod \"88e0b46d-a2cb-4927-9b1b-120bd2e07f5d\" (UID: \"88e0b46d-a2cb-4927-9b1b-120bd2e07f5d\") " Mar 20 10:51:39 crc kubenswrapper[4748]: I0320 10:51:39.974116 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88e0b46d-a2cb-4927-9b1b-120bd2e07f5d-utilities" (OuterVolumeSpecName: "utilities") pod "88e0b46d-a2cb-4927-9b1b-120bd2e07f5d" (UID: "88e0b46d-a2cb-4927-9b1b-120bd2e07f5d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:51:39 crc kubenswrapper[4748]: I0320 10:51:39.980102 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88e0b46d-a2cb-4927-9b1b-120bd2e07f5d-kube-api-access-gt9qj" (OuterVolumeSpecName: "kube-api-access-gt9qj") pod "88e0b46d-a2cb-4927-9b1b-120bd2e07f5d" (UID: "88e0b46d-a2cb-4927-9b1b-120bd2e07f5d"). InnerVolumeSpecName "kube-api-access-gt9qj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:51:40 crc kubenswrapper[4748]: I0320 10:51:40.024345 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88e0b46d-a2cb-4927-9b1b-120bd2e07f5d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "88e0b46d-a2cb-4927-9b1b-120bd2e07f5d" (UID: "88e0b46d-a2cb-4927-9b1b-120bd2e07f5d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:51:40 crc kubenswrapper[4748]: I0320 10:51:40.075324 4748 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88e0b46d-a2cb-4927-9b1b-120bd2e07f5d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 10:51:40 crc kubenswrapper[4748]: I0320 10:51:40.075373 4748 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88e0b46d-a2cb-4927-9b1b-120bd2e07f5d-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 10:51:40 crc kubenswrapper[4748]: I0320 10:51:40.075384 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gt9qj\" (UniqueName: \"kubernetes.io/projected/88e0b46d-a2cb-4927-9b1b-120bd2e07f5d-kube-api-access-gt9qj\") on node \"crc\" DevicePath \"\"" Mar 20 10:51:40 crc kubenswrapper[4748]: I0320 10:51:40.713989 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6z29z" event={"ID":"88e0b46d-a2cb-4927-9b1b-120bd2e07f5d","Type":"ContainerDied","Data":"cbb259e56d8151d610054f57499bc743d81cbf714108b8c61a6fd35bbec9f8a3"} Mar 20 10:51:40 crc kubenswrapper[4748]: I0320 10:51:40.714555 4748 scope.go:117] "RemoveContainer" containerID="750d87d8ef0f290024f77c3d1b17a9fa28a645c91eff50a38838d86cc4b74c8e" Mar 20 10:51:40 crc kubenswrapper[4748]: I0320 10:51:40.714049 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6z29z" Mar 20 10:51:40 crc kubenswrapper[4748]: I0320 10:51:40.742513 4748 scope.go:117] "RemoveContainer" containerID="0252a36180031a6eed46041ba045fb4b4447018f24b3826a05a8d9513c52d198" Mar 20 10:51:40 crc kubenswrapper[4748]: I0320 10:51:40.755510 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6z29z"] Mar 20 10:51:40 crc kubenswrapper[4748]: I0320 10:51:40.760535 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-6z29z"] Mar 20 10:51:40 crc kubenswrapper[4748]: I0320 10:51:40.769044 4748 scope.go:117] "RemoveContainer" containerID="9177ba6cab2a6a6d71fde462aa7290c0b1d2b61323e4547f6a238c1997276536" Mar 20 10:51:41 crc kubenswrapper[4748]: I0320 10:51:41.524951 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88e0b46d-a2cb-4927-9b1b-120bd2e07f5d" path="/var/lib/kubelet/pods/88e0b46d-a2cb-4927-9b1b-120bd2e07f5d/volumes" Mar 20 10:51:42 crc kubenswrapper[4748]: I0320 10:51:42.928997 4748 patch_prober.go:28] interesting pod/machine-config-daemon-5lbvz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 10:51:42 crc kubenswrapper[4748]: I0320 10:51:42.929586 4748 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 10:51:42 crc kubenswrapper[4748]: I0320 10:51:42.929656 4748 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" Mar 20 10:51:42 crc kubenswrapper[4748]: I0320 10:51:42.930570 4748 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9de74f70ec1f6bdcd394aa24f2c91f712fff4a83d715b6c60cbd64842427bc57"} pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 10:51:42 crc kubenswrapper[4748]: I0320 10:51:42.930627 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" containerName="machine-config-daemon" containerID="cri-o://9de74f70ec1f6bdcd394aa24f2c91f712fff4a83d715b6c60cbd64842427bc57" gracePeriod=600 Mar 20 10:51:43 crc kubenswrapper[4748]: I0320 10:51:43.739955 4748 generic.go:334] "Generic (PLEG): container finished" podID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" containerID="9de74f70ec1f6bdcd394aa24f2c91f712fff4a83d715b6c60cbd64842427bc57" exitCode=0 Mar 20 10:51:43 crc kubenswrapper[4748]: I0320 10:51:43.740013 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" event={"ID":"8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c","Type":"ContainerDied","Data":"9de74f70ec1f6bdcd394aa24f2c91f712fff4a83d715b6c60cbd64842427bc57"} Mar 20 10:51:43 crc kubenswrapper[4748]: I0320 10:51:43.740480 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" event={"ID":"8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c","Type":"ContainerStarted","Data":"c1253b5c73f9d1fbdb9567d5ed33468b56df43e62dd82201c361a89f824870ce"} Mar 20 10:51:43 crc kubenswrapper[4748]: I0320 10:51:43.740515 4748 scope.go:117] "RemoveContainer" containerID="261e51a50d63c2c75ffa3ccd3542b6a7465ac5ce52a44fe320cc511b3ec023ef" Mar 20 10:51:44 crc kubenswrapper[4748]: I0320 10:51:44.160179 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-d8d579484-vx6pz"] Mar 20 10:51:44 crc kubenswrapper[4748]: E0320 10:51:44.160887 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a8d9c43-e449-4317-9c7d-9dba5e1130c3" containerName="extract-utilities" Mar 20 10:51:44 crc kubenswrapper[4748]: I0320 10:51:44.160904 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a8d9c43-e449-4317-9c7d-9dba5e1130c3" containerName="extract-utilities" Mar 20 10:51:44 crc kubenswrapper[4748]: E0320 10:51:44.160920 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9a1c042-3dd9-486b-aafd-05767bd2e20b" containerName="pull" Mar 20 10:51:44 crc kubenswrapper[4748]: I0320 10:51:44.160927 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9a1c042-3dd9-486b-aafd-05767bd2e20b" containerName="pull" Mar 20 10:51:44 crc kubenswrapper[4748]: E0320 10:51:44.160938 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a8d9c43-e449-4317-9c7d-9dba5e1130c3" containerName="extract-content" Mar 20 10:51:44 crc kubenswrapper[4748]: I0320 10:51:44.160948 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a8d9c43-e449-4317-9c7d-9dba5e1130c3" containerName="extract-content" Mar 20 10:51:44 crc kubenswrapper[4748]: E0320 10:51:44.160957 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9a1c042-3dd9-486b-aafd-05767bd2e20b" containerName="extract" Mar 20 10:51:44 crc kubenswrapper[4748]: I0320 10:51:44.160963 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9a1c042-3dd9-486b-aafd-05767bd2e20b" containerName="extract" Mar 20 10:51:44 crc kubenswrapper[4748]: E0320 10:51:44.160974 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88e0b46d-a2cb-4927-9b1b-120bd2e07f5d" containerName="extract-utilities" Mar 20 10:51:44 crc kubenswrapper[4748]: I0320 10:51:44.160981 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="88e0b46d-a2cb-4927-9b1b-120bd2e07f5d" containerName="extract-utilities" Mar 20 10:51:44 crc kubenswrapper[4748]: E0320 10:51:44.160994 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88e0b46d-a2cb-4927-9b1b-120bd2e07f5d" containerName="registry-server" Mar 20 10:51:44 crc kubenswrapper[4748]: I0320 10:51:44.161001 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="88e0b46d-a2cb-4927-9b1b-120bd2e07f5d" containerName="registry-server" Mar 20 10:51:44 crc kubenswrapper[4748]: E0320 10:51:44.161011 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9a1c042-3dd9-486b-aafd-05767bd2e20b" containerName="util" Mar 20 10:51:44 crc kubenswrapper[4748]: I0320 10:51:44.161018 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9a1c042-3dd9-486b-aafd-05767bd2e20b" containerName="util" Mar 20 10:51:44 crc kubenswrapper[4748]: E0320 10:51:44.161029 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a8d9c43-e449-4317-9c7d-9dba5e1130c3" containerName="registry-server" Mar 20 10:51:44 crc kubenswrapper[4748]: I0320 10:51:44.161035 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a8d9c43-e449-4317-9c7d-9dba5e1130c3" containerName="registry-server" Mar 20 10:51:44 crc kubenswrapper[4748]: E0320 10:51:44.161042 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88e0b46d-a2cb-4927-9b1b-120bd2e07f5d" containerName="extract-content" Mar 20 10:51:44 crc kubenswrapper[4748]: I0320 10:51:44.161048 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="88e0b46d-a2cb-4927-9b1b-120bd2e07f5d" containerName="extract-content" Mar 20 10:51:44 crc kubenswrapper[4748]: I0320 10:51:44.161187 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a8d9c43-e449-4317-9c7d-9dba5e1130c3" containerName="registry-server" Mar 20 10:51:44 crc kubenswrapper[4748]: I0320 10:51:44.161206 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9a1c042-3dd9-486b-aafd-05767bd2e20b" containerName="extract" Mar 20 10:51:44 crc kubenswrapper[4748]: I0320 10:51:44.161217 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="88e0b46d-a2cb-4927-9b1b-120bd2e07f5d" containerName="registry-server" Mar 20 10:51:44 crc kubenswrapper[4748]: I0320 10:51:44.161728 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-d8d579484-vx6pz" Mar 20 10:51:44 crc kubenswrapper[4748]: I0320 10:51:44.164619 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-59b98" Mar 20 10:51:44 crc kubenswrapper[4748]: I0320 10:51:44.193073 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-d8d579484-vx6pz"] Mar 20 10:51:44 crc kubenswrapper[4748]: I0320 10:51:44.280763 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzq2q\" (UniqueName: \"kubernetes.io/projected/35554cb6-28ee-4104-8591-ee987f93805b-kube-api-access-vzq2q\") pod \"openstack-operator-controller-init-d8d579484-vx6pz\" (UID: \"35554cb6-28ee-4104-8591-ee987f93805b\") " pod="openstack-operators/openstack-operator-controller-init-d8d579484-vx6pz" Mar 20 10:51:44 crc kubenswrapper[4748]: I0320 10:51:44.382878 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vzq2q\" (UniqueName: \"kubernetes.io/projected/35554cb6-28ee-4104-8591-ee987f93805b-kube-api-access-vzq2q\") pod \"openstack-operator-controller-init-d8d579484-vx6pz\" (UID: \"35554cb6-28ee-4104-8591-ee987f93805b\") " pod="openstack-operators/openstack-operator-controller-init-d8d579484-vx6pz" Mar 20 10:51:44 crc kubenswrapper[4748]: I0320 10:51:44.428886 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzq2q\" (UniqueName: \"kubernetes.io/projected/35554cb6-28ee-4104-8591-ee987f93805b-kube-api-access-vzq2q\") pod \"openstack-operator-controller-init-d8d579484-vx6pz\" (UID: \"35554cb6-28ee-4104-8591-ee987f93805b\") " pod="openstack-operators/openstack-operator-controller-init-d8d579484-vx6pz" Mar 20 10:51:44 crc kubenswrapper[4748]: I0320 10:51:44.484926 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-d8d579484-vx6pz" Mar 20 10:51:45 crc kubenswrapper[4748]: I0320 10:51:45.015543 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-d8d579484-vx6pz"] Mar 20 10:51:45 crc kubenswrapper[4748]: W0320 10:51:45.020107 4748 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod35554cb6_28ee_4104_8591_ee987f93805b.slice/crio-3466c445b4cd3ec213e2d21d7268a9f4521fc0e499d7c54a3fa011f41b22e5ae WatchSource:0}: Error finding container 3466c445b4cd3ec213e2d21d7268a9f4521fc0e499d7c54a3fa011f41b22e5ae: Status 404 returned error can't find the container with id 3466c445b4cd3ec213e2d21d7268a9f4521fc0e499d7c54a3fa011f41b22e5ae Mar 20 10:51:45 crc kubenswrapper[4748]: I0320 10:51:45.334892 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-c6qwm"] Mar 20 10:51:45 crc kubenswrapper[4748]: I0320 10:51:45.337543 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c6qwm" Mar 20 10:51:45 crc kubenswrapper[4748]: I0320 10:51:45.350140 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-c6qwm"] Mar 20 10:51:45 crc kubenswrapper[4748]: I0320 10:51:45.419152 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxmll\" (UniqueName: \"kubernetes.io/projected/a71adfec-5917-40c6-bbb9-4b027fca626b-kube-api-access-gxmll\") pod \"certified-operators-c6qwm\" (UID: \"a71adfec-5917-40c6-bbb9-4b027fca626b\") " pod="openshift-marketplace/certified-operators-c6qwm" Mar 20 10:51:45 crc kubenswrapper[4748]: I0320 10:51:45.419259 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a71adfec-5917-40c6-bbb9-4b027fca626b-utilities\") pod \"certified-operators-c6qwm\" (UID: \"a71adfec-5917-40c6-bbb9-4b027fca626b\") " pod="openshift-marketplace/certified-operators-c6qwm" Mar 20 10:51:45 crc kubenswrapper[4748]: I0320 10:51:45.419288 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a71adfec-5917-40c6-bbb9-4b027fca626b-catalog-content\") pod \"certified-operators-c6qwm\" (UID: \"a71adfec-5917-40c6-bbb9-4b027fca626b\") " pod="openshift-marketplace/certified-operators-c6qwm" Mar 20 10:51:45 crc kubenswrapper[4748]: I0320 10:51:45.521038 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a71adfec-5917-40c6-bbb9-4b027fca626b-utilities\") pod \"certified-operators-c6qwm\" (UID: \"a71adfec-5917-40c6-bbb9-4b027fca626b\") " pod="openshift-marketplace/certified-operators-c6qwm" Mar 20 10:51:45 crc kubenswrapper[4748]: I0320 10:51:45.521093 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a71adfec-5917-40c6-bbb9-4b027fca626b-catalog-content\") pod \"certified-operators-c6qwm\" (UID: \"a71adfec-5917-40c6-bbb9-4b027fca626b\") " pod="openshift-marketplace/certified-operators-c6qwm" Mar 20 10:51:45 crc kubenswrapper[4748]: I0320 10:51:45.521165 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxmll\" (UniqueName: \"kubernetes.io/projected/a71adfec-5917-40c6-bbb9-4b027fca626b-kube-api-access-gxmll\") pod \"certified-operators-c6qwm\" (UID: \"a71adfec-5917-40c6-bbb9-4b027fca626b\") " pod="openshift-marketplace/certified-operators-c6qwm" Mar 20 10:51:45 crc kubenswrapper[4748]: I0320 10:51:45.522174 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a71adfec-5917-40c6-bbb9-4b027fca626b-utilities\") pod \"certified-operators-c6qwm\" (UID: \"a71adfec-5917-40c6-bbb9-4b027fca626b\") " pod="openshift-marketplace/certified-operators-c6qwm" Mar 20 10:51:45 crc kubenswrapper[4748]: I0320 10:51:45.522181 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a71adfec-5917-40c6-bbb9-4b027fca626b-catalog-content\") pod \"certified-operators-c6qwm\" (UID: \"a71adfec-5917-40c6-bbb9-4b027fca626b\") " pod="openshift-marketplace/certified-operators-c6qwm" Mar 20 10:51:45 crc kubenswrapper[4748]: I0320 10:51:45.546003 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxmll\" (UniqueName: \"kubernetes.io/projected/a71adfec-5917-40c6-bbb9-4b027fca626b-kube-api-access-gxmll\") pod \"certified-operators-c6qwm\" (UID: \"a71adfec-5917-40c6-bbb9-4b027fca626b\") " pod="openshift-marketplace/certified-operators-c6qwm" Mar 20 10:51:45 crc kubenswrapper[4748]: I0320 10:51:45.670814 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c6qwm" Mar 20 10:51:45 crc kubenswrapper[4748]: I0320 10:51:45.806179 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-d8d579484-vx6pz" event={"ID":"35554cb6-28ee-4104-8591-ee987f93805b","Type":"ContainerStarted","Data":"3466c445b4cd3ec213e2d21d7268a9f4521fc0e499d7c54a3fa011f41b22e5ae"} Mar 20 10:51:46 crc kubenswrapper[4748]: I0320 10:51:46.345397 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-c6qwm"] Mar 20 10:51:46 crc kubenswrapper[4748]: W0320 10:51:46.376090 4748 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda71adfec_5917_40c6_bbb9_4b027fca626b.slice/crio-3024bd5d7afafaac30eed3c24ac30724194543446252f8abcfc3d76ccc26e6d3 WatchSource:0}: Error finding container 3024bd5d7afafaac30eed3c24ac30724194543446252f8abcfc3d76ccc26e6d3: Status 404 returned error can't find the container with id 3024bd5d7afafaac30eed3c24ac30724194543446252f8abcfc3d76ccc26e6d3 Mar 20 10:51:46 crc kubenswrapper[4748]: I0320 10:51:46.815813 4748 generic.go:334] "Generic (PLEG): container finished" podID="a71adfec-5917-40c6-bbb9-4b027fca626b" containerID="57007597c710d186b660ea76f0333160fc921b046cc9ae317676700bb13d45aa" exitCode=0 Mar 20 10:51:46 crc kubenswrapper[4748]: I0320 10:51:46.815893 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c6qwm" event={"ID":"a71adfec-5917-40c6-bbb9-4b027fca626b","Type":"ContainerDied","Data":"57007597c710d186b660ea76f0333160fc921b046cc9ae317676700bb13d45aa"} Mar 20 10:51:46 crc kubenswrapper[4748]: I0320 10:51:46.815927 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c6qwm" event={"ID":"a71adfec-5917-40c6-bbb9-4b027fca626b","Type":"ContainerStarted","Data":"3024bd5d7afafaac30eed3c24ac30724194543446252f8abcfc3d76ccc26e6d3"} Mar 20 10:51:50 crc kubenswrapper[4748]: I0320 10:51:50.850115 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-d8d579484-vx6pz" event={"ID":"35554cb6-28ee-4104-8591-ee987f93805b","Type":"ContainerStarted","Data":"d38dc6cf9056752cb509e3305951248279c8eb350916d10dd72afc6ccf5bc50c"} Mar 20 10:51:50 crc kubenswrapper[4748]: I0320 10:51:50.851124 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-d8d579484-vx6pz" Mar 20 10:51:50 crc kubenswrapper[4748]: I0320 10:51:50.852940 4748 generic.go:334] "Generic (PLEG): container finished" podID="a71adfec-5917-40c6-bbb9-4b027fca626b" containerID="76690d67fa3dde1f526e83c4fe00f6615c197f119da9651290be79e1200e7b88" exitCode=0 Mar 20 10:51:50 crc kubenswrapper[4748]: I0320 10:51:50.852988 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c6qwm" event={"ID":"a71adfec-5917-40c6-bbb9-4b027fca626b","Type":"ContainerDied","Data":"76690d67fa3dde1f526e83c4fe00f6615c197f119da9651290be79e1200e7b88"} Mar 20 10:51:50 crc kubenswrapper[4748]: I0320 10:51:50.888785 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-d8d579484-vx6pz" podStartSLOduration=2.026877309 podStartE2EDuration="6.888755045s" podCreationTimestamp="2026-03-20 10:51:44 +0000 UTC" firstStartedPulling="2026-03-20 10:51:45.022293727 +0000 UTC m=+940.163839541" lastFinishedPulling="2026-03-20 10:51:49.884171463 +0000 UTC m=+945.025717277" observedRunningTime="2026-03-20 10:51:50.88214959 +0000 UTC m=+946.023695404" watchObservedRunningTime="2026-03-20 10:51:50.888755045 +0000 UTC m=+946.030300869" Mar 20 10:51:51 crc kubenswrapper[4748]: I0320 10:51:51.862561 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c6qwm" event={"ID":"a71adfec-5917-40c6-bbb9-4b027fca626b","Type":"ContainerStarted","Data":"6d0b954c4cedccd196ebd1a157adb79c9012198c56376b126052da4da3fde164"} Mar 20 10:51:51 crc kubenswrapper[4748]: I0320 10:51:51.887734 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-c6qwm" podStartSLOduration=2.113643244 podStartE2EDuration="6.887701636s" podCreationTimestamp="2026-03-20 10:51:45 +0000 UTC" firstStartedPulling="2026-03-20 10:51:46.817488024 +0000 UTC m=+941.959033828" lastFinishedPulling="2026-03-20 10:51:51.591546406 +0000 UTC m=+946.733092220" observedRunningTime="2026-03-20 10:51:51.883756738 +0000 UTC m=+947.025302552" watchObservedRunningTime="2026-03-20 10:51:51.887701636 +0000 UTC m=+947.029247450" Mar 20 10:51:55 crc kubenswrapper[4748]: I0320 10:51:55.671615 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-c6qwm" Mar 20 10:51:55 crc kubenswrapper[4748]: I0320 10:51:55.672480 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-c6qwm" Mar 20 10:51:55 crc kubenswrapper[4748]: I0320 10:51:55.712365 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-c6qwm" Mar 20 10:52:00 crc kubenswrapper[4748]: I0320 10:52:00.139536 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566732-vvq4n"] Mar 20 10:52:00 crc kubenswrapper[4748]: I0320 10:52:00.141365 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566732-vvq4n" Mar 20 10:52:00 crc kubenswrapper[4748]: I0320 10:52:00.158380 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-p6q8z" Mar 20 10:52:00 crc kubenswrapper[4748]: I0320 10:52:00.158917 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 10:52:00 crc kubenswrapper[4748]: I0320 10:52:00.158536 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566732-vvq4n"] Mar 20 10:52:00 crc kubenswrapper[4748]: I0320 10:52:00.161289 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 10:52:00 crc kubenswrapper[4748]: I0320 10:52:00.206740 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvb9v\" (UniqueName: \"kubernetes.io/projected/c2b93435-c7d8-4adc-856f-5b318567767e-kube-api-access-cvb9v\") pod \"auto-csr-approver-29566732-vvq4n\" (UID: \"c2b93435-c7d8-4adc-856f-5b318567767e\") " pod="openshift-infra/auto-csr-approver-29566732-vvq4n" Mar 20 10:52:00 crc kubenswrapper[4748]: I0320 10:52:00.308427 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cvb9v\" (UniqueName: \"kubernetes.io/projected/c2b93435-c7d8-4adc-856f-5b318567767e-kube-api-access-cvb9v\") pod \"auto-csr-approver-29566732-vvq4n\" (UID: \"c2b93435-c7d8-4adc-856f-5b318567767e\") " pod="openshift-infra/auto-csr-approver-29566732-vvq4n" Mar 20 10:52:00 crc kubenswrapper[4748]: I0320 10:52:00.329825 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvb9v\" (UniqueName: \"kubernetes.io/projected/c2b93435-c7d8-4adc-856f-5b318567767e-kube-api-access-cvb9v\") pod \"auto-csr-approver-29566732-vvq4n\" (UID: \"c2b93435-c7d8-4adc-856f-5b318567767e\") " pod="openshift-infra/auto-csr-approver-29566732-vvq4n" Mar 20 10:52:00 crc kubenswrapper[4748]: I0320 10:52:00.465759 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566732-vvq4n" Mar 20 10:52:00 crc kubenswrapper[4748]: I0320 10:52:00.728273 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566732-vvq4n"] Mar 20 10:52:00 crc kubenswrapper[4748]: I0320 10:52:00.947247 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566732-vvq4n" event={"ID":"c2b93435-c7d8-4adc-856f-5b318567767e","Type":"ContainerStarted","Data":"f487861eec98cc49c666427ffa92e96ea40d8fe7b937628184c2aced52366106"} Mar 20 10:52:02 crc kubenswrapper[4748]: I0320 10:52:02.965616 4748 generic.go:334] "Generic (PLEG): container finished" podID="c2b93435-c7d8-4adc-856f-5b318567767e" containerID="e8c94ff1d5e9a53a0a5d8c040dd94d943d991dc9ef7fe18e2ca0b299ba2c0393" exitCode=0 Mar 20 10:52:02 crc kubenswrapper[4748]: I0320 10:52:02.965730 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566732-vvq4n" event={"ID":"c2b93435-c7d8-4adc-856f-5b318567767e","Type":"ContainerDied","Data":"e8c94ff1d5e9a53a0a5d8c040dd94d943d991dc9ef7fe18e2ca0b299ba2c0393"} Mar 20 10:52:04 crc kubenswrapper[4748]: I0320 10:52:04.246263 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566732-vvq4n" Mar 20 10:52:04 crc kubenswrapper[4748]: I0320 10:52:04.376514 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cvb9v\" (UniqueName: \"kubernetes.io/projected/c2b93435-c7d8-4adc-856f-5b318567767e-kube-api-access-cvb9v\") pod \"c2b93435-c7d8-4adc-856f-5b318567767e\" (UID: \"c2b93435-c7d8-4adc-856f-5b318567767e\") " Mar 20 10:52:04 crc kubenswrapper[4748]: I0320 10:52:04.384209 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2b93435-c7d8-4adc-856f-5b318567767e-kube-api-access-cvb9v" (OuterVolumeSpecName: "kube-api-access-cvb9v") pod "c2b93435-c7d8-4adc-856f-5b318567767e" (UID: "c2b93435-c7d8-4adc-856f-5b318567767e"). InnerVolumeSpecName "kube-api-access-cvb9v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:52:04 crc kubenswrapper[4748]: I0320 10:52:04.479160 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cvb9v\" (UniqueName: \"kubernetes.io/projected/c2b93435-c7d8-4adc-856f-5b318567767e-kube-api-access-cvb9v\") on node \"crc\" DevicePath \"\"" Mar 20 10:52:04 crc kubenswrapper[4748]: I0320 10:52:04.488787 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-d8d579484-vx6pz" Mar 20 10:52:04 crc kubenswrapper[4748]: I0320 10:52:04.981997 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566732-vvq4n" event={"ID":"c2b93435-c7d8-4adc-856f-5b318567767e","Type":"ContainerDied","Data":"f487861eec98cc49c666427ffa92e96ea40d8fe7b937628184c2aced52366106"} Mar 20 10:52:04 crc kubenswrapper[4748]: I0320 10:52:04.982049 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566732-vvq4n" Mar 20 10:52:04 crc kubenswrapper[4748]: I0320 10:52:04.982064 4748 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f487861eec98cc49c666427ffa92e96ea40d8fe7b937628184c2aced52366106" Mar 20 10:52:05 crc kubenswrapper[4748]: I0320 10:52:05.384631 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566726-ts528"] Mar 20 10:52:05 crc kubenswrapper[4748]: I0320 10:52:05.390155 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566726-ts528"] Mar 20 10:52:05 crc kubenswrapper[4748]: I0320 10:52:05.557154 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="375ada94-2843-44ed-8f85-0be2255a00de" path="/var/lib/kubelet/pods/375ada94-2843-44ed-8f85-0be2255a00de/volumes" Mar 20 10:52:05 crc kubenswrapper[4748]: I0320 10:52:05.720805 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-c6qwm" Mar 20 10:52:05 crc kubenswrapper[4748]: I0320 10:52:05.775560 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-c6qwm"] Mar 20 10:52:05 crc kubenswrapper[4748]: I0320 10:52:05.989399 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-c6qwm" podUID="a71adfec-5917-40c6-bbb9-4b027fca626b" containerName="registry-server" containerID="cri-o://6d0b954c4cedccd196ebd1a157adb79c9012198c56376b126052da4da3fde164" gracePeriod=2 Mar 20 10:52:06 crc kubenswrapper[4748]: I0320 10:52:06.380737 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c6qwm" Mar 20 10:52:06 crc kubenswrapper[4748]: I0320 10:52:06.515141 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a71adfec-5917-40c6-bbb9-4b027fca626b-utilities\") pod \"a71adfec-5917-40c6-bbb9-4b027fca626b\" (UID: \"a71adfec-5917-40c6-bbb9-4b027fca626b\") " Mar 20 10:52:06 crc kubenswrapper[4748]: I0320 10:52:06.515269 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gxmll\" (UniqueName: \"kubernetes.io/projected/a71adfec-5917-40c6-bbb9-4b027fca626b-kube-api-access-gxmll\") pod \"a71adfec-5917-40c6-bbb9-4b027fca626b\" (UID: \"a71adfec-5917-40c6-bbb9-4b027fca626b\") " Mar 20 10:52:06 crc kubenswrapper[4748]: I0320 10:52:06.515462 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a71adfec-5917-40c6-bbb9-4b027fca626b-catalog-content\") pod \"a71adfec-5917-40c6-bbb9-4b027fca626b\" (UID: \"a71adfec-5917-40c6-bbb9-4b027fca626b\") " Mar 20 10:52:06 crc kubenswrapper[4748]: I0320 10:52:06.516233 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a71adfec-5917-40c6-bbb9-4b027fca626b-utilities" (OuterVolumeSpecName: "utilities") pod "a71adfec-5917-40c6-bbb9-4b027fca626b" (UID: "a71adfec-5917-40c6-bbb9-4b027fca626b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:52:06 crc kubenswrapper[4748]: I0320 10:52:06.522076 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a71adfec-5917-40c6-bbb9-4b027fca626b-kube-api-access-gxmll" (OuterVolumeSpecName: "kube-api-access-gxmll") pod "a71adfec-5917-40c6-bbb9-4b027fca626b" (UID: "a71adfec-5917-40c6-bbb9-4b027fca626b"). InnerVolumeSpecName "kube-api-access-gxmll". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:52:06 crc kubenswrapper[4748]: I0320 10:52:06.578181 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a71adfec-5917-40c6-bbb9-4b027fca626b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a71adfec-5917-40c6-bbb9-4b027fca626b" (UID: "a71adfec-5917-40c6-bbb9-4b027fca626b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:52:06 crc kubenswrapper[4748]: I0320 10:52:06.617209 4748 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a71adfec-5917-40c6-bbb9-4b027fca626b-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 10:52:06 crc kubenswrapper[4748]: I0320 10:52:06.617246 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gxmll\" (UniqueName: \"kubernetes.io/projected/a71adfec-5917-40c6-bbb9-4b027fca626b-kube-api-access-gxmll\") on node \"crc\" DevicePath \"\"" Mar 20 10:52:06 crc kubenswrapper[4748]: I0320 10:52:06.617260 4748 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a71adfec-5917-40c6-bbb9-4b027fca626b-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 10:52:06 crc kubenswrapper[4748]: I0320 10:52:06.999143 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c6qwm" Mar 20 10:52:06 crc kubenswrapper[4748]: I0320 10:52:06.999660 4748 generic.go:334] "Generic (PLEG): container finished" podID="a71adfec-5917-40c6-bbb9-4b027fca626b" containerID="6d0b954c4cedccd196ebd1a157adb79c9012198c56376b126052da4da3fde164" exitCode=0 Mar 20 10:52:06 crc kubenswrapper[4748]: I0320 10:52:06.999796 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c6qwm" event={"ID":"a71adfec-5917-40c6-bbb9-4b027fca626b","Type":"ContainerDied","Data":"6d0b954c4cedccd196ebd1a157adb79c9012198c56376b126052da4da3fde164"} Mar 20 10:52:06 crc kubenswrapper[4748]: I0320 10:52:06.999914 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c6qwm" event={"ID":"a71adfec-5917-40c6-bbb9-4b027fca626b","Type":"ContainerDied","Data":"3024bd5d7afafaac30eed3c24ac30724194543446252f8abcfc3d76ccc26e6d3"} Mar 20 10:52:07 crc kubenswrapper[4748]: I0320 10:52:06.999953 4748 scope.go:117] "RemoveContainer" containerID="6d0b954c4cedccd196ebd1a157adb79c9012198c56376b126052da4da3fde164" Mar 20 10:52:07 crc kubenswrapper[4748]: I0320 10:52:07.041266 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-c6qwm"] Mar 20 10:52:07 crc kubenswrapper[4748]: I0320 10:52:07.047264 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-c6qwm"] Mar 20 10:52:07 crc kubenswrapper[4748]: I0320 10:52:07.269110 4748 scope.go:117] "RemoveContainer" containerID="76690d67fa3dde1f526e83c4fe00f6615c197f119da9651290be79e1200e7b88" Mar 20 10:52:07 crc kubenswrapper[4748]: I0320 10:52:07.292138 4748 scope.go:117] "RemoveContainer" containerID="57007597c710d186b660ea76f0333160fc921b046cc9ae317676700bb13d45aa" Mar 20 10:52:07 crc kubenswrapper[4748]: I0320 10:52:07.308742 4748 scope.go:117] "RemoveContainer" containerID="6d0b954c4cedccd196ebd1a157adb79c9012198c56376b126052da4da3fde164" Mar 20 10:52:07 crc kubenswrapper[4748]: E0320 10:52:07.309389 4748 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d0b954c4cedccd196ebd1a157adb79c9012198c56376b126052da4da3fde164\": container with ID starting with 6d0b954c4cedccd196ebd1a157adb79c9012198c56376b126052da4da3fde164 not found: ID does not exist" containerID="6d0b954c4cedccd196ebd1a157adb79c9012198c56376b126052da4da3fde164" Mar 20 10:52:07 crc kubenswrapper[4748]: I0320 10:52:07.309461 4748 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d0b954c4cedccd196ebd1a157adb79c9012198c56376b126052da4da3fde164"} err="failed to get container status \"6d0b954c4cedccd196ebd1a157adb79c9012198c56376b126052da4da3fde164\": rpc error: code = NotFound desc = could not find container \"6d0b954c4cedccd196ebd1a157adb79c9012198c56376b126052da4da3fde164\": container with ID starting with 6d0b954c4cedccd196ebd1a157adb79c9012198c56376b126052da4da3fde164 not found: ID does not exist" Mar 20 10:52:07 crc kubenswrapper[4748]: I0320 10:52:07.309497 4748 scope.go:117] "RemoveContainer" containerID="76690d67fa3dde1f526e83c4fe00f6615c197f119da9651290be79e1200e7b88" Mar 20 10:52:07 crc kubenswrapper[4748]: E0320 10:52:07.309932 4748 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76690d67fa3dde1f526e83c4fe00f6615c197f119da9651290be79e1200e7b88\": container with ID starting with 76690d67fa3dde1f526e83c4fe00f6615c197f119da9651290be79e1200e7b88 not found: ID does not exist" containerID="76690d67fa3dde1f526e83c4fe00f6615c197f119da9651290be79e1200e7b88" Mar 20 10:52:07 crc kubenswrapper[4748]: I0320 10:52:07.309993 4748 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76690d67fa3dde1f526e83c4fe00f6615c197f119da9651290be79e1200e7b88"} err="failed to get container status \"76690d67fa3dde1f526e83c4fe00f6615c197f119da9651290be79e1200e7b88\": rpc error: code = NotFound desc = could not find container \"76690d67fa3dde1f526e83c4fe00f6615c197f119da9651290be79e1200e7b88\": container with ID starting with 76690d67fa3dde1f526e83c4fe00f6615c197f119da9651290be79e1200e7b88 not found: ID does not exist" Mar 20 10:52:07 crc kubenswrapper[4748]: I0320 10:52:07.310039 4748 scope.go:117] "RemoveContainer" containerID="57007597c710d186b660ea76f0333160fc921b046cc9ae317676700bb13d45aa" Mar 20 10:52:07 crc kubenswrapper[4748]: E0320 10:52:07.310553 4748 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57007597c710d186b660ea76f0333160fc921b046cc9ae317676700bb13d45aa\": container with ID starting with 57007597c710d186b660ea76f0333160fc921b046cc9ae317676700bb13d45aa not found: ID does not exist" containerID="57007597c710d186b660ea76f0333160fc921b046cc9ae317676700bb13d45aa" Mar 20 10:52:07 crc kubenswrapper[4748]: I0320 10:52:07.310580 4748 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57007597c710d186b660ea76f0333160fc921b046cc9ae317676700bb13d45aa"} err="failed to get container status \"57007597c710d186b660ea76f0333160fc921b046cc9ae317676700bb13d45aa\": rpc error: code = NotFound desc = could not find container \"57007597c710d186b660ea76f0333160fc921b046cc9ae317676700bb13d45aa\": container with ID starting with 57007597c710d186b660ea76f0333160fc921b046cc9ae317676700bb13d45aa not found: ID does not exist" Mar 20 10:52:07 crc kubenswrapper[4748]: I0320 10:52:07.522740 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a71adfec-5917-40c6-bbb9-4b027fca626b" path="/var/lib/kubelet/pods/a71adfec-5917-40c6-bbb9-4b027fca626b/volumes" Mar 20 10:52:14 crc kubenswrapper[4748]: I0320 10:52:14.626871 4748 scope.go:117] "RemoveContainer" containerID="034dd2badb36035539c1599476c14607f6a703ff3b80fce629e2ab422f884e3a" Mar 20 10:52:24 crc kubenswrapper[4748]: I0320 10:52:24.279017 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59bc569d95-4gsq2"] Mar 20 10:52:24 crc kubenswrapper[4748]: E0320 10:52:24.279899 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2b93435-c7d8-4adc-856f-5b318567767e" containerName="oc" Mar 20 10:52:24 crc kubenswrapper[4748]: I0320 10:52:24.279917 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2b93435-c7d8-4adc-856f-5b318567767e" containerName="oc" Mar 20 10:52:24 crc kubenswrapper[4748]: E0320 10:52:24.279929 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a71adfec-5917-40c6-bbb9-4b027fca626b" containerName="extract-utilities" Mar 20 10:52:24 crc kubenswrapper[4748]: I0320 10:52:24.279937 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="a71adfec-5917-40c6-bbb9-4b027fca626b" containerName="extract-utilities" Mar 20 10:52:24 crc kubenswrapper[4748]: E0320 10:52:24.279946 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a71adfec-5917-40c6-bbb9-4b027fca626b" containerName="extract-content" Mar 20 10:52:24 crc kubenswrapper[4748]: I0320 10:52:24.279954 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="a71adfec-5917-40c6-bbb9-4b027fca626b" containerName="extract-content" Mar 20 10:52:24 crc kubenswrapper[4748]: E0320 10:52:24.279972 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a71adfec-5917-40c6-bbb9-4b027fca626b" containerName="registry-server" Mar 20 10:52:24 crc kubenswrapper[4748]: I0320 10:52:24.279979 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="a71adfec-5917-40c6-bbb9-4b027fca626b" containerName="registry-server" Mar 20 10:52:24 crc kubenswrapper[4748]: I0320 10:52:24.280103 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2b93435-c7d8-4adc-856f-5b318567767e" containerName="oc" Mar 20 10:52:24 crc kubenswrapper[4748]: I0320 10:52:24.280119 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="a71adfec-5917-40c6-bbb9-4b027fca626b" containerName="registry-server" Mar 20 10:52:24 crc kubenswrapper[4748]: I0320 10:52:24.280625 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-4gsq2" Mar 20 10:52:24 crc kubenswrapper[4748]: I0320 10:52:24.282952 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-2bqxr" Mar 20 10:52:24 crc kubenswrapper[4748]: I0320 10:52:24.292267 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d58dc466-stsk5"] Mar 20 10:52:24 crc kubenswrapper[4748]: I0320 10:52:24.293443 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-stsk5" Mar 20 10:52:24 crc kubenswrapper[4748]: I0320 10:52:24.297806 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-pgknk" Mar 20 10:52:24 crc kubenswrapper[4748]: I0320 10:52:24.305445 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d58dc466-stsk5"] Mar 20 10:52:24 crc kubenswrapper[4748]: I0320 10:52:24.314066 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59bc569d95-4gsq2"] Mar 20 10:52:24 crc kubenswrapper[4748]: I0320 10:52:24.341456 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-588d4d986b-9gzr8"] Mar 20 10:52:24 crc kubenswrapper[4748]: I0320 10:52:24.342346 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-9gzr8" Mar 20 10:52:24 crc kubenswrapper[4748]: I0320 10:52:24.350036 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-588d4d986b-9gzr8"] Mar 20 10:52:24 crc kubenswrapper[4748]: I0320 10:52:24.360115 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-gc9j5" Mar 20 10:52:24 crc kubenswrapper[4748]: I0320 10:52:24.373628 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-79df6bcc97-dn24h"] Mar 20 10:52:24 crc kubenswrapper[4748]: I0320 10:52:24.374782 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-dn24h" Mar 20 10:52:24 crc kubenswrapper[4748]: I0320 10:52:24.377894 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-ghzn9" Mar 20 10:52:24 crc kubenswrapper[4748]: I0320 10:52:24.379331 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2kj9l\" (UniqueName: \"kubernetes.io/projected/20023868-c089-41ec-ac26-9b4882fbab50-kube-api-access-2kj9l\") pod \"barbican-operator-controller-manager-59bc569d95-4gsq2\" (UID: \"20023868-c089-41ec-ac26-9b4882fbab50\") " pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-4gsq2" Mar 20 10:52:24 crc kubenswrapper[4748]: I0320 10:52:24.379402 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rz4qn\" (UniqueName: \"kubernetes.io/projected/1ec4d02c-2709-4102-8a27-c4e7c71ed61f-kube-api-access-rz4qn\") pod \"cinder-operator-controller-manager-8d58dc466-stsk5\" (UID: \"1ec4d02c-2709-4102-8a27-c4e7c71ed61f\") " pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-stsk5" Mar 20 10:52:24 crc kubenswrapper[4748]: I0320 10:52:24.396663 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-79df6bcc97-dn24h"] Mar 20 10:52:24 crc kubenswrapper[4748]: I0320 10:52:24.420433 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-67dd5f86f5-r4sct"] Mar 20 10:52:24 crc kubenswrapper[4748]: I0320 10:52:24.421289 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-r4sct" Mar 20 10:52:24 crc kubenswrapper[4748]: I0320 10:52:24.427069 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-rc54h" Mar 20 10:52:24 crc kubenswrapper[4748]: I0320 10:52:24.458022 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-8464cc45fb-9gkdc"] Mar 20 10:52:24 crc kubenswrapper[4748]: I0320 10:52:24.458851 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-9gkdc" Mar 20 10:52:24 crc kubenswrapper[4748]: I0320 10:52:24.462780 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-pb9sq" Mar 20 10:52:24 crc kubenswrapper[4748]: I0320 10:52:24.477909 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-67dd5f86f5-r4sct"] Mar 20 10:52:24 crc kubenswrapper[4748]: I0320 10:52:24.481903 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2kj9l\" (UniqueName: \"kubernetes.io/projected/20023868-c089-41ec-ac26-9b4882fbab50-kube-api-access-2kj9l\") pod \"barbican-operator-controller-manager-59bc569d95-4gsq2\" (UID: \"20023868-c089-41ec-ac26-9b4882fbab50\") " pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-4gsq2" Mar 20 10:52:24 crc kubenswrapper[4748]: I0320 10:52:24.481969 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sc8ss\" (UniqueName: \"kubernetes.io/projected/d855d6bf-853d-454b-b0b7-feb11f23cc17-kube-api-access-sc8ss\") pod \"designate-operator-controller-manager-588d4d986b-9gzr8\" (UID: \"d855d6bf-853d-454b-b0b7-feb11f23cc17\") " pod="openstack-operators/designate-operator-controller-manager-588d4d986b-9gzr8" Mar 20 10:52:24 crc kubenswrapper[4748]: I0320 10:52:24.482001 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rz4qn\" (UniqueName: \"kubernetes.io/projected/1ec4d02c-2709-4102-8a27-c4e7c71ed61f-kube-api-access-rz4qn\") pod \"cinder-operator-controller-manager-8d58dc466-stsk5\" (UID: \"1ec4d02c-2709-4102-8a27-c4e7c71ed61f\") " pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-stsk5" Mar 20 10:52:24 crc kubenswrapper[4748]: I0320 10:52:24.482042 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8qhl\" (UniqueName: \"kubernetes.io/projected/8582a4fb-51b2-411c-a67f-31a023f40493-kube-api-access-b8qhl\") pod \"glance-operator-controller-manager-79df6bcc97-dn24h\" (UID: \"8582a4fb-51b2-411c-a67f-31a023f40493\") " pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-dn24h" Mar 20 10:52:24 crc kubenswrapper[4748]: I0320 10:52:24.482068 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhb76\" (UniqueName: \"kubernetes.io/projected/3fb5bb3a-ab86-4c3f-9d2d-9ef6d7f7ca1f-kube-api-access-rhb76\") pod \"heat-operator-controller-manager-67dd5f86f5-r4sct\" (UID: \"3fb5bb3a-ab86-4c3f-9d2d-9ef6d7f7ca1f\") " pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-r4sct" Mar 20 10:52:24 crc kubenswrapper[4748]: I0320 10:52:24.486473 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-8464cc45fb-9gkdc"] Mar 20 10:52:24 crc kubenswrapper[4748]: I0320 10:52:24.494764 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-7b9c774f96-x7sjb"] Mar 20 10:52:24 crc kubenswrapper[4748]: I0320 10:52:24.496420 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-x7sjb" Mar 20 10:52:24 crc kubenswrapper[4748]: I0320 10:52:24.503029 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-bpx2n" Mar 20 10:52:24 crc kubenswrapper[4748]: I0320 10:52:24.503142 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Mar 20 10:52:24 crc kubenswrapper[4748]: I0320 10:52:24.510480 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f787dddc9-9zhhz"] Mar 20 10:52:24 crc kubenswrapper[4748]: I0320 10:52:24.511541 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-9zhhz" Mar 20 10:52:24 crc kubenswrapper[4748]: I0320 10:52:24.526803 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-h99vz" Mar 20 10:52:24 crc kubenswrapper[4748]: I0320 10:52:24.538159 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2kj9l\" (UniqueName: \"kubernetes.io/projected/20023868-c089-41ec-ac26-9b4882fbab50-kube-api-access-2kj9l\") pod \"barbican-operator-controller-manager-59bc569d95-4gsq2\" (UID: \"20023868-c089-41ec-ac26-9b4882fbab50\") " pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-4gsq2" Mar 20 10:52:24 crc kubenswrapper[4748]: I0320 10:52:24.544930 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-7b9c774f96-x7sjb"] Mar 20 10:52:24 crc kubenswrapper[4748]: I0320 10:52:24.545938 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-768b96df4c-prgc2"] Mar 20 10:52:24 crc kubenswrapper[4748]: I0320 10:52:24.546752 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-prgc2" Mar 20 10:52:24 crc kubenswrapper[4748]: I0320 10:52:24.555857 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-cql2j" Mar 20 10:52:24 crc kubenswrapper[4748]: I0320 10:52:24.557928 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f787dddc9-9zhhz"] Mar 20 10:52:24 crc kubenswrapper[4748]: I0320 10:52:24.568603 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-768b96df4c-prgc2"] Mar 20 10:52:24 crc kubenswrapper[4748]: I0320 10:52:24.585534 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c29e0600-cf39-40bf-9225-48e55c4b8f97-cert\") pod \"infra-operator-controller-manager-7b9c774f96-x7sjb\" (UID: \"c29e0600-cf39-40bf-9225-48e55c4b8f97\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-x7sjb" Mar 20 10:52:24 crc kubenswrapper[4748]: I0320 10:52:24.585609 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bvhk\" (UniqueName: \"kubernetes.io/projected/5d9f2386-33fc-43e9-9a61-e0d57fd94fbe-kube-api-access-9bvhk\") pod \"horizon-operator-controller-manager-8464cc45fb-9gkdc\" (UID: \"5d9f2386-33fc-43e9-9a61-e0d57fd94fbe\") " pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-9gkdc" Mar 20 10:52:24 crc kubenswrapper[4748]: I0320 10:52:24.585656 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l77cr\" (UniqueName: \"kubernetes.io/projected/640d4c26-acbd-4cb4-8b59-fde206294a91-kube-api-access-l77cr\") pod \"ironic-operator-controller-manager-6f787dddc9-9zhhz\" (UID: \"640d4c26-acbd-4cb4-8b59-fde206294a91\") " pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-9zhhz" Mar 20 10:52:24 crc kubenswrapper[4748]: I0320 10:52:24.585687 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sc8ss\" (UniqueName: \"kubernetes.io/projected/d855d6bf-853d-454b-b0b7-feb11f23cc17-kube-api-access-sc8ss\") pod \"designate-operator-controller-manager-588d4d986b-9gzr8\" (UID: \"d855d6bf-853d-454b-b0b7-feb11f23cc17\") " pod="openstack-operators/designate-operator-controller-manager-588d4d986b-9gzr8" Mar 20 10:52:24 crc kubenswrapper[4748]: I0320 10:52:24.585711 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdvcc\" (UniqueName: \"kubernetes.io/projected/c29e0600-cf39-40bf-9225-48e55c4b8f97-kube-api-access-kdvcc\") pod \"infra-operator-controller-manager-7b9c774f96-x7sjb\" (UID: \"c29e0600-cf39-40bf-9225-48e55c4b8f97\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-x7sjb" Mar 20 10:52:24 crc kubenswrapper[4748]: I0320 10:52:24.585750 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b8qhl\" (UniqueName: \"kubernetes.io/projected/8582a4fb-51b2-411c-a67f-31a023f40493-kube-api-access-b8qhl\") pod \"glance-operator-controller-manager-79df6bcc97-dn24h\" (UID: \"8582a4fb-51b2-411c-a67f-31a023f40493\") " pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-dn24h" Mar 20 10:52:24 crc kubenswrapper[4748]: I0320 10:52:24.585771 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rhb76\" (UniqueName: \"kubernetes.io/projected/3fb5bb3a-ab86-4c3f-9d2d-9ef6d7f7ca1f-kube-api-access-rhb76\") pod \"heat-operator-controller-manager-67dd5f86f5-r4sct\" (UID: \"3fb5bb3a-ab86-4c3f-9d2d-9ef6d7f7ca1f\") " pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-r4sct" Mar 20 10:52:24 crc kubenswrapper[4748]: I0320 10:52:24.590573 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rz4qn\" (UniqueName: \"kubernetes.io/projected/1ec4d02c-2709-4102-8a27-c4e7c71ed61f-kube-api-access-rz4qn\") pod \"cinder-operator-controller-manager-8d58dc466-stsk5\" (UID: \"1ec4d02c-2709-4102-8a27-c4e7c71ed61f\") " pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-stsk5" Mar 20 10:52:24 crc kubenswrapper[4748]: I0320 10:52:24.601914 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-55f864c847-blfgz"] Mar 20 10:52:24 crc kubenswrapper[4748]: I0320 10:52:24.602769 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-55f864c847-blfgz" Mar 20 10:52:24 crc kubenswrapper[4748]: I0320 10:52:24.610081 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-nk4lb" Mar 20 10:52:24 crc kubenswrapper[4748]: I0320 10:52:24.637605 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-stsk5" Mar 20 10:52:24 crc kubenswrapper[4748]: I0320 10:52:24.637740 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhb76\" (UniqueName: \"kubernetes.io/projected/3fb5bb3a-ab86-4c3f-9d2d-9ef6d7f7ca1f-kube-api-access-rhb76\") pod \"heat-operator-controller-manager-67dd5f86f5-r4sct\" (UID: \"3fb5bb3a-ab86-4c3f-9d2d-9ef6d7f7ca1f\") " pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-r4sct" Mar 20 10:52:24 crc kubenswrapper[4748]: I0320 10:52:24.638599 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8qhl\" (UniqueName: \"kubernetes.io/projected/8582a4fb-51b2-411c-a67f-31a023f40493-kube-api-access-b8qhl\") pod \"glance-operator-controller-manager-79df6bcc97-dn24h\" (UID: \"8582a4fb-51b2-411c-a67f-31a023f40493\") " pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-dn24h" Mar 20 10:52:24 crc kubenswrapper[4748]: I0320 10:52:24.640077 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-4gsq2" Mar 20 10:52:24 crc kubenswrapper[4748]: I0320 10:52:24.671643 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sc8ss\" (UniqueName: \"kubernetes.io/projected/d855d6bf-853d-454b-b0b7-feb11f23cc17-kube-api-access-sc8ss\") pod \"designate-operator-controller-manager-588d4d986b-9gzr8\" (UID: \"d855d6bf-853d-454b-b0b7-feb11f23cc17\") " pod="openstack-operators/designate-operator-controller-manager-588d4d986b-9gzr8" Mar 20 10:52:24 crc kubenswrapper[4748]: I0320 10:52:24.691194 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c29e0600-cf39-40bf-9225-48e55c4b8f97-cert\") pod \"infra-operator-controller-manager-7b9c774f96-x7sjb\" (UID: \"c29e0600-cf39-40bf-9225-48e55c4b8f97\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-x7sjb" Mar 20 10:52:24 crc kubenswrapper[4748]: I0320 10:52:24.691492 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckfsv\" (UniqueName: \"kubernetes.io/projected/17f5527d-b31e-4788-ab09-ac5d26ea1bce-kube-api-access-ckfsv\") pod \"keystone-operator-controller-manager-768b96df4c-prgc2\" (UID: \"17f5527d-b31e-4788-ab09-ac5d26ea1bce\") " pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-prgc2" Mar 20 10:52:24 crc kubenswrapper[4748]: I0320 10:52:24.691664 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9bvhk\" (UniqueName: \"kubernetes.io/projected/5d9f2386-33fc-43e9-9a61-e0d57fd94fbe-kube-api-access-9bvhk\") pod \"horizon-operator-controller-manager-8464cc45fb-9gkdc\" (UID: \"5d9f2386-33fc-43e9-9a61-e0d57fd94fbe\") " pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-9gkdc" Mar 20 10:52:24 crc kubenswrapper[4748]: I0320 10:52:24.691732 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l77cr\" (UniqueName: \"kubernetes.io/projected/640d4c26-acbd-4cb4-8b59-fde206294a91-kube-api-access-l77cr\") pod \"ironic-operator-controller-manager-6f787dddc9-9zhhz\" (UID: \"640d4c26-acbd-4cb4-8b59-fde206294a91\") " pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-9zhhz" Mar 20 10:52:24 crc kubenswrapper[4748]: I0320 10:52:24.691766 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kdvcc\" (UniqueName: \"kubernetes.io/projected/c29e0600-cf39-40bf-9225-48e55c4b8f97-kube-api-access-kdvcc\") pod \"infra-operator-controller-manager-7b9c774f96-x7sjb\" (UID: \"c29e0600-cf39-40bf-9225-48e55c4b8f97\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-x7sjb" Mar 20 10:52:24 crc kubenswrapper[4748]: I0320 10:52:24.691805 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s776c\" (UniqueName: \"kubernetes.io/projected/6dbf38ab-35e4-4cf3-9655-b8dc49eaea7d-kube-api-access-s776c\") pod \"manila-operator-controller-manager-55f864c847-blfgz\" (UID: \"6dbf38ab-35e4-4cf3-9655-b8dc49eaea7d\") " pod="openstack-operators/manila-operator-controller-manager-55f864c847-blfgz" Mar 20 10:52:24 crc kubenswrapper[4748]: E0320 10:52:24.692446 4748 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 20 10:52:24 crc kubenswrapper[4748]: E0320 10:52:24.692510 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c29e0600-cf39-40bf-9225-48e55c4b8f97-cert podName:c29e0600-cf39-40bf-9225-48e55c4b8f97 nodeName:}" failed. No retries permitted until 2026-03-20 10:52:25.192476467 +0000 UTC m=+980.334022281 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c29e0600-cf39-40bf-9225-48e55c4b8f97-cert") pod "infra-operator-controller-manager-7b9c774f96-x7sjb" (UID: "c29e0600-cf39-40bf-9225-48e55c4b8f97") : secret "infra-operator-webhook-server-cert" not found Mar 20 10:52:24 crc kubenswrapper[4748]: I0320 10:52:24.694540 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-dn24h" Mar 20 10:52:24 crc kubenswrapper[4748]: I0320 10:52:24.703362 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-55f864c847-blfgz"] Mar 20 10:52:24 crc kubenswrapper[4748]: I0320 10:52:24.715532 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdvcc\" (UniqueName: \"kubernetes.io/projected/c29e0600-cf39-40bf-9225-48e55c4b8f97-kube-api-access-kdvcc\") pod \"infra-operator-controller-manager-7b9c774f96-x7sjb\" (UID: \"c29e0600-cf39-40bf-9225-48e55c4b8f97\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-x7sjb" Mar 20 10:52:24 crc kubenswrapper[4748]: I0320 10:52:24.717253 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l77cr\" (UniqueName: \"kubernetes.io/projected/640d4c26-acbd-4cb4-8b59-fde206294a91-kube-api-access-l77cr\") pod \"ironic-operator-controller-manager-6f787dddc9-9zhhz\" (UID: \"640d4c26-acbd-4cb4-8b59-fde206294a91\") " pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-9zhhz" Mar 20 10:52:24 crc kubenswrapper[4748]: I0320 10:52:24.718473 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67ccfc9778-r5z4d"] Mar 20 10:52:24 crc kubenswrapper[4748]: I0320 10:52:24.719072 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9bvhk\" (UniqueName: \"kubernetes.io/projected/5d9f2386-33fc-43e9-9a61-e0d57fd94fbe-kube-api-access-9bvhk\") pod \"horizon-operator-controller-manager-8464cc45fb-9gkdc\" (UID: \"5d9f2386-33fc-43e9-9a61-e0d57fd94fbe\") " pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-9gkdc" Mar 20 10:52:24 crc kubenswrapper[4748]: I0320 10:52:24.720764 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-r5z4d" Mar 20 10:52:24 crc kubenswrapper[4748]: I0320 10:52:24.729537 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-vvzcf" Mar 20 10:52:24 crc kubenswrapper[4748]: I0320 10:52:24.745953 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-zcqnn"] Mar 20 10:52:24 crc kubenswrapper[4748]: I0320 10:52:24.750292 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-767865f676-zcqnn" Mar 20 10:52:24 crc kubenswrapper[4748]: I0320 10:52:24.753651 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-fhfbq" Mar 20 10:52:24 crc kubenswrapper[4748]: I0320 10:52:24.756471 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67ccfc9778-r5z4d"] Mar 20 10:52:24 crc kubenswrapper[4748]: I0320 10:52:24.764103 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-r4sct" Mar 20 10:52:24 crc kubenswrapper[4748]: I0320 10:52:24.778810 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5b9f45d989-cgzmd"] Mar 20 10:52:24 crc kubenswrapper[4748]: I0320 10:52:24.780046 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-cgzmd" Mar 20 10:52:24 crc kubenswrapper[4748]: I0320 10:52:24.786380 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-mt8v4" Mar 20 10:52:24 crc kubenswrapper[4748]: I0320 10:52:24.793497 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqdpj\" (UniqueName: \"kubernetes.io/projected/b0d0b327-5826-4c41-84bc-8b2c2bb05756-kube-api-access-xqdpj\") pod \"mariadb-operator-controller-manager-67ccfc9778-r5z4d\" (UID: \"b0d0b327-5826-4c41-84bc-8b2c2bb05756\") " pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-r5z4d" Mar 20 10:52:24 crc kubenswrapper[4748]: I0320 10:52:24.793621 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2wdg\" (UniqueName: \"kubernetes.io/projected/ecd87b49-65fa-465e-a668-03cb90381b6e-kube-api-access-q2wdg\") pod \"neutron-operator-controller-manager-767865f676-zcqnn\" (UID: \"ecd87b49-65fa-465e-a668-03cb90381b6e\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-zcqnn" Mar 20 10:52:24 crc kubenswrapper[4748]: I0320 10:52:24.793674 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s776c\" (UniqueName: \"kubernetes.io/projected/6dbf38ab-35e4-4cf3-9655-b8dc49eaea7d-kube-api-access-s776c\") pod \"manila-operator-controller-manager-55f864c847-blfgz\" (UID: \"6dbf38ab-35e4-4cf3-9655-b8dc49eaea7d\") " pod="openstack-operators/manila-operator-controller-manager-55f864c847-blfgz" Mar 20 10:52:24 crc kubenswrapper[4748]: I0320 10:52:24.793781 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckfsv\" (UniqueName: \"kubernetes.io/projected/17f5527d-b31e-4788-ab09-ac5d26ea1bce-kube-api-access-ckfsv\") pod \"keystone-operator-controller-manager-768b96df4c-prgc2\" (UID: \"17f5527d-b31e-4788-ab09-ac5d26ea1bce\") " pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-prgc2" Mar 20 10:52:24 crc kubenswrapper[4748]: I0320 10:52:24.796746 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d488d59fb-m84q7"] Mar 20 10:52:24 crc kubenswrapper[4748]: I0320 10:52:24.797611 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-m84q7" Mar 20 10:52:24 crc kubenswrapper[4748]: I0320 10:52:24.798386 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-9gkdc" Mar 20 10:52:24 crc kubenswrapper[4748]: I0320 10:52:24.808730 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-l7vmc" Mar 20 10:52:24 crc kubenswrapper[4748]: I0320 10:52:24.820887 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-zcqnn"] Mar 20 10:52:24 crc kubenswrapper[4748]: I0320 10:52:24.823717 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s776c\" (UniqueName: \"kubernetes.io/projected/6dbf38ab-35e4-4cf3-9655-b8dc49eaea7d-kube-api-access-s776c\") pod \"manila-operator-controller-manager-55f864c847-blfgz\" (UID: \"6dbf38ab-35e4-4cf3-9655-b8dc49eaea7d\") " pod="openstack-operators/manila-operator-controller-manager-55f864c847-blfgz" Mar 20 10:52:24 crc kubenswrapper[4748]: I0320 10:52:24.825657 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckfsv\" (UniqueName: \"kubernetes.io/projected/17f5527d-b31e-4788-ab09-ac5d26ea1bce-kube-api-access-ckfsv\") pod \"keystone-operator-controller-manager-768b96df4c-prgc2\" (UID: \"17f5527d-b31e-4788-ab09-ac5d26ea1bce\") " pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-prgc2" Mar 20 10:52:24 crc kubenswrapper[4748]: I0320 10:52:24.845675 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d488d59fb-m84q7"] Mar 20 10:52:24 crc kubenswrapper[4748]: I0320 10:52:24.894036 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5b9f45d989-cgzmd"] Mar 20 10:52:24 crc kubenswrapper[4748]: I0320 10:52:24.895372 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q2wdg\" (UniqueName: \"kubernetes.io/projected/ecd87b49-65fa-465e-a668-03cb90381b6e-kube-api-access-q2wdg\") pod \"neutron-operator-controller-manager-767865f676-zcqnn\" (UID: \"ecd87b49-65fa-465e-a668-03cb90381b6e\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-zcqnn" Mar 20 10:52:24 crc kubenswrapper[4748]: I0320 10:52:24.895425 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkc8q\" (UniqueName: \"kubernetes.io/projected/bd4cdccf-68e3-4c27-ae51-f54b8089e08b-kube-api-access-fkc8q\") pod \"nova-operator-controller-manager-5d488d59fb-m84q7\" (UID: \"bd4cdccf-68e3-4c27-ae51-f54b8089e08b\") " pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-m84q7" Mar 20 10:52:24 crc kubenswrapper[4748]: I0320 10:52:24.895522 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xqdpj\" (UniqueName: \"kubernetes.io/projected/b0d0b327-5826-4c41-84bc-8b2c2bb05756-kube-api-access-xqdpj\") pod \"mariadb-operator-controller-manager-67ccfc9778-r5z4d\" (UID: \"b0d0b327-5826-4c41-84bc-8b2c2bb05756\") " pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-r5z4d" Mar 20 10:52:24 crc kubenswrapper[4748]: I0320 10:52:24.895546 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vtn9p\" (UniqueName: \"kubernetes.io/projected/49092c30-9830-451a-8003-2cc7fa078b62-kube-api-access-vtn9p\") pod \"octavia-operator-controller-manager-5b9f45d989-cgzmd\" (UID: \"49092c30-9830-451a-8003-2cc7fa078b62\") " pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-cgzmd" Mar 20 10:52:24 crc kubenswrapper[4748]: I0320 10:52:24.926354 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2wdg\" (UniqueName: \"kubernetes.io/projected/ecd87b49-65fa-465e-a668-03cb90381b6e-kube-api-access-q2wdg\") pod \"neutron-operator-controller-manager-767865f676-zcqnn\" (UID: \"ecd87b49-65fa-465e-a668-03cb90381b6e\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-zcqnn" Mar 20 10:52:24 crc kubenswrapper[4748]: I0320 10:52:24.932764 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqdpj\" (UniqueName: \"kubernetes.io/projected/b0d0b327-5826-4c41-84bc-8b2c2bb05756-kube-api-access-xqdpj\") pod \"mariadb-operator-controller-manager-67ccfc9778-r5z4d\" (UID: \"b0d0b327-5826-4c41-84bc-8b2c2bb05756\") " pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-r5z4d" Mar 20 10:52:24 crc kubenswrapper[4748]: I0320 10:52:24.936405 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-9zhhz" Mar 20 10:52:24 crc kubenswrapper[4748]: I0320 10:52:24.945824 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-rxplv"] Mar 20 10:52:24 crc kubenswrapper[4748]: I0320 10:52:24.947199 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-rxplv" Mar 20 10:52:24 crc kubenswrapper[4748]: I0320 10:52:24.952742 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-4q9zf" Mar 20 10:52:24 crc kubenswrapper[4748]: I0320 10:52:24.953425 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Mar 20 10:52:24 crc kubenswrapper[4748]: I0320 10:52:24.969139 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-9gzr8" Mar 20 10:52:24 crc kubenswrapper[4748]: I0320 10:52:24.976596 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-884679f54-rt4hn"] Mar 20 10:52:24 crc kubenswrapper[4748]: I0320 10:52:24.977474 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-884679f54-rt4hn" Mar 20 10:52:24 crc kubenswrapper[4748]: I0320 10:52:24.982127 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-kz6zq" Mar 20 10:52:24 crc kubenswrapper[4748]: I0320 10:52:24.997264 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f0a3f8d9-dcfa-498a-a46e-61628aa68067-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-rxplv\" (UID: \"f0a3f8d9-dcfa-498a-a46e-61628aa68067\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-rxplv" Mar 20 10:52:24 crc kubenswrapper[4748]: I0320 10:52:24.997313 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vtn9p\" (UniqueName: \"kubernetes.io/projected/49092c30-9830-451a-8003-2cc7fa078b62-kube-api-access-vtn9p\") pod \"octavia-operator-controller-manager-5b9f45d989-cgzmd\" (UID: \"49092c30-9830-451a-8003-2cc7fa078b62\") " pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-cgzmd" Mar 20 10:52:24 crc kubenswrapper[4748]: I0320 10:52:24.997357 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fkc8q\" (UniqueName: \"kubernetes.io/projected/bd4cdccf-68e3-4c27-ae51-f54b8089e08b-kube-api-access-fkc8q\") pod \"nova-operator-controller-manager-5d488d59fb-m84q7\" (UID: \"bd4cdccf-68e3-4c27-ae51-f54b8089e08b\") " pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-m84q7" Mar 20 10:52:24 crc kubenswrapper[4748]: I0320 10:52:24.997393 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9xpt\" (UniqueName: \"kubernetes.io/projected/f0a3f8d9-dcfa-498a-a46e-61628aa68067-kube-api-access-w9xpt\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-rxplv\" (UID: \"f0a3f8d9-dcfa-498a-a46e-61628aa68067\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-rxplv" Mar 20 10:52:25 crc kubenswrapper[4748]: I0320 10:52:25.007734 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-884679f54-rt4hn"] Mar 20 10:52:25 crc kubenswrapper[4748]: I0320 10:52:25.014921 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-rxplv"] Mar 20 10:52:25 crc kubenswrapper[4748]: I0320 10:52:25.019652 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-prgc2" Mar 20 10:52:25 crc kubenswrapper[4748]: I0320 10:52:25.019906 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-5784578c99-897gg"] Mar 20 10:52:25 crc kubenswrapper[4748]: I0320 10:52:25.020730 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5784578c99-897gg" Mar 20 10:52:25 crc kubenswrapper[4748]: I0320 10:52:25.035238 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-rpvk8" Mar 20 10:52:25 crc kubenswrapper[4748]: I0320 10:52:25.047108 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-c674c5965-2lkhq"] Mar 20 10:52:25 crc kubenswrapper[4748]: I0320 10:52:25.048132 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-c674c5965-2lkhq" Mar 20 10:52:25 crc kubenswrapper[4748]: I0320 10:52:25.049050 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-55f864c847-blfgz" Mar 20 10:52:25 crc kubenswrapper[4748]: I0320 10:52:25.049414 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vtn9p\" (UniqueName: \"kubernetes.io/projected/49092c30-9830-451a-8003-2cc7fa078b62-kube-api-access-vtn9p\") pod \"octavia-operator-controller-manager-5b9f45d989-cgzmd\" (UID: \"49092c30-9830-451a-8003-2cc7fa078b62\") " pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-cgzmd" Mar 20 10:52:25 crc kubenswrapper[4748]: I0320 10:52:25.052146 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-th9jr" Mar 20 10:52:25 crc kubenswrapper[4748]: I0320 10:52:25.078097 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-r5z4d" Mar 20 10:52:25 crc kubenswrapper[4748]: I0320 10:52:25.078549 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5784578c99-897gg"] Mar 20 10:52:25 crc kubenswrapper[4748]: I0320 10:52:25.079227 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-767865f676-zcqnn" Mar 20 10:52:25 crc kubenswrapper[4748]: I0320 10:52:25.082878 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-c674c5965-2lkhq"] Mar 20 10:52:25 crc kubenswrapper[4748]: I0320 10:52:25.094956 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-d6b694c5-9pmr7"] Mar 20 10:52:25 crc kubenswrapper[4748]: I0320 10:52:25.095716 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-9pmr7" Mar 20 10:52:25 crc kubenswrapper[4748]: I0320 10:52:25.098347 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gz22c\" (UniqueName: \"kubernetes.io/projected/1988c5e6-a91c-4085-a878-2ffdf478fa1b-kube-api-access-gz22c\") pod \"swift-operator-controller-manager-c674c5965-2lkhq\" (UID: \"1988c5e6-a91c-4085-a878-2ffdf478fa1b\") " pod="openstack-operators/swift-operator-controller-manager-c674c5965-2lkhq" Mar 20 10:52:25 crc kubenswrapper[4748]: I0320 10:52:25.098428 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7dv5l\" (UniqueName: \"kubernetes.io/projected/bf9a7295-e355-4c61-a841-fd2bce675235-kube-api-access-7dv5l\") pod \"ovn-operator-controller-manager-884679f54-rt4hn\" (UID: \"bf9a7295-e355-4c61-a841-fd2bce675235\") " pod="openstack-operators/ovn-operator-controller-manager-884679f54-rt4hn" Mar 20 10:52:25 crc kubenswrapper[4748]: I0320 10:52:25.098495 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9xpt\" (UniqueName: \"kubernetes.io/projected/f0a3f8d9-dcfa-498a-a46e-61628aa68067-kube-api-access-w9xpt\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-rxplv\" (UID: \"f0a3f8d9-dcfa-498a-a46e-61628aa68067\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-rxplv" Mar 20 10:52:25 crc kubenswrapper[4748]: I0320 10:52:25.098555 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbxkj\" (UniqueName: \"kubernetes.io/projected/736beaed-774c-43c0-bff9-d66a5ae4a1f5-kube-api-access-lbxkj\") pod \"placement-operator-controller-manager-5784578c99-897gg\" (UID: \"736beaed-774c-43c0-bff9-d66a5ae4a1f5\") " pod="openstack-operators/placement-operator-controller-manager-5784578c99-897gg" Mar 20 10:52:25 crc kubenswrapper[4748]: I0320 10:52:25.098580 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f0a3f8d9-dcfa-498a-a46e-61628aa68067-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-rxplv\" (UID: \"f0a3f8d9-dcfa-498a-a46e-61628aa68067\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-rxplv" Mar 20 10:52:25 crc kubenswrapper[4748]: E0320 10:52:25.098707 4748 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 10:52:25 crc kubenswrapper[4748]: E0320 10:52:25.098760 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f0a3f8d9-dcfa-498a-a46e-61628aa68067-cert podName:f0a3f8d9-dcfa-498a-a46e-61628aa68067 nodeName:}" failed. No retries permitted until 2026-03-20 10:52:25.598744354 +0000 UTC m=+980.740290168 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f0a3f8d9-dcfa-498a-a46e-61628aa68067-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-rxplv" (UID: "f0a3f8d9-dcfa-498a-a46e-61628aa68067") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 10:52:25 crc kubenswrapper[4748]: I0320 10:52:25.104194 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-qhpgz" Mar 20 10:52:25 crc kubenswrapper[4748]: I0320 10:52:25.124202 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-d6b694c5-9pmr7"] Mar 20 10:52:25 crc kubenswrapper[4748]: I0320 10:52:25.124274 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-cgzmd" Mar 20 10:52:25 crc kubenswrapper[4748]: I0320 10:52:25.126661 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkc8q\" (UniqueName: \"kubernetes.io/projected/bd4cdccf-68e3-4c27-ae51-f54b8089e08b-kube-api-access-fkc8q\") pod \"nova-operator-controller-manager-5d488d59fb-m84q7\" (UID: \"bd4cdccf-68e3-4c27-ae51-f54b8089e08b\") " pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-m84q7" Mar 20 10:52:25 crc kubenswrapper[4748]: I0320 10:52:25.144180 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-m84q7" Mar 20 10:52:25 crc kubenswrapper[4748]: I0320 10:52:25.166029 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-vsd7j"] Mar 20 10:52:25 crc kubenswrapper[4748]: I0320 10:52:25.167116 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-vsd7j" Mar 20 10:52:25 crc kubenswrapper[4748]: I0320 10:52:25.169425 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-478rf" Mar 20 10:52:25 crc kubenswrapper[4748]: I0320 10:52:25.176379 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-vsd7j"] Mar 20 10:52:25 crc kubenswrapper[4748]: I0320 10:52:25.180703 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9xpt\" (UniqueName: \"kubernetes.io/projected/f0a3f8d9-dcfa-498a-a46e-61628aa68067-kube-api-access-w9xpt\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-rxplv\" (UID: \"f0a3f8d9-dcfa-498a-a46e-61628aa68067\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-rxplv" Mar 20 10:52:25 crc kubenswrapper[4748]: I0320 10:52:25.200552 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c29e0600-cf39-40bf-9225-48e55c4b8f97-cert\") pod \"infra-operator-controller-manager-7b9c774f96-x7sjb\" (UID: \"c29e0600-cf39-40bf-9225-48e55c4b8f97\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-x7sjb" Mar 20 10:52:25 crc kubenswrapper[4748]: I0320 10:52:25.200604 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lbxkj\" (UniqueName: \"kubernetes.io/projected/736beaed-774c-43c0-bff9-d66a5ae4a1f5-kube-api-access-lbxkj\") pod \"placement-operator-controller-manager-5784578c99-897gg\" (UID: \"736beaed-774c-43c0-bff9-d66a5ae4a1f5\") " pod="openstack-operators/placement-operator-controller-manager-5784578c99-897gg" Mar 20 10:52:25 crc kubenswrapper[4748]: I0320 10:52:25.200638 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cd85s\" (UniqueName: \"kubernetes.io/projected/44b5dd5f-9a81-4c01-8efd-6d4997bb9c94-kube-api-access-cd85s\") pod \"telemetry-operator-controller-manager-d6b694c5-9pmr7\" (UID: \"44b5dd5f-9a81-4c01-8efd-6d4997bb9c94\") " pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-9pmr7" Mar 20 10:52:25 crc kubenswrapper[4748]: I0320 10:52:25.200672 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gz22c\" (UniqueName: \"kubernetes.io/projected/1988c5e6-a91c-4085-a878-2ffdf478fa1b-kube-api-access-gz22c\") pod \"swift-operator-controller-manager-c674c5965-2lkhq\" (UID: \"1988c5e6-a91c-4085-a878-2ffdf478fa1b\") " pod="openstack-operators/swift-operator-controller-manager-c674c5965-2lkhq" Mar 20 10:52:25 crc kubenswrapper[4748]: I0320 10:52:25.200705 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7dv5l\" (UniqueName: \"kubernetes.io/projected/bf9a7295-e355-4c61-a841-fd2bce675235-kube-api-access-7dv5l\") pod \"ovn-operator-controller-manager-884679f54-rt4hn\" (UID: \"bf9a7295-e355-4c61-a841-fd2bce675235\") " pod="openstack-operators/ovn-operator-controller-manager-884679f54-rt4hn" Mar 20 10:52:25 crc kubenswrapper[4748]: I0320 10:52:25.200727 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khtgz\" (UniqueName: \"kubernetes.io/projected/8e47e918-33de-4a66-9223-7ee3264600c1-kube-api-access-khtgz\") pod \"test-operator-controller-manager-5c5cb9c4d7-vsd7j\" (UID: \"8e47e918-33de-4a66-9223-7ee3264600c1\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-vsd7j" Mar 20 10:52:25 crc kubenswrapper[4748]: E0320 10:52:25.200915 4748 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 20 10:52:25 crc kubenswrapper[4748]: E0320 10:52:25.200961 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c29e0600-cf39-40bf-9225-48e55c4b8f97-cert podName:c29e0600-cf39-40bf-9225-48e55c4b8f97 nodeName:}" failed. No retries permitted until 2026-03-20 10:52:26.200945294 +0000 UTC m=+981.342491108 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c29e0600-cf39-40bf-9225-48e55c4b8f97-cert") pod "infra-operator-controller-manager-7b9c774f96-x7sjb" (UID: "c29e0600-cf39-40bf-9225-48e55c4b8f97") : secret "infra-operator-webhook-server-cert" not found Mar 20 10:52:25 crc kubenswrapper[4748]: I0320 10:52:25.234478 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-h7vwp"] Mar 20 10:52:25 crc kubenswrapper[4748]: I0320 10:52:25.235590 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-h7vwp" Mar 20 10:52:25 crc kubenswrapper[4748]: I0320 10:52:25.241683 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbxkj\" (UniqueName: \"kubernetes.io/projected/736beaed-774c-43c0-bff9-d66a5ae4a1f5-kube-api-access-lbxkj\") pod \"placement-operator-controller-manager-5784578c99-897gg\" (UID: \"736beaed-774c-43c0-bff9-d66a5ae4a1f5\") " pod="openstack-operators/placement-operator-controller-manager-5784578c99-897gg" Mar 20 10:52:25 crc kubenswrapper[4748]: I0320 10:52:25.242137 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-qpb6t" Mar 20 10:52:25 crc kubenswrapper[4748]: I0320 10:52:25.252919 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gz22c\" (UniqueName: \"kubernetes.io/projected/1988c5e6-a91c-4085-a878-2ffdf478fa1b-kube-api-access-gz22c\") pod \"swift-operator-controller-manager-c674c5965-2lkhq\" (UID: \"1988c5e6-a91c-4085-a878-2ffdf478fa1b\") " pod="openstack-operators/swift-operator-controller-manager-c674c5965-2lkhq" Mar 20 10:52:25 crc kubenswrapper[4748]: I0320 10:52:25.254695 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-h7vwp"] Mar 20 10:52:25 crc kubenswrapper[4748]: I0320 10:52:25.259642 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7dv5l\" (UniqueName: \"kubernetes.io/projected/bf9a7295-e355-4c61-a841-fd2bce675235-kube-api-access-7dv5l\") pod \"ovn-operator-controller-manager-884679f54-rt4hn\" (UID: \"bf9a7295-e355-4c61-a841-fd2bce675235\") " pod="openstack-operators/ovn-operator-controller-manager-884679f54-rt4hn" Mar 20 10:52:25 crc kubenswrapper[4748]: I0320 10:52:25.268524 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5647f98656-9pqdv"] Mar 20 10:52:25 crc kubenswrapper[4748]: I0320 10:52:25.280316 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-5647f98656-9pqdv" Mar 20 10:52:25 crc kubenswrapper[4748]: I0320 10:52:25.283732 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Mar 20 10:52:25 crc kubenswrapper[4748]: I0320 10:52:25.284164 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Mar 20 10:52:25 crc kubenswrapper[4748]: I0320 10:52:25.286036 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-h2k75" Mar 20 10:52:25 crc kubenswrapper[4748]: I0320 10:52:25.296579 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-884679f54-rt4hn" Mar 20 10:52:25 crc kubenswrapper[4748]: I0320 10:52:25.306028 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rsh7f\" (UniqueName: \"kubernetes.io/projected/795ce1d0-2232-4ed7-8618-c47a7584973e-kube-api-access-rsh7f\") pod \"openstack-operator-controller-manager-5647f98656-9pqdv\" (UID: \"795ce1d0-2232-4ed7-8618-c47a7584973e\") " pod="openstack-operators/openstack-operator-controller-manager-5647f98656-9pqdv" Mar 20 10:52:25 crc kubenswrapper[4748]: I0320 10:52:25.306093 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwxwd\" (UniqueName: \"kubernetes.io/projected/b6d05c98-f000-4560-b790-da31157488dc-kube-api-access-jwxwd\") pod \"watcher-operator-controller-manager-6c4d75f7f9-h7vwp\" (UID: \"b6d05c98-f000-4560-b790-da31157488dc\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-h7vwp" Mar 20 10:52:25 crc kubenswrapper[4748]: I0320 10:52:25.308606 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/795ce1d0-2232-4ed7-8618-c47a7584973e-webhook-certs\") pod \"openstack-operator-controller-manager-5647f98656-9pqdv\" (UID: \"795ce1d0-2232-4ed7-8618-c47a7584973e\") " pod="openstack-operators/openstack-operator-controller-manager-5647f98656-9pqdv" Mar 20 10:52:25 crc kubenswrapper[4748]: I0320 10:52:25.309501 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cd85s\" (UniqueName: \"kubernetes.io/projected/44b5dd5f-9a81-4c01-8efd-6d4997bb9c94-kube-api-access-cd85s\") pod \"telemetry-operator-controller-manager-d6b694c5-9pmr7\" (UID: \"44b5dd5f-9a81-4c01-8efd-6d4997bb9c94\") " pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-9pmr7" Mar 20 10:52:25 crc kubenswrapper[4748]: I0320 10:52:25.309611 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/795ce1d0-2232-4ed7-8618-c47a7584973e-metrics-certs\") pod \"openstack-operator-controller-manager-5647f98656-9pqdv\" (UID: \"795ce1d0-2232-4ed7-8618-c47a7584973e\") " pod="openstack-operators/openstack-operator-controller-manager-5647f98656-9pqdv" Mar 20 10:52:25 crc kubenswrapper[4748]: I0320 10:52:25.309759 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-khtgz\" (UniqueName: \"kubernetes.io/projected/8e47e918-33de-4a66-9223-7ee3264600c1-kube-api-access-khtgz\") pod \"test-operator-controller-manager-5c5cb9c4d7-vsd7j\" (UID: \"8e47e918-33de-4a66-9223-7ee3264600c1\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-vsd7j" Mar 20 10:52:25 crc kubenswrapper[4748]: I0320 10:52:25.314210 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5647f98656-9pqdv"] Mar 20 10:52:25 crc kubenswrapper[4748]: I0320 10:52:25.331590 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cd85s\" (UniqueName: \"kubernetes.io/projected/44b5dd5f-9a81-4c01-8efd-6d4997bb9c94-kube-api-access-cd85s\") pod \"telemetry-operator-controller-manager-d6b694c5-9pmr7\" (UID: \"44b5dd5f-9a81-4c01-8efd-6d4997bb9c94\") " pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-9pmr7" Mar 20 10:52:25 crc kubenswrapper[4748]: I0320 10:52:25.332361 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-khtgz\" (UniqueName: \"kubernetes.io/projected/8e47e918-33de-4a66-9223-7ee3264600c1-kube-api-access-khtgz\") pod \"test-operator-controller-manager-5c5cb9c4d7-vsd7j\" (UID: \"8e47e918-33de-4a66-9223-7ee3264600c1\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-vsd7j" Mar 20 10:52:25 crc kubenswrapper[4748]: I0320 10:52:25.359782 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-c674c5965-2lkhq" Mar 20 10:52:25 crc kubenswrapper[4748]: I0320 10:52:25.372349 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5784578c99-897gg" Mar 20 10:52:25 crc kubenswrapper[4748]: I0320 10:52:25.381714 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-7tgsh"] Mar 20 10:52:25 crc kubenswrapper[4748]: I0320 10:52:25.382605 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-7tgsh" Mar 20 10:52:25 crc kubenswrapper[4748]: I0320 10:52:25.383732 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-9pmr7" Mar 20 10:52:25 crc kubenswrapper[4748]: I0320 10:52:25.385219 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-wj9hn" Mar 20 10:52:25 crc kubenswrapper[4748]: I0320 10:52:25.394047 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-7tgsh"] Mar 20 10:52:25 crc kubenswrapper[4748]: I0320 10:52:25.410590 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/795ce1d0-2232-4ed7-8618-c47a7584973e-webhook-certs\") pod \"openstack-operator-controller-manager-5647f98656-9pqdv\" (UID: \"795ce1d0-2232-4ed7-8618-c47a7584973e\") " pod="openstack-operators/openstack-operator-controller-manager-5647f98656-9pqdv" Mar 20 10:52:25 crc kubenswrapper[4748]: I0320 10:52:25.410625 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/795ce1d0-2232-4ed7-8618-c47a7584973e-metrics-certs\") pod \"openstack-operator-controller-manager-5647f98656-9pqdv\" (UID: \"795ce1d0-2232-4ed7-8618-c47a7584973e\") " pod="openstack-operators/openstack-operator-controller-manager-5647f98656-9pqdv" Mar 20 10:52:25 crc kubenswrapper[4748]: I0320 10:52:25.410713 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqdjd\" (UniqueName: \"kubernetes.io/projected/b0646c53-71d5-40d9-8a3b-77c244fff7c4-kube-api-access-jqdjd\") pod \"rabbitmq-cluster-operator-manager-668c99d594-7tgsh\" (UID: \"b0646c53-71d5-40d9-8a3b-77c244fff7c4\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-7tgsh" Mar 20 10:52:25 crc kubenswrapper[4748]: I0320 10:52:25.410738 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rsh7f\" (UniqueName: \"kubernetes.io/projected/795ce1d0-2232-4ed7-8618-c47a7584973e-kube-api-access-rsh7f\") pod \"openstack-operator-controller-manager-5647f98656-9pqdv\" (UID: \"795ce1d0-2232-4ed7-8618-c47a7584973e\") " pod="openstack-operators/openstack-operator-controller-manager-5647f98656-9pqdv" Mar 20 10:52:25 crc kubenswrapper[4748]: I0320 10:52:25.410760 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwxwd\" (UniqueName: \"kubernetes.io/projected/b6d05c98-f000-4560-b790-da31157488dc-kube-api-access-jwxwd\") pod \"watcher-operator-controller-manager-6c4d75f7f9-h7vwp\" (UID: \"b6d05c98-f000-4560-b790-da31157488dc\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-h7vwp" Mar 20 10:52:25 crc kubenswrapper[4748]: E0320 10:52:25.411467 4748 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 20 10:52:25 crc kubenswrapper[4748]: E0320 10:52:25.411507 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/795ce1d0-2232-4ed7-8618-c47a7584973e-webhook-certs podName:795ce1d0-2232-4ed7-8618-c47a7584973e nodeName:}" failed. No retries permitted until 2026-03-20 10:52:25.911492078 +0000 UTC m=+981.053037892 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/795ce1d0-2232-4ed7-8618-c47a7584973e-webhook-certs") pod "openstack-operator-controller-manager-5647f98656-9pqdv" (UID: "795ce1d0-2232-4ed7-8618-c47a7584973e") : secret "webhook-server-cert" not found Mar 20 10:52:25 crc kubenswrapper[4748]: E0320 10:52:25.411658 4748 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 20 10:52:25 crc kubenswrapper[4748]: E0320 10:52:25.411679 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/795ce1d0-2232-4ed7-8618-c47a7584973e-metrics-certs podName:795ce1d0-2232-4ed7-8618-c47a7584973e nodeName:}" failed. No retries permitted until 2026-03-20 10:52:25.911672662 +0000 UTC m=+981.053218476 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/795ce1d0-2232-4ed7-8618-c47a7584973e-metrics-certs") pod "openstack-operator-controller-manager-5647f98656-9pqdv" (UID: "795ce1d0-2232-4ed7-8618-c47a7584973e") : secret "metrics-server-cert" not found Mar 20 10:52:25 crc kubenswrapper[4748]: I0320 10:52:25.412698 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-vsd7j" Mar 20 10:52:25 crc kubenswrapper[4748]: I0320 10:52:25.435822 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwxwd\" (UniqueName: \"kubernetes.io/projected/b6d05c98-f000-4560-b790-da31157488dc-kube-api-access-jwxwd\") pod \"watcher-operator-controller-manager-6c4d75f7f9-h7vwp\" (UID: \"b6d05c98-f000-4560-b790-da31157488dc\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-h7vwp" Mar 20 10:52:25 crc kubenswrapper[4748]: I0320 10:52:25.454615 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rsh7f\" (UniqueName: \"kubernetes.io/projected/795ce1d0-2232-4ed7-8618-c47a7584973e-kube-api-access-rsh7f\") pod \"openstack-operator-controller-manager-5647f98656-9pqdv\" (UID: \"795ce1d0-2232-4ed7-8618-c47a7584973e\") " pod="openstack-operators/openstack-operator-controller-manager-5647f98656-9pqdv" Mar 20 10:52:25 crc kubenswrapper[4748]: I0320 10:52:25.507788 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d58dc466-stsk5"] Mar 20 10:52:25 crc kubenswrapper[4748]: I0320 10:52:25.513350 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jqdjd\" (UniqueName: \"kubernetes.io/projected/b0646c53-71d5-40d9-8a3b-77c244fff7c4-kube-api-access-jqdjd\") pod \"rabbitmq-cluster-operator-manager-668c99d594-7tgsh\" (UID: \"b0646c53-71d5-40d9-8a3b-77c244fff7c4\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-7tgsh" Mar 20 10:52:25 crc kubenswrapper[4748]: I0320 10:52:25.535934 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqdjd\" (UniqueName: \"kubernetes.io/projected/b0646c53-71d5-40d9-8a3b-77c244fff7c4-kube-api-access-jqdjd\") pod \"rabbitmq-cluster-operator-manager-668c99d594-7tgsh\" (UID: \"b0646c53-71d5-40d9-8a3b-77c244fff7c4\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-7tgsh" Mar 20 10:52:25 crc kubenswrapper[4748]: I0320 10:52:25.616015 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f0a3f8d9-dcfa-498a-a46e-61628aa68067-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-rxplv\" (UID: \"f0a3f8d9-dcfa-498a-a46e-61628aa68067\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-rxplv" Mar 20 10:52:25 crc kubenswrapper[4748]: E0320 10:52:25.616213 4748 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 10:52:25 crc kubenswrapper[4748]: E0320 10:52:25.616275 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f0a3f8d9-dcfa-498a-a46e-61628aa68067-cert podName:f0a3f8d9-dcfa-498a-a46e-61628aa68067 nodeName:}" failed. No retries permitted until 2026-03-20 10:52:26.616256116 +0000 UTC m=+981.757801930 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f0a3f8d9-dcfa-498a-a46e-61628aa68067-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-rxplv" (UID: "f0a3f8d9-dcfa-498a-a46e-61628aa68067") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 10:52:25 crc kubenswrapper[4748]: I0320 10:52:25.723485 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-h7vwp" Mar 20 10:52:25 crc kubenswrapper[4748]: I0320 10:52:25.758411 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-7tgsh" Mar 20 10:52:25 crc kubenswrapper[4748]: I0320 10:52:25.922228 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/795ce1d0-2232-4ed7-8618-c47a7584973e-webhook-certs\") pod \"openstack-operator-controller-manager-5647f98656-9pqdv\" (UID: \"795ce1d0-2232-4ed7-8618-c47a7584973e\") " pod="openstack-operators/openstack-operator-controller-manager-5647f98656-9pqdv" Mar 20 10:52:25 crc kubenswrapper[4748]: I0320 10:52:25.922696 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/795ce1d0-2232-4ed7-8618-c47a7584973e-metrics-certs\") pod \"openstack-operator-controller-manager-5647f98656-9pqdv\" (UID: \"795ce1d0-2232-4ed7-8618-c47a7584973e\") " pod="openstack-operators/openstack-operator-controller-manager-5647f98656-9pqdv" Mar 20 10:52:25 crc kubenswrapper[4748]: E0320 10:52:25.922434 4748 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 20 10:52:25 crc kubenswrapper[4748]: E0320 10:52:25.923005 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/795ce1d0-2232-4ed7-8618-c47a7584973e-webhook-certs podName:795ce1d0-2232-4ed7-8618-c47a7584973e nodeName:}" failed. No retries permitted until 2026-03-20 10:52:26.922985919 +0000 UTC m=+982.064531733 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/795ce1d0-2232-4ed7-8618-c47a7584973e-webhook-certs") pod "openstack-operator-controller-manager-5647f98656-9pqdv" (UID: "795ce1d0-2232-4ed7-8618-c47a7584973e") : secret "webhook-server-cert" not found Mar 20 10:52:25 crc kubenswrapper[4748]: E0320 10:52:25.922938 4748 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 20 10:52:25 crc kubenswrapper[4748]: E0320 10:52:25.923464 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/795ce1d0-2232-4ed7-8618-c47a7584973e-metrics-certs podName:795ce1d0-2232-4ed7-8618-c47a7584973e nodeName:}" failed. No retries permitted until 2026-03-20 10:52:26.923451061 +0000 UTC m=+982.064996875 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/795ce1d0-2232-4ed7-8618-c47a7584973e-metrics-certs") pod "openstack-operator-controller-manager-5647f98656-9pqdv" (UID: "795ce1d0-2232-4ed7-8618-c47a7584973e") : secret "metrics-server-cert" not found Mar 20 10:52:26 crc kubenswrapper[4748]: I0320 10:52:26.025097 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59bc569d95-4gsq2"] Mar 20 10:52:26 crc kubenswrapper[4748]: I0320 10:52:26.033253 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-79df6bcc97-dn24h"] Mar 20 10:52:26 crc kubenswrapper[4748]: I0320 10:52:26.149508 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-67dd5f86f5-r4sct"] Mar 20 10:52:26 crc kubenswrapper[4748]: I0320 10:52:26.159980 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-768b96df4c-prgc2"] Mar 20 10:52:26 crc kubenswrapper[4748]: I0320 10:52:26.164071 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-stsk5" event={"ID":"1ec4d02c-2709-4102-8a27-c4e7c71ed61f","Type":"ContainerStarted","Data":"363ef476dffcb9e52d0202651554c94b7e11b6f4043c25c89d4c57d26ee52508"} Mar 20 10:52:26 crc kubenswrapper[4748]: I0320 10:52:26.166422 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-4gsq2" event={"ID":"20023868-c089-41ec-ac26-9b4882fbab50","Type":"ContainerStarted","Data":"e935a985299924ad7bfe9a012b8918bd5d3e2fe81847b8215868ff061462aec7"} Mar 20 10:52:26 crc kubenswrapper[4748]: I0320 10:52:26.169040 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-dn24h" event={"ID":"8582a4fb-51b2-411c-a67f-31a023f40493","Type":"ContainerStarted","Data":"5fd0feb6c19a004243e27022f660dc710cef83fcc5e00e821312d1e12e22113d"} Mar 20 10:52:26 crc kubenswrapper[4748]: W0320 10:52:26.172282 4748 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5d9f2386_33fc_43e9_9a61_e0d57fd94fbe.slice/crio-d56044da7d386eb515db2ee01faa8d75dd12c4d0da1b095a01bfac7cba09d0d4 WatchSource:0}: Error finding container d56044da7d386eb515db2ee01faa8d75dd12c4d0da1b095a01bfac7cba09d0d4: Status 404 returned error can't find the container with id d56044da7d386eb515db2ee01faa8d75dd12c4d0da1b095a01bfac7cba09d0d4 Mar 20 10:52:26 crc kubenswrapper[4748]: W0320 10:52:26.174772 4748 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod17f5527d_b31e_4788_ab09_ac5d26ea1bce.slice/crio-0e8927acbb7d0b0e64353568015f5bc838f4b51ff3a2fddc48f5c02464b3189c WatchSource:0}: Error finding container 0e8927acbb7d0b0e64353568015f5bc838f4b51ff3a2fddc48f5c02464b3189c: Status 404 returned error can't find the container with id 0e8927acbb7d0b0e64353568015f5bc838f4b51ff3a2fddc48f5c02464b3189c Mar 20 10:52:26 crc kubenswrapper[4748]: I0320 10:52:26.176644 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-8464cc45fb-9gkdc"] Mar 20 10:52:26 crc kubenswrapper[4748]: W0320 10:52:26.178960 4748 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbd4cdccf_68e3_4c27_ae51_f54b8089e08b.slice/crio-8632c3606e4cac250b426352091d463f87c01b1f6891daa3905233f946a1ee5e WatchSource:0}: Error finding container 8632c3606e4cac250b426352091d463f87c01b1f6891daa3905233f946a1ee5e: Status 404 returned error can't find the container with id 8632c3606e4cac250b426352091d463f87c01b1f6891daa3905233f946a1ee5e Mar 20 10:52:26 crc kubenswrapper[4748]: I0320 10:52:26.185140 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d488d59fb-m84q7"] Mar 20 10:52:26 crc kubenswrapper[4748]: I0320 10:52:26.226524 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c29e0600-cf39-40bf-9225-48e55c4b8f97-cert\") pod \"infra-operator-controller-manager-7b9c774f96-x7sjb\" (UID: \"c29e0600-cf39-40bf-9225-48e55c4b8f97\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-x7sjb" Mar 20 10:52:26 crc kubenswrapper[4748]: E0320 10:52:26.227335 4748 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 20 10:52:26 crc kubenswrapper[4748]: E0320 10:52:26.227504 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c29e0600-cf39-40bf-9225-48e55c4b8f97-cert podName:c29e0600-cf39-40bf-9225-48e55c4b8f97 nodeName:}" failed. No retries permitted until 2026-03-20 10:52:28.227474767 +0000 UTC m=+983.369020581 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c29e0600-cf39-40bf-9225-48e55c4b8f97-cert") pod "infra-operator-controller-manager-7b9c774f96-x7sjb" (UID: "c29e0600-cf39-40bf-9225-48e55c4b8f97") : secret "infra-operator-webhook-server-cert" not found Mar 20 10:52:26 crc kubenswrapper[4748]: I0320 10:52:26.305359 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-884679f54-rt4hn"] Mar 20 10:52:26 crc kubenswrapper[4748]: I0320 10:52:26.324190 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-zcqnn"] Mar 20 10:52:26 crc kubenswrapper[4748]: I0320 10:52:26.331146 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f787dddc9-9zhhz"] Mar 20 10:52:26 crc kubenswrapper[4748]: I0320 10:52:26.336796 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67ccfc9778-r5z4d"] Mar 20 10:52:26 crc kubenswrapper[4748]: W0320 10:52:26.337734 4748 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod640d4c26_acbd_4cb4_8b59_fde206294a91.slice/crio-9258a05c8dab199a357abb2979732d841a52ad4a861a36c0b7db24990d731d4e WatchSource:0}: Error finding container 9258a05c8dab199a357abb2979732d841a52ad4a861a36c0b7db24990d731d4e: Status 404 returned error can't find the container with id 9258a05c8dab199a357abb2979732d841a52ad4a861a36c0b7db24990d731d4e Mar 20 10:52:26 crc kubenswrapper[4748]: I0320 10:52:26.342846 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-55f864c847-blfgz"] Mar 20 10:52:26 crc kubenswrapper[4748]: E0320 10:52:26.347637 4748 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/manila-operator@sha256:f2e0b0fb34995b8acbbf1b0b60b5dbcf488b4f3899d1bb0763ae7dcee9bae6da,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-s776c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-operator-controller-manager-55f864c847-blfgz_openstack-operators(6dbf38ab-35e4-4cf3-9655-b8dc49eaea7d): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 20 10:52:26 crc kubenswrapper[4748]: I0320 10:52:26.348289 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5b9f45d989-cgzmd"] Mar 20 10:52:26 crc kubenswrapper[4748]: E0320 10:52:26.349062 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/manila-operator-controller-manager-55f864c847-blfgz" podUID="6dbf38ab-35e4-4cf3-9655-b8dc49eaea7d" Mar 20 10:52:26 crc kubenswrapper[4748]: I0320 10:52:26.477944 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-vsd7j"] Mar 20 10:52:26 crc kubenswrapper[4748]: W0320 10:52:26.487927 4748 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e47e918_33de_4a66_9223_7ee3264600c1.slice/crio-67ee4045cce3359c1dbb8014fe60ef8920e45ebdfb2753b0cefd80b964b7204c WatchSource:0}: Error finding container 67ee4045cce3359c1dbb8014fe60ef8920e45ebdfb2753b0cefd80b964b7204c: Status 404 returned error can't find the container with id 67ee4045cce3359c1dbb8014fe60ef8920e45ebdfb2753b0cefd80b964b7204c Mar 20 10:52:26 crc kubenswrapper[4748]: I0320 10:52:26.510260 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5784578c99-897gg"] Mar 20 10:52:26 crc kubenswrapper[4748]: I0320 10:52:26.535922 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-c674c5965-2lkhq"] Mar 20 10:52:26 crc kubenswrapper[4748]: E0320 10:52:26.553578 4748 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-khtgz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5c5cb9c4d7-vsd7j_openstack-operators(8e47e918-33de-4a66-9223-7ee3264600c1): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 20 10:52:26 crc kubenswrapper[4748]: I0320 10:52:26.553664 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-d6b694c5-9pmr7"] Mar 20 10:52:26 crc kubenswrapper[4748]: E0320 10:52:26.553779 4748 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:866844c5b88e1e0518ceb7490cac9d093da3fb8b2f27ba7bd9bd89f946b9ee6e,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-gz22c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-c674c5965-2lkhq_openstack-operators(1988c5e6-a91c-4085-a878-2ffdf478fa1b): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 20 10:52:26 crc kubenswrapper[4748]: E0320 10:52:26.554758 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-vsd7j" podUID="8e47e918-33de-4a66-9223-7ee3264600c1" Mar 20 10:52:26 crc kubenswrapper[4748]: E0320 10:52:26.554862 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-c674c5965-2lkhq" podUID="1988c5e6-a91c-4085-a878-2ffdf478fa1b" Mar 20 10:52:26 crc kubenswrapper[4748]: E0320 10:52:26.559758 4748 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:c500fa7080b94105e85eeced772d8872e4168904e74ba02116e15ab66f522444,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-cd85s,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-d6b694c5-9pmr7_openstack-operators(44b5dd5f-9a81-4c01-8efd-6d4997bb9c94): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 20 10:52:26 crc kubenswrapper[4748]: E0320 10:52:26.561125 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-9pmr7" podUID="44b5dd5f-9a81-4c01-8efd-6d4997bb9c94" Mar 20 10:52:26 crc kubenswrapper[4748]: I0320 10:52:26.565490 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-588d4d986b-9gzr8"] Mar 20 10:52:26 crc kubenswrapper[4748]: E0320 10:52:26.573194 4748 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/designate-operator@sha256:12841b27173f5f1beeb83112e057c8753f4cf411f583fba4f0610fac0f60b7ad,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-sc8ss,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod designate-operator-controller-manager-588d4d986b-9gzr8_openstack-operators(d855d6bf-853d-454b-b0b7-feb11f23cc17): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 20 10:52:26 crc kubenswrapper[4748]: E0320 10:52:26.574561 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-9gzr8" podUID="d855d6bf-853d-454b-b0b7-feb11f23cc17" Mar 20 10:52:26 crc kubenswrapper[4748]: I0320 10:52:26.574873 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-7tgsh"] Mar 20 10:52:26 crc kubenswrapper[4748]: I0320 10:52:26.582592 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-h7vwp"] Mar 20 10:52:26 crc kubenswrapper[4748]: W0320 10:52:26.587705 4748 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb0646c53_71d5_40d9_8a3b_77c244fff7c4.slice/crio-3a316be460432e520908ebeb88a334d29fb646fdd1b89427ec8d474bf4b18c8a WatchSource:0}: Error finding container 3a316be460432e520908ebeb88a334d29fb646fdd1b89427ec8d474bf4b18c8a: Status 404 returned error can't find the container with id 3a316be460432e520908ebeb88a334d29fb646fdd1b89427ec8d474bf4b18c8a Mar 20 10:52:26 crc kubenswrapper[4748]: W0320 10:52:26.589153 4748 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb6d05c98_f000_4560_b790_da31157488dc.slice/crio-b9b06a72e366bbbd6f08797ff49fa083511da851a89544efaaac62f3bea414f2 WatchSource:0}: Error finding container b9b06a72e366bbbd6f08797ff49fa083511da851a89544efaaac62f3bea414f2: Status 404 returned error can't find the container with id b9b06a72e366bbbd6f08797ff49fa083511da851a89544efaaac62f3bea414f2 Mar 20 10:52:26 crc kubenswrapper[4748]: E0320 10:52:26.595253 4748 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:d9c55e8c6304a0e32289b5e8c69a87ea59b9968918a5c85b7c384633df82c807,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-jwxwd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-6c4d75f7f9-h7vwp_openstack-operators(b6d05c98-f000-4560-b790-da31157488dc): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 20 10:52:26 crc kubenswrapper[4748]: E0320 10:52:26.595926 4748 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-jqdjd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-7tgsh_openstack-operators(b0646c53-71d5-40d9-8a3b-77c244fff7c4): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 20 10:52:26 crc kubenswrapper[4748]: E0320 10:52:26.596678 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-h7vwp" podUID="b6d05c98-f000-4560-b790-da31157488dc" Mar 20 10:52:26 crc kubenswrapper[4748]: E0320 10:52:26.598344 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-7tgsh" podUID="b0646c53-71d5-40d9-8a3b-77c244fff7c4" Mar 20 10:52:26 crc kubenswrapper[4748]: I0320 10:52:26.633855 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f0a3f8d9-dcfa-498a-a46e-61628aa68067-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-rxplv\" (UID: \"f0a3f8d9-dcfa-498a-a46e-61628aa68067\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-rxplv" Mar 20 10:52:26 crc kubenswrapper[4748]: E0320 10:52:26.634115 4748 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 10:52:26 crc kubenswrapper[4748]: E0320 10:52:26.634176 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f0a3f8d9-dcfa-498a-a46e-61628aa68067-cert podName:f0a3f8d9-dcfa-498a-a46e-61628aa68067 nodeName:}" failed. No retries permitted until 2026-03-20 10:52:28.634157674 +0000 UTC m=+983.775703498 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f0a3f8d9-dcfa-498a-a46e-61628aa68067-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-rxplv" (UID: "f0a3f8d9-dcfa-498a-a46e-61628aa68067") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 10:52:26 crc kubenswrapper[4748]: I0320 10:52:26.938287 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/795ce1d0-2232-4ed7-8618-c47a7584973e-webhook-certs\") pod \"openstack-operator-controller-manager-5647f98656-9pqdv\" (UID: \"795ce1d0-2232-4ed7-8618-c47a7584973e\") " pod="openstack-operators/openstack-operator-controller-manager-5647f98656-9pqdv" Mar 20 10:52:26 crc kubenswrapper[4748]: I0320 10:52:26.938784 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/795ce1d0-2232-4ed7-8618-c47a7584973e-metrics-certs\") pod \"openstack-operator-controller-manager-5647f98656-9pqdv\" (UID: \"795ce1d0-2232-4ed7-8618-c47a7584973e\") " pod="openstack-operators/openstack-operator-controller-manager-5647f98656-9pqdv" Mar 20 10:52:26 crc kubenswrapper[4748]: E0320 10:52:26.938585 4748 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 20 10:52:26 crc kubenswrapper[4748]: E0320 10:52:26.938965 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/795ce1d0-2232-4ed7-8618-c47a7584973e-webhook-certs podName:795ce1d0-2232-4ed7-8618-c47a7584973e nodeName:}" failed. No retries permitted until 2026-03-20 10:52:28.938938719 +0000 UTC m=+984.080484533 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/795ce1d0-2232-4ed7-8618-c47a7584973e-webhook-certs") pod "openstack-operator-controller-manager-5647f98656-9pqdv" (UID: "795ce1d0-2232-4ed7-8618-c47a7584973e") : secret "webhook-server-cert" not found Mar 20 10:52:26 crc kubenswrapper[4748]: E0320 10:52:26.939030 4748 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 20 10:52:26 crc kubenswrapper[4748]: E0320 10:52:26.939107 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/795ce1d0-2232-4ed7-8618-c47a7584973e-metrics-certs podName:795ce1d0-2232-4ed7-8618-c47a7584973e nodeName:}" failed. No retries permitted until 2026-03-20 10:52:28.939088023 +0000 UTC m=+984.080633907 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/795ce1d0-2232-4ed7-8618-c47a7584973e-metrics-certs") pod "openstack-operator-controller-manager-5647f98656-9pqdv" (UID: "795ce1d0-2232-4ed7-8618-c47a7584973e") : secret "metrics-server-cert" not found Mar 20 10:52:27 crc kubenswrapper[4748]: I0320 10:52:27.182553 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-cgzmd" event={"ID":"49092c30-9830-451a-8003-2cc7fa078b62","Type":"ContainerStarted","Data":"5570f4459a3e751d1d9a108c02da69c7391c4d3eff76ef9742189dbcbf389040"} Mar 20 10:52:27 crc kubenswrapper[4748]: I0320 10:52:27.191344 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-prgc2" event={"ID":"17f5527d-b31e-4788-ab09-ac5d26ea1bce","Type":"ContainerStarted","Data":"0e8927acbb7d0b0e64353568015f5bc838f4b51ff3a2fddc48f5c02464b3189c"} Mar 20 10:52:27 crc kubenswrapper[4748]: I0320 10:52:27.200671 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-9gkdc" event={"ID":"5d9f2386-33fc-43e9-9a61-e0d57fd94fbe","Type":"ContainerStarted","Data":"d56044da7d386eb515db2ee01faa8d75dd12c4d0da1b095a01bfac7cba09d0d4"} Mar 20 10:52:27 crc kubenswrapper[4748]: I0320 10:52:27.228912 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-vsd7j" event={"ID":"8e47e918-33de-4a66-9223-7ee3264600c1","Type":"ContainerStarted","Data":"67ee4045cce3359c1dbb8014fe60ef8920e45ebdfb2753b0cefd80b964b7204c"} Mar 20 10:52:27 crc kubenswrapper[4748]: E0320 10:52:27.232297 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42\\\"\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-vsd7j" podUID="8e47e918-33de-4a66-9223-7ee3264600c1" Mar 20 10:52:27 crc kubenswrapper[4748]: I0320 10:52:27.232379 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-r5z4d" event={"ID":"b0d0b327-5826-4c41-84bc-8b2c2bb05756","Type":"ContainerStarted","Data":"56502f96fae99b45ea451f1ca329e945bfd4eb56d4fbe673180e7ac81f0e7d4c"} Mar 20 10:52:27 crc kubenswrapper[4748]: I0320 10:52:27.237054 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-9pmr7" event={"ID":"44b5dd5f-9a81-4c01-8efd-6d4997bb9c94","Type":"ContainerStarted","Data":"1e3bfdbe46883d63dff0d3d5866f6aff3791e18b75325100b7d0d226887ed7dd"} Mar 20 10:52:27 crc kubenswrapper[4748]: E0320 10:52:27.239533 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:c500fa7080b94105e85eeced772d8872e4168904e74ba02116e15ab66f522444\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-9pmr7" podUID="44b5dd5f-9a81-4c01-8efd-6d4997bb9c94" Mar 20 10:52:27 crc kubenswrapper[4748]: I0320 10:52:27.241768 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-m84q7" event={"ID":"bd4cdccf-68e3-4c27-ae51-f54b8089e08b","Type":"ContainerStarted","Data":"8632c3606e4cac250b426352091d463f87c01b1f6891daa3905233f946a1ee5e"} Mar 20 10:52:27 crc kubenswrapper[4748]: I0320 10:52:27.269245 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-884679f54-rt4hn" event={"ID":"bf9a7295-e355-4c61-a841-fd2bce675235","Type":"ContainerStarted","Data":"341d21b3c360865e7c5a7e5e3aca333bf85793d0a217aec851d5c341aab643d0"} Mar 20 10:52:27 crc kubenswrapper[4748]: I0320 10:52:27.273707 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-h7vwp" event={"ID":"b6d05c98-f000-4560-b790-da31157488dc","Type":"ContainerStarted","Data":"b9b06a72e366bbbd6f08797ff49fa083511da851a89544efaaac62f3bea414f2"} Mar 20 10:52:27 crc kubenswrapper[4748]: E0320 10:52:27.275238 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:d9c55e8c6304a0e32289b5e8c69a87ea59b9968918a5c85b7c384633df82c807\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-h7vwp" podUID="b6d05c98-f000-4560-b790-da31157488dc" Mar 20 10:52:27 crc kubenswrapper[4748]: I0320 10:52:27.276576 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-55f864c847-blfgz" event={"ID":"6dbf38ab-35e4-4cf3-9655-b8dc49eaea7d","Type":"ContainerStarted","Data":"027f61b17ba1a9f028ca4f1e9f1f6619e65b12f2f1fffea2e68b9a74a7abb04d"} Mar 20 10:52:27 crc kubenswrapper[4748]: E0320 10:52:27.278248 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:f2e0b0fb34995b8acbbf1b0b60b5dbcf488b4f3899d1bb0763ae7dcee9bae6da\\\"\"" pod="openstack-operators/manila-operator-controller-manager-55f864c847-blfgz" podUID="6dbf38ab-35e4-4cf3-9655-b8dc49eaea7d" Mar 20 10:52:27 crc kubenswrapper[4748]: I0320 10:52:27.281704 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-767865f676-zcqnn" event={"ID":"ecd87b49-65fa-465e-a668-03cb90381b6e","Type":"ContainerStarted","Data":"46991a332d70ee9e310ff6357e3e6fd54e109d379b10f3e8a22a2a54d67a0cf5"} Mar 20 10:52:27 crc kubenswrapper[4748]: I0320 10:52:27.290454 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-7tgsh" event={"ID":"b0646c53-71d5-40d9-8a3b-77c244fff7c4","Type":"ContainerStarted","Data":"3a316be460432e520908ebeb88a334d29fb646fdd1b89427ec8d474bf4b18c8a"} Mar 20 10:52:27 crc kubenswrapper[4748]: E0320 10:52:27.293288 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-7tgsh" podUID="b0646c53-71d5-40d9-8a3b-77c244fff7c4" Mar 20 10:52:27 crc kubenswrapper[4748]: I0320 10:52:27.309208 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-r4sct" event={"ID":"3fb5bb3a-ab86-4c3f-9d2d-9ef6d7f7ca1f","Type":"ContainerStarted","Data":"4c98498158e5a9e32c243083e501d1e40c1787bd19cbb46f112bdadf37b0e6b9"} Mar 20 10:52:27 crc kubenswrapper[4748]: I0320 10:52:27.328074 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-9gzr8" event={"ID":"d855d6bf-853d-454b-b0b7-feb11f23cc17","Type":"ContainerStarted","Data":"d1b1b9392f77a868230adb3fd4bb80b9bb61e977ed26061be8165b412d00a1cb"} Mar 20 10:52:27 crc kubenswrapper[4748]: E0320 10:52:27.330002 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/designate-operator@sha256:12841b27173f5f1beeb83112e057c8753f4cf411f583fba4f0610fac0f60b7ad\\\"\"" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-9gzr8" podUID="d855d6bf-853d-454b-b0b7-feb11f23cc17" Mar 20 10:52:27 crc kubenswrapper[4748]: I0320 10:52:27.353213 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-c674c5965-2lkhq" event={"ID":"1988c5e6-a91c-4085-a878-2ffdf478fa1b","Type":"ContainerStarted","Data":"b2fd8ab7bb0fb8aafe241e31a6872b0d93a7c32248f5c6f5c1cb6b7679064c08"} Mar 20 10:52:27 crc kubenswrapper[4748]: E0320 10:52:27.354675 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:866844c5b88e1e0518ceb7490cac9d093da3fb8b2f27ba7bd9bd89f946b9ee6e\\\"\"" pod="openstack-operators/swift-operator-controller-manager-c674c5965-2lkhq" podUID="1988c5e6-a91c-4085-a878-2ffdf478fa1b" Mar 20 10:52:27 crc kubenswrapper[4748]: I0320 10:52:27.364233 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5784578c99-897gg" event={"ID":"736beaed-774c-43c0-bff9-d66a5ae4a1f5","Type":"ContainerStarted","Data":"79c6dbdbd4cf715b2b2b97021f8bb755de492a9ca3584ab042bf92b3234f24c0"} Mar 20 10:52:27 crc kubenswrapper[4748]: I0320 10:52:27.368318 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-9zhhz" event={"ID":"640d4c26-acbd-4cb4-8b59-fde206294a91","Type":"ContainerStarted","Data":"9258a05c8dab199a357abb2979732d841a52ad4a861a36c0b7db24990d731d4e"} Mar 20 10:52:28 crc kubenswrapper[4748]: I0320 10:52:28.266702 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c29e0600-cf39-40bf-9225-48e55c4b8f97-cert\") pod \"infra-operator-controller-manager-7b9c774f96-x7sjb\" (UID: \"c29e0600-cf39-40bf-9225-48e55c4b8f97\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-x7sjb" Mar 20 10:52:28 crc kubenswrapper[4748]: E0320 10:52:28.266903 4748 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 20 10:52:28 crc kubenswrapper[4748]: E0320 10:52:28.266993 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c29e0600-cf39-40bf-9225-48e55c4b8f97-cert podName:c29e0600-cf39-40bf-9225-48e55c4b8f97 nodeName:}" failed. No retries permitted until 2026-03-20 10:52:32.266970956 +0000 UTC m=+987.408516770 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c29e0600-cf39-40bf-9225-48e55c4b8f97-cert") pod "infra-operator-controller-manager-7b9c774f96-x7sjb" (UID: "c29e0600-cf39-40bf-9225-48e55c4b8f97") : secret "infra-operator-webhook-server-cert" not found Mar 20 10:52:28 crc kubenswrapper[4748]: E0320 10:52:28.378376 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/designate-operator@sha256:12841b27173f5f1beeb83112e057c8753f4cf411f583fba4f0610fac0f60b7ad\\\"\"" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-9gzr8" podUID="d855d6bf-853d-454b-b0b7-feb11f23cc17" Mar 20 10:52:28 crc kubenswrapper[4748]: E0320 10:52:28.380262 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:866844c5b88e1e0518ceb7490cac9d093da3fb8b2f27ba7bd9bd89f946b9ee6e\\\"\"" pod="openstack-operators/swift-operator-controller-manager-c674c5965-2lkhq" podUID="1988c5e6-a91c-4085-a878-2ffdf478fa1b" Mar 20 10:52:28 crc kubenswrapper[4748]: E0320 10:52:28.380523 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:c500fa7080b94105e85eeced772d8872e4168904e74ba02116e15ab66f522444\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-9pmr7" podUID="44b5dd5f-9a81-4c01-8efd-6d4997bb9c94" Mar 20 10:52:28 crc kubenswrapper[4748]: E0320 10:52:28.380613 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42\\\"\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-vsd7j" podUID="8e47e918-33de-4a66-9223-7ee3264600c1" Mar 20 10:52:28 crc kubenswrapper[4748]: E0320 10:52:28.380636 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:f2e0b0fb34995b8acbbf1b0b60b5dbcf488b4f3899d1bb0763ae7dcee9bae6da\\\"\"" pod="openstack-operators/manila-operator-controller-manager-55f864c847-blfgz" podUID="6dbf38ab-35e4-4cf3-9655-b8dc49eaea7d" Mar 20 10:52:28 crc kubenswrapper[4748]: E0320 10:52:28.381132 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-7tgsh" podUID="b0646c53-71d5-40d9-8a3b-77c244fff7c4" Mar 20 10:52:28 crc kubenswrapper[4748]: E0320 10:52:28.381146 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:d9c55e8c6304a0e32289b5e8c69a87ea59b9968918a5c85b7c384633df82c807\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-h7vwp" podUID="b6d05c98-f000-4560-b790-da31157488dc" Mar 20 10:52:28 crc kubenswrapper[4748]: I0320 10:52:28.675914 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f0a3f8d9-dcfa-498a-a46e-61628aa68067-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-rxplv\" (UID: \"f0a3f8d9-dcfa-498a-a46e-61628aa68067\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-rxplv" Mar 20 10:52:28 crc kubenswrapper[4748]: E0320 10:52:28.676108 4748 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 10:52:28 crc kubenswrapper[4748]: E0320 10:52:28.676194 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f0a3f8d9-dcfa-498a-a46e-61628aa68067-cert podName:f0a3f8d9-dcfa-498a-a46e-61628aa68067 nodeName:}" failed. No retries permitted until 2026-03-20 10:52:32.676172716 +0000 UTC m=+987.817718530 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f0a3f8d9-dcfa-498a-a46e-61628aa68067-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-rxplv" (UID: "f0a3f8d9-dcfa-498a-a46e-61628aa68067") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 10:52:28 crc kubenswrapper[4748]: I0320 10:52:28.979245 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/795ce1d0-2232-4ed7-8618-c47a7584973e-webhook-certs\") pod \"openstack-operator-controller-manager-5647f98656-9pqdv\" (UID: \"795ce1d0-2232-4ed7-8618-c47a7584973e\") " pod="openstack-operators/openstack-operator-controller-manager-5647f98656-9pqdv" Mar 20 10:52:28 crc kubenswrapper[4748]: I0320 10:52:28.979296 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/795ce1d0-2232-4ed7-8618-c47a7584973e-metrics-certs\") pod \"openstack-operator-controller-manager-5647f98656-9pqdv\" (UID: \"795ce1d0-2232-4ed7-8618-c47a7584973e\") " pod="openstack-operators/openstack-operator-controller-manager-5647f98656-9pqdv" Mar 20 10:52:28 crc kubenswrapper[4748]: E0320 10:52:28.979444 4748 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 20 10:52:28 crc kubenswrapper[4748]: E0320 10:52:28.979502 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/795ce1d0-2232-4ed7-8618-c47a7584973e-metrics-certs podName:795ce1d0-2232-4ed7-8618-c47a7584973e nodeName:}" failed. No retries permitted until 2026-03-20 10:52:32.979485214 +0000 UTC m=+988.121031028 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/795ce1d0-2232-4ed7-8618-c47a7584973e-metrics-certs") pod "openstack-operator-controller-manager-5647f98656-9pqdv" (UID: "795ce1d0-2232-4ed7-8618-c47a7584973e") : secret "metrics-server-cert" not found Mar 20 10:52:28 crc kubenswrapper[4748]: E0320 10:52:28.979515 4748 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 20 10:52:28 crc kubenswrapper[4748]: E0320 10:52:28.979674 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/795ce1d0-2232-4ed7-8618-c47a7584973e-webhook-certs podName:795ce1d0-2232-4ed7-8618-c47a7584973e nodeName:}" failed. No retries permitted until 2026-03-20 10:52:32.979543275 +0000 UTC m=+988.121089089 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/795ce1d0-2232-4ed7-8618-c47a7584973e-webhook-certs") pod "openstack-operator-controller-manager-5647f98656-9pqdv" (UID: "795ce1d0-2232-4ed7-8618-c47a7584973e") : secret "webhook-server-cert" not found Mar 20 10:52:32 crc kubenswrapper[4748]: I0320 10:52:32.332044 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c29e0600-cf39-40bf-9225-48e55c4b8f97-cert\") pod \"infra-operator-controller-manager-7b9c774f96-x7sjb\" (UID: \"c29e0600-cf39-40bf-9225-48e55c4b8f97\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-x7sjb" Mar 20 10:52:32 crc kubenswrapper[4748]: E0320 10:52:32.332866 4748 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 20 10:52:32 crc kubenswrapper[4748]: E0320 10:52:32.332937 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c29e0600-cf39-40bf-9225-48e55c4b8f97-cert podName:c29e0600-cf39-40bf-9225-48e55c4b8f97 nodeName:}" failed. No retries permitted until 2026-03-20 10:52:40.332912336 +0000 UTC m=+995.474458150 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c29e0600-cf39-40bf-9225-48e55c4b8f97-cert") pod "infra-operator-controller-manager-7b9c774f96-x7sjb" (UID: "c29e0600-cf39-40bf-9225-48e55c4b8f97") : secret "infra-operator-webhook-server-cert" not found Mar 20 10:52:32 crc kubenswrapper[4748]: I0320 10:52:32.739294 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f0a3f8d9-dcfa-498a-a46e-61628aa68067-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-rxplv\" (UID: \"f0a3f8d9-dcfa-498a-a46e-61628aa68067\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-rxplv" Mar 20 10:52:32 crc kubenswrapper[4748]: E0320 10:52:32.739539 4748 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 10:52:32 crc kubenswrapper[4748]: E0320 10:52:32.739680 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f0a3f8d9-dcfa-498a-a46e-61628aa68067-cert podName:f0a3f8d9-dcfa-498a-a46e-61628aa68067 nodeName:}" failed. No retries permitted until 2026-03-20 10:52:40.739646664 +0000 UTC m=+995.881192478 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f0a3f8d9-dcfa-498a-a46e-61628aa68067-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-rxplv" (UID: "f0a3f8d9-dcfa-498a-a46e-61628aa68067") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 10:52:33 crc kubenswrapper[4748]: I0320 10:52:33.044419 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/795ce1d0-2232-4ed7-8618-c47a7584973e-webhook-certs\") pod \"openstack-operator-controller-manager-5647f98656-9pqdv\" (UID: \"795ce1d0-2232-4ed7-8618-c47a7584973e\") " pod="openstack-operators/openstack-operator-controller-manager-5647f98656-9pqdv" Mar 20 10:52:33 crc kubenswrapper[4748]: I0320 10:52:33.044479 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/795ce1d0-2232-4ed7-8618-c47a7584973e-metrics-certs\") pod \"openstack-operator-controller-manager-5647f98656-9pqdv\" (UID: \"795ce1d0-2232-4ed7-8618-c47a7584973e\") " pod="openstack-operators/openstack-operator-controller-manager-5647f98656-9pqdv" Mar 20 10:52:33 crc kubenswrapper[4748]: E0320 10:52:33.044652 4748 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 20 10:52:33 crc kubenswrapper[4748]: E0320 10:52:33.044682 4748 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 20 10:52:33 crc kubenswrapper[4748]: E0320 10:52:33.044718 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/795ce1d0-2232-4ed7-8618-c47a7584973e-metrics-certs podName:795ce1d0-2232-4ed7-8618-c47a7584973e nodeName:}" failed. No retries permitted until 2026-03-20 10:52:41.044698775 +0000 UTC m=+996.186244599 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/795ce1d0-2232-4ed7-8618-c47a7584973e-metrics-certs") pod "openstack-operator-controller-manager-5647f98656-9pqdv" (UID: "795ce1d0-2232-4ed7-8618-c47a7584973e") : secret "metrics-server-cert" not found Mar 20 10:52:33 crc kubenswrapper[4748]: E0320 10:52:33.044773 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/795ce1d0-2232-4ed7-8618-c47a7584973e-webhook-certs podName:795ce1d0-2232-4ed7-8618-c47a7584973e nodeName:}" failed. No retries permitted until 2026-03-20 10:52:41.044749036 +0000 UTC m=+996.186294940 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/795ce1d0-2232-4ed7-8618-c47a7584973e-webhook-certs") pod "openstack-operator-controller-manager-5647f98656-9pqdv" (UID: "795ce1d0-2232-4ed7-8618-c47a7584973e") : secret "webhook-server-cert" not found Mar 20 10:52:38 crc kubenswrapper[4748]: E0320 10:52:38.094745 4748 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:ec36a9083657587022f8471c9d5a71b87a7895398496e7fc546c73aa1eae4b56" Mar 20 10:52:38 crc kubenswrapper[4748]: E0320 10:52:38.095602 4748 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:ec36a9083657587022f8471c9d5a71b87a7895398496e7fc546c73aa1eae4b56,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-ckfsv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-768b96df4c-prgc2_openstack-operators(17f5527d-b31e-4788-ab09-ac5d26ea1bce): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 10:52:38 crc kubenswrapper[4748]: E0320 10:52:38.097853 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-prgc2" podUID="17f5527d-b31e-4788-ab09-ac5d26ea1bce" Mar 20 10:52:38 crc kubenswrapper[4748]: E0320 10:52:38.463189 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:ec36a9083657587022f8471c9d5a71b87a7895398496e7fc546c73aa1eae4b56\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-prgc2" podUID="17f5527d-b31e-4788-ab09-ac5d26ea1bce" Mar 20 10:52:39 crc kubenswrapper[4748]: I0320 10:52:39.468256 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-9zhhz" event={"ID":"640d4c26-acbd-4cb4-8b59-fde206294a91","Type":"ContainerStarted","Data":"511abd5338cfe000dd7d7d55d8b3dda0d4f72bb7a94c085ea6f1993b4839c478"} Mar 20 10:52:39 crc kubenswrapper[4748]: I0320 10:52:39.468725 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-9zhhz" Mar 20 10:52:39 crc kubenswrapper[4748]: I0320 10:52:39.470079 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-cgzmd" event={"ID":"49092c30-9830-451a-8003-2cc7fa078b62","Type":"ContainerStarted","Data":"eb71dade9e5c984477769c0def314911acfe065995cb4e2039820f3fa548d82a"} Mar 20 10:52:39 crc kubenswrapper[4748]: I0320 10:52:39.470166 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-cgzmd" Mar 20 10:52:39 crc kubenswrapper[4748]: I0320 10:52:39.471867 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-9gkdc" event={"ID":"5d9f2386-33fc-43e9-9a61-e0d57fd94fbe","Type":"ContainerStarted","Data":"b7b5e23a48496d339427009fbabe1ed5061a19ce936a9d3e203ac98356da94d8"} Mar 20 10:52:39 crc kubenswrapper[4748]: I0320 10:52:39.471956 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-9gkdc" Mar 20 10:52:39 crc kubenswrapper[4748]: I0320 10:52:39.473012 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-r4sct" event={"ID":"3fb5bb3a-ab86-4c3f-9d2d-9ef6d7f7ca1f","Type":"ContainerStarted","Data":"ea145941df3687b6f560ee6f8e8634761d1d341d367aa2663e4de52a00e99274"} Mar 20 10:52:39 crc kubenswrapper[4748]: I0320 10:52:39.473647 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-r4sct" Mar 20 10:52:39 crc kubenswrapper[4748]: I0320 10:52:39.474722 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-884679f54-rt4hn" event={"ID":"bf9a7295-e355-4c61-a841-fd2bce675235","Type":"ContainerStarted","Data":"66cb6ab729c75b156f5715bb940804834ad59156a9c32adffe372804a911df39"} Mar 20 10:52:39 crc kubenswrapper[4748]: I0320 10:52:39.475064 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-884679f54-rt4hn" Mar 20 10:52:39 crc kubenswrapper[4748]: I0320 10:52:39.476740 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-767865f676-zcqnn" event={"ID":"ecd87b49-65fa-465e-a668-03cb90381b6e","Type":"ContainerStarted","Data":"554d6818e9ca85c1e450f01f651da05880414c5e83a1317c11076e04e20f38a5"} Mar 20 10:52:39 crc kubenswrapper[4748]: I0320 10:52:39.476859 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-767865f676-zcqnn" Mar 20 10:52:39 crc kubenswrapper[4748]: I0320 10:52:39.478206 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-4gsq2" event={"ID":"20023868-c089-41ec-ac26-9b4882fbab50","Type":"ContainerStarted","Data":"ea94e3ca7745a9645d892093c86a5231fa53b0c262df37fdb592ad00ad125369"} Mar 20 10:52:39 crc kubenswrapper[4748]: I0320 10:52:39.478343 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-4gsq2" Mar 20 10:52:39 crc kubenswrapper[4748]: I0320 10:52:39.480101 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5784578c99-897gg" event={"ID":"736beaed-774c-43c0-bff9-d66a5ae4a1f5","Type":"ContainerStarted","Data":"2627b8e154d5b1edc2c7d8cb9e7bdaaafb700d464a080ef96ff31c688e102dea"} Mar 20 10:52:39 crc kubenswrapper[4748]: I0320 10:52:39.480222 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-5784578c99-897gg" Mar 20 10:52:39 crc kubenswrapper[4748]: I0320 10:52:39.481641 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-m84q7" event={"ID":"bd4cdccf-68e3-4c27-ae51-f54b8089e08b","Type":"ContainerStarted","Data":"c09e6a54ff0321d13d529c72a6d121a56f06d50cbd217eba6ac95cf9538a2c23"} Mar 20 10:52:39 crc kubenswrapper[4748]: I0320 10:52:39.481765 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-m84q7" Mar 20 10:52:39 crc kubenswrapper[4748]: I0320 10:52:39.483087 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-dn24h" event={"ID":"8582a4fb-51b2-411c-a67f-31a023f40493","Type":"ContainerStarted","Data":"5954530a51e77bce079ae0b19287373c424ad258fb6fdae6aff588e28bc56118"} Mar 20 10:52:39 crc kubenswrapper[4748]: I0320 10:52:39.483152 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-dn24h" Mar 20 10:52:39 crc kubenswrapper[4748]: I0320 10:52:39.484550 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-r5z4d" event={"ID":"b0d0b327-5826-4c41-84bc-8b2c2bb05756","Type":"ContainerStarted","Data":"fe506928c937970f3ce0b381581a1cc1b58cf655bc90dc13de18f8b5f0ee96ee"} Mar 20 10:52:39 crc kubenswrapper[4748]: I0320 10:52:39.484747 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-r5z4d" Mar 20 10:52:39 crc kubenswrapper[4748]: I0320 10:52:39.486030 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-stsk5" event={"ID":"1ec4d02c-2709-4102-8a27-c4e7c71ed61f","Type":"ContainerStarted","Data":"0f2b93add5e6518b9208bdf16f58fbd905074a112621f6614042df2d312b124f"} Mar 20 10:52:39 crc kubenswrapper[4748]: I0320 10:52:39.486208 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-stsk5" Mar 20 10:52:39 crc kubenswrapper[4748]: I0320 10:52:39.505902 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-9zhhz" podStartSLOduration=3.058115353 podStartE2EDuration="15.505880161s" podCreationTimestamp="2026-03-20 10:52:24 +0000 UTC" firstStartedPulling="2026-03-20 10:52:26.345481951 +0000 UTC m=+981.487027765" lastFinishedPulling="2026-03-20 10:52:38.793246769 +0000 UTC m=+993.934792573" observedRunningTime="2026-03-20 10:52:39.503223444 +0000 UTC m=+994.644769268" watchObservedRunningTime="2026-03-20 10:52:39.505880161 +0000 UTC m=+994.647425975" Mar 20 10:52:39 crc kubenswrapper[4748]: I0320 10:52:39.557198 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-9gkdc" podStartSLOduration=2.938246282 podStartE2EDuration="15.557176941s" podCreationTimestamp="2026-03-20 10:52:24 +0000 UTC" firstStartedPulling="2026-03-20 10:52:26.175015368 +0000 UTC m=+981.316561182" lastFinishedPulling="2026-03-20 10:52:38.793946027 +0000 UTC m=+993.935491841" observedRunningTime="2026-03-20 10:52:39.55114321 +0000 UTC m=+994.692689024" watchObservedRunningTime="2026-03-20 10:52:39.557176941 +0000 UTC m=+994.698722745" Mar 20 10:52:39 crc kubenswrapper[4748]: I0320 10:52:39.649296 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-884679f54-rt4hn" podStartSLOduration=3.173579893 podStartE2EDuration="15.649274018s" podCreationTimestamp="2026-03-20 10:52:24 +0000 UTC" firstStartedPulling="2026-03-20 10:52:26.317502553 +0000 UTC m=+981.459048367" lastFinishedPulling="2026-03-20 10:52:38.793196678 +0000 UTC m=+993.934742492" observedRunningTime="2026-03-20 10:52:39.635763182 +0000 UTC m=+994.777308996" watchObservedRunningTime="2026-03-20 10:52:39.649274018 +0000 UTC m=+994.790819832" Mar 20 10:52:39 crc kubenswrapper[4748]: I0320 10:52:39.649557 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-r5z4d" podStartSLOduration=3.18986348 podStartE2EDuration="15.649551745s" podCreationTimestamp="2026-03-20 10:52:24 +0000 UTC" firstStartedPulling="2026-03-20 10:52:26.342691282 +0000 UTC m=+981.484237096" lastFinishedPulling="2026-03-20 10:52:38.802379547 +0000 UTC m=+993.943925361" observedRunningTime="2026-03-20 10:52:39.598494401 +0000 UTC m=+994.740040215" watchObservedRunningTime="2026-03-20 10:52:39.649551745 +0000 UTC m=+994.791097559" Mar 20 10:52:39 crc kubenswrapper[4748]: I0320 10:52:39.675493 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-cgzmd" podStartSLOduration=3.211845598 podStartE2EDuration="15.675473752s" podCreationTimestamp="2026-03-20 10:52:24 +0000 UTC" firstStartedPulling="2026-03-20 10:52:26.340266331 +0000 UTC m=+981.481812145" lastFinishedPulling="2026-03-20 10:52:38.803894485 +0000 UTC m=+993.945440299" observedRunningTime="2026-03-20 10:52:39.671849862 +0000 UTC m=+994.813395686" watchObservedRunningTime="2026-03-20 10:52:39.675473752 +0000 UTC m=+994.817019566" Mar 20 10:52:39 crc kubenswrapper[4748]: I0320 10:52:39.689771 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-m84q7" podStartSLOduration=2.95255889 podStartE2EDuration="15.689752809s" podCreationTimestamp="2026-03-20 10:52:24 +0000 UTC" firstStartedPulling="2026-03-20 10:52:26.184667799 +0000 UTC m=+981.326213613" lastFinishedPulling="2026-03-20 10:52:38.921861718 +0000 UTC m=+994.063407532" observedRunningTime="2026-03-20 10:52:39.687545324 +0000 UTC m=+994.829091138" watchObservedRunningTime="2026-03-20 10:52:39.689752809 +0000 UTC m=+994.831298623" Mar 20 10:52:39 crc kubenswrapper[4748]: I0320 10:52:39.709486 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-dn24h" podStartSLOduration=2.967321577 podStartE2EDuration="15.70945428s" podCreationTimestamp="2026-03-20 10:52:24 +0000 UTC" firstStartedPulling="2026-03-20 10:52:26.051580208 +0000 UTC m=+981.193126022" lastFinishedPulling="2026-03-20 10:52:38.793712911 +0000 UTC m=+993.935258725" observedRunningTime="2026-03-20 10:52:39.703968813 +0000 UTC m=+994.845514637" watchObservedRunningTime="2026-03-20 10:52:39.70945428 +0000 UTC m=+994.851000094" Mar 20 10:52:39 crc kubenswrapper[4748]: I0320 10:52:39.737232 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-5784578c99-897gg" podStartSLOduration=3.472387629 podStartE2EDuration="15.737212243s" podCreationTimestamp="2026-03-20 10:52:24 +0000 UTC" firstStartedPulling="2026-03-20 10:52:26.553360778 +0000 UTC m=+981.694906602" lastFinishedPulling="2026-03-20 10:52:38.818185402 +0000 UTC m=+993.959731216" observedRunningTime="2026-03-20 10:52:39.732437884 +0000 UTC m=+994.873983698" watchObservedRunningTime="2026-03-20 10:52:39.737212243 +0000 UTC m=+994.878758057" Mar 20 10:52:39 crc kubenswrapper[4748]: I0320 10:52:39.772769 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-4gsq2" podStartSLOduration=3.030542704 podStartE2EDuration="15.772750969s" podCreationTimestamp="2026-03-20 10:52:24 +0000 UTC" firstStartedPulling="2026-03-20 10:52:26.051787473 +0000 UTC m=+981.193333287" lastFinishedPulling="2026-03-20 10:52:38.793995748 +0000 UTC m=+993.935541552" observedRunningTime="2026-03-20 10:52:39.755220292 +0000 UTC m=+994.896766106" watchObservedRunningTime="2026-03-20 10:52:39.772750969 +0000 UTC m=+994.914296783" Mar 20 10:52:39 crc kubenswrapper[4748]: I0320 10:52:39.775448 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-r4sct" podStartSLOduration=3.146227981 podStartE2EDuration="15.775441527s" podCreationTimestamp="2026-03-20 10:52:24 +0000 UTC" firstStartedPulling="2026-03-20 10:52:26.164539246 +0000 UTC m=+981.306085060" lastFinishedPulling="2026-03-20 10:52:38.793752792 +0000 UTC m=+993.935298606" observedRunningTime="2026-03-20 10:52:39.770498093 +0000 UTC m=+994.912043917" watchObservedRunningTime="2026-03-20 10:52:39.775441527 +0000 UTC m=+994.916987341" Mar 20 10:52:39 crc kubenswrapper[4748]: I0320 10:52:39.798996 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-767865f676-zcqnn" podStartSLOduration=3.335221057 podStartE2EDuration="15.798974094s" podCreationTimestamp="2026-03-20 10:52:24 +0000 UTC" firstStartedPulling="2026-03-20 10:52:26.337073381 +0000 UTC m=+981.478619195" lastFinishedPulling="2026-03-20 10:52:38.800826418 +0000 UTC m=+993.942372232" observedRunningTime="2026-03-20 10:52:39.793655241 +0000 UTC m=+994.935201055" watchObservedRunningTime="2026-03-20 10:52:39.798974094 +0000 UTC m=+994.940519908" Mar 20 10:52:39 crc kubenswrapper[4748]: I0320 10:52:39.818881 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-stsk5" podStartSLOduration=2.645531558 podStartE2EDuration="15.81885989s" podCreationTimestamp="2026-03-20 10:52:24 +0000 UTC" firstStartedPulling="2026-03-20 10:52:25.620069711 +0000 UTC m=+980.761615535" lastFinishedPulling="2026-03-20 10:52:38.793398053 +0000 UTC m=+993.934943867" observedRunningTime="2026-03-20 10:52:39.817039875 +0000 UTC m=+994.958585699" watchObservedRunningTime="2026-03-20 10:52:39.81885989 +0000 UTC m=+994.960405704" Mar 20 10:52:40 crc kubenswrapper[4748]: I0320 10:52:40.366700 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c29e0600-cf39-40bf-9225-48e55c4b8f97-cert\") pod \"infra-operator-controller-manager-7b9c774f96-x7sjb\" (UID: \"c29e0600-cf39-40bf-9225-48e55c4b8f97\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-x7sjb" Mar 20 10:52:40 crc kubenswrapper[4748]: E0320 10:52:40.366896 4748 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 20 10:52:40 crc kubenswrapper[4748]: E0320 10:52:40.367144 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c29e0600-cf39-40bf-9225-48e55c4b8f97-cert podName:c29e0600-cf39-40bf-9225-48e55c4b8f97 nodeName:}" failed. No retries permitted until 2026-03-20 10:52:56.367127289 +0000 UTC m=+1011.508673103 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c29e0600-cf39-40bf-9225-48e55c4b8f97-cert") pod "infra-operator-controller-manager-7b9c774f96-x7sjb" (UID: "c29e0600-cf39-40bf-9225-48e55c4b8f97") : secret "infra-operator-webhook-server-cert" not found Mar 20 10:52:40 crc kubenswrapper[4748]: I0320 10:52:40.782098 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f0a3f8d9-dcfa-498a-a46e-61628aa68067-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-rxplv\" (UID: \"f0a3f8d9-dcfa-498a-a46e-61628aa68067\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-rxplv" Mar 20 10:52:40 crc kubenswrapper[4748]: E0320 10:52:40.782339 4748 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 10:52:40 crc kubenswrapper[4748]: E0320 10:52:40.784158 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f0a3f8d9-dcfa-498a-a46e-61628aa68067-cert podName:f0a3f8d9-dcfa-498a-a46e-61628aa68067 nodeName:}" failed. No retries permitted until 2026-03-20 10:52:56.784129874 +0000 UTC m=+1011.925675688 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f0a3f8d9-dcfa-498a-a46e-61628aa68067-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-rxplv" (UID: "f0a3f8d9-dcfa-498a-a46e-61628aa68067") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 10:52:41 crc kubenswrapper[4748]: I0320 10:52:41.088078 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/795ce1d0-2232-4ed7-8618-c47a7584973e-webhook-certs\") pod \"openstack-operator-controller-manager-5647f98656-9pqdv\" (UID: \"795ce1d0-2232-4ed7-8618-c47a7584973e\") " pod="openstack-operators/openstack-operator-controller-manager-5647f98656-9pqdv" Mar 20 10:52:41 crc kubenswrapper[4748]: I0320 10:52:41.088133 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/795ce1d0-2232-4ed7-8618-c47a7584973e-metrics-certs\") pod \"openstack-operator-controller-manager-5647f98656-9pqdv\" (UID: \"795ce1d0-2232-4ed7-8618-c47a7584973e\") " pod="openstack-operators/openstack-operator-controller-manager-5647f98656-9pqdv" Mar 20 10:52:41 crc kubenswrapper[4748]: E0320 10:52:41.088315 4748 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 20 10:52:41 crc kubenswrapper[4748]: E0320 10:52:41.088371 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/795ce1d0-2232-4ed7-8618-c47a7584973e-metrics-certs podName:795ce1d0-2232-4ed7-8618-c47a7584973e nodeName:}" failed. No retries permitted until 2026-03-20 10:52:57.088355145 +0000 UTC m=+1012.229900959 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/795ce1d0-2232-4ed7-8618-c47a7584973e-metrics-certs") pod "openstack-operator-controller-manager-5647f98656-9pqdv" (UID: "795ce1d0-2232-4ed7-8618-c47a7584973e") : secret "metrics-server-cert" not found Mar 20 10:52:41 crc kubenswrapper[4748]: E0320 10:52:41.088820 4748 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 20 10:52:41 crc kubenswrapper[4748]: E0320 10:52:41.088949 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/795ce1d0-2232-4ed7-8618-c47a7584973e-webhook-certs podName:795ce1d0-2232-4ed7-8618-c47a7584973e nodeName:}" failed. No retries permitted until 2026-03-20 10:52:57.088920659 +0000 UTC m=+1012.230466473 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/795ce1d0-2232-4ed7-8618-c47a7584973e-webhook-certs") pod "openstack-operator-controller-manager-5647f98656-9pqdv" (UID: "795ce1d0-2232-4ed7-8618-c47a7584973e") : secret "webhook-server-cert" not found Mar 20 10:52:44 crc kubenswrapper[4748]: I0320 10:52:44.640861 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-stsk5" Mar 20 10:52:44 crc kubenswrapper[4748]: I0320 10:52:44.643499 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-4gsq2" Mar 20 10:52:44 crc kubenswrapper[4748]: I0320 10:52:44.702112 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-dn24h" Mar 20 10:52:44 crc kubenswrapper[4748]: I0320 10:52:44.774514 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-r4sct" Mar 20 10:52:44 crc kubenswrapper[4748]: I0320 10:52:44.802065 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-9gkdc" Mar 20 10:52:44 crc kubenswrapper[4748]: I0320 10:52:44.943973 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-9zhhz" Mar 20 10:52:45 crc kubenswrapper[4748]: I0320 10:52:45.081884 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-767865f676-zcqnn" Mar 20 10:52:45 crc kubenswrapper[4748]: I0320 10:52:45.082590 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-r5z4d" Mar 20 10:52:45 crc kubenswrapper[4748]: I0320 10:52:45.128213 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-cgzmd" Mar 20 10:52:45 crc kubenswrapper[4748]: I0320 10:52:45.149223 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-m84q7" Mar 20 10:52:45 crc kubenswrapper[4748]: I0320 10:52:45.299612 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-884679f54-rt4hn" Mar 20 10:52:45 crc kubenswrapper[4748]: I0320 10:52:45.375484 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-5784578c99-897gg" Mar 20 10:52:50 crc kubenswrapper[4748]: I0320 10:52:50.595595 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-c674c5965-2lkhq" event={"ID":"1988c5e6-a91c-4085-a878-2ffdf478fa1b","Type":"ContainerStarted","Data":"2ad77dd3fc68661e8bdc3917637fd87ec65e391b5c6e287753b28c2fdcd0a419"} Mar 20 10:52:50 crc kubenswrapper[4748]: I0320 10:52:50.596459 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-c674c5965-2lkhq" Mar 20 10:52:50 crc kubenswrapper[4748]: I0320 10:52:50.598065 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-7tgsh" event={"ID":"b0646c53-71d5-40d9-8a3b-77c244fff7c4","Type":"ContainerStarted","Data":"09f3b58a5f3d5e81ab7542e98b67a2b8c34e4483151692a3f8afeebd4126a673"} Mar 20 10:52:50 crc kubenswrapper[4748]: I0320 10:52:50.600060 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-9pmr7" event={"ID":"44b5dd5f-9a81-4c01-8efd-6d4997bb9c94","Type":"ContainerStarted","Data":"05226ecfd296c813f0293a968cae4d5d16149e3d6994bc1fcb3f5f89476b5777"} Mar 20 10:52:50 crc kubenswrapper[4748]: I0320 10:52:50.600395 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-9pmr7" Mar 20 10:52:50 crc kubenswrapper[4748]: I0320 10:52:50.602792 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-vsd7j" event={"ID":"8e47e918-33de-4a66-9223-7ee3264600c1","Type":"ContainerStarted","Data":"faf1c3d1e3d5884ac7d7f40d9ce926a094d9dfb6a14ae1dc59aeb9306b94b484"} Mar 20 10:52:50 crc kubenswrapper[4748]: I0320 10:52:50.603139 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-vsd7j" Mar 20 10:52:50 crc kubenswrapper[4748]: I0320 10:52:50.605384 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-h7vwp" event={"ID":"b6d05c98-f000-4560-b790-da31157488dc","Type":"ContainerStarted","Data":"43b1a31facf57346ec8a14c2cb24a8097294add2d634bd9f6a7fbe129c7a78c0"} Mar 20 10:52:50 crc kubenswrapper[4748]: I0320 10:52:50.605710 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-h7vwp" Mar 20 10:52:50 crc kubenswrapper[4748]: I0320 10:52:50.607559 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-9gzr8" event={"ID":"d855d6bf-853d-454b-b0b7-feb11f23cc17","Type":"ContainerStarted","Data":"bcfed1342dff98f72674efe758bfd185ec4e4da374e053e3ed4c629a7a93da21"} Mar 20 10:52:50 crc kubenswrapper[4748]: I0320 10:52:50.607935 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-9gzr8" Mar 20 10:52:50 crc kubenswrapper[4748]: I0320 10:52:50.609359 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-55f864c847-blfgz" event={"ID":"6dbf38ab-35e4-4cf3-9655-b8dc49eaea7d","Type":"ContainerStarted","Data":"41ab5bf63264cb601d945292f8d84a9e1b0780a32e9f4d3a7c75165b5e275fa1"} Mar 20 10:52:50 crc kubenswrapper[4748]: I0320 10:52:50.609631 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-55f864c847-blfgz" Mar 20 10:52:50 crc kubenswrapper[4748]: I0320 10:52:50.626090 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-c674c5965-2lkhq" podStartSLOduration=3.8513941149999997 podStartE2EDuration="26.626073523s" podCreationTimestamp="2026-03-20 10:52:24 +0000 UTC" firstStartedPulling="2026-03-20 10:52:26.553708627 +0000 UTC m=+981.695254441" lastFinishedPulling="2026-03-20 10:52:49.328388025 +0000 UTC m=+1004.469933849" observedRunningTime="2026-03-20 10:52:50.619583891 +0000 UTC m=+1005.761129705" watchObservedRunningTime="2026-03-20 10:52:50.626073523 +0000 UTC m=+1005.767619337" Mar 20 10:52:50 crc kubenswrapper[4748]: I0320 10:52:50.640564 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-7tgsh" podStartSLOduration=2.927301269 podStartE2EDuration="25.640547024s" podCreationTimestamp="2026-03-20 10:52:25 +0000 UTC" firstStartedPulling="2026-03-20 10:52:26.595163031 +0000 UTC m=+981.736708845" lastFinishedPulling="2026-03-20 10:52:49.308408786 +0000 UTC m=+1004.449954600" observedRunningTime="2026-03-20 10:52:50.636203476 +0000 UTC m=+1005.777749300" watchObservedRunningTime="2026-03-20 10:52:50.640547024 +0000 UTC m=+1005.782092838" Mar 20 10:52:50 crc kubenswrapper[4748]: I0320 10:52:50.663605 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-9gzr8" podStartSLOduration=3.909819073 podStartE2EDuration="26.663579469s" podCreationTimestamp="2026-03-20 10:52:24 +0000 UTC" firstStartedPulling="2026-03-20 10:52:26.573025719 +0000 UTC m=+981.714571533" lastFinishedPulling="2026-03-20 10:52:49.326786125 +0000 UTC m=+1004.468331929" observedRunningTime="2026-03-20 10:52:50.66243938 +0000 UTC m=+1005.803985214" watchObservedRunningTime="2026-03-20 10:52:50.663579469 +0000 UTC m=+1005.805125293" Mar 20 10:52:50 crc kubenswrapper[4748]: I0320 10:52:50.690722 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-vsd7j" podStartSLOduration=3.955592404 podStartE2EDuration="26.690696225s" podCreationTimestamp="2026-03-20 10:52:24 +0000 UTC" firstStartedPulling="2026-03-20 10:52:26.55344726 +0000 UTC m=+981.694993074" lastFinishedPulling="2026-03-20 10:52:49.288551071 +0000 UTC m=+1004.430096895" observedRunningTime="2026-03-20 10:52:50.685197938 +0000 UTC m=+1005.826743762" watchObservedRunningTime="2026-03-20 10:52:50.690696225 +0000 UTC m=+1005.832242049" Mar 20 10:52:50 crc kubenswrapper[4748]: I0320 10:52:50.710086 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-h7vwp" podStartSLOduration=3.996699871 podStartE2EDuration="26.710056959s" podCreationTimestamp="2026-03-20 10:52:24 +0000 UTC" firstStartedPulling="2026-03-20 10:52:26.595088179 +0000 UTC m=+981.736633993" lastFinishedPulling="2026-03-20 10:52:49.308445267 +0000 UTC m=+1004.449991081" observedRunningTime="2026-03-20 10:52:50.704324956 +0000 UTC m=+1005.845870770" watchObservedRunningTime="2026-03-20 10:52:50.710056959 +0000 UTC m=+1005.851602783" Mar 20 10:52:50 crc kubenswrapper[4748]: I0320 10:52:50.726056 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-9pmr7" podStartSLOduration=3.976448586 podStartE2EDuration="26.726032597s" podCreationTimestamp="2026-03-20 10:52:24 +0000 UTC" firstStartedPulling="2026-03-20 10:52:26.559633605 +0000 UTC m=+981.701179419" lastFinishedPulling="2026-03-20 10:52:49.309217616 +0000 UTC m=+1004.450763430" observedRunningTime="2026-03-20 10:52:50.7217315 +0000 UTC m=+1005.863277314" watchObservedRunningTime="2026-03-20 10:52:50.726032597 +0000 UTC m=+1005.867578431" Mar 20 10:52:50 crc kubenswrapper[4748]: I0320 10:52:50.738693 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-55f864c847-blfgz" podStartSLOduration=3.795789798 podStartE2EDuration="26.738678013s" podCreationTimestamp="2026-03-20 10:52:24 +0000 UTC" firstStartedPulling="2026-03-20 10:52:26.346502217 +0000 UTC m=+981.488048031" lastFinishedPulling="2026-03-20 10:52:49.289390432 +0000 UTC m=+1004.430936246" observedRunningTime="2026-03-20 10:52:50.738334644 +0000 UTC m=+1005.879880458" watchObservedRunningTime="2026-03-20 10:52:50.738678013 +0000 UTC m=+1005.880223827" Mar 20 10:52:54 crc kubenswrapper[4748]: I0320 10:52:54.973250 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-9gzr8" Mar 20 10:52:55 crc kubenswrapper[4748]: I0320 10:52:55.053041 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-55f864c847-blfgz" Mar 20 10:52:55 crc kubenswrapper[4748]: I0320 10:52:55.362475 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-c674c5965-2lkhq" Mar 20 10:52:55 crc kubenswrapper[4748]: I0320 10:52:55.388768 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-9pmr7" Mar 20 10:52:55 crc kubenswrapper[4748]: I0320 10:52:55.415485 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-vsd7j" Mar 20 10:52:55 crc kubenswrapper[4748]: I0320 10:52:55.727037 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-h7vwp" Mar 20 10:52:56 crc kubenswrapper[4748]: I0320 10:52:56.427132 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c29e0600-cf39-40bf-9225-48e55c4b8f97-cert\") pod \"infra-operator-controller-manager-7b9c774f96-x7sjb\" (UID: \"c29e0600-cf39-40bf-9225-48e55c4b8f97\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-x7sjb" Mar 20 10:52:56 crc kubenswrapper[4748]: I0320 10:52:56.432518 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c29e0600-cf39-40bf-9225-48e55c4b8f97-cert\") pod \"infra-operator-controller-manager-7b9c774f96-x7sjb\" (UID: \"c29e0600-cf39-40bf-9225-48e55c4b8f97\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-x7sjb" Mar 20 10:52:56 crc kubenswrapper[4748]: I0320 10:52:56.634429 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-x7sjb" Mar 20 10:52:56 crc kubenswrapper[4748]: I0320 10:52:56.832947 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f0a3f8d9-dcfa-498a-a46e-61628aa68067-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-rxplv\" (UID: \"f0a3f8d9-dcfa-498a-a46e-61628aa68067\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-rxplv" Mar 20 10:52:56 crc kubenswrapper[4748]: I0320 10:52:56.838734 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f0a3f8d9-dcfa-498a-a46e-61628aa68067-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-rxplv\" (UID: \"f0a3f8d9-dcfa-498a-a46e-61628aa68067\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-rxplv" Mar 20 10:52:57 crc kubenswrapper[4748]: I0320 10:52:57.037316 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-7b9c774f96-x7sjb"] Mar 20 10:52:57 crc kubenswrapper[4748]: W0320 10:52:57.038660 4748 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc29e0600_cf39_40bf_9225_48e55c4b8f97.slice/crio-5eb942d93110e934a2ddc0fc22085a60bcc34f8e3cd7f5ebb7b4913f4351d01d WatchSource:0}: Error finding container 5eb942d93110e934a2ddc0fc22085a60bcc34f8e3cd7f5ebb7b4913f4351d01d: Status 404 returned error can't find the container with id 5eb942d93110e934a2ddc0fc22085a60bcc34f8e3cd7f5ebb7b4913f4351d01d Mar 20 10:52:57 crc kubenswrapper[4748]: I0320 10:52:57.070188 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-rxplv" Mar 20 10:52:57 crc kubenswrapper[4748]: I0320 10:52:57.138360 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/795ce1d0-2232-4ed7-8618-c47a7584973e-webhook-certs\") pod \"openstack-operator-controller-manager-5647f98656-9pqdv\" (UID: \"795ce1d0-2232-4ed7-8618-c47a7584973e\") " pod="openstack-operators/openstack-operator-controller-manager-5647f98656-9pqdv" Mar 20 10:52:57 crc kubenswrapper[4748]: I0320 10:52:57.139158 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/795ce1d0-2232-4ed7-8618-c47a7584973e-metrics-certs\") pod \"openstack-operator-controller-manager-5647f98656-9pqdv\" (UID: \"795ce1d0-2232-4ed7-8618-c47a7584973e\") " pod="openstack-operators/openstack-operator-controller-manager-5647f98656-9pqdv" Mar 20 10:52:57 crc kubenswrapper[4748]: I0320 10:52:57.143278 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/795ce1d0-2232-4ed7-8618-c47a7584973e-webhook-certs\") pod \"openstack-operator-controller-manager-5647f98656-9pqdv\" (UID: \"795ce1d0-2232-4ed7-8618-c47a7584973e\") " pod="openstack-operators/openstack-operator-controller-manager-5647f98656-9pqdv" Mar 20 10:52:57 crc kubenswrapper[4748]: I0320 10:52:57.146465 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/795ce1d0-2232-4ed7-8618-c47a7584973e-metrics-certs\") pod \"openstack-operator-controller-manager-5647f98656-9pqdv\" (UID: \"795ce1d0-2232-4ed7-8618-c47a7584973e\") " pod="openstack-operators/openstack-operator-controller-manager-5647f98656-9pqdv" Mar 20 10:52:57 crc kubenswrapper[4748]: I0320 10:52:57.246150 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-5647f98656-9pqdv" Mar 20 10:52:57 crc kubenswrapper[4748]: I0320 10:52:57.452915 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5647f98656-9pqdv"] Mar 20 10:52:57 crc kubenswrapper[4748]: I0320 10:52:57.485260 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-rxplv"] Mar 20 10:52:57 crc kubenswrapper[4748]: W0320 10:52:57.489735 4748 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf0a3f8d9_dcfa_498a_a46e_61628aa68067.slice/crio-6168c285fd437f977facfd49a604f173a77755762bdc992a304512750b97ba6d WatchSource:0}: Error finding container 6168c285fd437f977facfd49a604f173a77755762bdc992a304512750b97ba6d: Status 404 returned error can't find the container with id 6168c285fd437f977facfd49a604f173a77755762bdc992a304512750b97ba6d Mar 20 10:52:57 crc kubenswrapper[4748]: I0320 10:52:57.664877 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-rxplv" event={"ID":"f0a3f8d9-dcfa-498a-a46e-61628aa68067","Type":"ContainerStarted","Data":"6168c285fd437f977facfd49a604f173a77755762bdc992a304512750b97ba6d"} Mar 20 10:52:57 crc kubenswrapper[4748]: I0320 10:52:57.666813 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-5647f98656-9pqdv" event={"ID":"795ce1d0-2232-4ed7-8618-c47a7584973e","Type":"ContainerStarted","Data":"fdb164a335cfbf4888b2f1f1fd80f9348b28ee814b880d967ee70c0198995419"} Mar 20 10:52:57 crc kubenswrapper[4748]: I0320 10:52:57.668255 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-x7sjb" event={"ID":"c29e0600-cf39-40bf-9225-48e55c4b8f97","Type":"ContainerStarted","Data":"5eb942d93110e934a2ddc0fc22085a60bcc34f8e3cd7f5ebb7b4913f4351d01d"} Mar 20 10:53:15 crc kubenswrapper[4748]: I0320 10:53:15.816036 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-5647f98656-9pqdv" event={"ID":"795ce1d0-2232-4ed7-8618-c47a7584973e","Type":"ContainerStarted","Data":"c8186cf83064917da21228356e2ad120a9b771c52311d48360d04e53cbc75edb"} Mar 20 10:53:16 crc kubenswrapper[4748]: I0320 10:53:16.824213 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-5647f98656-9pqdv" Mar 20 10:53:16 crc kubenswrapper[4748]: I0320 10:53:16.864540 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-5647f98656-9pqdv" podStartSLOduration=52.864486718 podStartE2EDuration="52.864486718s" podCreationTimestamp="2026-03-20 10:52:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:53:16.85133467 +0000 UTC m=+1031.992880494" watchObservedRunningTime="2026-03-20 10:53:16.864486718 +0000 UTC m=+1032.006032552" Mar 20 10:53:16 crc kubenswrapper[4748]: E0320 10:53:16.970936 4748 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/infra-operator@sha256:a4cb438fef247332815b032c8a248bc65b873274aaac92478a22aa2f915798db" Mar 20 10:53:16 crc kubenswrapper[4748]: E0320 10:53:16.971177 4748 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/infra-operator@sha256:a4cb438fef247332815b032c8a248bc65b873274aaac92478a22aa2f915798db,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:true,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{600 -3} {} 600m DecimalSI},memory: {{2147483648 0} {} 2Gi BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{536870912 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cert,ReadOnly:true,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kdvcc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod infra-operator-controller-manager-7b9c774f96-x7sjb_openstack-operators(c29e0600-cf39-40bf-9225-48e55c4b8f97): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 10:53:16 crc kubenswrapper[4748]: E0320 10:53:16.972930 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-x7sjb" podUID="c29e0600-cf39-40bf-9225-48e55c4b8f97" Mar 20 10:53:21 crc kubenswrapper[4748]: E0320 10:53:21.625097 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/infra-operator@sha256:a4cb438fef247332815b032c8a248bc65b873274aaac92478a22aa2f915798db\\\"\"" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-x7sjb" podUID="c29e0600-cf39-40bf-9225-48e55c4b8f97" Mar 20 10:53:23 crc kubenswrapper[4748]: I0320 10:53:23.870801 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-prgc2" event={"ID":"17f5527d-b31e-4788-ab09-ac5d26ea1bce","Type":"ContainerStarted","Data":"40932481bcef049a2f2c78ded18b9a43c7022a87e800564182f9ad07a33b960f"} Mar 20 10:53:23 crc kubenswrapper[4748]: I0320 10:53:23.871418 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-prgc2" Mar 20 10:53:23 crc kubenswrapper[4748]: I0320 10:53:23.872454 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-rxplv" event={"ID":"f0a3f8d9-dcfa-498a-a46e-61628aa68067","Type":"ContainerStarted","Data":"b014200b8ee128f7653dd32cca2d05d7e71a95592f3cbbd7f4f8140250c53752"} Mar 20 10:53:23 crc kubenswrapper[4748]: I0320 10:53:23.872609 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-rxplv" Mar 20 10:53:23 crc kubenswrapper[4748]: I0320 10:53:23.898057 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-prgc2" podStartSLOduration=2.587036238 podStartE2EDuration="59.898022253s" podCreationTimestamp="2026-03-20 10:52:24 +0000 UTC" firstStartedPulling="2026-03-20 10:52:26.180968066 +0000 UTC m=+981.322513880" lastFinishedPulling="2026-03-20 10:53:23.491954091 +0000 UTC m=+1038.633499895" observedRunningTime="2026-03-20 10:53:23.886504886 +0000 UTC m=+1039.028050700" watchObservedRunningTime="2026-03-20 10:53:23.898022253 +0000 UTC m=+1039.039568077" Mar 20 10:53:23 crc kubenswrapper[4748]: I0320 10:53:23.931909 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-rxplv" podStartSLOduration=33.912097538 podStartE2EDuration="59.931884828s" podCreationTimestamp="2026-03-20 10:52:24 +0000 UTC" firstStartedPulling="2026-03-20 10:52:57.492499489 +0000 UTC m=+1012.634045313" lastFinishedPulling="2026-03-20 10:53:23.512286749 +0000 UTC m=+1038.653832603" observedRunningTime="2026-03-20 10:53:23.92352083 +0000 UTC m=+1039.065066654" watchObservedRunningTime="2026-03-20 10:53:23.931884828 +0000 UTC m=+1039.073430642" Mar 20 10:53:27 crc kubenswrapper[4748]: I0320 10:53:27.254697 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-5647f98656-9pqdv" Mar 20 10:53:35 crc kubenswrapper[4748]: I0320 10:53:35.023332 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-prgc2" Mar 20 10:53:35 crc kubenswrapper[4748]: I0320 10:53:35.976528 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-x7sjb" event={"ID":"c29e0600-cf39-40bf-9225-48e55c4b8f97","Type":"ContainerStarted","Data":"9606c7783b5b0960b8d38fbfe528d523b024cc8db98321050c662eb243acbd5f"} Mar 20 10:53:35 crc kubenswrapper[4748]: I0320 10:53:35.977048 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-x7sjb" Mar 20 10:53:36 crc kubenswrapper[4748]: I0320 10:53:36.003308 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-x7sjb" podStartSLOduration=34.054778588 podStartE2EDuration="1m12.003275695s" podCreationTimestamp="2026-03-20 10:52:24 +0000 UTC" firstStartedPulling="2026-03-20 10:52:57.042120081 +0000 UTC m=+1012.183665895" lastFinishedPulling="2026-03-20 10:53:34.990617178 +0000 UTC m=+1050.132163002" observedRunningTime="2026-03-20 10:53:35.997936512 +0000 UTC m=+1051.139482316" watchObservedRunningTime="2026-03-20 10:53:36.003275695 +0000 UTC m=+1051.144821519" Mar 20 10:53:37 crc kubenswrapper[4748]: I0320 10:53:37.078156 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-rxplv" Mar 20 10:53:46 crc kubenswrapper[4748]: I0320 10:53:46.644182 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-x7sjb" Mar 20 10:54:00 crc kubenswrapper[4748]: I0320 10:54:00.142692 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566734-jbtgv"] Mar 20 10:54:00 crc kubenswrapper[4748]: I0320 10:54:00.144900 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566734-jbtgv" Mar 20 10:54:00 crc kubenswrapper[4748]: I0320 10:54:00.149897 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 10:54:00 crc kubenswrapper[4748]: I0320 10:54:00.150364 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-p6q8z" Mar 20 10:54:00 crc kubenswrapper[4748]: I0320 10:54:00.150591 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 10:54:00 crc kubenswrapper[4748]: I0320 10:54:00.158466 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566734-jbtgv"] Mar 20 10:54:00 crc kubenswrapper[4748]: I0320 10:54:00.293108 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hf4rl\" (UniqueName: \"kubernetes.io/projected/a1f05295-f9d1-467f-9b86-99f226ca7765-kube-api-access-hf4rl\") pod \"auto-csr-approver-29566734-jbtgv\" (UID: \"a1f05295-f9d1-467f-9b86-99f226ca7765\") " pod="openshift-infra/auto-csr-approver-29566734-jbtgv" Mar 20 10:54:00 crc kubenswrapper[4748]: I0320 10:54:00.395163 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hf4rl\" (UniqueName: \"kubernetes.io/projected/a1f05295-f9d1-467f-9b86-99f226ca7765-kube-api-access-hf4rl\") pod \"auto-csr-approver-29566734-jbtgv\" (UID: \"a1f05295-f9d1-467f-9b86-99f226ca7765\") " pod="openshift-infra/auto-csr-approver-29566734-jbtgv" Mar 20 10:54:00 crc kubenswrapper[4748]: I0320 10:54:00.428592 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hf4rl\" (UniqueName: \"kubernetes.io/projected/a1f05295-f9d1-467f-9b86-99f226ca7765-kube-api-access-hf4rl\") pod \"auto-csr-approver-29566734-jbtgv\" (UID: \"a1f05295-f9d1-467f-9b86-99f226ca7765\") " pod="openshift-infra/auto-csr-approver-29566734-jbtgv" Mar 20 10:54:00 crc kubenswrapper[4748]: I0320 10:54:00.484023 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566734-jbtgv" Mar 20 10:54:00 crc kubenswrapper[4748]: I0320 10:54:00.955515 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566734-jbtgv"] Mar 20 10:54:01 crc kubenswrapper[4748]: I0320 10:54:01.215435 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566734-jbtgv" event={"ID":"a1f05295-f9d1-467f-9b86-99f226ca7765","Type":"ContainerStarted","Data":"5becd6c60dca0ed1d08643188c273dbf3d1b9b8d12555dd939733435bca1065f"} Mar 20 10:54:03 crc kubenswrapper[4748]: I0320 10:54:03.235866 4748 generic.go:334] "Generic (PLEG): container finished" podID="a1f05295-f9d1-467f-9b86-99f226ca7765" containerID="ff823410f00759d7564d35667a82e4412741fc7128102625a43ed2140f0ec63a" exitCode=0 Mar 20 10:54:03 crc kubenswrapper[4748]: I0320 10:54:03.235963 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566734-jbtgv" event={"ID":"a1f05295-f9d1-467f-9b86-99f226ca7765","Type":"ContainerDied","Data":"ff823410f00759d7564d35667a82e4412741fc7128102625a43ed2140f0ec63a"} Mar 20 10:54:04 crc kubenswrapper[4748]: I0320 10:54:04.569693 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566734-jbtgv" Mar 20 10:54:04 crc kubenswrapper[4748]: I0320 10:54:04.668698 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-5s5fj"] Mar 20 10:54:04 crc kubenswrapper[4748]: E0320 10:54:04.669241 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1f05295-f9d1-467f-9b86-99f226ca7765" containerName="oc" Mar 20 10:54:04 crc kubenswrapper[4748]: I0320 10:54:04.669268 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1f05295-f9d1-467f-9b86-99f226ca7765" containerName="oc" Mar 20 10:54:04 crc kubenswrapper[4748]: I0320 10:54:04.669452 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1f05295-f9d1-467f-9b86-99f226ca7765" containerName="oc" Mar 20 10:54:04 crc kubenswrapper[4748]: I0320 10:54:04.670435 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-5s5fj" Mar 20 10:54:04 crc kubenswrapper[4748]: I0320 10:54:04.674100 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Mar 20 10:54:04 crc kubenswrapper[4748]: I0320 10:54:04.674460 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Mar 20 10:54:04 crc kubenswrapper[4748]: I0320 10:54:04.677372 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-5dvfq" Mar 20 10:54:04 crc kubenswrapper[4748]: I0320 10:54:04.678139 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Mar 20 10:54:04 crc kubenswrapper[4748]: I0320 10:54:04.685011 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hf4rl\" (UniqueName: \"kubernetes.io/projected/a1f05295-f9d1-467f-9b86-99f226ca7765-kube-api-access-hf4rl\") pod \"a1f05295-f9d1-467f-9b86-99f226ca7765\" (UID: \"a1f05295-f9d1-467f-9b86-99f226ca7765\") " Mar 20 10:54:04 crc kubenswrapper[4748]: I0320 10:54:04.701963 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1f05295-f9d1-467f-9b86-99f226ca7765-kube-api-access-hf4rl" (OuterVolumeSpecName: "kube-api-access-hf4rl") pod "a1f05295-f9d1-467f-9b86-99f226ca7765" (UID: "a1f05295-f9d1-467f-9b86-99f226ca7765"). InnerVolumeSpecName "kube-api-access-hf4rl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:54:04 crc kubenswrapper[4748]: I0320 10:54:04.717526 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-5s5fj"] Mar 20 10:54:04 crc kubenswrapper[4748]: I0320 10:54:04.776658 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-zfcwj"] Mar 20 10:54:04 crc kubenswrapper[4748]: I0320 10:54:04.778844 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-zfcwj" Mar 20 10:54:04 crc kubenswrapper[4748]: I0320 10:54:04.781505 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Mar 20 10:54:04 crc kubenswrapper[4748]: I0320 10:54:04.787614 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24928\" (UniqueName: \"kubernetes.io/projected/7ed41b0a-dc41-453c-966b-c7dd5b490bfe-kube-api-access-24928\") pod \"dnsmasq-dns-675f4bcbfc-5s5fj\" (UID: \"7ed41b0a-dc41-453c-966b-c7dd5b490bfe\") " pod="openstack/dnsmasq-dns-675f4bcbfc-5s5fj" Mar 20 10:54:04 crc kubenswrapper[4748]: I0320 10:54:04.787697 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ed41b0a-dc41-453c-966b-c7dd5b490bfe-config\") pod \"dnsmasq-dns-675f4bcbfc-5s5fj\" (UID: \"7ed41b0a-dc41-453c-966b-c7dd5b490bfe\") " pod="openstack/dnsmasq-dns-675f4bcbfc-5s5fj" Mar 20 10:54:04 crc kubenswrapper[4748]: I0320 10:54:04.787869 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hf4rl\" (UniqueName: \"kubernetes.io/projected/a1f05295-f9d1-467f-9b86-99f226ca7765-kube-api-access-hf4rl\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:04 crc kubenswrapper[4748]: I0320 10:54:04.811431 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-zfcwj"] Mar 20 10:54:04 crc kubenswrapper[4748]: I0320 10:54:04.889174 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a8087a23-47c8-476a-b1ed-f333c87a10c7-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-zfcwj\" (UID: \"a8087a23-47c8-476a-b1ed-f333c87a10c7\") " pod="openstack/dnsmasq-dns-78dd6ddcc-zfcwj" Mar 20 10:54:04 crc kubenswrapper[4748]: I0320 10:54:04.889257 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jj4s7\" (UniqueName: \"kubernetes.io/projected/a8087a23-47c8-476a-b1ed-f333c87a10c7-kube-api-access-jj4s7\") pod \"dnsmasq-dns-78dd6ddcc-zfcwj\" (UID: \"a8087a23-47c8-476a-b1ed-f333c87a10c7\") " pod="openstack/dnsmasq-dns-78dd6ddcc-zfcwj" Mar 20 10:54:04 crc kubenswrapper[4748]: I0320 10:54:04.889353 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-24928\" (UniqueName: \"kubernetes.io/projected/7ed41b0a-dc41-453c-966b-c7dd5b490bfe-kube-api-access-24928\") pod \"dnsmasq-dns-675f4bcbfc-5s5fj\" (UID: \"7ed41b0a-dc41-453c-966b-c7dd5b490bfe\") " pod="openstack/dnsmasq-dns-675f4bcbfc-5s5fj" Mar 20 10:54:04 crc kubenswrapper[4748]: I0320 10:54:04.889666 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ed41b0a-dc41-453c-966b-c7dd5b490bfe-config\") pod \"dnsmasq-dns-675f4bcbfc-5s5fj\" (UID: \"7ed41b0a-dc41-453c-966b-c7dd5b490bfe\") " pod="openstack/dnsmasq-dns-675f4bcbfc-5s5fj" Mar 20 10:54:04 crc kubenswrapper[4748]: I0320 10:54:04.890145 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8087a23-47c8-476a-b1ed-f333c87a10c7-config\") pod \"dnsmasq-dns-78dd6ddcc-zfcwj\" (UID: \"a8087a23-47c8-476a-b1ed-f333c87a10c7\") " pod="openstack/dnsmasq-dns-78dd6ddcc-zfcwj" Mar 20 10:54:04 crc kubenswrapper[4748]: I0320 10:54:04.890663 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ed41b0a-dc41-453c-966b-c7dd5b490bfe-config\") pod \"dnsmasq-dns-675f4bcbfc-5s5fj\" (UID: \"7ed41b0a-dc41-453c-966b-c7dd5b490bfe\") " pod="openstack/dnsmasq-dns-675f4bcbfc-5s5fj" Mar 20 10:54:04 crc kubenswrapper[4748]: I0320 10:54:04.912761 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-24928\" (UniqueName: \"kubernetes.io/projected/7ed41b0a-dc41-453c-966b-c7dd5b490bfe-kube-api-access-24928\") pod \"dnsmasq-dns-675f4bcbfc-5s5fj\" (UID: \"7ed41b0a-dc41-453c-966b-c7dd5b490bfe\") " pod="openstack/dnsmasq-dns-675f4bcbfc-5s5fj" Mar 20 10:54:04 crc kubenswrapper[4748]: I0320 10:54:04.992547 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a8087a23-47c8-476a-b1ed-f333c87a10c7-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-zfcwj\" (UID: \"a8087a23-47c8-476a-b1ed-f333c87a10c7\") " pod="openstack/dnsmasq-dns-78dd6ddcc-zfcwj" Mar 20 10:54:04 crc kubenswrapper[4748]: I0320 10:54:04.992690 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jj4s7\" (UniqueName: \"kubernetes.io/projected/a8087a23-47c8-476a-b1ed-f333c87a10c7-kube-api-access-jj4s7\") pod \"dnsmasq-dns-78dd6ddcc-zfcwj\" (UID: \"a8087a23-47c8-476a-b1ed-f333c87a10c7\") " pod="openstack/dnsmasq-dns-78dd6ddcc-zfcwj" Mar 20 10:54:04 crc kubenswrapper[4748]: I0320 10:54:04.992762 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8087a23-47c8-476a-b1ed-f333c87a10c7-config\") pod \"dnsmasq-dns-78dd6ddcc-zfcwj\" (UID: \"a8087a23-47c8-476a-b1ed-f333c87a10c7\") " pod="openstack/dnsmasq-dns-78dd6ddcc-zfcwj" Mar 20 10:54:04 crc kubenswrapper[4748]: I0320 10:54:04.993729 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a8087a23-47c8-476a-b1ed-f333c87a10c7-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-zfcwj\" (UID: \"a8087a23-47c8-476a-b1ed-f333c87a10c7\") " pod="openstack/dnsmasq-dns-78dd6ddcc-zfcwj" Mar 20 10:54:04 crc kubenswrapper[4748]: I0320 10:54:04.993925 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8087a23-47c8-476a-b1ed-f333c87a10c7-config\") pod \"dnsmasq-dns-78dd6ddcc-zfcwj\" (UID: \"a8087a23-47c8-476a-b1ed-f333c87a10c7\") " pod="openstack/dnsmasq-dns-78dd6ddcc-zfcwj" Mar 20 10:54:05 crc kubenswrapper[4748]: I0320 10:54:05.012919 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jj4s7\" (UniqueName: \"kubernetes.io/projected/a8087a23-47c8-476a-b1ed-f333c87a10c7-kube-api-access-jj4s7\") pod \"dnsmasq-dns-78dd6ddcc-zfcwj\" (UID: \"a8087a23-47c8-476a-b1ed-f333c87a10c7\") " pod="openstack/dnsmasq-dns-78dd6ddcc-zfcwj" Mar 20 10:54:05 crc kubenswrapper[4748]: I0320 10:54:05.027153 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-5s5fj" Mar 20 10:54:05 crc kubenswrapper[4748]: I0320 10:54:05.094804 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-zfcwj" Mar 20 10:54:05 crc kubenswrapper[4748]: I0320 10:54:05.260287 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566734-jbtgv" event={"ID":"a1f05295-f9d1-467f-9b86-99f226ca7765","Type":"ContainerDied","Data":"5becd6c60dca0ed1d08643188c273dbf3d1b9b8d12555dd939733435bca1065f"} Mar 20 10:54:05 crc kubenswrapper[4748]: I0320 10:54:05.260344 4748 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5becd6c60dca0ed1d08643188c273dbf3d1b9b8d12555dd939733435bca1065f" Mar 20 10:54:05 crc kubenswrapper[4748]: I0320 10:54:05.260452 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566734-jbtgv" Mar 20 10:54:05 crc kubenswrapper[4748]: I0320 10:54:05.495093 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-5s5fj"] Mar 20 10:54:05 crc kubenswrapper[4748]: I0320 10:54:05.587981 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-zfcwj"] Mar 20 10:54:05 crc kubenswrapper[4748]: W0320 10:54:05.593165 4748 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda8087a23_47c8_476a_b1ed_f333c87a10c7.slice/crio-5578606eb524a959efa987c5e619beb83c8439c9284ed7356a08c517d4d2adfc WatchSource:0}: Error finding container 5578606eb524a959efa987c5e619beb83c8439c9284ed7356a08c517d4d2adfc: Status 404 returned error can't find the container with id 5578606eb524a959efa987c5e619beb83c8439c9284ed7356a08c517d4d2adfc Mar 20 10:54:05 crc kubenswrapper[4748]: I0320 10:54:05.665829 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566728-z252s"] Mar 20 10:54:05 crc kubenswrapper[4748]: I0320 10:54:05.672136 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566728-z252s"] Mar 20 10:54:06 crc kubenswrapper[4748]: I0320 10:54:06.271209 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-5s5fj" event={"ID":"7ed41b0a-dc41-453c-966b-c7dd5b490bfe","Type":"ContainerStarted","Data":"cd52a930108f45a5b4aca6b8e1e41d7325b987796f2d2470ff6fcb8b47a03f9b"} Mar 20 10:54:06 crc kubenswrapper[4748]: I0320 10:54:06.274142 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-zfcwj" event={"ID":"a8087a23-47c8-476a-b1ed-f333c87a10c7","Type":"ContainerStarted","Data":"5578606eb524a959efa987c5e619beb83c8439c9284ed7356a08c517d4d2adfc"} Mar 20 10:54:07 crc kubenswrapper[4748]: I0320 10:54:07.536110 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f04b7943-417c-40b2-bc8f-a8a7ed916a47" path="/var/lib/kubelet/pods/f04b7943-417c-40b2-bc8f-a8a7ed916a47/volumes" Mar 20 10:54:07 crc kubenswrapper[4748]: I0320 10:54:07.559190 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-5s5fj"] Mar 20 10:54:07 crc kubenswrapper[4748]: I0320 10:54:07.586108 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-qq2xk"] Mar 20 10:54:07 crc kubenswrapper[4748]: I0320 10:54:07.596956 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-qq2xk" Mar 20 10:54:07 crc kubenswrapper[4748]: I0320 10:54:07.601646 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-qq2xk"] Mar 20 10:54:07 crc kubenswrapper[4748]: I0320 10:54:07.752189 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7b2j\" (UniqueName: \"kubernetes.io/projected/0f69b3ab-d71e-44be-b0b0-fa830eb8756a-kube-api-access-d7b2j\") pod \"dnsmasq-dns-5ccc8479f9-qq2xk\" (UID: \"0f69b3ab-d71e-44be-b0b0-fa830eb8756a\") " pod="openstack/dnsmasq-dns-5ccc8479f9-qq2xk" Mar 20 10:54:07 crc kubenswrapper[4748]: I0320 10:54:07.752306 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f69b3ab-d71e-44be-b0b0-fa830eb8756a-config\") pod \"dnsmasq-dns-5ccc8479f9-qq2xk\" (UID: \"0f69b3ab-d71e-44be-b0b0-fa830eb8756a\") " pod="openstack/dnsmasq-dns-5ccc8479f9-qq2xk" Mar 20 10:54:07 crc kubenswrapper[4748]: I0320 10:54:07.752334 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0f69b3ab-d71e-44be-b0b0-fa830eb8756a-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-qq2xk\" (UID: \"0f69b3ab-d71e-44be-b0b0-fa830eb8756a\") " pod="openstack/dnsmasq-dns-5ccc8479f9-qq2xk" Mar 20 10:54:07 crc kubenswrapper[4748]: I0320 10:54:07.854268 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f69b3ab-d71e-44be-b0b0-fa830eb8756a-config\") pod \"dnsmasq-dns-5ccc8479f9-qq2xk\" (UID: \"0f69b3ab-d71e-44be-b0b0-fa830eb8756a\") " pod="openstack/dnsmasq-dns-5ccc8479f9-qq2xk" Mar 20 10:54:07 crc kubenswrapper[4748]: I0320 10:54:07.854329 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0f69b3ab-d71e-44be-b0b0-fa830eb8756a-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-qq2xk\" (UID: \"0f69b3ab-d71e-44be-b0b0-fa830eb8756a\") " pod="openstack/dnsmasq-dns-5ccc8479f9-qq2xk" Mar 20 10:54:07 crc kubenswrapper[4748]: I0320 10:54:07.854404 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d7b2j\" (UniqueName: \"kubernetes.io/projected/0f69b3ab-d71e-44be-b0b0-fa830eb8756a-kube-api-access-d7b2j\") pod \"dnsmasq-dns-5ccc8479f9-qq2xk\" (UID: \"0f69b3ab-d71e-44be-b0b0-fa830eb8756a\") " pod="openstack/dnsmasq-dns-5ccc8479f9-qq2xk" Mar 20 10:54:07 crc kubenswrapper[4748]: I0320 10:54:07.855855 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0f69b3ab-d71e-44be-b0b0-fa830eb8756a-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-qq2xk\" (UID: \"0f69b3ab-d71e-44be-b0b0-fa830eb8756a\") " pod="openstack/dnsmasq-dns-5ccc8479f9-qq2xk" Mar 20 10:54:07 crc kubenswrapper[4748]: I0320 10:54:07.855947 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f69b3ab-d71e-44be-b0b0-fa830eb8756a-config\") pod \"dnsmasq-dns-5ccc8479f9-qq2xk\" (UID: \"0f69b3ab-d71e-44be-b0b0-fa830eb8756a\") " pod="openstack/dnsmasq-dns-5ccc8479f9-qq2xk" Mar 20 10:54:07 crc kubenswrapper[4748]: I0320 10:54:07.897078 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7b2j\" (UniqueName: \"kubernetes.io/projected/0f69b3ab-d71e-44be-b0b0-fa830eb8756a-kube-api-access-d7b2j\") pod \"dnsmasq-dns-5ccc8479f9-qq2xk\" (UID: \"0f69b3ab-d71e-44be-b0b0-fa830eb8756a\") " pod="openstack/dnsmasq-dns-5ccc8479f9-qq2xk" Mar 20 10:54:07 crc kubenswrapper[4748]: I0320 10:54:07.919411 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-qq2xk" Mar 20 10:54:08 crc kubenswrapper[4748]: I0320 10:54:08.259956 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-zfcwj"] Mar 20 10:54:08 crc kubenswrapper[4748]: I0320 10:54:08.313061 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-dv6gr"] Mar 20 10:54:08 crc kubenswrapper[4748]: I0320 10:54:08.316000 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-dv6gr" Mar 20 10:54:08 crc kubenswrapper[4748]: I0320 10:54:08.343346 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-dv6gr"] Mar 20 10:54:08 crc kubenswrapper[4748]: I0320 10:54:08.468245 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sgkv9\" (UniqueName: \"kubernetes.io/projected/c0179e70-7de5-4413-868b-8d442ec891e5-kube-api-access-sgkv9\") pod \"dnsmasq-dns-57d769cc4f-dv6gr\" (UID: \"c0179e70-7de5-4413-868b-8d442ec891e5\") " pod="openstack/dnsmasq-dns-57d769cc4f-dv6gr" Mar 20 10:54:08 crc kubenswrapper[4748]: I0320 10:54:08.468323 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0179e70-7de5-4413-868b-8d442ec891e5-config\") pod \"dnsmasq-dns-57d769cc4f-dv6gr\" (UID: \"c0179e70-7de5-4413-868b-8d442ec891e5\") " pod="openstack/dnsmasq-dns-57d769cc4f-dv6gr" Mar 20 10:54:08 crc kubenswrapper[4748]: I0320 10:54:08.468352 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c0179e70-7de5-4413-868b-8d442ec891e5-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-dv6gr\" (UID: \"c0179e70-7de5-4413-868b-8d442ec891e5\") " pod="openstack/dnsmasq-dns-57d769cc4f-dv6gr" Mar 20 10:54:08 crc kubenswrapper[4748]: I0320 10:54:08.570346 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0179e70-7de5-4413-868b-8d442ec891e5-config\") pod \"dnsmasq-dns-57d769cc4f-dv6gr\" (UID: \"c0179e70-7de5-4413-868b-8d442ec891e5\") " pod="openstack/dnsmasq-dns-57d769cc4f-dv6gr" Mar 20 10:54:08 crc kubenswrapper[4748]: I0320 10:54:08.570412 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c0179e70-7de5-4413-868b-8d442ec891e5-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-dv6gr\" (UID: \"c0179e70-7de5-4413-868b-8d442ec891e5\") " pod="openstack/dnsmasq-dns-57d769cc4f-dv6gr" Mar 20 10:54:08 crc kubenswrapper[4748]: I0320 10:54:08.570507 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sgkv9\" (UniqueName: \"kubernetes.io/projected/c0179e70-7de5-4413-868b-8d442ec891e5-kube-api-access-sgkv9\") pod \"dnsmasq-dns-57d769cc4f-dv6gr\" (UID: \"c0179e70-7de5-4413-868b-8d442ec891e5\") " pod="openstack/dnsmasq-dns-57d769cc4f-dv6gr" Mar 20 10:54:08 crc kubenswrapper[4748]: I0320 10:54:08.579403 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0179e70-7de5-4413-868b-8d442ec891e5-config\") pod \"dnsmasq-dns-57d769cc4f-dv6gr\" (UID: \"c0179e70-7de5-4413-868b-8d442ec891e5\") " pod="openstack/dnsmasq-dns-57d769cc4f-dv6gr" Mar 20 10:54:08 crc kubenswrapper[4748]: I0320 10:54:08.589744 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c0179e70-7de5-4413-868b-8d442ec891e5-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-dv6gr\" (UID: \"c0179e70-7de5-4413-868b-8d442ec891e5\") " pod="openstack/dnsmasq-dns-57d769cc4f-dv6gr" Mar 20 10:54:08 crc kubenswrapper[4748]: I0320 10:54:08.603207 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sgkv9\" (UniqueName: \"kubernetes.io/projected/c0179e70-7de5-4413-868b-8d442ec891e5-kube-api-access-sgkv9\") pod \"dnsmasq-dns-57d769cc4f-dv6gr\" (UID: \"c0179e70-7de5-4413-868b-8d442ec891e5\") " pod="openstack/dnsmasq-dns-57d769cc4f-dv6gr" Mar 20 10:54:08 crc kubenswrapper[4748]: I0320 10:54:08.657553 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-dv6gr" Mar 20 10:54:08 crc kubenswrapper[4748]: I0320 10:54:08.891669 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-qq2xk"] Mar 20 10:54:08 crc kubenswrapper[4748]: W0320 10:54:08.908185 4748 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0f69b3ab_d71e_44be_b0b0_fa830eb8756a.slice/crio-5793222b6b3cf464979eb7630466ac7e2dd9a5c51cf017ea06271d2b3a772977 WatchSource:0}: Error finding container 5793222b6b3cf464979eb7630466ac7e2dd9a5c51cf017ea06271d2b3a772977: Status 404 returned error can't find the container with id 5793222b6b3cf464979eb7630466ac7e2dd9a5c51cf017ea06271d2b3a772977 Mar 20 10:54:08 crc kubenswrapper[4748]: I0320 10:54:08.921636 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 20 10:54:08 crc kubenswrapper[4748]: I0320 10:54:08.925010 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 20 10:54:08 crc kubenswrapper[4748]: I0320 10:54:08.928083 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 20 10:54:08 crc kubenswrapper[4748]: I0320 10:54:08.934544 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 20 10:54:08 crc kubenswrapper[4748]: I0320 10:54:08.934877 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 20 10:54:08 crc kubenswrapper[4748]: I0320 10:54:08.935052 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 20 10:54:08 crc kubenswrapper[4748]: I0320 10:54:08.935341 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 20 10:54:08 crc kubenswrapper[4748]: I0320 10:54:08.935518 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 20 10:54:08 crc kubenswrapper[4748]: I0320 10:54:08.935694 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-86jnz" Mar 20 10:54:08 crc kubenswrapper[4748]: I0320 10:54:08.962863 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 20 10:54:09 crc kubenswrapper[4748]: I0320 10:54:09.080999 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a5a9b3e3-3a44-4765-ab5b-0e7955b524f7-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"a5a9b3e3-3a44-4765-ab5b-0e7955b524f7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 10:54:09 crc kubenswrapper[4748]: I0320 10:54:09.081073 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmt7j\" (UniqueName: \"kubernetes.io/projected/a5a9b3e3-3a44-4765-ab5b-0e7955b524f7-kube-api-access-vmt7j\") pod \"rabbitmq-cell1-server-0\" (UID: \"a5a9b3e3-3a44-4765-ab5b-0e7955b524f7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 10:54:09 crc kubenswrapper[4748]: I0320 10:54:09.081099 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a5a9b3e3-3a44-4765-ab5b-0e7955b524f7-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"a5a9b3e3-3a44-4765-ab5b-0e7955b524f7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 10:54:09 crc kubenswrapper[4748]: I0320 10:54:09.081134 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a5a9b3e3-3a44-4765-ab5b-0e7955b524f7-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"a5a9b3e3-3a44-4765-ab5b-0e7955b524f7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 10:54:09 crc kubenswrapper[4748]: I0320 10:54:09.081160 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a5a9b3e3-3a44-4765-ab5b-0e7955b524f7-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"a5a9b3e3-3a44-4765-ab5b-0e7955b524f7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 10:54:09 crc kubenswrapper[4748]: I0320 10:54:09.081198 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a5a9b3e3-3a44-4765-ab5b-0e7955b524f7-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a5a9b3e3-3a44-4765-ab5b-0e7955b524f7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 10:54:09 crc kubenswrapper[4748]: I0320 10:54:09.081226 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a5a9b3e3-3a44-4765-ab5b-0e7955b524f7-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a5a9b3e3-3a44-4765-ab5b-0e7955b524f7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 10:54:09 crc kubenswrapper[4748]: I0320 10:54:09.081258 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a5a9b3e3-3a44-4765-ab5b-0e7955b524f7-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"a5a9b3e3-3a44-4765-ab5b-0e7955b524f7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 10:54:09 crc kubenswrapper[4748]: I0320 10:54:09.081297 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"a5a9b3e3-3a44-4765-ab5b-0e7955b524f7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 10:54:09 crc kubenswrapper[4748]: I0320 10:54:09.081336 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a5a9b3e3-3a44-4765-ab5b-0e7955b524f7-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"a5a9b3e3-3a44-4765-ab5b-0e7955b524f7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 10:54:09 crc kubenswrapper[4748]: I0320 10:54:09.081367 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a5a9b3e3-3a44-4765-ab5b-0e7955b524f7-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"a5a9b3e3-3a44-4765-ab5b-0e7955b524f7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 10:54:09 crc kubenswrapper[4748]: I0320 10:54:09.184850 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a5a9b3e3-3a44-4765-ab5b-0e7955b524f7-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a5a9b3e3-3a44-4765-ab5b-0e7955b524f7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 10:54:09 crc kubenswrapper[4748]: I0320 10:54:09.184929 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a5a9b3e3-3a44-4765-ab5b-0e7955b524f7-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a5a9b3e3-3a44-4765-ab5b-0e7955b524f7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 10:54:09 crc kubenswrapper[4748]: I0320 10:54:09.184967 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a5a9b3e3-3a44-4765-ab5b-0e7955b524f7-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"a5a9b3e3-3a44-4765-ab5b-0e7955b524f7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 10:54:09 crc kubenswrapper[4748]: I0320 10:54:09.185003 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"a5a9b3e3-3a44-4765-ab5b-0e7955b524f7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 10:54:09 crc kubenswrapper[4748]: I0320 10:54:09.185040 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a5a9b3e3-3a44-4765-ab5b-0e7955b524f7-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"a5a9b3e3-3a44-4765-ab5b-0e7955b524f7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 10:54:09 crc kubenswrapper[4748]: I0320 10:54:09.185072 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a5a9b3e3-3a44-4765-ab5b-0e7955b524f7-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"a5a9b3e3-3a44-4765-ab5b-0e7955b524f7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 10:54:09 crc kubenswrapper[4748]: I0320 10:54:09.185100 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a5a9b3e3-3a44-4765-ab5b-0e7955b524f7-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"a5a9b3e3-3a44-4765-ab5b-0e7955b524f7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 10:54:09 crc kubenswrapper[4748]: I0320 10:54:09.185120 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vmt7j\" (UniqueName: \"kubernetes.io/projected/a5a9b3e3-3a44-4765-ab5b-0e7955b524f7-kube-api-access-vmt7j\") pod \"rabbitmq-cell1-server-0\" (UID: \"a5a9b3e3-3a44-4765-ab5b-0e7955b524f7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 10:54:09 crc kubenswrapper[4748]: I0320 10:54:09.185142 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a5a9b3e3-3a44-4765-ab5b-0e7955b524f7-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"a5a9b3e3-3a44-4765-ab5b-0e7955b524f7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 10:54:09 crc kubenswrapper[4748]: I0320 10:54:09.185167 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a5a9b3e3-3a44-4765-ab5b-0e7955b524f7-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"a5a9b3e3-3a44-4765-ab5b-0e7955b524f7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 10:54:09 crc kubenswrapper[4748]: I0320 10:54:09.185189 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a5a9b3e3-3a44-4765-ab5b-0e7955b524f7-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"a5a9b3e3-3a44-4765-ab5b-0e7955b524f7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 10:54:09 crc kubenswrapper[4748]: I0320 10:54:09.186098 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a5a9b3e3-3a44-4765-ab5b-0e7955b524f7-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"a5a9b3e3-3a44-4765-ab5b-0e7955b524f7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 10:54:09 crc kubenswrapper[4748]: I0320 10:54:09.186318 4748 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"a5a9b3e3-3a44-4765-ab5b-0e7955b524f7\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/rabbitmq-cell1-server-0" Mar 20 10:54:09 crc kubenswrapper[4748]: I0320 10:54:09.187340 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a5a9b3e3-3a44-4765-ab5b-0e7955b524f7-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a5a9b3e3-3a44-4765-ab5b-0e7955b524f7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 10:54:09 crc kubenswrapper[4748]: I0320 10:54:09.191465 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a5a9b3e3-3a44-4765-ab5b-0e7955b524f7-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"a5a9b3e3-3a44-4765-ab5b-0e7955b524f7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 10:54:09 crc kubenswrapper[4748]: I0320 10:54:09.194749 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a5a9b3e3-3a44-4765-ab5b-0e7955b524f7-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"a5a9b3e3-3a44-4765-ab5b-0e7955b524f7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 10:54:09 crc kubenswrapper[4748]: I0320 10:54:09.196090 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a5a9b3e3-3a44-4765-ab5b-0e7955b524f7-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"a5a9b3e3-3a44-4765-ab5b-0e7955b524f7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 10:54:09 crc kubenswrapper[4748]: I0320 10:54:09.199368 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a5a9b3e3-3a44-4765-ab5b-0e7955b524f7-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"a5a9b3e3-3a44-4765-ab5b-0e7955b524f7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 10:54:09 crc kubenswrapper[4748]: I0320 10:54:09.201908 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a5a9b3e3-3a44-4765-ab5b-0e7955b524f7-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"a5a9b3e3-3a44-4765-ab5b-0e7955b524f7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 10:54:09 crc kubenswrapper[4748]: I0320 10:54:09.202342 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a5a9b3e3-3a44-4765-ab5b-0e7955b524f7-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"a5a9b3e3-3a44-4765-ab5b-0e7955b524f7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 10:54:09 crc kubenswrapper[4748]: I0320 10:54:09.214881 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmt7j\" (UniqueName: \"kubernetes.io/projected/a5a9b3e3-3a44-4765-ab5b-0e7955b524f7-kube-api-access-vmt7j\") pod \"rabbitmq-cell1-server-0\" (UID: \"a5a9b3e3-3a44-4765-ab5b-0e7955b524f7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 10:54:09 crc kubenswrapper[4748]: I0320 10:54:09.224925 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a5a9b3e3-3a44-4765-ab5b-0e7955b524f7-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a5a9b3e3-3a44-4765-ab5b-0e7955b524f7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 10:54:09 crc kubenswrapper[4748]: I0320 10:54:09.234216 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"a5a9b3e3-3a44-4765-ab5b-0e7955b524f7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 10:54:09 crc kubenswrapper[4748]: I0320 10:54:09.274123 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 20 10:54:09 crc kubenswrapper[4748]: I0320 10:54:09.369182 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-qq2xk" event={"ID":"0f69b3ab-d71e-44be-b0b0-fa830eb8756a","Type":"ContainerStarted","Data":"5793222b6b3cf464979eb7630466ac7e2dd9a5c51cf017ea06271d2b3a772977"} Mar 20 10:54:09 crc kubenswrapper[4748]: I0320 10:54:09.475269 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 20 10:54:09 crc kubenswrapper[4748]: I0320 10:54:09.479171 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 20 10:54:09 crc kubenswrapper[4748]: I0320 10:54:09.482634 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 20 10:54:09 crc kubenswrapper[4748]: I0320 10:54:09.483417 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 20 10:54:09 crc kubenswrapper[4748]: I0320 10:54:09.484513 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 20 10:54:09 crc kubenswrapper[4748]: I0320 10:54:09.485802 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 20 10:54:09 crc kubenswrapper[4748]: I0320 10:54:09.485952 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-kzfkz" Mar 20 10:54:09 crc kubenswrapper[4748]: I0320 10:54:09.485986 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 20 10:54:09 crc kubenswrapper[4748]: I0320 10:54:09.486235 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 20 10:54:09 crc kubenswrapper[4748]: I0320 10:54:09.487268 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 20 10:54:09 crc kubenswrapper[4748]: I0320 10:54:09.547790 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-dv6gr"] Mar 20 10:54:09 crc kubenswrapper[4748]: W0320 10:54:09.570972 4748 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc0179e70_7de5_4413_868b_8d442ec891e5.slice/crio-242bb20379b17471767cb9141d1623f2c76987d3ff4af3d38d6655d440a59abc WatchSource:0}: Error finding container 242bb20379b17471767cb9141d1623f2c76987d3ff4af3d38d6655d440a59abc: Status 404 returned error can't find the container with id 242bb20379b17471767cb9141d1623f2c76987d3ff4af3d38d6655d440a59abc Mar 20 10:54:09 crc kubenswrapper[4748]: I0320 10:54:09.581097 4748 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 10:54:09 crc kubenswrapper[4748]: I0320 10:54:09.627614 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c9362889-0195-4aad-96bd-ed63db88da83-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"c9362889-0195-4aad-96bd-ed63db88da83\") " pod="openstack/rabbitmq-server-0" Mar 20 10:54:09 crc kubenswrapper[4748]: I0320 10:54:09.628017 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-server-0\" (UID: \"c9362889-0195-4aad-96bd-ed63db88da83\") " pod="openstack/rabbitmq-server-0" Mar 20 10:54:09 crc kubenswrapper[4748]: I0320 10:54:09.628069 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c9362889-0195-4aad-96bd-ed63db88da83-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"c9362889-0195-4aad-96bd-ed63db88da83\") " pod="openstack/rabbitmq-server-0" Mar 20 10:54:09 crc kubenswrapper[4748]: I0320 10:54:09.628096 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c9362889-0195-4aad-96bd-ed63db88da83-config-data\") pod \"rabbitmq-server-0\" (UID: \"c9362889-0195-4aad-96bd-ed63db88da83\") " pod="openstack/rabbitmq-server-0" Mar 20 10:54:09 crc kubenswrapper[4748]: I0320 10:54:09.628126 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c9362889-0195-4aad-96bd-ed63db88da83-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"c9362889-0195-4aad-96bd-ed63db88da83\") " pod="openstack/rabbitmq-server-0" Mar 20 10:54:09 crc kubenswrapper[4748]: I0320 10:54:09.628152 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c9362889-0195-4aad-96bd-ed63db88da83-pod-info\") pod \"rabbitmq-server-0\" (UID: \"c9362889-0195-4aad-96bd-ed63db88da83\") " pod="openstack/rabbitmq-server-0" Mar 20 10:54:09 crc kubenswrapper[4748]: I0320 10:54:09.628190 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c9362889-0195-4aad-96bd-ed63db88da83-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"c9362889-0195-4aad-96bd-ed63db88da83\") " pod="openstack/rabbitmq-server-0" Mar 20 10:54:09 crc kubenswrapper[4748]: I0320 10:54:09.628223 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c9362889-0195-4aad-96bd-ed63db88da83-server-conf\") pod \"rabbitmq-server-0\" (UID: \"c9362889-0195-4aad-96bd-ed63db88da83\") " pod="openstack/rabbitmq-server-0" Mar 20 10:54:09 crc kubenswrapper[4748]: I0320 10:54:09.628248 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4d4s\" (UniqueName: \"kubernetes.io/projected/c9362889-0195-4aad-96bd-ed63db88da83-kube-api-access-w4d4s\") pod \"rabbitmq-server-0\" (UID: \"c9362889-0195-4aad-96bd-ed63db88da83\") " pod="openstack/rabbitmq-server-0" Mar 20 10:54:09 crc kubenswrapper[4748]: I0320 10:54:09.628274 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c9362889-0195-4aad-96bd-ed63db88da83-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"c9362889-0195-4aad-96bd-ed63db88da83\") " pod="openstack/rabbitmq-server-0" Mar 20 10:54:09 crc kubenswrapper[4748]: I0320 10:54:09.628291 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c9362889-0195-4aad-96bd-ed63db88da83-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"c9362889-0195-4aad-96bd-ed63db88da83\") " pod="openstack/rabbitmq-server-0" Mar 20 10:54:09 crc kubenswrapper[4748]: I0320 10:54:09.732278 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c9362889-0195-4aad-96bd-ed63db88da83-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"c9362889-0195-4aad-96bd-ed63db88da83\") " pod="openstack/rabbitmq-server-0" Mar 20 10:54:09 crc kubenswrapper[4748]: I0320 10:54:09.732341 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c9362889-0195-4aad-96bd-ed63db88da83-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"c9362889-0195-4aad-96bd-ed63db88da83\") " pod="openstack/rabbitmq-server-0" Mar 20 10:54:09 crc kubenswrapper[4748]: I0320 10:54:09.732391 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c9362889-0195-4aad-96bd-ed63db88da83-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"c9362889-0195-4aad-96bd-ed63db88da83\") " pod="openstack/rabbitmq-server-0" Mar 20 10:54:09 crc kubenswrapper[4748]: I0320 10:54:09.732448 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-server-0\" (UID: \"c9362889-0195-4aad-96bd-ed63db88da83\") " pod="openstack/rabbitmq-server-0" Mar 20 10:54:09 crc kubenswrapper[4748]: I0320 10:54:09.732496 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c9362889-0195-4aad-96bd-ed63db88da83-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"c9362889-0195-4aad-96bd-ed63db88da83\") " pod="openstack/rabbitmq-server-0" Mar 20 10:54:09 crc kubenswrapper[4748]: I0320 10:54:09.732548 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c9362889-0195-4aad-96bd-ed63db88da83-config-data\") pod \"rabbitmq-server-0\" (UID: \"c9362889-0195-4aad-96bd-ed63db88da83\") " pod="openstack/rabbitmq-server-0" Mar 20 10:54:09 crc kubenswrapper[4748]: I0320 10:54:09.732582 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c9362889-0195-4aad-96bd-ed63db88da83-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"c9362889-0195-4aad-96bd-ed63db88da83\") " pod="openstack/rabbitmq-server-0" Mar 20 10:54:09 crc kubenswrapper[4748]: I0320 10:54:09.732621 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c9362889-0195-4aad-96bd-ed63db88da83-pod-info\") pod \"rabbitmq-server-0\" (UID: \"c9362889-0195-4aad-96bd-ed63db88da83\") " pod="openstack/rabbitmq-server-0" Mar 20 10:54:09 crc kubenswrapper[4748]: I0320 10:54:09.732724 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c9362889-0195-4aad-96bd-ed63db88da83-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"c9362889-0195-4aad-96bd-ed63db88da83\") " pod="openstack/rabbitmq-server-0" Mar 20 10:54:09 crc kubenswrapper[4748]: I0320 10:54:09.732783 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c9362889-0195-4aad-96bd-ed63db88da83-server-conf\") pod \"rabbitmq-server-0\" (UID: \"c9362889-0195-4aad-96bd-ed63db88da83\") " pod="openstack/rabbitmq-server-0" Mar 20 10:54:09 crc kubenswrapper[4748]: I0320 10:54:09.732827 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4d4s\" (UniqueName: \"kubernetes.io/projected/c9362889-0195-4aad-96bd-ed63db88da83-kube-api-access-w4d4s\") pod \"rabbitmq-server-0\" (UID: \"c9362889-0195-4aad-96bd-ed63db88da83\") " pod="openstack/rabbitmq-server-0" Mar 20 10:54:09 crc kubenswrapper[4748]: I0320 10:54:09.734927 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c9362889-0195-4aad-96bd-ed63db88da83-config-data\") pod \"rabbitmq-server-0\" (UID: \"c9362889-0195-4aad-96bd-ed63db88da83\") " pod="openstack/rabbitmq-server-0" Mar 20 10:54:09 crc kubenswrapper[4748]: I0320 10:54:09.738731 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c9362889-0195-4aad-96bd-ed63db88da83-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"c9362889-0195-4aad-96bd-ed63db88da83\") " pod="openstack/rabbitmq-server-0" Mar 20 10:54:09 crc kubenswrapper[4748]: I0320 10:54:09.739073 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c9362889-0195-4aad-96bd-ed63db88da83-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"c9362889-0195-4aad-96bd-ed63db88da83\") " pod="openstack/rabbitmq-server-0" Mar 20 10:54:09 crc kubenswrapper[4748]: I0320 10:54:09.741600 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c9362889-0195-4aad-96bd-ed63db88da83-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"c9362889-0195-4aad-96bd-ed63db88da83\") " pod="openstack/rabbitmq-server-0" Mar 20 10:54:09 crc kubenswrapper[4748]: I0320 10:54:09.741965 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c9362889-0195-4aad-96bd-ed63db88da83-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"c9362889-0195-4aad-96bd-ed63db88da83\") " pod="openstack/rabbitmq-server-0" Mar 20 10:54:09 crc kubenswrapper[4748]: I0320 10:54:09.742368 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c9362889-0195-4aad-96bd-ed63db88da83-pod-info\") pod \"rabbitmq-server-0\" (UID: \"c9362889-0195-4aad-96bd-ed63db88da83\") " pod="openstack/rabbitmq-server-0" Mar 20 10:54:09 crc kubenswrapper[4748]: I0320 10:54:09.743002 4748 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-server-0\" (UID: \"c9362889-0195-4aad-96bd-ed63db88da83\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/rabbitmq-server-0" Mar 20 10:54:09 crc kubenswrapper[4748]: I0320 10:54:09.743073 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c9362889-0195-4aad-96bd-ed63db88da83-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"c9362889-0195-4aad-96bd-ed63db88da83\") " pod="openstack/rabbitmq-server-0" Mar 20 10:54:09 crc kubenswrapper[4748]: I0320 10:54:09.745391 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c9362889-0195-4aad-96bd-ed63db88da83-server-conf\") pod \"rabbitmq-server-0\" (UID: \"c9362889-0195-4aad-96bd-ed63db88da83\") " pod="openstack/rabbitmq-server-0" Mar 20 10:54:09 crc kubenswrapper[4748]: I0320 10:54:09.763746 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c9362889-0195-4aad-96bd-ed63db88da83-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"c9362889-0195-4aad-96bd-ed63db88da83\") " pod="openstack/rabbitmq-server-0" Mar 20 10:54:09 crc kubenswrapper[4748]: I0320 10:54:09.769922 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4d4s\" (UniqueName: \"kubernetes.io/projected/c9362889-0195-4aad-96bd-ed63db88da83-kube-api-access-w4d4s\") pod \"rabbitmq-server-0\" (UID: \"c9362889-0195-4aad-96bd-ed63db88da83\") " pod="openstack/rabbitmq-server-0" Mar 20 10:54:09 crc kubenswrapper[4748]: I0320 10:54:09.774199 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-server-0\" (UID: \"c9362889-0195-4aad-96bd-ed63db88da83\") " pod="openstack/rabbitmq-server-0" Mar 20 10:54:09 crc kubenswrapper[4748]: I0320 10:54:09.812802 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 20 10:54:09 crc kubenswrapper[4748]: I0320 10:54:09.979434 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 20 10:54:10 crc kubenswrapper[4748]: W0320 10:54:10.048036 4748 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda5a9b3e3_3a44_4765_ab5b_0e7955b524f7.slice/crio-33d77a2ba8801769f0bbcc59b4a07c93ff3300d311223f5055ef01fe17c63a03 WatchSource:0}: Error finding container 33d77a2ba8801769f0bbcc59b4a07c93ff3300d311223f5055ef01fe17c63a03: Status 404 returned error can't find the container with id 33d77a2ba8801769f0bbcc59b4a07c93ff3300d311223f5055ef01fe17c63a03 Mar 20 10:54:10 crc kubenswrapper[4748]: I0320 10:54:10.140657 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Mar 20 10:54:10 crc kubenswrapper[4748]: I0320 10:54:10.143611 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 20 10:54:10 crc kubenswrapper[4748]: I0320 10:54:10.147929 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Mar 20 10:54:10 crc kubenswrapper[4748]: I0320 10:54:10.147953 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Mar 20 10:54:10 crc kubenswrapper[4748]: I0320 10:54:10.148247 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-qvmlz" Mar 20 10:54:10 crc kubenswrapper[4748]: I0320 10:54:10.156542 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 20 10:54:10 crc kubenswrapper[4748]: I0320 10:54:10.162429 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Mar 20 10:54:10 crc kubenswrapper[4748]: I0320 10:54:10.168855 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Mar 20 10:54:10 crc kubenswrapper[4748]: I0320 10:54:10.262212 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-galera-0\" (UID: \"74743039-97e4-46cf-8fbf-183c8c11ca20\") " pod="openstack/openstack-galera-0" Mar 20 10:54:10 crc kubenswrapper[4748]: I0320 10:54:10.262317 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74743039-97e4-46cf-8fbf-183c8c11ca20-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"74743039-97e4-46cf-8fbf-183c8c11ca20\") " pod="openstack/openstack-galera-0" Mar 20 10:54:10 crc kubenswrapper[4748]: I0320 10:54:10.262355 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hmvk\" (UniqueName: \"kubernetes.io/projected/74743039-97e4-46cf-8fbf-183c8c11ca20-kube-api-access-5hmvk\") pod \"openstack-galera-0\" (UID: \"74743039-97e4-46cf-8fbf-183c8c11ca20\") " pod="openstack/openstack-galera-0" Mar 20 10:54:10 crc kubenswrapper[4748]: I0320 10:54:10.262408 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/74743039-97e4-46cf-8fbf-183c8c11ca20-operator-scripts\") pod \"openstack-galera-0\" (UID: \"74743039-97e4-46cf-8fbf-183c8c11ca20\") " pod="openstack/openstack-galera-0" Mar 20 10:54:10 crc kubenswrapper[4748]: I0320 10:54:10.262438 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/74743039-97e4-46cf-8fbf-183c8c11ca20-config-data-default\") pod \"openstack-galera-0\" (UID: \"74743039-97e4-46cf-8fbf-183c8c11ca20\") " pod="openstack/openstack-galera-0" Mar 20 10:54:10 crc kubenswrapper[4748]: I0320 10:54:10.262480 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/74743039-97e4-46cf-8fbf-183c8c11ca20-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"74743039-97e4-46cf-8fbf-183c8c11ca20\") " pod="openstack/openstack-galera-0" Mar 20 10:54:10 crc kubenswrapper[4748]: I0320 10:54:10.262506 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/74743039-97e4-46cf-8fbf-183c8c11ca20-kolla-config\") pod \"openstack-galera-0\" (UID: \"74743039-97e4-46cf-8fbf-183c8c11ca20\") " pod="openstack/openstack-galera-0" Mar 20 10:54:10 crc kubenswrapper[4748]: I0320 10:54:10.262554 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/74743039-97e4-46cf-8fbf-183c8c11ca20-config-data-generated\") pod \"openstack-galera-0\" (UID: \"74743039-97e4-46cf-8fbf-183c8c11ca20\") " pod="openstack/openstack-galera-0" Mar 20 10:54:10 crc kubenswrapper[4748]: I0320 10:54:10.365110 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/74743039-97e4-46cf-8fbf-183c8c11ca20-operator-scripts\") pod \"openstack-galera-0\" (UID: \"74743039-97e4-46cf-8fbf-183c8c11ca20\") " pod="openstack/openstack-galera-0" Mar 20 10:54:10 crc kubenswrapper[4748]: I0320 10:54:10.365195 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/74743039-97e4-46cf-8fbf-183c8c11ca20-config-data-default\") pod \"openstack-galera-0\" (UID: \"74743039-97e4-46cf-8fbf-183c8c11ca20\") " pod="openstack/openstack-galera-0" Mar 20 10:54:10 crc kubenswrapper[4748]: I0320 10:54:10.365233 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/74743039-97e4-46cf-8fbf-183c8c11ca20-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"74743039-97e4-46cf-8fbf-183c8c11ca20\") " pod="openstack/openstack-galera-0" Mar 20 10:54:10 crc kubenswrapper[4748]: I0320 10:54:10.365258 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/74743039-97e4-46cf-8fbf-183c8c11ca20-kolla-config\") pod \"openstack-galera-0\" (UID: \"74743039-97e4-46cf-8fbf-183c8c11ca20\") " pod="openstack/openstack-galera-0" Mar 20 10:54:10 crc kubenswrapper[4748]: I0320 10:54:10.365302 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/74743039-97e4-46cf-8fbf-183c8c11ca20-config-data-generated\") pod \"openstack-galera-0\" (UID: \"74743039-97e4-46cf-8fbf-183c8c11ca20\") " pod="openstack/openstack-galera-0" Mar 20 10:54:10 crc kubenswrapper[4748]: I0320 10:54:10.365347 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-galera-0\" (UID: \"74743039-97e4-46cf-8fbf-183c8c11ca20\") " pod="openstack/openstack-galera-0" Mar 20 10:54:10 crc kubenswrapper[4748]: I0320 10:54:10.365419 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74743039-97e4-46cf-8fbf-183c8c11ca20-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"74743039-97e4-46cf-8fbf-183c8c11ca20\") " pod="openstack/openstack-galera-0" Mar 20 10:54:10 crc kubenswrapper[4748]: I0320 10:54:10.365440 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5hmvk\" (UniqueName: \"kubernetes.io/projected/74743039-97e4-46cf-8fbf-183c8c11ca20-kube-api-access-5hmvk\") pod \"openstack-galera-0\" (UID: \"74743039-97e4-46cf-8fbf-183c8c11ca20\") " pod="openstack/openstack-galera-0" Mar 20 10:54:10 crc kubenswrapper[4748]: I0320 10:54:10.367602 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/74743039-97e4-46cf-8fbf-183c8c11ca20-config-data-default\") pod \"openstack-galera-0\" (UID: \"74743039-97e4-46cf-8fbf-183c8c11ca20\") " pod="openstack/openstack-galera-0" Mar 20 10:54:10 crc kubenswrapper[4748]: I0320 10:54:10.368126 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/74743039-97e4-46cf-8fbf-183c8c11ca20-operator-scripts\") pod \"openstack-galera-0\" (UID: \"74743039-97e4-46cf-8fbf-183c8c11ca20\") " pod="openstack/openstack-galera-0" Mar 20 10:54:10 crc kubenswrapper[4748]: I0320 10:54:10.368601 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/74743039-97e4-46cf-8fbf-183c8c11ca20-config-data-generated\") pod \"openstack-galera-0\" (UID: \"74743039-97e4-46cf-8fbf-183c8c11ca20\") " pod="openstack/openstack-galera-0" Mar 20 10:54:10 crc kubenswrapper[4748]: I0320 10:54:10.369748 4748 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-galera-0\" (UID: \"74743039-97e4-46cf-8fbf-183c8c11ca20\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/openstack-galera-0" Mar 20 10:54:10 crc kubenswrapper[4748]: I0320 10:54:10.374925 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/74743039-97e4-46cf-8fbf-183c8c11ca20-kolla-config\") pod \"openstack-galera-0\" (UID: \"74743039-97e4-46cf-8fbf-183c8c11ca20\") " pod="openstack/openstack-galera-0" Mar 20 10:54:10 crc kubenswrapper[4748]: I0320 10:54:10.439120 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/74743039-97e4-46cf-8fbf-183c8c11ca20-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"74743039-97e4-46cf-8fbf-183c8c11ca20\") " pod="openstack/openstack-galera-0" Mar 20 10:54:10 crc kubenswrapper[4748]: I0320 10:54:10.439822 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74743039-97e4-46cf-8fbf-183c8c11ca20-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"74743039-97e4-46cf-8fbf-183c8c11ca20\") " pod="openstack/openstack-galera-0" Mar 20 10:54:10 crc kubenswrapper[4748]: I0320 10:54:10.487777 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hmvk\" (UniqueName: \"kubernetes.io/projected/74743039-97e4-46cf-8fbf-183c8c11ca20-kube-api-access-5hmvk\") pod \"openstack-galera-0\" (UID: \"74743039-97e4-46cf-8fbf-183c8c11ca20\") " pod="openstack/openstack-galera-0" Mar 20 10:54:10 crc kubenswrapper[4748]: I0320 10:54:10.517921 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-galera-0\" (UID: \"74743039-97e4-46cf-8fbf-183c8c11ca20\") " pod="openstack/openstack-galera-0" Mar 20 10:54:10 crc kubenswrapper[4748]: I0320 10:54:10.584229 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-dv6gr" event={"ID":"c0179e70-7de5-4413-868b-8d442ec891e5","Type":"ContainerStarted","Data":"242bb20379b17471767cb9141d1623f2c76987d3ff4af3d38d6655d440a59abc"} Mar 20 10:54:10 crc kubenswrapper[4748]: I0320 10:54:10.613199 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a5a9b3e3-3a44-4765-ab5b-0e7955b524f7","Type":"ContainerStarted","Data":"33d77a2ba8801769f0bbcc59b4a07c93ff3300d311223f5055ef01fe17c63a03"} Mar 20 10:54:10 crc kubenswrapper[4748]: I0320 10:54:10.707523 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 20 10:54:10 crc kubenswrapper[4748]: W0320 10:54:10.722749 4748 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc9362889_0195_4aad_96bd_ed63db88da83.slice/crio-6ddb29d78aaa1efa0d79ef7d996e0d06f7985af2a78b3e4a1947ce3b405d8608 WatchSource:0}: Error finding container 6ddb29d78aaa1efa0d79ef7d996e0d06f7985af2a78b3e4a1947ce3b405d8608: Status 404 returned error can't find the container with id 6ddb29d78aaa1efa0d79ef7d996e0d06f7985af2a78b3e4a1947ce3b405d8608 Mar 20 10:54:10 crc kubenswrapper[4748]: I0320 10:54:10.785738 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 20 10:54:11 crc kubenswrapper[4748]: I0320 10:54:11.555187 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 20 10:54:11 crc kubenswrapper[4748]: W0320 10:54:11.630281 4748 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod74743039_97e4_46cf_8fbf_183c8c11ca20.slice/crio-cb85f32c86982fde261f6db679b48b13037254aece360a2c9bed61593fc7d873 WatchSource:0}: Error finding container cb85f32c86982fde261f6db679b48b13037254aece360a2c9bed61593fc7d873: Status 404 returned error can't find the container with id cb85f32c86982fde261f6db679b48b13037254aece360a2c9bed61593fc7d873 Mar 20 10:54:11 crc kubenswrapper[4748]: I0320 10:54:11.711714 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c9362889-0195-4aad-96bd-ed63db88da83","Type":"ContainerStarted","Data":"6ddb29d78aaa1efa0d79ef7d996e0d06f7985af2a78b3e4a1947ce3b405d8608"} Mar 20 10:54:11 crc kubenswrapper[4748]: I0320 10:54:11.720120 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 20 10:54:11 crc kubenswrapper[4748]: I0320 10:54:11.741665 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 20 10:54:11 crc kubenswrapper[4748]: I0320 10:54:11.746775 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Mar 20 10:54:11 crc kubenswrapper[4748]: I0320 10:54:11.748038 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Mar 20 10:54:11 crc kubenswrapper[4748]: I0320 10:54:11.748097 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Mar 20 10:54:11 crc kubenswrapper[4748]: I0320 10:54:11.748210 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Mar 20 10:54:11 crc kubenswrapper[4748]: I0320 10:54:11.748231 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-l8nps" Mar 20 10:54:11 crc kubenswrapper[4748]: I0320 10:54:11.749925 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 20 10:54:11 crc kubenswrapper[4748]: I0320 10:54:11.754529 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Mar 20 10:54:11 crc kubenswrapper[4748]: I0320 10:54:11.754555 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-dsdvc" Mar 20 10:54:11 crc kubenswrapper[4748]: I0320 10:54:11.754914 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Mar 20 10:54:11 crc kubenswrapper[4748]: I0320 10:54:11.758864 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 20 10:54:11 crc kubenswrapper[4748]: I0320 10:54:11.830002 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 20 10:54:11 crc kubenswrapper[4748]: I0320 10:54:11.838082 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/ffd19d53-385f-45a9-a222-caa7fbf6545e-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"ffd19d53-385f-45a9-a222-caa7fbf6545e\") " pod="openstack/openstack-cell1-galera-0" Mar 20 10:54:11 crc kubenswrapper[4748]: I0320 10:54:11.838161 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/27339553-c013-4538-9a4d-5bbd249c197c-kolla-config\") pod \"memcached-0\" (UID: \"27339553-c013-4538-9a4d-5bbd249c197c\") " pod="openstack/memcached-0" Mar 20 10:54:11 crc kubenswrapper[4748]: I0320 10:54:11.838201 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbsfn\" (UniqueName: \"kubernetes.io/projected/27339553-c013-4538-9a4d-5bbd249c197c-kube-api-access-vbsfn\") pod \"memcached-0\" (UID: \"27339553-c013-4538-9a4d-5bbd249c197c\") " pod="openstack/memcached-0" Mar 20 10:54:11 crc kubenswrapper[4748]: I0320 10:54:11.838244 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/ffd19d53-385f-45a9-a222-caa7fbf6545e-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"ffd19d53-385f-45a9-a222-caa7fbf6545e\") " pod="openstack/openstack-cell1-galera-0" Mar 20 10:54:11 crc kubenswrapper[4748]: I0320 10:54:11.838281 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9xtv\" (UniqueName: \"kubernetes.io/projected/ffd19d53-385f-45a9-a222-caa7fbf6545e-kube-api-access-m9xtv\") pod \"openstack-cell1-galera-0\" (UID: \"ffd19d53-385f-45a9-a222-caa7fbf6545e\") " pod="openstack/openstack-cell1-galera-0" Mar 20 10:54:11 crc kubenswrapper[4748]: I0320 10:54:11.838327 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27339553-c013-4538-9a4d-5bbd249c197c-combined-ca-bundle\") pod \"memcached-0\" (UID: \"27339553-c013-4538-9a4d-5bbd249c197c\") " pod="openstack/memcached-0" Mar 20 10:54:11 crc kubenswrapper[4748]: I0320 10:54:11.838365 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ffd19d53-385f-45a9-a222-caa7fbf6545e-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"ffd19d53-385f-45a9-a222-caa7fbf6545e\") " pod="openstack/openstack-cell1-galera-0" Mar 20 10:54:11 crc kubenswrapper[4748]: I0320 10:54:11.838422 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffd19d53-385f-45a9-a222-caa7fbf6545e-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"ffd19d53-385f-45a9-a222-caa7fbf6545e\") " pod="openstack/openstack-cell1-galera-0" Mar 20 10:54:11 crc kubenswrapper[4748]: I0320 10:54:11.838450 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ffd19d53-385f-45a9-a222-caa7fbf6545e-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"ffd19d53-385f-45a9-a222-caa7fbf6545e\") " pod="openstack/openstack-cell1-galera-0" Mar 20 10:54:11 crc kubenswrapper[4748]: I0320 10:54:11.838499 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/27339553-c013-4538-9a4d-5bbd249c197c-config-data\") pod \"memcached-0\" (UID: \"27339553-c013-4538-9a4d-5bbd249c197c\") " pod="openstack/memcached-0" Mar 20 10:54:11 crc kubenswrapper[4748]: I0320 10:54:11.838542 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/27339553-c013-4538-9a4d-5bbd249c197c-memcached-tls-certs\") pod \"memcached-0\" (UID: \"27339553-c013-4538-9a4d-5bbd249c197c\") " pod="openstack/memcached-0" Mar 20 10:54:11 crc kubenswrapper[4748]: I0320 10:54:11.838581 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/ffd19d53-385f-45a9-a222-caa7fbf6545e-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"ffd19d53-385f-45a9-a222-caa7fbf6545e\") " pod="openstack/openstack-cell1-galera-0" Mar 20 10:54:11 crc kubenswrapper[4748]: I0320 10:54:11.838652 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-cell1-galera-0\" (UID: \"ffd19d53-385f-45a9-a222-caa7fbf6545e\") " pod="openstack/openstack-cell1-galera-0" Mar 20 10:54:11 crc kubenswrapper[4748]: I0320 10:54:11.941031 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/27339553-c013-4538-9a4d-5bbd249c197c-config-data\") pod \"memcached-0\" (UID: \"27339553-c013-4538-9a4d-5bbd249c197c\") " pod="openstack/memcached-0" Mar 20 10:54:11 crc kubenswrapper[4748]: I0320 10:54:11.941116 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/27339553-c013-4538-9a4d-5bbd249c197c-memcached-tls-certs\") pod \"memcached-0\" (UID: \"27339553-c013-4538-9a4d-5bbd249c197c\") " pod="openstack/memcached-0" Mar 20 10:54:11 crc kubenswrapper[4748]: I0320 10:54:11.941153 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/ffd19d53-385f-45a9-a222-caa7fbf6545e-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"ffd19d53-385f-45a9-a222-caa7fbf6545e\") " pod="openstack/openstack-cell1-galera-0" Mar 20 10:54:11 crc kubenswrapper[4748]: I0320 10:54:11.941205 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-cell1-galera-0\" (UID: \"ffd19d53-385f-45a9-a222-caa7fbf6545e\") " pod="openstack/openstack-cell1-galera-0" Mar 20 10:54:11 crc kubenswrapper[4748]: I0320 10:54:11.941299 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/ffd19d53-385f-45a9-a222-caa7fbf6545e-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"ffd19d53-385f-45a9-a222-caa7fbf6545e\") " pod="openstack/openstack-cell1-galera-0" Mar 20 10:54:11 crc kubenswrapper[4748]: I0320 10:54:11.941336 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/27339553-c013-4538-9a4d-5bbd249c197c-kolla-config\") pod \"memcached-0\" (UID: \"27339553-c013-4538-9a4d-5bbd249c197c\") " pod="openstack/memcached-0" Mar 20 10:54:11 crc kubenswrapper[4748]: I0320 10:54:11.942783 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/27339553-c013-4538-9a4d-5bbd249c197c-config-data\") pod \"memcached-0\" (UID: \"27339553-c013-4538-9a4d-5bbd249c197c\") " pod="openstack/memcached-0" Mar 20 10:54:11 crc kubenswrapper[4748]: I0320 10:54:11.943321 4748 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-cell1-galera-0\" (UID: \"ffd19d53-385f-45a9-a222-caa7fbf6545e\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/openstack-cell1-galera-0" Mar 20 10:54:11 crc kubenswrapper[4748]: I0320 10:54:11.944006 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vbsfn\" (UniqueName: \"kubernetes.io/projected/27339553-c013-4538-9a4d-5bbd249c197c-kube-api-access-vbsfn\") pod \"memcached-0\" (UID: \"27339553-c013-4538-9a4d-5bbd249c197c\") " pod="openstack/memcached-0" Mar 20 10:54:11 crc kubenswrapper[4748]: I0320 10:54:11.944124 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/ffd19d53-385f-45a9-a222-caa7fbf6545e-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"ffd19d53-385f-45a9-a222-caa7fbf6545e\") " pod="openstack/openstack-cell1-galera-0" Mar 20 10:54:11 crc kubenswrapper[4748]: I0320 10:54:11.944171 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m9xtv\" (UniqueName: \"kubernetes.io/projected/ffd19d53-385f-45a9-a222-caa7fbf6545e-kube-api-access-m9xtv\") pod \"openstack-cell1-galera-0\" (UID: \"ffd19d53-385f-45a9-a222-caa7fbf6545e\") " pod="openstack/openstack-cell1-galera-0" Mar 20 10:54:11 crc kubenswrapper[4748]: I0320 10:54:11.944282 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27339553-c013-4538-9a4d-5bbd249c197c-combined-ca-bundle\") pod \"memcached-0\" (UID: \"27339553-c013-4538-9a4d-5bbd249c197c\") " pod="openstack/memcached-0" Mar 20 10:54:11 crc kubenswrapper[4748]: I0320 10:54:11.944328 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ffd19d53-385f-45a9-a222-caa7fbf6545e-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"ffd19d53-385f-45a9-a222-caa7fbf6545e\") " pod="openstack/openstack-cell1-galera-0" Mar 20 10:54:11 crc kubenswrapper[4748]: I0320 10:54:11.944440 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffd19d53-385f-45a9-a222-caa7fbf6545e-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"ffd19d53-385f-45a9-a222-caa7fbf6545e\") " pod="openstack/openstack-cell1-galera-0" Mar 20 10:54:11 crc kubenswrapper[4748]: I0320 10:54:11.944480 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ffd19d53-385f-45a9-a222-caa7fbf6545e-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"ffd19d53-385f-45a9-a222-caa7fbf6545e\") " pod="openstack/openstack-cell1-galera-0" Mar 20 10:54:11 crc kubenswrapper[4748]: I0320 10:54:11.945559 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ffd19d53-385f-45a9-a222-caa7fbf6545e-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"ffd19d53-385f-45a9-a222-caa7fbf6545e\") " pod="openstack/openstack-cell1-galera-0" Mar 20 10:54:11 crc kubenswrapper[4748]: I0320 10:54:11.946349 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/ffd19d53-385f-45a9-a222-caa7fbf6545e-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"ffd19d53-385f-45a9-a222-caa7fbf6545e\") " pod="openstack/openstack-cell1-galera-0" Mar 20 10:54:11 crc kubenswrapper[4748]: I0320 10:54:11.946952 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/27339553-c013-4538-9a4d-5bbd249c197c-kolla-config\") pod \"memcached-0\" (UID: \"27339553-c013-4538-9a4d-5bbd249c197c\") " pod="openstack/memcached-0" Mar 20 10:54:11 crc kubenswrapper[4748]: I0320 10:54:11.947616 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/ffd19d53-385f-45a9-a222-caa7fbf6545e-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"ffd19d53-385f-45a9-a222-caa7fbf6545e\") " pod="openstack/openstack-cell1-galera-0" Mar 20 10:54:11 crc kubenswrapper[4748]: I0320 10:54:11.959769 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/27339553-c013-4538-9a4d-5bbd249c197c-memcached-tls-certs\") pod \"memcached-0\" (UID: \"27339553-c013-4538-9a4d-5bbd249c197c\") " pod="openstack/memcached-0" Mar 20 10:54:11 crc kubenswrapper[4748]: I0320 10:54:11.961967 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ffd19d53-385f-45a9-a222-caa7fbf6545e-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"ffd19d53-385f-45a9-a222-caa7fbf6545e\") " pod="openstack/openstack-cell1-galera-0" Mar 20 10:54:11 crc kubenswrapper[4748]: I0320 10:54:11.967009 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffd19d53-385f-45a9-a222-caa7fbf6545e-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"ffd19d53-385f-45a9-a222-caa7fbf6545e\") " pod="openstack/openstack-cell1-galera-0" Mar 20 10:54:11 crc kubenswrapper[4748]: I0320 10:54:11.969501 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/ffd19d53-385f-45a9-a222-caa7fbf6545e-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"ffd19d53-385f-45a9-a222-caa7fbf6545e\") " pod="openstack/openstack-cell1-galera-0" Mar 20 10:54:11 crc kubenswrapper[4748]: I0320 10:54:11.974342 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbsfn\" (UniqueName: \"kubernetes.io/projected/27339553-c013-4538-9a4d-5bbd249c197c-kube-api-access-vbsfn\") pod \"memcached-0\" (UID: \"27339553-c013-4538-9a4d-5bbd249c197c\") " pod="openstack/memcached-0" Mar 20 10:54:11 crc kubenswrapper[4748]: I0320 10:54:11.974493 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27339553-c013-4538-9a4d-5bbd249c197c-combined-ca-bundle\") pod \"memcached-0\" (UID: \"27339553-c013-4538-9a4d-5bbd249c197c\") " pod="openstack/memcached-0" Mar 20 10:54:11 crc kubenswrapper[4748]: I0320 10:54:11.974685 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9xtv\" (UniqueName: \"kubernetes.io/projected/ffd19d53-385f-45a9-a222-caa7fbf6545e-kube-api-access-m9xtv\") pod \"openstack-cell1-galera-0\" (UID: \"ffd19d53-385f-45a9-a222-caa7fbf6545e\") " pod="openstack/openstack-cell1-galera-0" Mar 20 10:54:11 crc kubenswrapper[4748]: I0320 10:54:11.986073 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-cell1-galera-0\" (UID: \"ffd19d53-385f-45a9-a222-caa7fbf6545e\") " pod="openstack/openstack-cell1-galera-0" Mar 20 10:54:12 crc kubenswrapper[4748]: I0320 10:54:12.099749 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 20 10:54:12 crc kubenswrapper[4748]: I0320 10:54:12.111360 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 20 10:54:12 crc kubenswrapper[4748]: I0320 10:54:12.774000 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"74743039-97e4-46cf-8fbf-183c8c11ca20","Type":"ContainerStarted","Data":"cb85f32c86982fde261f6db679b48b13037254aece360a2c9bed61593fc7d873"} Mar 20 10:54:12 crc kubenswrapper[4748]: I0320 10:54:12.928243 4748 patch_prober.go:28] interesting pod/machine-config-daemon-5lbvz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 10:54:12 crc kubenswrapper[4748]: I0320 10:54:12.929104 4748 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 10:54:13 crc kubenswrapper[4748]: I0320 10:54:13.196477 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 20 10:54:13 crc kubenswrapper[4748]: W0320 10:54:13.276305 4748 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod27339553_c013_4538_9a4d_5bbd249c197c.slice/crio-58c899938ec42745b9e06f9c6620babe7bbb48549d73cb1b1fe7a756107bc4c9 WatchSource:0}: Error finding container 58c899938ec42745b9e06f9c6620babe7bbb48549d73cb1b1fe7a756107bc4c9: Status 404 returned error can't find the container with id 58c899938ec42745b9e06f9c6620babe7bbb48549d73cb1b1fe7a756107bc4c9 Mar 20 10:54:13 crc kubenswrapper[4748]: I0320 10:54:13.303161 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 20 10:54:13 crc kubenswrapper[4748]: W0320 10:54:13.321409 4748 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podffd19d53_385f_45a9_a222_caa7fbf6545e.slice/crio-b19961f121df2f1b6320dbf3898580990456a470f5ddd04a2d315435471721d7 WatchSource:0}: Error finding container b19961f121df2f1b6320dbf3898580990456a470f5ddd04a2d315435471721d7: Status 404 returned error can't find the container with id b19961f121df2f1b6320dbf3898580990456a470f5ddd04a2d315435471721d7 Mar 20 10:54:13 crc kubenswrapper[4748]: I0320 10:54:13.807512 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"ffd19d53-385f-45a9-a222-caa7fbf6545e","Type":"ContainerStarted","Data":"b19961f121df2f1b6320dbf3898580990456a470f5ddd04a2d315435471721d7"} Mar 20 10:54:13 crc kubenswrapper[4748]: I0320 10:54:13.828313 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"27339553-c013-4538-9a4d-5bbd249c197c","Type":"ContainerStarted","Data":"58c899938ec42745b9e06f9c6620babe7bbb48549d73cb1b1fe7a756107bc4c9"} Mar 20 10:54:14 crc kubenswrapper[4748]: I0320 10:54:14.539564 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 10:54:14 crc kubenswrapper[4748]: I0320 10:54:14.541194 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 20 10:54:14 crc kubenswrapper[4748]: I0320 10:54:14.549105 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-9gpg2" Mar 20 10:54:14 crc kubenswrapper[4748]: I0320 10:54:14.565384 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 10:54:14 crc kubenswrapper[4748]: I0320 10:54:14.624757 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fsvcl\" (UniqueName: \"kubernetes.io/projected/cae37e8d-39b5-4045-aa76-b36630130555-kube-api-access-fsvcl\") pod \"kube-state-metrics-0\" (UID: \"cae37e8d-39b5-4045-aa76-b36630130555\") " pod="openstack/kube-state-metrics-0" Mar 20 10:54:14 crc kubenswrapper[4748]: I0320 10:54:14.739222 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fsvcl\" (UniqueName: \"kubernetes.io/projected/cae37e8d-39b5-4045-aa76-b36630130555-kube-api-access-fsvcl\") pod \"kube-state-metrics-0\" (UID: \"cae37e8d-39b5-4045-aa76-b36630130555\") " pod="openstack/kube-state-metrics-0" Mar 20 10:54:14 crc kubenswrapper[4748]: I0320 10:54:14.766571 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fsvcl\" (UniqueName: \"kubernetes.io/projected/cae37e8d-39b5-4045-aa76-b36630130555-kube-api-access-fsvcl\") pod \"kube-state-metrics-0\" (UID: \"cae37e8d-39b5-4045-aa76-b36630130555\") " pod="openstack/kube-state-metrics-0" Mar 20 10:54:14 crc kubenswrapper[4748]: I0320 10:54:14.886089 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 20 10:54:14 crc kubenswrapper[4748]: I0320 10:54:14.893635 4748 scope.go:117] "RemoveContainer" containerID="b089bfcde2b05951629c210ad4c13c707d17e323e4706365a2d8ab37cb31bfb4" Mar 20 10:54:15 crc kubenswrapper[4748]: I0320 10:54:15.952620 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 10:54:17 crc kubenswrapper[4748]: I0320 10:54:17.210044 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 20 10:54:17 crc kubenswrapper[4748]: I0320 10:54:17.212676 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 20 10:54:17 crc kubenswrapper[4748]: I0320 10:54:17.217251 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Mar 20 10:54:17 crc kubenswrapper[4748]: I0320 10:54:17.217533 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-tpswc" Mar 20 10:54:17 crc kubenswrapper[4748]: I0320 10:54:17.217738 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Mar 20 10:54:17 crc kubenswrapper[4748]: I0320 10:54:17.217901 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Mar 20 10:54:17 crc kubenswrapper[4748]: I0320 10:54:17.220755 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Mar 20 10:54:17 crc kubenswrapper[4748]: I0320 10:54:17.240581 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 20 10:54:17 crc kubenswrapper[4748]: I0320 10:54:17.362414 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-nb-0\" (UID: \"9ed1d8cb-39ee-4f07-98b6-61e1ea8777a2\") " pod="openstack/ovsdbserver-nb-0" Mar 20 10:54:17 crc kubenswrapper[4748]: I0320 10:54:17.362496 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9ed1d8cb-39ee-4f07-98b6-61e1ea8777a2-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"9ed1d8cb-39ee-4f07-98b6-61e1ea8777a2\") " pod="openstack/ovsdbserver-nb-0" Mar 20 10:54:17 crc kubenswrapper[4748]: I0320 10:54:17.362534 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ed1d8cb-39ee-4f07-98b6-61e1ea8777a2-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"9ed1d8cb-39ee-4f07-98b6-61e1ea8777a2\") " pod="openstack/ovsdbserver-nb-0" Mar 20 10:54:17 crc kubenswrapper[4748]: I0320 10:54:17.362562 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9ed1d8cb-39ee-4f07-98b6-61e1ea8777a2-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"9ed1d8cb-39ee-4f07-98b6-61e1ea8777a2\") " pod="openstack/ovsdbserver-nb-0" Mar 20 10:54:17 crc kubenswrapper[4748]: I0320 10:54:17.362599 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clxfc\" (UniqueName: \"kubernetes.io/projected/9ed1d8cb-39ee-4f07-98b6-61e1ea8777a2-kube-api-access-clxfc\") pod \"ovsdbserver-nb-0\" (UID: \"9ed1d8cb-39ee-4f07-98b6-61e1ea8777a2\") " pod="openstack/ovsdbserver-nb-0" Mar 20 10:54:17 crc kubenswrapper[4748]: I0320 10:54:17.362624 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9ed1d8cb-39ee-4f07-98b6-61e1ea8777a2-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"9ed1d8cb-39ee-4f07-98b6-61e1ea8777a2\") " pod="openstack/ovsdbserver-nb-0" Mar 20 10:54:17 crc kubenswrapper[4748]: I0320 10:54:17.362647 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9ed1d8cb-39ee-4f07-98b6-61e1ea8777a2-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"9ed1d8cb-39ee-4f07-98b6-61e1ea8777a2\") " pod="openstack/ovsdbserver-nb-0" Mar 20 10:54:17 crc kubenswrapper[4748]: I0320 10:54:17.362869 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ed1d8cb-39ee-4f07-98b6-61e1ea8777a2-config\") pod \"ovsdbserver-nb-0\" (UID: \"9ed1d8cb-39ee-4f07-98b6-61e1ea8777a2\") " pod="openstack/ovsdbserver-nb-0" Mar 20 10:54:17 crc kubenswrapper[4748]: I0320 10:54:17.465244 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-nb-0\" (UID: \"9ed1d8cb-39ee-4f07-98b6-61e1ea8777a2\") " pod="openstack/ovsdbserver-nb-0" Mar 20 10:54:17 crc kubenswrapper[4748]: I0320 10:54:17.465316 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9ed1d8cb-39ee-4f07-98b6-61e1ea8777a2-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"9ed1d8cb-39ee-4f07-98b6-61e1ea8777a2\") " pod="openstack/ovsdbserver-nb-0" Mar 20 10:54:17 crc kubenswrapper[4748]: I0320 10:54:17.465341 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ed1d8cb-39ee-4f07-98b6-61e1ea8777a2-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"9ed1d8cb-39ee-4f07-98b6-61e1ea8777a2\") " pod="openstack/ovsdbserver-nb-0" Mar 20 10:54:17 crc kubenswrapper[4748]: I0320 10:54:17.465365 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9ed1d8cb-39ee-4f07-98b6-61e1ea8777a2-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"9ed1d8cb-39ee-4f07-98b6-61e1ea8777a2\") " pod="openstack/ovsdbserver-nb-0" Mar 20 10:54:17 crc kubenswrapper[4748]: I0320 10:54:17.465398 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-clxfc\" (UniqueName: \"kubernetes.io/projected/9ed1d8cb-39ee-4f07-98b6-61e1ea8777a2-kube-api-access-clxfc\") pod \"ovsdbserver-nb-0\" (UID: \"9ed1d8cb-39ee-4f07-98b6-61e1ea8777a2\") " pod="openstack/ovsdbserver-nb-0" Mar 20 10:54:17 crc kubenswrapper[4748]: I0320 10:54:17.465418 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9ed1d8cb-39ee-4f07-98b6-61e1ea8777a2-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"9ed1d8cb-39ee-4f07-98b6-61e1ea8777a2\") " pod="openstack/ovsdbserver-nb-0" Mar 20 10:54:17 crc kubenswrapper[4748]: I0320 10:54:17.465437 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9ed1d8cb-39ee-4f07-98b6-61e1ea8777a2-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"9ed1d8cb-39ee-4f07-98b6-61e1ea8777a2\") " pod="openstack/ovsdbserver-nb-0" Mar 20 10:54:17 crc kubenswrapper[4748]: I0320 10:54:17.465490 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ed1d8cb-39ee-4f07-98b6-61e1ea8777a2-config\") pod \"ovsdbserver-nb-0\" (UID: \"9ed1d8cb-39ee-4f07-98b6-61e1ea8777a2\") " pod="openstack/ovsdbserver-nb-0" Mar 20 10:54:17 crc kubenswrapper[4748]: I0320 10:54:17.467134 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9ed1d8cb-39ee-4f07-98b6-61e1ea8777a2-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"9ed1d8cb-39ee-4f07-98b6-61e1ea8777a2\") " pod="openstack/ovsdbserver-nb-0" Mar 20 10:54:17 crc kubenswrapper[4748]: I0320 10:54:17.467604 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ed1d8cb-39ee-4f07-98b6-61e1ea8777a2-config\") pod \"ovsdbserver-nb-0\" (UID: \"9ed1d8cb-39ee-4f07-98b6-61e1ea8777a2\") " pod="openstack/ovsdbserver-nb-0" Mar 20 10:54:17 crc kubenswrapper[4748]: I0320 10:54:17.468119 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9ed1d8cb-39ee-4f07-98b6-61e1ea8777a2-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"9ed1d8cb-39ee-4f07-98b6-61e1ea8777a2\") " pod="openstack/ovsdbserver-nb-0" Mar 20 10:54:17 crc kubenswrapper[4748]: I0320 10:54:17.466821 4748 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-nb-0\" (UID: \"9ed1d8cb-39ee-4f07-98b6-61e1ea8777a2\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/ovsdbserver-nb-0" Mar 20 10:54:17 crc kubenswrapper[4748]: I0320 10:54:17.480297 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9ed1d8cb-39ee-4f07-98b6-61e1ea8777a2-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"9ed1d8cb-39ee-4f07-98b6-61e1ea8777a2\") " pod="openstack/ovsdbserver-nb-0" Mar 20 10:54:17 crc kubenswrapper[4748]: I0320 10:54:17.481722 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9ed1d8cb-39ee-4f07-98b6-61e1ea8777a2-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"9ed1d8cb-39ee-4f07-98b6-61e1ea8777a2\") " pod="openstack/ovsdbserver-nb-0" Mar 20 10:54:17 crc kubenswrapper[4748]: I0320 10:54:17.485926 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ed1d8cb-39ee-4f07-98b6-61e1ea8777a2-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"9ed1d8cb-39ee-4f07-98b6-61e1ea8777a2\") " pod="openstack/ovsdbserver-nb-0" Mar 20 10:54:17 crc kubenswrapper[4748]: I0320 10:54:17.488080 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-clxfc\" (UniqueName: \"kubernetes.io/projected/9ed1d8cb-39ee-4f07-98b6-61e1ea8777a2-kube-api-access-clxfc\") pod \"ovsdbserver-nb-0\" (UID: \"9ed1d8cb-39ee-4f07-98b6-61e1ea8777a2\") " pod="openstack/ovsdbserver-nb-0" Mar 20 10:54:17 crc kubenswrapper[4748]: I0320 10:54:17.502185 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-nb-0\" (UID: \"9ed1d8cb-39ee-4f07-98b6-61e1ea8777a2\") " pod="openstack/ovsdbserver-nb-0" Mar 20 10:54:17 crc kubenswrapper[4748]: I0320 10:54:17.574183 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 20 10:54:18 crc kubenswrapper[4748]: I0320 10:54:18.366418 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-bldp9"] Mar 20 10:54:18 crc kubenswrapper[4748]: I0320 10:54:18.368380 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-bldp9" Mar 20 10:54:18 crc kubenswrapper[4748]: I0320 10:54:18.372357 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-fvkvc" Mar 20 10:54:18 crc kubenswrapper[4748]: I0320 10:54:18.372631 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Mar 20 10:54:18 crc kubenswrapper[4748]: I0320 10:54:18.372866 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Mar 20 10:54:18 crc kubenswrapper[4748]: I0320 10:54:18.441251 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-bldp9"] Mar 20 10:54:18 crc kubenswrapper[4748]: I0320 10:54:18.495119 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-756zb"] Mar 20 10:54:18 crc kubenswrapper[4748]: I0320 10:54:18.497372 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-756zb" Mar 20 10:54:18 crc kubenswrapper[4748]: I0320 10:54:18.518151 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-756zb"] Mar 20 10:54:18 crc kubenswrapper[4748]: I0320 10:54:18.520333 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/2482f122-92d5-410c-b4c0-41834cea1711-var-log-ovn\") pod \"ovn-controller-bldp9\" (UID: \"2482f122-92d5-410c-b4c0-41834cea1711\") " pod="openstack/ovn-controller-bldp9" Mar 20 10:54:18 crc kubenswrapper[4748]: I0320 10:54:18.526959 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2482f122-92d5-410c-b4c0-41834cea1711-scripts\") pod \"ovn-controller-bldp9\" (UID: \"2482f122-92d5-410c-b4c0-41834cea1711\") " pod="openstack/ovn-controller-bldp9" Mar 20 10:54:18 crc kubenswrapper[4748]: I0320 10:54:18.527376 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77c7h\" (UniqueName: \"kubernetes.io/projected/2482f122-92d5-410c-b4c0-41834cea1711-kube-api-access-77c7h\") pod \"ovn-controller-bldp9\" (UID: \"2482f122-92d5-410c-b4c0-41834cea1711\") " pod="openstack/ovn-controller-bldp9" Mar 20 10:54:18 crc kubenswrapper[4748]: I0320 10:54:18.527560 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/2482f122-92d5-410c-b4c0-41834cea1711-var-run-ovn\") pod \"ovn-controller-bldp9\" (UID: \"2482f122-92d5-410c-b4c0-41834cea1711\") " pod="openstack/ovn-controller-bldp9" Mar 20 10:54:18 crc kubenswrapper[4748]: I0320 10:54:18.527807 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2482f122-92d5-410c-b4c0-41834cea1711-combined-ca-bundle\") pod \"ovn-controller-bldp9\" (UID: \"2482f122-92d5-410c-b4c0-41834cea1711\") " pod="openstack/ovn-controller-bldp9" Mar 20 10:54:18 crc kubenswrapper[4748]: I0320 10:54:18.528033 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/2482f122-92d5-410c-b4c0-41834cea1711-var-run\") pod \"ovn-controller-bldp9\" (UID: \"2482f122-92d5-410c-b4c0-41834cea1711\") " pod="openstack/ovn-controller-bldp9" Mar 20 10:54:18 crc kubenswrapper[4748]: I0320 10:54:18.528222 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/2482f122-92d5-410c-b4c0-41834cea1711-ovn-controller-tls-certs\") pod \"ovn-controller-bldp9\" (UID: \"2482f122-92d5-410c-b4c0-41834cea1711\") " pod="openstack/ovn-controller-bldp9" Mar 20 10:54:18 crc kubenswrapper[4748]: I0320 10:54:18.629780 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/2482f122-92d5-410c-b4c0-41834cea1711-ovn-controller-tls-certs\") pod \"ovn-controller-bldp9\" (UID: \"2482f122-92d5-410c-b4c0-41834cea1711\") " pod="openstack/ovn-controller-bldp9" Mar 20 10:54:18 crc kubenswrapper[4748]: I0320 10:54:18.629862 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxmzh\" (UniqueName: \"kubernetes.io/projected/334c1861-88b1-44e2-a02e-ad1dcecf2fc0-kube-api-access-fxmzh\") pod \"ovn-controller-ovs-756zb\" (UID: \"334c1861-88b1-44e2-a02e-ad1dcecf2fc0\") " pod="openstack/ovn-controller-ovs-756zb" Mar 20 10:54:18 crc kubenswrapper[4748]: I0320 10:54:18.629955 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/2482f122-92d5-410c-b4c0-41834cea1711-var-log-ovn\") pod \"ovn-controller-bldp9\" (UID: \"2482f122-92d5-410c-b4c0-41834cea1711\") " pod="openstack/ovn-controller-bldp9" Mar 20 10:54:18 crc kubenswrapper[4748]: I0320 10:54:18.629991 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2482f122-92d5-410c-b4c0-41834cea1711-scripts\") pod \"ovn-controller-bldp9\" (UID: \"2482f122-92d5-410c-b4c0-41834cea1711\") " pod="openstack/ovn-controller-bldp9" Mar 20 10:54:18 crc kubenswrapper[4748]: I0320 10:54:18.630060 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/334c1861-88b1-44e2-a02e-ad1dcecf2fc0-var-lib\") pod \"ovn-controller-ovs-756zb\" (UID: \"334c1861-88b1-44e2-a02e-ad1dcecf2fc0\") " pod="openstack/ovn-controller-ovs-756zb" Mar 20 10:54:18 crc kubenswrapper[4748]: I0320 10:54:18.630090 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/334c1861-88b1-44e2-a02e-ad1dcecf2fc0-var-log\") pod \"ovn-controller-ovs-756zb\" (UID: \"334c1861-88b1-44e2-a02e-ad1dcecf2fc0\") " pod="openstack/ovn-controller-ovs-756zb" Mar 20 10:54:18 crc kubenswrapper[4748]: I0320 10:54:18.630143 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-77c7h\" (UniqueName: \"kubernetes.io/projected/2482f122-92d5-410c-b4c0-41834cea1711-kube-api-access-77c7h\") pod \"ovn-controller-bldp9\" (UID: \"2482f122-92d5-410c-b4c0-41834cea1711\") " pod="openstack/ovn-controller-bldp9" Mar 20 10:54:18 crc kubenswrapper[4748]: I0320 10:54:18.630168 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/334c1861-88b1-44e2-a02e-ad1dcecf2fc0-var-run\") pod \"ovn-controller-ovs-756zb\" (UID: \"334c1861-88b1-44e2-a02e-ad1dcecf2fc0\") " pod="openstack/ovn-controller-ovs-756zb" Mar 20 10:54:18 crc kubenswrapper[4748]: I0320 10:54:18.630191 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/2482f122-92d5-410c-b4c0-41834cea1711-var-run-ovn\") pod \"ovn-controller-bldp9\" (UID: \"2482f122-92d5-410c-b4c0-41834cea1711\") " pod="openstack/ovn-controller-bldp9" Mar 20 10:54:18 crc kubenswrapper[4748]: I0320 10:54:18.630219 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/334c1861-88b1-44e2-a02e-ad1dcecf2fc0-scripts\") pod \"ovn-controller-ovs-756zb\" (UID: \"334c1861-88b1-44e2-a02e-ad1dcecf2fc0\") " pod="openstack/ovn-controller-ovs-756zb" Mar 20 10:54:18 crc kubenswrapper[4748]: I0320 10:54:18.630293 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2482f122-92d5-410c-b4c0-41834cea1711-combined-ca-bundle\") pod \"ovn-controller-bldp9\" (UID: \"2482f122-92d5-410c-b4c0-41834cea1711\") " pod="openstack/ovn-controller-bldp9" Mar 20 10:54:18 crc kubenswrapper[4748]: I0320 10:54:18.630319 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/334c1861-88b1-44e2-a02e-ad1dcecf2fc0-etc-ovs\") pod \"ovn-controller-ovs-756zb\" (UID: \"334c1861-88b1-44e2-a02e-ad1dcecf2fc0\") " pod="openstack/ovn-controller-ovs-756zb" Mar 20 10:54:18 crc kubenswrapper[4748]: I0320 10:54:18.630361 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/2482f122-92d5-410c-b4c0-41834cea1711-var-run\") pod \"ovn-controller-bldp9\" (UID: \"2482f122-92d5-410c-b4c0-41834cea1711\") " pod="openstack/ovn-controller-bldp9" Mar 20 10:54:18 crc kubenswrapper[4748]: I0320 10:54:18.630968 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/2482f122-92d5-410c-b4c0-41834cea1711-var-run\") pod \"ovn-controller-bldp9\" (UID: \"2482f122-92d5-410c-b4c0-41834cea1711\") " pod="openstack/ovn-controller-bldp9" Mar 20 10:54:18 crc kubenswrapper[4748]: I0320 10:54:18.633479 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/2482f122-92d5-410c-b4c0-41834cea1711-var-log-ovn\") pod \"ovn-controller-bldp9\" (UID: \"2482f122-92d5-410c-b4c0-41834cea1711\") " pod="openstack/ovn-controller-bldp9" Mar 20 10:54:18 crc kubenswrapper[4748]: I0320 10:54:18.634130 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/2482f122-92d5-410c-b4c0-41834cea1711-var-run-ovn\") pod \"ovn-controller-bldp9\" (UID: \"2482f122-92d5-410c-b4c0-41834cea1711\") " pod="openstack/ovn-controller-bldp9" Mar 20 10:54:18 crc kubenswrapper[4748]: I0320 10:54:18.636100 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2482f122-92d5-410c-b4c0-41834cea1711-scripts\") pod \"ovn-controller-bldp9\" (UID: \"2482f122-92d5-410c-b4c0-41834cea1711\") " pod="openstack/ovn-controller-bldp9" Mar 20 10:54:18 crc kubenswrapper[4748]: I0320 10:54:18.652878 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/2482f122-92d5-410c-b4c0-41834cea1711-ovn-controller-tls-certs\") pod \"ovn-controller-bldp9\" (UID: \"2482f122-92d5-410c-b4c0-41834cea1711\") " pod="openstack/ovn-controller-bldp9" Mar 20 10:54:18 crc kubenswrapper[4748]: I0320 10:54:18.660299 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-77c7h\" (UniqueName: \"kubernetes.io/projected/2482f122-92d5-410c-b4c0-41834cea1711-kube-api-access-77c7h\") pod \"ovn-controller-bldp9\" (UID: \"2482f122-92d5-410c-b4c0-41834cea1711\") " pod="openstack/ovn-controller-bldp9" Mar 20 10:54:18 crc kubenswrapper[4748]: I0320 10:54:18.684167 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2482f122-92d5-410c-b4c0-41834cea1711-combined-ca-bundle\") pod \"ovn-controller-bldp9\" (UID: \"2482f122-92d5-410c-b4c0-41834cea1711\") " pod="openstack/ovn-controller-bldp9" Mar 20 10:54:18 crc kubenswrapper[4748]: I0320 10:54:18.731811 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-bldp9" Mar 20 10:54:18 crc kubenswrapper[4748]: I0320 10:54:18.732656 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fxmzh\" (UniqueName: \"kubernetes.io/projected/334c1861-88b1-44e2-a02e-ad1dcecf2fc0-kube-api-access-fxmzh\") pod \"ovn-controller-ovs-756zb\" (UID: \"334c1861-88b1-44e2-a02e-ad1dcecf2fc0\") " pod="openstack/ovn-controller-ovs-756zb" Mar 20 10:54:18 crc kubenswrapper[4748]: I0320 10:54:18.732775 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/334c1861-88b1-44e2-a02e-ad1dcecf2fc0-var-lib\") pod \"ovn-controller-ovs-756zb\" (UID: \"334c1861-88b1-44e2-a02e-ad1dcecf2fc0\") " pod="openstack/ovn-controller-ovs-756zb" Mar 20 10:54:18 crc kubenswrapper[4748]: I0320 10:54:18.732813 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/334c1861-88b1-44e2-a02e-ad1dcecf2fc0-var-log\") pod \"ovn-controller-ovs-756zb\" (UID: \"334c1861-88b1-44e2-a02e-ad1dcecf2fc0\") " pod="openstack/ovn-controller-ovs-756zb" Mar 20 10:54:18 crc kubenswrapper[4748]: I0320 10:54:18.732892 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/334c1861-88b1-44e2-a02e-ad1dcecf2fc0-var-run\") pod \"ovn-controller-ovs-756zb\" (UID: \"334c1861-88b1-44e2-a02e-ad1dcecf2fc0\") " pod="openstack/ovn-controller-ovs-756zb" Mar 20 10:54:18 crc kubenswrapper[4748]: I0320 10:54:18.732933 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/334c1861-88b1-44e2-a02e-ad1dcecf2fc0-scripts\") pod \"ovn-controller-ovs-756zb\" (UID: \"334c1861-88b1-44e2-a02e-ad1dcecf2fc0\") " pod="openstack/ovn-controller-ovs-756zb" Mar 20 10:54:18 crc kubenswrapper[4748]: I0320 10:54:18.733003 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/334c1861-88b1-44e2-a02e-ad1dcecf2fc0-etc-ovs\") pod \"ovn-controller-ovs-756zb\" (UID: \"334c1861-88b1-44e2-a02e-ad1dcecf2fc0\") " pod="openstack/ovn-controller-ovs-756zb" Mar 20 10:54:18 crc kubenswrapper[4748]: I0320 10:54:18.733200 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/334c1861-88b1-44e2-a02e-ad1dcecf2fc0-var-run\") pod \"ovn-controller-ovs-756zb\" (UID: \"334c1861-88b1-44e2-a02e-ad1dcecf2fc0\") " pod="openstack/ovn-controller-ovs-756zb" Mar 20 10:54:18 crc kubenswrapper[4748]: I0320 10:54:18.733222 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/334c1861-88b1-44e2-a02e-ad1dcecf2fc0-var-log\") pod \"ovn-controller-ovs-756zb\" (UID: \"334c1861-88b1-44e2-a02e-ad1dcecf2fc0\") " pod="openstack/ovn-controller-ovs-756zb" Mar 20 10:54:18 crc kubenswrapper[4748]: I0320 10:54:18.733514 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/334c1861-88b1-44e2-a02e-ad1dcecf2fc0-var-lib\") pod \"ovn-controller-ovs-756zb\" (UID: \"334c1861-88b1-44e2-a02e-ad1dcecf2fc0\") " pod="openstack/ovn-controller-ovs-756zb" Mar 20 10:54:18 crc kubenswrapper[4748]: I0320 10:54:18.733546 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/334c1861-88b1-44e2-a02e-ad1dcecf2fc0-etc-ovs\") pod \"ovn-controller-ovs-756zb\" (UID: \"334c1861-88b1-44e2-a02e-ad1dcecf2fc0\") " pod="openstack/ovn-controller-ovs-756zb" Mar 20 10:54:18 crc kubenswrapper[4748]: I0320 10:54:18.735283 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/334c1861-88b1-44e2-a02e-ad1dcecf2fc0-scripts\") pod \"ovn-controller-ovs-756zb\" (UID: \"334c1861-88b1-44e2-a02e-ad1dcecf2fc0\") " pod="openstack/ovn-controller-ovs-756zb" Mar 20 10:54:18 crc kubenswrapper[4748]: I0320 10:54:18.791353 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxmzh\" (UniqueName: \"kubernetes.io/projected/334c1861-88b1-44e2-a02e-ad1dcecf2fc0-kube-api-access-fxmzh\") pod \"ovn-controller-ovs-756zb\" (UID: \"334c1861-88b1-44e2-a02e-ad1dcecf2fc0\") " pod="openstack/ovn-controller-ovs-756zb" Mar 20 10:54:18 crc kubenswrapper[4748]: I0320 10:54:18.838440 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-756zb" Mar 20 10:54:20 crc kubenswrapper[4748]: I0320 10:54:20.679699 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 20 10:54:20 crc kubenswrapper[4748]: I0320 10:54:20.684286 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 20 10:54:20 crc kubenswrapper[4748]: I0320 10:54:20.700004 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-cjr2q" Mar 20 10:54:20 crc kubenswrapper[4748]: I0320 10:54:20.700048 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Mar 20 10:54:20 crc kubenswrapper[4748]: I0320 10:54:20.700347 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Mar 20 10:54:20 crc kubenswrapper[4748]: I0320 10:54:20.700568 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Mar 20 10:54:20 crc kubenswrapper[4748]: I0320 10:54:20.704398 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 20 10:54:20 crc kubenswrapper[4748]: I0320 10:54:20.821918 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e2e0cd6-8912-4f2a-81ed-7cf9a9a0bd78-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"6e2e0cd6-8912-4f2a-81ed-7cf9a9a0bd78\") " pod="openstack/ovsdbserver-sb-0" Mar 20 10:54:20 crc kubenswrapper[4748]: I0320 10:54:20.821991 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e2e0cd6-8912-4f2a-81ed-7cf9a9a0bd78-config\") pod \"ovsdbserver-sb-0\" (UID: \"6e2e0cd6-8912-4f2a-81ed-7cf9a9a0bd78\") " pod="openstack/ovsdbserver-sb-0" Mar 20 10:54:20 crc kubenswrapper[4748]: I0320 10:54:20.822085 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6e2e0cd6-8912-4f2a-81ed-7cf9a9a0bd78-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"6e2e0cd6-8912-4f2a-81ed-7cf9a9a0bd78\") " pod="openstack/ovsdbserver-sb-0" Mar 20 10:54:20 crc kubenswrapper[4748]: I0320 10:54:20.822758 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxlbd\" (UniqueName: \"kubernetes.io/projected/6e2e0cd6-8912-4f2a-81ed-7cf9a9a0bd78-kube-api-access-mxlbd\") pod \"ovsdbserver-sb-0\" (UID: \"6e2e0cd6-8912-4f2a-81ed-7cf9a9a0bd78\") " pod="openstack/ovsdbserver-sb-0" Mar 20 10:54:20 crc kubenswrapper[4748]: I0320 10:54:20.822929 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6e2e0cd6-8912-4f2a-81ed-7cf9a9a0bd78-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"6e2e0cd6-8912-4f2a-81ed-7cf9a9a0bd78\") " pod="openstack/ovsdbserver-sb-0" Mar 20 10:54:20 crc kubenswrapper[4748]: I0320 10:54:20.823030 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e2e0cd6-8912-4f2a-81ed-7cf9a9a0bd78-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"6e2e0cd6-8912-4f2a-81ed-7cf9a9a0bd78\") " pod="openstack/ovsdbserver-sb-0" Mar 20 10:54:20 crc kubenswrapper[4748]: I0320 10:54:20.823095 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e2e0cd6-8912-4f2a-81ed-7cf9a9a0bd78-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"6e2e0cd6-8912-4f2a-81ed-7cf9a9a0bd78\") " pod="openstack/ovsdbserver-sb-0" Mar 20 10:54:20 crc kubenswrapper[4748]: I0320 10:54:20.823215 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-sb-0\" (UID: \"6e2e0cd6-8912-4f2a-81ed-7cf9a9a0bd78\") " pod="openstack/ovsdbserver-sb-0" Mar 20 10:54:20 crc kubenswrapper[4748]: I0320 10:54:20.925497 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e2e0cd6-8912-4f2a-81ed-7cf9a9a0bd78-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"6e2e0cd6-8912-4f2a-81ed-7cf9a9a0bd78\") " pod="openstack/ovsdbserver-sb-0" Mar 20 10:54:20 crc kubenswrapper[4748]: I0320 10:54:20.925654 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-sb-0\" (UID: \"6e2e0cd6-8912-4f2a-81ed-7cf9a9a0bd78\") " pod="openstack/ovsdbserver-sb-0" Mar 20 10:54:20 crc kubenswrapper[4748]: I0320 10:54:20.926327 4748 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-sb-0\" (UID: \"6e2e0cd6-8912-4f2a-81ed-7cf9a9a0bd78\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/ovsdbserver-sb-0" Mar 20 10:54:20 crc kubenswrapper[4748]: I0320 10:54:20.926549 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e2e0cd6-8912-4f2a-81ed-7cf9a9a0bd78-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"6e2e0cd6-8912-4f2a-81ed-7cf9a9a0bd78\") " pod="openstack/ovsdbserver-sb-0" Mar 20 10:54:20 crc kubenswrapper[4748]: I0320 10:54:20.926647 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e2e0cd6-8912-4f2a-81ed-7cf9a9a0bd78-config\") pod \"ovsdbserver-sb-0\" (UID: \"6e2e0cd6-8912-4f2a-81ed-7cf9a9a0bd78\") " pod="openstack/ovsdbserver-sb-0" Mar 20 10:54:20 crc kubenswrapper[4748]: I0320 10:54:20.926725 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6e2e0cd6-8912-4f2a-81ed-7cf9a9a0bd78-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"6e2e0cd6-8912-4f2a-81ed-7cf9a9a0bd78\") " pod="openstack/ovsdbserver-sb-0" Mar 20 10:54:20 crc kubenswrapper[4748]: I0320 10:54:20.926996 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxlbd\" (UniqueName: \"kubernetes.io/projected/6e2e0cd6-8912-4f2a-81ed-7cf9a9a0bd78-kube-api-access-mxlbd\") pod \"ovsdbserver-sb-0\" (UID: \"6e2e0cd6-8912-4f2a-81ed-7cf9a9a0bd78\") " pod="openstack/ovsdbserver-sb-0" Mar 20 10:54:20 crc kubenswrapper[4748]: I0320 10:54:20.927200 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6e2e0cd6-8912-4f2a-81ed-7cf9a9a0bd78-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"6e2e0cd6-8912-4f2a-81ed-7cf9a9a0bd78\") " pod="openstack/ovsdbserver-sb-0" Mar 20 10:54:20 crc kubenswrapper[4748]: I0320 10:54:20.927248 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e2e0cd6-8912-4f2a-81ed-7cf9a9a0bd78-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"6e2e0cd6-8912-4f2a-81ed-7cf9a9a0bd78\") " pod="openstack/ovsdbserver-sb-0" Mar 20 10:54:20 crc kubenswrapper[4748]: I0320 10:54:20.928388 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e2e0cd6-8912-4f2a-81ed-7cf9a9a0bd78-config\") pod \"ovsdbserver-sb-0\" (UID: \"6e2e0cd6-8912-4f2a-81ed-7cf9a9a0bd78\") " pod="openstack/ovsdbserver-sb-0" Mar 20 10:54:20 crc kubenswrapper[4748]: I0320 10:54:20.930754 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6e2e0cd6-8912-4f2a-81ed-7cf9a9a0bd78-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"6e2e0cd6-8912-4f2a-81ed-7cf9a9a0bd78\") " pod="openstack/ovsdbserver-sb-0" Mar 20 10:54:20 crc kubenswrapper[4748]: I0320 10:54:20.931264 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6e2e0cd6-8912-4f2a-81ed-7cf9a9a0bd78-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"6e2e0cd6-8912-4f2a-81ed-7cf9a9a0bd78\") " pod="openstack/ovsdbserver-sb-0" Mar 20 10:54:20 crc kubenswrapper[4748]: I0320 10:54:20.939418 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e2e0cd6-8912-4f2a-81ed-7cf9a9a0bd78-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"6e2e0cd6-8912-4f2a-81ed-7cf9a9a0bd78\") " pod="openstack/ovsdbserver-sb-0" Mar 20 10:54:20 crc kubenswrapper[4748]: I0320 10:54:20.944501 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e2e0cd6-8912-4f2a-81ed-7cf9a9a0bd78-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"6e2e0cd6-8912-4f2a-81ed-7cf9a9a0bd78\") " pod="openstack/ovsdbserver-sb-0" Mar 20 10:54:20 crc kubenswrapper[4748]: I0320 10:54:20.946820 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e2e0cd6-8912-4f2a-81ed-7cf9a9a0bd78-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"6e2e0cd6-8912-4f2a-81ed-7cf9a9a0bd78\") " pod="openstack/ovsdbserver-sb-0" Mar 20 10:54:20 crc kubenswrapper[4748]: I0320 10:54:20.949438 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxlbd\" (UniqueName: \"kubernetes.io/projected/6e2e0cd6-8912-4f2a-81ed-7cf9a9a0bd78-kube-api-access-mxlbd\") pod \"ovsdbserver-sb-0\" (UID: \"6e2e0cd6-8912-4f2a-81ed-7cf9a9a0bd78\") " pod="openstack/ovsdbserver-sb-0" Mar 20 10:54:20 crc kubenswrapper[4748]: I0320 10:54:20.967663 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-sb-0\" (UID: \"6e2e0cd6-8912-4f2a-81ed-7cf9a9a0bd78\") " pod="openstack/ovsdbserver-sb-0" Mar 20 10:54:21 crc kubenswrapper[4748]: I0320 10:54:21.035318 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 20 10:54:31 crc kubenswrapper[4748]: I0320 10:54:31.051913 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"cae37e8d-39b5-4045-aa76-b36630130555","Type":"ContainerStarted","Data":"1f9ceb1d83b0268a51fcbb041538e072d0a06d480ae737ff6447b08905aafe3f"} Mar 20 10:54:40 crc kubenswrapper[4748]: E0320 10:54:40.844593 4748 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified" Mar 20 10:54:40 crc kubenswrapper[4748]: E0320 10:54:40.845378 4748 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-w4d4s,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-server-0_openstack(c9362889-0195-4aad-96bd-ed63db88da83): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 10:54:40 crc kubenswrapper[4748]: E0320 10:54:40.847895 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-server-0" podUID="c9362889-0195-4aad-96bd-ed63db88da83" Mar 20 10:54:40 crc kubenswrapper[4748]: E0320 10:54:40.869703 4748 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified" Mar 20 10:54:40 crc kubenswrapper[4748]: E0320 10:54:40.869933 4748 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vmt7j,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cell1-server-0_openstack(a5a9b3e3-3a44-4765-ab5b-0e7955b524f7): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 10:54:40 crc kubenswrapper[4748]: E0320 10:54:40.871351 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-cell1-server-0" podUID="a5a9b3e3-3a44-4765-ab5b-0e7955b524f7" Mar 20 10:54:41 crc kubenswrapper[4748]: E0320 10:54:41.173759 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified\\\"\"" pod="openstack/rabbitmq-server-0" podUID="c9362889-0195-4aad-96bd-ed63db88da83" Mar 20 10:54:41 crc kubenswrapper[4748]: E0320 10:54:41.175154 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified\\\"\"" pod="openstack/rabbitmq-cell1-server-0" podUID="a5a9b3e3-3a44-4765-ab5b-0e7955b524f7" Mar 20 10:54:42 crc kubenswrapper[4748]: E0320 10:54:42.810322 4748 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-mariadb:current-podified" Mar 20 10:54:42 crc kubenswrapper[4748]: E0320 10:54:42.810607 4748 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:mysql-bootstrap,Image:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,Command:[bash /var/lib/operator-scripts/mysql_bootstrap.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:True,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:mysql-db,ReadOnly:false,MountPath:/var/lib/mysql,SubPath:mysql,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-default,ReadOnly:true,MountPath:/var/lib/config-data/default,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-generated,ReadOnly:false,MountPath:/var/lib/config-data/generated,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:operator-scripts,ReadOnly:true,MountPath:/var/lib/operator-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5hmvk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-galera-0_openstack(74743039-97e4-46cf-8fbf-183c8c11ca20): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 10:54:42 crc kubenswrapper[4748]: E0320 10:54:42.812102 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/openstack-galera-0" podUID="74743039-97e4-46cf-8fbf-183c8c11ca20" Mar 20 10:54:42 crc kubenswrapper[4748]: I0320 10:54:42.928426 4748 patch_prober.go:28] interesting pod/machine-config-daemon-5lbvz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 10:54:42 crc kubenswrapper[4748]: I0320 10:54:42.928826 4748 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 10:54:43 crc kubenswrapper[4748]: E0320 10:54:43.192102 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-mariadb:current-podified\\\"\"" pod="openstack/openstack-galera-0" podUID="74743039-97e4-46cf-8fbf-183c8c11ca20" Mar 20 10:54:43 crc kubenswrapper[4748]: E0320 10:54:43.655953 4748 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 20 10:54:43 crc kubenswrapper[4748]: E0320 10:54:43.656209 4748 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nfdh5dfhb6h64h676hc4h78h97h669h54chfbh696hb5h54bh5d4h6bh64h644h677h584h5cbh698h9dh5bbh5f8h5b8hcdh644h5c7h694hbfh589q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-d7b2j,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-5ccc8479f9-qq2xk_openstack(0f69b3ab-d71e-44be-b0b0-fa830eb8756a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 10:54:43 crc kubenswrapper[4748]: E0320 10:54:43.658140 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-5ccc8479f9-qq2xk" podUID="0f69b3ab-d71e-44be-b0b0-fa830eb8756a" Mar 20 10:54:43 crc kubenswrapper[4748]: E0320 10:54:43.675779 4748 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 20 10:54:43 crc kubenswrapper[4748]: E0320 10:54:43.676061 4748 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jj4s7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-zfcwj_openstack(a8087a23-47c8-476a-b1ed-f333c87a10c7): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 10:54:43 crc kubenswrapper[4748]: E0320 10:54:43.677521 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-zfcwj" podUID="a8087a23-47c8-476a-b1ed-f333c87a10c7" Mar 20 10:54:44 crc kubenswrapper[4748]: E0320 10:54:44.199063 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-5ccc8479f9-qq2xk" podUID="0f69b3ab-d71e-44be-b0b0-fa830eb8756a" Mar 20 10:54:44 crc kubenswrapper[4748]: E0320 10:54:44.357291 4748 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-memcached:current-podified" Mar 20 10:54:44 crc kubenswrapper[4748]: E0320 10:54:44.357540 4748 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:memcached,Image:quay.io/podified-antelope-centos9/openstack-memcached:current-podified,Command:[/usr/bin/dumb-init -- /usr/local/bin/kolla_start],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:memcached,HostPort:0,ContainerPort:11211,Protocol:TCP,HostIP:,},ContainerPort{Name:memcached-tls,HostPort:0,ContainerPort:11212,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:POD_IPS,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIPs,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:CONFIG_HASH,Value:n9fh5c9h69hf9h658h675h556h58fhbh68ch5fdh77h6bh7dh5fh85h6bhbfh68h57ch9fh65bh5f5h689h5b4h676h675h55fh686hbh545h587q,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/src,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:memcached-tls-certs,ReadOnly:true,MountPath:/var/lib/config-data/tls/certs/memcached.crt,SubPath:tls.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:memcached-tls-certs,ReadOnly:true,MountPath:/var/lib/config-data/tls/private/memcached.key,SubPath:tls.key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vbsfn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 11211 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 11211 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42457,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42457,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod memcached-0_openstack(27339553-c013-4538-9a4d-5bbd249c197c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 10:54:44 crc kubenswrapper[4748]: E0320 10:54:44.359100 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"memcached\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/memcached-0" podUID="27339553-c013-4538-9a4d-5bbd249c197c" Mar 20 10:54:44 crc kubenswrapper[4748]: E0320 10:54:44.405176 4748 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 20 10:54:44 crc kubenswrapper[4748]: E0320 10:54:44.405462 4748 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-sgkv9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-57d769cc4f-dv6gr_openstack(c0179e70-7de5-4413-868b-8d442ec891e5): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 10:54:44 crc kubenswrapper[4748]: E0320 10:54:44.408538 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-57d769cc4f-dv6gr" podUID="c0179e70-7de5-4413-868b-8d442ec891e5" Mar 20 10:54:44 crc kubenswrapper[4748]: E0320 10:54:44.441216 4748 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 20 10:54:44 crc kubenswrapper[4748]: E0320 10:54:44.441515 4748 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-24928,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-5s5fj_openstack(7ed41b0a-dc41-453c-966b-c7dd5b490bfe): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 10:54:44 crc kubenswrapper[4748]: E0320 10:54:44.443062 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-5s5fj" podUID="7ed41b0a-dc41-453c-966b-c7dd5b490bfe" Mar 20 10:54:44 crc kubenswrapper[4748]: I0320 10:54:44.671638 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-zfcwj" Mar 20 10:54:44 crc kubenswrapper[4748]: I0320 10:54:44.764821 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a8087a23-47c8-476a-b1ed-f333c87a10c7-dns-svc\") pod \"a8087a23-47c8-476a-b1ed-f333c87a10c7\" (UID: \"a8087a23-47c8-476a-b1ed-f333c87a10c7\") " Mar 20 10:54:44 crc kubenswrapper[4748]: I0320 10:54:44.765079 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8087a23-47c8-476a-b1ed-f333c87a10c7-config\") pod \"a8087a23-47c8-476a-b1ed-f333c87a10c7\" (UID: \"a8087a23-47c8-476a-b1ed-f333c87a10c7\") " Mar 20 10:54:44 crc kubenswrapper[4748]: I0320 10:54:44.765135 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jj4s7\" (UniqueName: \"kubernetes.io/projected/a8087a23-47c8-476a-b1ed-f333c87a10c7-kube-api-access-jj4s7\") pod \"a8087a23-47c8-476a-b1ed-f333c87a10c7\" (UID: \"a8087a23-47c8-476a-b1ed-f333c87a10c7\") " Mar 20 10:54:44 crc kubenswrapper[4748]: I0320 10:54:44.767042 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8087a23-47c8-476a-b1ed-f333c87a10c7-config" (OuterVolumeSpecName: "config") pod "a8087a23-47c8-476a-b1ed-f333c87a10c7" (UID: "a8087a23-47c8-476a-b1ed-f333c87a10c7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:54:44 crc kubenswrapper[4748]: I0320 10:54:44.767389 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8087a23-47c8-476a-b1ed-f333c87a10c7-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a8087a23-47c8-476a-b1ed-f333c87a10c7" (UID: "a8087a23-47c8-476a-b1ed-f333c87a10c7"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:54:44 crc kubenswrapper[4748]: I0320 10:54:44.783502 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8087a23-47c8-476a-b1ed-f333c87a10c7-kube-api-access-jj4s7" (OuterVolumeSpecName: "kube-api-access-jj4s7") pod "a8087a23-47c8-476a-b1ed-f333c87a10c7" (UID: "a8087a23-47c8-476a-b1ed-f333c87a10c7"). InnerVolumeSpecName "kube-api-access-jj4s7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:54:44 crc kubenswrapper[4748]: I0320 10:54:44.869414 4748 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a8087a23-47c8-476a-b1ed-f333c87a10c7-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:44 crc kubenswrapper[4748]: I0320 10:54:44.869450 4748 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8087a23-47c8-476a-b1ed-f333c87a10c7-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:44 crc kubenswrapper[4748]: I0320 10:54:44.869460 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jj4s7\" (UniqueName: \"kubernetes.io/projected/a8087a23-47c8-476a-b1ed-f333c87a10c7-kube-api-access-jj4s7\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:44 crc kubenswrapper[4748]: E0320 10:54:44.926780 4748 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-mariadb:current-podified" Mar 20 10:54:44 crc kubenswrapper[4748]: E0320 10:54:44.927412 4748 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:mysql-bootstrap,Image:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,Command:[bash /var/lib/operator-scripts/mysql_bootstrap.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:True,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:mysql-db,ReadOnly:false,MountPath:/var/lib/mysql,SubPath:mysql,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-default,ReadOnly:true,MountPath:/var/lib/config-data/default,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-generated,ReadOnly:false,MountPath:/var/lib/config-data/generated,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:operator-scripts,ReadOnly:true,MountPath:/var/lib/operator-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-m9xtv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-cell1-galera-0_openstack(ffd19d53-385f-45a9-a222-caa7fbf6545e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 10:54:44 crc kubenswrapper[4748]: E0320 10:54:44.928640 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/openstack-cell1-galera-0" podUID="ffd19d53-385f-45a9-a222-caa7fbf6545e" Mar 20 10:54:45 crc kubenswrapper[4748]: I0320 10:54:45.002342 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-bldp9"] Mar 20 10:54:45 crc kubenswrapper[4748]: I0320 10:54:45.037998 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 20 10:54:45 crc kubenswrapper[4748]: I0320 10:54:45.129218 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 20 10:54:45 crc kubenswrapper[4748]: I0320 10:54:45.213205 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-zfcwj" Mar 20 10:54:45 crc kubenswrapper[4748]: I0320 10:54:45.213247 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-zfcwj" event={"ID":"a8087a23-47c8-476a-b1ed-f333c87a10c7","Type":"ContainerDied","Data":"5578606eb524a959efa987c5e619beb83c8439c9284ed7356a08c517d4d2adfc"} Mar 20 10:54:45 crc kubenswrapper[4748]: E0320 10:54:45.215502 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-57d769cc4f-dv6gr" podUID="c0179e70-7de5-4413-868b-8d442ec891e5" Mar 20 10:54:45 crc kubenswrapper[4748]: E0320 10:54:45.215694 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-mariadb:current-podified\\\"\"" pod="openstack/openstack-cell1-galera-0" podUID="ffd19d53-385f-45a9-a222-caa7fbf6545e" Mar 20 10:54:45 crc kubenswrapper[4748]: E0320 10:54:45.216368 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"memcached\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-memcached:current-podified\\\"\"" pod="openstack/memcached-0" podUID="27339553-c013-4538-9a4d-5bbd249c197c" Mar 20 10:54:45 crc kubenswrapper[4748]: I0320 10:54:45.400771 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-zfcwj"] Mar 20 10:54:45 crc kubenswrapper[4748]: I0320 10:54:45.405764 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-zfcwj"] Mar 20 10:54:45 crc kubenswrapper[4748]: I0320 10:54:45.534740 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8087a23-47c8-476a-b1ed-f333c87a10c7" path="/var/lib/kubelet/pods/a8087a23-47c8-476a-b1ed-f333c87a10c7/volumes" Mar 20 10:54:45 crc kubenswrapper[4748]: I0320 10:54:45.784934 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-5s5fj" Mar 20 10:54:45 crc kubenswrapper[4748]: I0320 10:54:45.906642 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-756zb"] Mar 20 10:54:45 crc kubenswrapper[4748]: I0320 10:54:45.910514 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ed41b0a-dc41-453c-966b-c7dd5b490bfe-config\") pod \"7ed41b0a-dc41-453c-966b-c7dd5b490bfe\" (UID: \"7ed41b0a-dc41-453c-966b-c7dd5b490bfe\") " Mar 20 10:54:45 crc kubenswrapper[4748]: I0320 10:54:45.910987 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-24928\" (UniqueName: \"kubernetes.io/projected/7ed41b0a-dc41-453c-966b-c7dd5b490bfe-kube-api-access-24928\") pod \"7ed41b0a-dc41-453c-966b-c7dd5b490bfe\" (UID: \"7ed41b0a-dc41-453c-966b-c7dd5b490bfe\") " Mar 20 10:54:45 crc kubenswrapper[4748]: I0320 10:54:45.913858 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ed41b0a-dc41-453c-966b-c7dd5b490bfe-config" (OuterVolumeSpecName: "config") pod "7ed41b0a-dc41-453c-966b-c7dd5b490bfe" (UID: "7ed41b0a-dc41-453c-966b-c7dd5b490bfe"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:54:45 crc kubenswrapper[4748]: I0320 10:54:45.922434 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ed41b0a-dc41-453c-966b-c7dd5b490bfe-kube-api-access-24928" (OuterVolumeSpecName: "kube-api-access-24928") pod "7ed41b0a-dc41-453c-966b-c7dd5b490bfe" (UID: "7ed41b0a-dc41-453c-966b-c7dd5b490bfe"). InnerVolumeSpecName "kube-api-access-24928". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:54:46 crc kubenswrapper[4748]: I0320 10:54:46.012797 4748 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ed41b0a-dc41-453c-966b-c7dd5b490bfe-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:46 crc kubenswrapper[4748]: I0320 10:54:46.012947 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-24928\" (UniqueName: \"kubernetes.io/projected/7ed41b0a-dc41-453c-966b-c7dd5b490bfe-kube-api-access-24928\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:46 crc kubenswrapper[4748]: I0320 10:54:46.223512 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"9ed1d8cb-39ee-4f07-98b6-61e1ea8777a2","Type":"ContainerStarted","Data":"96f23076ef3e6407cd30a275ea9db841871632a97d36127da370364684caa4b5"} Mar 20 10:54:46 crc kubenswrapper[4748]: I0320 10:54:46.225811 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-5s5fj" Mar 20 10:54:46 crc kubenswrapper[4748]: I0320 10:54:46.225802 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-5s5fj" event={"ID":"7ed41b0a-dc41-453c-966b-c7dd5b490bfe","Type":"ContainerDied","Data":"cd52a930108f45a5b4aca6b8e1e41d7325b987796f2d2470ff6fcb8b47a03f9b"} Mar 20 10:54:46 crc kubenswrapper[4748]: I0320 10:54:46.227221 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-756zb" event={"ID":"334c1861-88b1-44e2-a02e-ad1dcecf2fc0","Type":"ContainerStarted","Data":"b1ae400db342c64101cb09e43efcdb7f906c10c5afebd1551cce7187e1824935"} Mar 20 10:54:46 crc kubenswrapper[4748]: I0320 10:54:46.229528 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-bldp9" event={"ID":"2482f122-92d5-410c-b4c0-41834cea1711","Type":"ContainerStarted","Data":"3ca5f5df1887f6cc3f3b7c0d2b140a8b16dac62ee7fae23782c7122116618457"} Mar 20 10:54:46 crc kubenswrapper[4748]: I0320 10:54:46.231525 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"6e2e0cd6-8912-4f2a-81ed-7cf9a9a0bd78","Type":"ContainerStarted","Data":"71a1604a1f22b51bd539bdf0c9f07e4b75f132fc864d4130c17fed521baa5ca6"} Mar 20 10:54:46 crc kubenswrapper[4748]: I0320 10:54:46.355387 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-5s5fj"] Mar 20 10:54:46 crc kubenswrapper[4748]: I0320 10:54:46.411861 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-5s5fj"] Mar 20 10:54:47 crc kubenswrapper[4748]: I0320 10:54:47.527098 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ed41b0a-dc41-453c-966b-c7dd5b490bfe" path="/var/lib/kubelet/pods/7ed41b0a-dc41-453c-966b-c7dd5b490bfe/volumes" Mar 20 10:54:48 crc kubenswrapper[4748]: I0320 10:54:48.253271 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"cae37e8d-39b5-4045-aa76-b36630130555","Type":"ContainerStarted","Data":"ffaf001983b0e0e79ddb3a63e0ef90c30af0324d2384906b1e696e1b7cf09210"} Mar 20 10:54:48 crc kubenswrapper[4748]: I0320 10:54:48.253952 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 20 10:54:48 crc kubenswrapper[4748]: I0320 10:54:48.272863 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=17.628201454 podStartE2EDuration="34.272826928s" podCreationTimestamp="2026-03-20 10:54:14 +0000 UTC" firstStartedPulling="2026-03-20 10:54:30.519161961 +0000 UTC m=+1105.660707785" lastFinishedPulling="2026-03-20 10:54:47.163787445 +0000 UTC m=+1122.305333259" observedRunningTime="2026-03-20 10:54:48.271193917 +0000 UTC m=+1123.412739731" watchObservedRunningTime="2026-03-20 10:54:48.272826928 +0000 UTC m=+1123.414372742" Mar 20 10:54:50 crc kubenswrapper[4748]: I0320 10:54:50.277254 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"6e2e0cd6-8912-4f2a-81ed-7cf9a9a0bd78","Type":"ContainerStarted","Data":"d1f2d16a0c0ce74bdf6d2e2b0588e37e364b07c13ef01239addc944a240c2691"} Mar 20 10:54:50 crc kubenswrapper[4748]: I0320 10:54:50.280454 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"9ed1d8cb-39ee-4f07-98b6-61e1ea8777a2","Type":"ContainerStarted","Data":"0db1cb4ff98a10ba507baeb3c7998fbb255307a0fc8c22dd83868a9661571197"} Mar 20 10:54:50 crc kubenswrapper[4748]: I0320 10:54:50.281891 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-bldp9" event={"ID":"2482f122-92d5-410c-b4c0-41834cea1711","Type":"ContainerStarted","Data":"4ae278b6e44fe5cb14d1a29669c74dd4da6250dd026cafe92bf218371012a014"} Mar 20 10:54:50 crc kubenswrapper[4748]: I0320 10:54:50.282336 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-bldp9" Mar 20 10:54:50 crc kubenswrapper[4748]: I0320 10:54:50.284867 4748 generic.go:334] "Generic (PLEG): container finished" podID="334c1861-88b1-44e2-a02e-ad1dcecf2fc0" containerID="1cbdc9a564a63c37b984a1e0b67643e24023dc23737b3772621220b2e12abc5d" exitCode=0 Mar 20 10:54:50 crc kubenswrapper[4748]: I0320 10:54:50.284945 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-756zb" event={"ID":"334c1861-88b1-44e2-a02e-ad1dcecf2fc0","Type":"ContainerDied","Data":"1cbdc9a564a63c37b984a1e0b67643e24023dc23737b3772621220b2e12abc5d"} Mar 20 10:54:50 crc kubenswrapper[4748]: I0320 10:54:50.313113 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-bldp9" podStartSLOduration=28.184563176 podStartE2EDuration="32.313085072s" podCreationTimestamp="2026-03-20 10:54:18 +0000 UTC" firstStartedPulling="2026-03-20 10:54:45.418585568 +0000 UTC m=+1120.560131382" lastFinishedPulling="2026-03-20 10:54:49.547107464 +0000 UTC m=+1124.688653278" observedRunningTime="2026-03-20 10:54:50.305211854 +0000 UTC m=+1125.446757678" watchObservedRunningTime="2026-03-20 10:54:50.313085072 +0000 UTC m=+1125.454630886" Mar 20 10:54:51 crc kubenswrapper[4748]: I0320 10:54:51.298073 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-756zb" event={"ID":"334c1861-88b1-44e2-a02e-ad1dcecf2fc0","Type":"ContainerStarted","Data":"7e43ed6c94fe75018de4d33f7c7488014003441807b5f97fab2d5298684af5a7"} Mar 20 10:54:51 crc kubenswrapper[4748]: I0320 10:54:51.298964 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-756zb" Mar 20 10:54:51 crc kubenswrapper[4748]: I0320 10:54:51.298984 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-756zb" event={"ID":"334c1861-88b1-44e2-a02e-ad1dcecf2fc0","Type":"ContainerStarted","Data":"f7fef9bfb746b51d050c17d7aff4f6eb905f508600232b34475ec2146602d3d3"} Mar 20 10:54:51 crc kubenswrapper[4748]: I0320 10:54:51.299003 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-756zb" Mar 20 10:54:53 crc kubenswrapper[4748]: I0320 10:54:53.546935 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-756zb" podStartSLOduration=31.946651239 podStartE2EDuration="35.546911994s" podCreationTimestamp="2026-03-20 10:54:18 +0000 UTC" firstStartedPulling="2026-03-20 10:54:45.946189083 +0000 UTC m=+1121.087734897" lastFinishedPulling="2026-03-20 10:54:49.546449838 +0000 UTC m=+1124.687995652" observedRunningTime="2026-03-20 10:54:51.330824909 +0000 UTC m=+1126.472370733" watchObservedRunningTime="2026-03-20 10:54:53.546911994 +0000 UTC m=+1128.688457818" Mar 20 10:54:54 crc kubenswrapper[4748]: I0320 10:54:54.325132 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"6e2e0cd6-8912-4f2a-81ed-7cf9a9a0bd78","Type":"ContainerStarted","Data":"7e5e845c61889e15a123dba5a1e4d2681c5f7fbb9adf710cfc2908452f80077b"} Mar 20 10:54:54 crc kubenswrapper[4748]: I0320 10:54:54.328054 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"9ed1d8cb-39ee-4f07-98b6-61e1ea8777a2","Type":"ContainerStarted","Data":"61a75f2c29b3a97a0e65b19843d722fde4ca68196a1f926e2c53fbc2ef9c5216"} Mar 20 10:54:54 crc kubenswrapper[4748]: I0320 10:54:54.331186 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"74743039-97e4-46cf-8fbf-183c8c11ca20","Type":"ContainerStarted","Data":"d7cf477475f12edbe557b6ccba8fb795c3fc1e78e3ac026f21911552125d4fb4"} Mar 20 10:54:54 crc kubenswrapper[4748]: I0320 10:54:54.347589 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=27.102821792 podStartE2EDuration="35.347562974s" podCreationTimestamp="2026-03-20 10:54:19 +0000 UTC" firstStartedPulling="2026-03-20 10:54:45.423176604 +0000 UTC m=+1120.564722418" lastFinishedPulling="2026-03-20 10:54:53.667917786 +0000 UTC m=+1128.809463600" observedRunningTime="2026-03-20 10:54:54.345400869 +0000 UTC m=+1129.486946693" watchObservedRunningTime="2026-03-20 10:54:54.347562974 +0000 UTC m=+1129.489108788" Mar 20 10:54:54 crc kubenswrapper[4748]: I0320 10:54:54.892348 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 20 10:54:54 crc kubenswrapper[4748]: I0320 10:54:54.916717 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=30.636469327 podStartE2EDuration="38.916690662s" podCreationTimestamp="2026-03-20 10:54:16 +0000 UTC" firstStartedPulling="2026-03-20 10:54:45.40474653 +0000 UTC m=+1120.546292344" lastFinishedPulling="2026-03-20 10:54:53.684967865 +0000 UTC m=+1128.826513679" observedRunningTime="2026-03-20 10:54:54.407901501 +0000 UTC m=+1129.549447335" watchObservedRunningTime="2026-03-20 10:54:54.916690662 +0000 UTC m=+1130.058236476" Mar 20 10:54:56 crc kubenswrapper[4748]: I0320 10:54:56.035756 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Mar 20 10:54:56 crc kubenswrapper[4748]: I0320 10:54:56.576205 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Mar 20 10:54:56 crc kubenswrapper[4748]: I0320 10:54:56.615337 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Mar 20 10:54:57 crc kubenswrapper[4748]: I0320 10:54:57.035545 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Mar 20 10:54:57 crc kubenswrapper[4748]: I0320 10:54:57.074361 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Mar 20 10:54:57 crc kubenswrapper[4748]: I0320 10:54:57.360429 4748 generic.go:334] "Generic (PLEG): container finished" podID="c0179e70-7de5-4413-868b-8d442ec891e5" containerID="db24e1dadac5e77428c5a5774e6b4b3cc54e4699e8dadf4d8cc64e3b5f6d1c2c" exitCode=0 Mar 20 10:54:57 crc kubenswrapper[4748]: I0320 10:54:57.360554 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-dv6gr" event={"ID":"c0179e70-7de5-4413-868b-8d442ec891e5","Type":"ContainerDied","Data":"db24e1dadac5e77428c5a5774e6b4b3cc54e4699e8dadf4d8cc64e3b5f6d1c2c"} Mar 20 10:54:57 crc kubenswrapper[4748]: I0320 10:54:57.361278 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Mar 20 10:54:57 crc kubenswrapper[4748]: I0320 10:54:57.416060 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Mar 20 10:54:57 crc kubenswrapper[4748]: I0320 10:54:57.432107 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Mar 20 10:54:57 crc kubenswrapper[4748]: I0320 10:54:57.657439 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-qq2xk"] Mar 20 10:54:57 crc kubenswrapper[4748]: I0320 10:54:57.728666 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-kdcvl"] Mar 20 10:54:57 crc kubenswrapper[4748]: I0320 10:54:57.730415 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-kdcvl" Mar 20 10:54:57 crc kubenswrapper[4748]: I0320 10:54:57.739676 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-zhsd6"] Mar 20 10:54:57 crc kubenswrapper[4748]: I0320 10:54:57.742564 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-zhsd6" Mar 20 10:54:57 crc kubenswrapper[4748]: I0320 10:54:57.755927 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-kdcvl"] Mar 20 10:54:57 crc kubenswrapper[4748]: I0320 10:54:57.764504 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Mar 20 10:54:57 crc kubenswrapper[4748]: I0320 10:54:57.765233 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Mar 20 10:54:57 crc kubenswrapper[4748]: I0320 10:54:57.772238 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0c60c0d4-dc51-4249-a35d-3a9601052fcd-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-kdcvl\" (UID: \"0c60c0d4-dc51-4249-a35d-3a9601052fcd\") " pod="openstack/dnsmasq-dns-7fd796d7df-kdcvl" Mar 20 10:54:57 crc kubenswrapper[4748]: I0320 10:54:57.772345 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c60c0d4-dc51-4249-a35d-3a9601052fcd-config\") pod \"dnsmasq-dns-7fd796d7df-kdcvl\" (UID: \"0c60c0d4-dc51-4249-a35d-3a9601052fcd\") " pod="openstack/dnsmasq-dns-7fd796d7df-kdcvl" Mar 20 10:54:57 crc kubenswrapper[4748]: I0320 10:54:57.772432 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-827hq\" (UniqueName: \"kubernetes.io/projected/0c60c0d4-dc51-4249-a35d-3a9601052fcd-kube-api-access-827hq\") pod \"dnsmasq-dns-7fd796d7df-kdcvl\" (UID: \"0c60c0d4-dc51-4249-a35d-3a9601052fcd\") " pod="openstack/dnsmasq-dns-7fd796d7df-kdcvl" Mar 20 10:54:57 crc kubenswrapper[4748]: I0320 10:54:57.772529 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0c60c0d4-dc51-4249-a35d-3a9601052fcd-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-kdcvl\" (UID: \"0c60c0d4-dc51-4249-a35d-3a9601052fcd\") " pod="openstack/dnsmasq-dns-7fd796d7df-kdcvl" Mar 20 10:54:57 crc kubenswrapper[4748]: I0320 10:54:57.774880 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-zhsd6"] Mar 20 10:54:57 crc kubenswrapper[4748]: I0320 10:54:57.882149 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0c60c0d4-dc51-4249-a35d-3a9601052fcd-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-kdcvl\" (UID: \"0c60c0d4-dc51-4249-a35d-3a9601052fcd\") " pod="openstack/dnsmasq-dns-7fd796d7df-kdcvl" Mar 20 10:54:57 crc kubenswrapper[4748]: I0320 10:54:57.882327 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqpdk\" (UniqueName: \"kubernetes.io/projected/6b172e4c-c5b0-4573-b80c-9bc074489627-kube-api-access-dqpdk\") pod \"ovn-controller-metrics-zhsd6\" (UID: \"6b172e4c-c5b0-4573-b80c-9bc074489627\") " pod="openstack/ovn-controller-metrics-zhsd6" Mar 20 10:54:57 crc kubenswrapper[4748]: I0320 10:54:57.883015 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/6b172e4c-c5b0-4573-b80c-9bc074489627-ovs-rundir\") pod \"ovn-controller-metrics-zhsd6\" (UID: \"6b172e4c-c5b0-4573-b80c-9bc074489627\") " pod="openstack/ovn-controller-metrics-zhsd6" Mar 20 10:54:57 crc kubenswrapper[4748]: I0320 10:54:57.883503 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b172e4c-c5b0-4573-b80c-9bc074489627-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-zhsd6\" (UID: \"6b172e4c-c5b0-4573-b80c-9bc074489627\") " pod="openstack/ovn-controller-metrics-zhsd6" Mar 20 10:54:57 crc kubenswrapper[4748]: I0320 10:54:57.886442 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/6b172e4c-c5b0-4573-b80c-9bc074489627-ovn-rundir\") pod \"ovn-controller-metrics-zhsd6\" (UID: \"6b172e4c-c5b0-4573-b80c-9bc074489627\") " pod="openstack/ovn-controller-metrics-zhsd6" Mar 20 10:54:57 crc kubenswrapper[4748]: I0320 10:54:57.886594 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0c60c0d4-dc51-4249-a35d-3a9601052fcd-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-kdcvl\" (UID: \"0c60c0d4-dc51-4249-a35d-3a9601052fcd\") " pod="openstack/dnsmasq-dns-7fd796d7df-kdcvl" Mar 20 10:54:57 crc kubenswrapper[4748]: I0320 10:54:57.886727 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b172e4c-c5b0-4573-b80c-9bc074489627-config\") pod \"ovn-controller-metrics-zhsd6\" (UID: \"6b172e4c-c5b0-4573-b80c-9bc074489627\") " pod="openstack/ovn-controller-metrics-zhsd6" Mar 20 10:54:57 crc kubenswrapper[4748]: I0320 10:54:57.887161 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c60c0d4-dc51-4249-a35d-3a9601052fcd-config\") pod \"dnsmasq-dns-7fd796d7df-kdcvl\" (UID: \"0c60c0d4-dc51-4249-a35d-3a9601052fcd\") " pod="openstack/dnsmasq-dns-7fd796d7df-kdcvl" Mar 20 10:54:57 crc kubenswrapper[4748]: I0320 10:54:57.887327 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b172e4c-c5b0-4573-b80c-9bc074489627-combined-ca-bundle\") pod \"ovn-controller-metrics-zhsd6\" (UID: \"6b172e4c-c5b0-4573-b80c-9bc074489627\") " pod="openstack/ovn-controller-metrics-zhsd6" Mar 20 10:54:57 crc kubenswrapper[4748]: I0320 10:54:57.890586 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0c60c0d4-dc51-4249-a35d-3a9601052fcd-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-kdcvl\" (UID: \"0c60c0d4-dc51-4249-a35d-3a9601052fcd\") " pod="openstack/dnsmasq-dns-7fd796d7df-kdcvl" Mar 20 10:54:57 crc kubenswrapper[4748]: I0320 10:54:57.893070 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c60c0d4-dc51-4249-a35d-3a9601052fcd-config\") pod \"dnsmasq-dns-7fd796d7df-kdcvl\" (UID: \"0c60c0d4-dc51-4249-a35d-3a9601052fcd\") " pod="openstack/dnsmasq-dns-7fd796d7df-kdcvl" Mar 20 10:54:57 crc kubenswrapper[4748]: I0320 10:54:57.893348 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-827hq\" (UniqueName: \"kubernetes.io/projected/0c60c0d4-dc51-4249-a35d-3a9601052fcd-kube-api-access-827hq\") pod \"dnsmasq-dns-7fd796d7df-kdcvl\" (UID: \"0c60c0d4-dc51-4249-a35d-3a9601052fcd\") " pod="openstack/dnsmasq-dns-7fd796d7df-kdcvl" Mar 20 10:54:57 crc kubenswrapper[4748]: I0320 10:54:57.899225 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0c60c0d4-dc51-4249-a35d-3a9601052fcd-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-kdcvl\" (UID: \"0c60c0d4-dc51-4249-a35d-3a9601052fcd\") " pod="openstack/dnsmasq-dns-7fd796d7df-kdcvl" Mar 20 10:54:57 crc kubenswrapper[4748]: I0320 10:54:57.913602 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-dv6gr"] Mar 20 10:54:57 crc kubenswrapper[4748]: I0320 10:54:57.953170 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-8hm6n"] Mar 20 10:54:57 crc kubenswrapper[4748]: I0320 10:54:57.969098 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-8hm6n" Mar 20 10:54:57 crc kubenswrapper[4748]: I0320 10:54:57.973735 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Mar 20 10:54:57 crc kubenswrapper[4748]: I0320 10:54:57.977003 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Mar 20 10:54:57 crc kubenswrapper[4748]: I0320 10:54:57.991311 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-8hm6n"] Mar 20 10:54:57 crc kubenswrapper[4748]: I0320 10:54:57.991658 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 20 10:54:57 crc kubenswrapper[4748]: I0320 10:54:57.997034 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/6b172e4c-c5b0-4573-b80c-9bc074489627-ovn-rundir\") pod \"ovn-controller-metrics-zhsd6\" (UID: \"6b172e4c-c5b0-4573-b80c-9bc074489627\") " pod="openstack/ovn-controller-metrics-zhsd6" Mar 20 10:54:57 crc kubenswrapper[4748]: I0320 10:54:57.997099 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b172e4c-c5b0-4573-b80c-9bc074489627-config\") pod \"ovn-controller-metrics-zhsd6\" (UID: \"6b172e4c-c5b0-4573-b80c-9bc074489627\") " pod="openstack/ovn-controller-metrics-zhsd6" Mar 20 10:54:57 crc kubenswrapper[4748]: I0320 10:54:57.997166 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b172e4c-c5b0-4573-b80c-9bc074489627-combined-ca-bundle\") pod \"ovn-controller-metrics-zhsd6\" (UID: \"6b172e4c-c5b0-4573-b80c-9bc074489627\") " pod="openstack/ovn-controller-metrics-zhsd6" Mar 20 10:54:57 crc kubenswrapper[4748]: I0320 10:54:57.997292 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dqpdk\" (UniqueName: \"kubernetes.io/projected/6b172e4c-c5b0-4573-b80c-9bc074489627-kube-api-access-dqpdk\") pod \"ovn-controller-metrics-zhsd6\" (UID: \"6b172e4c-c5b0-4573-b80c-9bc074489627\") " pod="openstack/ovn-controller-metrics-zhsd6" Mar 20 10:54:57 crc kubenswrapper[4748]: I0320 10:54:57.997318 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/6b172e4c-c5b0-4573-b80c-9bc074489627-ovs-rundir\") pod \"ovn-controller-metrics-zhsd6\" (UID: \"6b172e4c-c5b0-4573-b80c-9bc074489627\") " pod="openstack/ovn-controller-metrics-zhsd6" Mar 20 10:54:57 crc kubenswrapper[4748]: I0320 10:54:57.997380 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b172e4c-c5b0-4573-b80c-9bc074489627-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-zhsd6\" (UID: \"6b172e4c-c5b0-4573-b80c-9bc074489627\") " pod="openstack/ovn-controller-metrics-zhsd6" Mar 20 10:54:58 crc kubenswrapper[4748]: I0320 10:54:58.009039 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/6b172e4c-c5b0-4573-b80c-9bc074489627-ovs-rundir\") pod \"ovn-controller-metrics-zhsd6\" (UID: \"6b172e4c-c5b0-4573-b80c-9bc074489627\") " pod="openstack/ovn-controller-metrics-zhsd6" Mar 20 10:54:58 crc kubenswrapper[4748]: I0320 10:54:58.009770 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/6b172e4c-c5b0-4573-b80c-9bc074489627-ovn-rundir\") pod \"ovn-controller-metrics-zhsd6\" (UID: \"6b172e4c-c5b0-4573-b80c-9bc074489627\") " pod="openstack/ovn-controller-metrics-zhsd6" Mar 20 10:54:58 crc kubenswrapper[4748]: I0320 10:54:58.015054 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Mar 20 10:54:58 crc kubenswrapper[4748]: I0320 10:54:58.015328 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Mar 20 10:54:58 crc kubenswrapper[4748]: I0320 10:54:58.015399 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b172e4c-c5b0-4573-b80c-9bc074489627-combined-ca-bundle\") pod \"ovn-controller-metrics-zhsd6\" (UID: \"6b172e4c-c5b0-4573-b80c-9bc074489627\") " pod="openstack/ovn-controller-metrics-zhsd6" Mar 20 10:54:58 crc kubenswrapper[4748]: I0320 10:54:58.015504 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Mar 20 10:54:58 crc kubenswrapper[4748]: I0320 10:54:58.015702 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-nmbsn" Mar 20 10:54:58 crc kubenswrapper[4748]: I0320 10:54:58.018708 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 20 10:54:58 crc kubenswrapper[4748]: I0320 10:54:58.021890 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-827hq\" (UniqueName: \"kubernetes.io/projected/0c60c0d4-dc51-4249-a35d-3a9601052fcd-kube-api-access-827hq\") pod \"dnsmasq-dns-7fd796d7df-kdcvl\" (UID: \"0c60c0d4-dc51-4249-a35d-3a9601052fcd\") " pod="openstack/dnsmasq-dns-7fd796d7df-kdcvl" Mar 20 10:54:58 crc kubenswrapper[4748]: I0320 10:54:58.025843 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b172e4c-c5b0-4573-b80c-9bc074489627-config\") pod \"ovn-controller-metrics-zhsd6\" (UID: \"6b172e4c-c5b0-4573-b80c-9bc074489627\") " pod="openstack/ovn-controller-metrics-zhsd6" Mar 20 10:54:58 crc kubenswrapper[4748]: I0320 10:54:58.037127 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqpdk\" (UniqueName: \"kubernetes.io/projected/6b172e4c-c5b0-4573-b80c-9bc074489627-kube-api-access-dqpdk\") pod \"ovn-controller-metrics-zhsd6\" (UID: \"6b172e4c-c5b0-4573-b80c-9bc074489627\") " pod="openstack/ovn-controller-metrics-zhsd6" Mar 20 10:54:58 crc kubenswrapper[4748]: I0320 10:54:58.044762 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b172e4c-c5b0-4573-b80c-9bc074489627-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-zhsd6\" (UID: \"6b172e4c-c5b0-4573-b80c-9bc074489627\") " pod="openstack/ovn-controller-metrics-zhsd6" Mar 20 10:54:58 crc kubenswrapper[4748]: I0320 10:54:58.099709 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b80cd60-a2f6-4638-a600-4d866573bbc3-config\") pod \"ovn-northd-0\" (UID: \"9b80cd60-a2f6-4638-a600-4d866573bbc3\") " pod="openstack/ovn-northd-0" Mar 20 10:54:58 crc kubenswrapper[4748]: I0320 10:54:58.099784 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/954ea93a-7914-4792-b989-9a376beed991-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-8hm6n\" (UID: \"954ea93a-7914-4792-b989-9a376beed991\") " pod="openstack/dnsmasq-dns-86db49b7ff-8hm6n" Mar 20 10:54:58 crc kubenswrapper[4748]: I0320 10:54:58.099820 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9b80cd60-a2f6-4638-a600-4d866573bbc3-scripts\") pod \"ovn-northd-0\" (UID: \"9b80cd60-a2f6-4638-a600-4d866573bbc3\") " pod="openstack/ovn-northd-0" Mar 20 10:54:58 crc kubenswrapper[4748]: I0320 10:54:58.099858 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b80cd60-a2f6-4638-a600-4d866573bbc3-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"9b80cd60-a2f6-4638-a600-4d866573bbc3\") " pod="openstack/ovn-northd-0" Mar 20 10:54:58 crc kubenswrapper[4748]: I0320 10:54:58.099896 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b80cd60-a2f6-4638-a600-4d866573bbc3-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"9b80cd60-a2f6-4638-a600-4d866573bbc3\") " pod="openstack/ovn-northd-0" Mar 20 10:54:58 crc kubenswrapper[4748]: I0320 10:54:58.099931 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9b80cd60-a2f6-4638-a600-4d866573bbc3-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"9b80cd60-a2f6-4638-a600-4d866573bbc3\") " pod="openstack/ovn-northd-0" Mar 20 10:54:58 crc kubenswrapper[4748]: I0320 10:54:58.099952 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/954ea93a-7914-4792-b989-9a376beed991-config\") pod \"dnsmasq-dns-86db49b7ff-8hm6n\" (UID: \"954ea93a-7914-4792-b989-9a376beed991\") " pod="openstack/dnsmasq-dns-86db49b7ff-8hm6n" Mar 20 10:54:58 crc kubenswrapper[4748]: I0320 10:54:58.100085 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdknd\" (UniqueName: \"kubernetes.io/projected/9b80cd60-a2f6-4638-a600-4d866573bbc3-kube-api-access-xdknd\") pod \"ovn-northd-0\" (UID: \"9b80cd60-a2f6-4638-a600-4d866573bbc3\") " pod="openstack/ovn-northd-0" Mar 20 10:54:58 crc kubenswrapper[4748]: I0320 10:54:58.100139 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/954ea93a-7914-4792-b989-9a376beed991-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-8hm6n\" (UID: \"954ea93a-7914-4792-b989-9a376beed991\") " pod="openstack/dnsmasq-dns-86db49b7ff-8hm6n" Mar 20 10:54:58 crc kubenswrapper[4748]: I0320 10:54:58.100174 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/954ea93a-7914-4792-b989-9a376beed991-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-8hm6n\" (UID: \"954ea93a-7914-4792-b989-9a376beed991\") " pod="openstack/dnsmasq-dns-86db49b7ff-8hm6n" Mar 20 10:54:58 crc kubenswrapper[4748]: I0320 10:54:58.100216 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h9jzm\" (UniqueName: \"kubernetes.io/projected/954ea93a-7914-4792-b989-9a376beed991-kube-api-access-h9jzm\") pod \"dnsmasq-dns-86db49b7ff-8hm6n\" (UID: \"954ea93a-7914-4792-b989-9a376beed991\") " pod="openstack/dnsmasq-dns-86db49b7ff-8hm6n" Mar 20 10:54:58 crc kubenswrapper[4748]: I0320 10:54:58.100237 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b80cd60-a2f6-4638-a600-4d866573bbc3-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"9b80cd60-a2f6-4638-a600-4d866573bbc3\") " pod="openstack/ovn-northd-0" Mar 20 10:54:58 crc kubenswrapper[4748]: I0320 10:54:58.123079 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-kdcvl" Mar 20 10:54:58 crc kubenswrapper[4748]: I0320 10:54:58.156080 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-zhsd6" Mar 20 10:54:58 crc kubenswrapper[4748]: I0320 10:54:58.202170 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h9jzm\" (UniqueName: \"kubernetes.io/projected/954ea93a-7914-4792-b989-9a376beed991-kube-api-access-h9jzm\") pod \"dnsmasq-dns-86db49b7ff-8hm6n\" (UID: \"954ea93a-7914-4792-b989-9a376beed991\") " pod="openstack/dnsmasq-dns-86db49b7ff-8hm6n" Mar 20 10:54:58 crc kubenswrapper[4748]: I0320 10:54:58.202229 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b80cd60-a2f6-4638-a600-4d866573bbc3-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"9b80cd60-a2f6-4638-a600-4d866573bbc3\") " pod="openstack/ovn-northd-0" Mar 20 10:54:58 crc kubenswrapper[4748]: I0320 10:54:58.202270 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/954ea93a-7914-4792-b989-9a376beed991-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-8hm6n\" (UID: \"954ea93a-7914-4792-b989-9a376beed991\") " pod="openstack/dnsmasq-dns-86db49b7ff-8hm6n" Mar 20 10:54:58 crc kubenswrapper[4748]: I0320 10:54:58.202289 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b80cd60-a2f6-4638-a600-4d866573bbc3-config\") pod \"ovn-northd-0\" (UID: \"9b80cd60-a2f6-4638-a600-4d866573bbc3\") " pod="openstack/ovn-northd-0" Mar 20 10:54:58 crc kubenswrapper[4748]: I0320 10:54:58.202320 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b80cd60-a2f6-4638-a600-4d866573bbc3-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"9b80cd60-a2f6-4638-a600-4d866573bbc3\") " pod="openstack/ovn-northd-0" Mar 20 10:54:58 crc kubenswrapper[4748]: I0320 10:54:58.202339 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9b80cd60-a2f6-4638-a600-4d866573bbc3-scripts\") pod \"ovn-northd-0\" (UID: \"9b80cd60-a2f6-4638-a600-4d866573bbc3\") " pod="openstack/ovn-northd-0" Mar 20 10:54:58 crc kubenswrapper[4748]: I0320 10:54:58.202394 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b80cd60-a2f6-4638-a600-4d866573bbc3-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"9b80cd60-a2f6-4638-a600-4d866573bbc3\") " pod="openstack/ovn-northd-0" Mar 20 10:54:58 crc kubenswrapper[4748]: I0320 10:54:58.202442 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9b80cd60-a2f6-4638-a600-4d866573bbc3-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"9b80cd60-a2f6-4638-a600-4d866573bbc3\") " pod="openstack/ovn-northd-0" Mar 20 10:54:58 crc kubenswrapper[4748]: I0320 10:54:58.202470 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/954ea93a-7914-4792-b989-9a376beed991-config\") pod \"dnsmasq-dns-86db49b7ff-8hm6n\" (UID: \"954ea93a-7914-4792-b989-9a376beed991\") " pod="openstack/dnsmasq-dns-86db49b7ff-8hm6n" Mar 20 10:54:58 crc kubenswrapper[4748]: I0320 10:54:58.202504 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xdknd\" (UniqueName: \"kubernetes.io/projected/9b80cd60-a2f6-4638-a600-4d866573bbc3-kube-api-access-xdknd\") pod \"ovn-northd-0\" (UID: \"9b80cd60-a2f6-4638-a600-4d866573bbc3\") " pod="openstack/ovn-northd-0" Mar 20 10:54:58 crc kubenswrapper[4748]: I0320 10:54:58.202544 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/954ea93a-7914-4792-b989-9a376beed991-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-8hm6n\" (UID: \"954ea93a-7914-4792-b989-9a376beed991\") " pod="openstack/dnsmasq-dns-86db49b7ff-8hm6n" Mar 20 10:54:58 crc kubenswrapper[4748]: I0320 10:54:58.202580 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/954ea93a-7914-4792-b989-9a376beed991-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-8hm6n\" (UID: \"954ea93a-7914-4792-b989-9a376beed991\") " pod="openstack/dnsmasq-dns-86db49b7ff-8hm6n" Mar 20 10:54:58 crc kubenswrapper[4748]: I0320 10:54:58.203977 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/954ea93a-7914-4792-b989-9a376beed991-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-8hm6n\" (UID: \"954ea93a-7914-4792-b989-9a376beed991\") " pod="openstack/dnsmasq-dns-86db49b7ff-8hm6n" Mar 20 10:54:58 crc kubenswrapper[4748]: I0320 10:54:58.204158 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/954ea93a-7914-4792-b989-9a376beed991-config\") pod \"dnsmasq-dns-86db49b7ff-8hm6n\" (UID: \"954ea93a-7914-4792-b989-9a376beed991\") " pod="openstack/dnsmasq-dns-86db49b7ff-8hm6n" Mar 20 10:54:58 crc kubenswrapper[4748]: I0320 10:54:58.204224 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/954ea93a-7914-4792-b989-9a376beed991-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-8hm6n\" (UID: \"954ea93a-7914-4792-b989-9a376beed991\") " pod="openstack/dnsmasq-dns-86db49b7ff-8hm6n" Mar 20 10:54:58 crc kubenswrapper[4748]: I0320 10:54:58.204253 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b80cd60-a2f6-4638-a600-4d866573bbc3-config\") pod \"ovn-northd-0\" (UID: \"9b80cd60-a2f6-4638-a600-4d866573bbc3\") " pod="openstack/ovn-northd-0" Mar 20 10:54:58 crc kubenswrapper[4748]: I0320 10:54:58.204717 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9b80cd60-a2f6-4638-a600-4d866573bbc3-scripts\") pod \"ovn-northd-0\" (UID: \"9b80cd60-a2f6-4638-a600-4d866573bbc3\") " pod="openstack/ovn-northd-0" Mar 20 10:54:58 crc kubenswrapper[4748]: I0320 10:54:58.204964 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/954ea93a-7914-4792-b989-9a376beed991-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-8hm6n\" (UID: \"954ea93a-7914-4792-b989-9a376beed991\") " pod="openstack/dnsmasq-dns-86db49b7ff-8hm6n" Mar 20 10:54:58 crc kubenswrapper[4748]: I0320 10:54:58.208511 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9b80cd60-a2f6-4638-a600-4d866573bbc3-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"9b80cd60-a2f6-4638-a600-4d866573bbc3\") " pod="openstack/ovn-northd-0" Mar 20 10:54:58 crc kubenswrapper[4748]: I0320 10:54:58.209545 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b80cd60-a2f6-4638-a600-4d866573bbc3-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"9b80cd60-a2f6-4638-a600-4d866573bbc3\") " pod="openstack/ovn-northd-0" Mar 20 10:54:58 crc kubenswrapper[4748]: I0320 10:54:58.210412 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b80cd60-a2f6-4638-a600-4d866573bbc3-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"9b80cd60-a2f6-4638-a600-4d866573bbc3\") " pod="openstack/ovn-northd-0" Mar 20 10:54:58 crc kubenswrapper[4748]: I0320 10:54:58.211215 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b80cd60-a2f6-4638-a600-4d866573bbc3-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"9b80cd60-a2f6-4638-a600-4d866573bbc3\") " pod="openstack/ovn-northd-0" Mar 20 10:54:58 crc kubenswrapper[4748]: I0320 10:54:58.229150 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdknd\" (UniqueName: \"kubernetes.io/projected/9b80cd60-a2f6-4638-a600-4d866573bbc3-kube-api-access-xdknd\") pod \"ovn-northd-0\" (UID: \"9b80cd60-a2f6-4638-a600-4d866573bbc3\") " pod="openstack/ovn-northd-0" Mar 20 10:54:58 crc kubenswrapper[4748]: I0320 10:54:58.229427 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h9jzm\" (UniqueName: \"kubernetes.io/projected/954ea93a-7914-4792-b989-9a376beed991-kube-api-access-h9jzm\") pod \"dnsmasq-dns-86db49b7ff-8hm6n\" (UID: \"954ea93a-7914-4792-b989-9a376beed991\") " pod="openstack/dnsmasq-dns-86db49b7ff-8hm6n" Mar 20 10:54:58 crc kubenswrapper[4748]: I0320 10:54:58.333811 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-8hm6n" Mar 20 10:54:58 crc kubenswrapper[4748]: I0320 10:54:58.359549 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 20 10:54:58 crc kubenswrapper[4748]: I0320 10:54:58.376564 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-dv6gr" event={"ID":"c0179e70-7de5-4413-868b-8d442ec891e5","Type":"ContainerStarted","Data":"0c9350b14510d84174bc5cfc83122cdb2e8cee5632f1da2763ce251c6c3a8bf8"} Mar 20 10:54:58 crc kubenswrapper[4748]: I0320 10:54:58.391725 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a5a9b3e3-3a44-4765-ab5b-0e7955b524f7","Type":"ContainerStarted","Data":"6ba003918b6897670cc8bdcdc8ff22f66bc5073d251d1b4a3f6edadbbe769774"} Mar 20 10:54:58 crc kubenswrapper[4748]: I0320 10:54:58.403516 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c9362889-0195-4aad-96bd-ed63db88da83","Type":"ContainerStarted","Data":"6ccf567ca597230b5f8e105670b07f93dbf0776dfe28b4aadbed6ba96ca21f1b"} Mar 20 10:54:58 crc kubenswrapper[4748]: I0320 10:54:58.512798 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-kdcvl"] Mar 20 10:54:58 crc kubenswrapper[4748]: I0320 10:54:58.768361 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-zhsd6"] Mar 20 10:54:58 crc kubenswrapper[4748]: I0320 10:54:58.979279 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-8hm6n"] Mar 20 10:54:58 crc kubenswrapper[4748]: W0320 10:54:58.994342 4748 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod954ea93a_7914_4792_b989_9a376beed991.slice/crio-d3c55c5370ebc6b98b221595a9803fe09255531485f0e997eb83aa38a0f29051 WatchSource:0}: Error finding container d3c55c5370ebc6b98b221595a9803fe09255531485f0e997eb83aa38a0f29051: Status 404 returned error can't find the container with id d3c55c5370ebc6b98b221595a9803fe09255531485f0e997eb83aa38a0f29051 Mar 20 10:54:59 crc kubenswrapper[4748]: W0320 10:54:59.060908 4748 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9b80cd60_a2f6_4638_a600_4d866573bbc3.slice/crio-664913edcdc12facb9ec79ab4ba88a3097c91f629e33d0e260498e3f3a4dd6aa WatchSource:0}: Error finding container 664913edcdc12facb9ec79ab4ba88a3097c91f629e33d0e260498e3f3a4dd6aa: Status 404 returned error can't find the container with id 664913edcdc12facb9ec79ab4ba88a3097c91f629e33d0e260498e3f3a4dd6aa Mar 20 10:54:59 crc kubenswrapper[4748]: I0320 10:54:59.062525 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 20 10:54:59 crc kubenswrapper[4748]: I0320 10:54:59.410065 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-kdcvl" event={"ID":"0c60c0d4-dc51-4249-a35d-3a9601052fcd","Type":"ContainerStarted","Data":"983c54fd22a4523c7b017e9c1d1b1033d26281ec63520750f6f8d1e929237243"} Mar 20 10:54:59 crc kubenswrapper[4748]: I0320 10:54:59.411672 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-8hm6n" event={"ID":"954ea93a-7914-4792-b989-9a376beed991","Type":"ContainerStarted","Data":"d3c55c5370ebc6b98b221595a9803fe09255531485f0e997eb83aa38a0f29051"} Mar 20 10:54:59 crc kubenswrapper[4748]: I0320 10:54:59.413391 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"9b80cd60-a2f6-4638-a600-4d866573bbc3","Type":"ContainerStarted","Data":"664913edcdc12facb9ec79ab4ba88a3097c91f629e33d0e260498e3f3a4dd6aa"} Mar 20 10:54:59 crc kubenswrapper[4748]: I0320 10:54:59.419186 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-zhsd6" event={"ID":"6b172e4c-c5b0-4573-b80c-9bc074489627","Type":"ContainerStarted","Data":"fed6eee9485ea04ee0b060fc3d48c9316fe253877378bd1b0c3ab3db7ecfb0d0"} Mar 20 10:55:00 crc kubenswrapper[4748]: I0320 10:55:00.429852 4748 generic.go:334] "Generic (PLEG): container finished" podID="74743039-97e4-46cf-8fbf-183c8c11ca20" containerID="d7cf477475f12edbe557b6ccba8fb795c3fc1e78e3ac026f21911552125d4fb4" exitCode=0 Mar 20 10:55:00 crc kubenswrapper[4748]: I0320 10:55:00.429926 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"74743039-97e4-46cf-8fbf-183c8c11ca20","Type":"ContainerDied","Data":"d7cf477475f12edbe557b6ccba8fb795c3fc1e78e3ac026f21911552125d4fb4"} Mar 20 10:55:04 crc kubenswrapper[4748]: I0320 10:55:04.469207 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"74743039-97e4-46cf-8fbf-183c8c11ca20","Type":"ContainerStarted","Data":"698836ad5e5e280c9c0e221b72c4edc506ec2c4c3207ad5b133484d0a7554d85"} Mar 20 10:55:04 crc kubenswrapper[4748]: I0320 10:55:04.473916 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"ffd19d53-385f-45a9-a222-caa7fbf6545e","Type":"ContainerStarted","Data":"4cca4c55509844bd6486a06711f3eae70b3ec61c1db71d2b9df39854b1b94ddb"} Mar 20 10:55:04 crc kubenswrapper[4748]: I0320 10:55:04.477364 4748 generic.go:334] "Generic (PLEG): container finished" podID="954ea93a-7914-4792-b989-9a376beed991" containerID="99de9558a1c154da176d583f15404d0f01c5c93bd30f19853d9beac956ffd70d" exitCode=0 Mar 20 10:55:04 crc kubenswrapper[4748]: I0320 10:55:04.477470 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-8hm6n" event={"ID":"954ea93a-7914-4792-b989-9a376beed991","Type":"ContainerDied","Data":"99de9558a1c154da176d583f15404d0f01c5c93bd30f19853d9beac956ffd70d"} Mar 20 10:55:04 crc kubenswrapper[4748]: I0320 10:55:04.479478 4748 generic.go:334] "Generic (PLEG): container finished" podID="0f69b3ab-d71e-44be-b0b0-fa830eb8756a" containerID="95e07573622ab2537fd7fa99e82ab09c065ca3fd810458a9659fce161277bbdf" exitCode=0 Mar 20 10:55:04 crc kubenswrapper[4748]: I0320 10:55:04.479540 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-qq2xk" event={"ID":"0f69b3ab-d71e-44be-b0b0-fa830eb8756a","Type":"ContainerDied","Data":"95e07573622ab2537fd7fa99e82ab09c065ca3fd810458a9659fce161277bbdf"} Mar 20 10:55:04 crc kubenswrapper[4748]: I0320 10:55:04.481892 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-zhsd6" event={"ID":"6b172e4c-c5b0-4573-b80c-9bc074489627","Type":"ContainerStarted","Data":"f96ac22709200e1e879080e1177dad51943b8d099e529e701bc3f8a5fb588e73"} Mar 20 10:55:04 crc kubenswrapper[4748]: I0320 10:55:04.488550 4748 generic.go:334] "Generic (PLEG): container finished" podID="0c60c0d4-dc51-4249-a35d-3a9601052fcd" containerID="f94717516fbfb078d41ff2191b61c8f6f726fff3b36db70a3c06eeb40f1b0a8e" exitCode=0 Mar 20 10:55:04 crc kubenswrapper[4748]: I0320 10:55:04.488732 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d769cc4f-dv6gr" podUID="c0179e70-7de5-4413-868b-8d442ec891e5" containerName="dnsmasq-dns" containerID="cri-o://0c9350b14510d84174bc5cfc83122cdb2e8cee5632f1da2763ce251c6c3a8bf8" gracePeriod=10 Mar 20 10:55:04 crc kubenswrapper[4748]: I0320 10:55:04.489760 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-kdcvl" event={"ID":"0c60c0d4-dc51-4249-a35d-3a9601052fcd","Type":"ContainerDied","Data":"f94717516fbfb078d41ff2191b61c8f6f726fff3b36db70a3c06eeb40f1b0a8e"} Mar 20 10:55:04 crc kubenswrapper[4748]: I0320 10:55:04.490552 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d769cc4f-dv6gr" Mar 20 10:55:04 crc kubenswrapper[4748]: I0320 10:55:04.505600 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=13.084602779 podStartE2EDuration="55.505574356s" podCreationTimestamp="2026-03-20 10:54:09 +0000 UTC" firstStartedPulling="2026-03-20 10:54:11.681751831 +0000 UTC m=+1086.823297645" lastFinishedPulling="2026-03-20 10:54:54.102723408 +0000 UTC m=+1129.244269222" observedRunningTime="2026-03-20 10:55:04.495218706 +0000 UTC m=+1139.636764540" watchObservedRunningTime="2026-03-20 10:55:04.505574356 +0000 UTC m=+1139.647120170" Mar 20 10:55:04 crc kubenswrapper[4748]: I0320 10:55:04.507886 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57d769cc4f-dv6gr" Mar 20 10:55:04 crc kubenswrapper[4748]: I0320 10:55:04.566896 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d769cc4f-dv6gr" podStartSLOduration=10.057197326 podStartE2EDuration="56.566863857s" podCreationTimestamp="2026-03-20 10:54:08 +0000 UTC" firstStartedPulling="2026-03-20 10:54:09.574569064 +0000 UTC m=+1084.716114878" lastFinishedPulling="2026-03-20 10:54:56.084235595 +0000 UTC m=+1131.225781409" observedRunningTime="2026-03-20 10:55:04.535583031 +0000 UTC m=+1139.677128845" watchObservedRunningTime="2026-03-20 10:55:04.566863857 +0000 UTC m=+1139.708409681" Mar 20 10:55:04 crc kubenswrapper[4748]: I0320 10:55:04.660507 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-zhsd6" podStartSLOduration=7.660480661 podStartE2EDuration="7.660480661s" podCreationTimestamp="2026-03-20 10:54:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:55:04.653156387 +0000 UTC m=+1139.794702201" watchObservedRunningTime="2026-03-20 10:55:04.660480661 +0000 UTC m=+1139.802026475" Mar 20 10:55:04 crc kubenswrapper[4748]: I0320 10:55:04.955646 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-qq2xk" Mar 20 10:55:05 crc kubenswrapper[4748]: I0320 10:55:05.061609 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0f69b3ab-d71e-44be-b0b0-fa830eb8756a-dns-svc\") pod \"0f69b3ab-d71e-44be-b0b0-fa830eb8756a\" (UID: \"0f69b3ab-d71e-44be-b0b0-fa830eb8756a\") " Mar 20 10:55:05 crc kubenswrapper[4748]: I0320 10:55:05.061828 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f69b3ab-d71e-44be-b0b0-fa830eb8756a-config\") pod \"0f69b3ab-d71e-44be-b0b0-fa830eb8756a\" (UID: \"0f69b3ab-d71e-44be-b0b0-fa830eb8756a\") " Mar 20 10:55:05 crc kubenswrapper[4748]: I0320 10:55:05.062170 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d7b2j\" (UniqueName: \"kubernetes.io/projected/0f69b3ab-d71e-44be-b0b0-fa830eb8756a-kube-api-access-d7b2j\") pod \"0f69b3ab-d71e-44be-b0b0-fa830eb8756a\" (UID: \"0f69b3ab-d71e-44be-b0b0-fa830eb8756a\") " Mar 20 10:55:05 crc kubenswrapper[4748]: I0320 10:55:05.068171 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f69b3ab-d71e-44be-b0b0-fa830eb8756a-kube-api-access-d7b2j" (OuterVolumeSpecName: "kube-api-access-d7b2j") pod "0f69b3ab-d71e-44be-b0b0-fa830eb8756a" (UID: "0f69b3ab-d71e-44be-b0b0-fa830eb8756a"). InnerVolumeSpecName "kube-api-access-d7b2j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:55:05 crc kubenswrapper[4748]: I0320 10:55:05.074482 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-dv6gr" Mar 20 10:55:05 crc kubenswrapper[4748]: I0320 10:55:05.088659 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f69b3ab-d71e-44be-b0b0-fa830eb8756a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0f69b3ab-d71e-44be-b0b0-fa830eb8756a" (UID: "0f69b3ab-d71e-44be-b0b0-fa830eb8756a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:55:05 crc kubenswrapper[4748]: I0320 10:55:05.091031 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f69b3ab-d71e-44be-b0b0-fa830eb8756a-config" (OuterVolumeSpecName: "config") pod "0f69b3ab-d71e-44be-b0b0-fa830eb8756a" (UID: "0f69b3ab-d71e-44be-b0b0-fa830eb8756a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:55:05 crc kubenswrapper[4748]: I0320 10:55:05.163967 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0179e70-7de5-4413-868b-8d442ec891e5-config\") pod \"c0179e70-7de5-4413-868b-8d442ec891e5\" (UID: \"c0179e70-7de5-4413-868b-8d442ec891e5\") " Mar 20 10:55:05 crc kubenswrapper[4748]: I0320 10:55:05.164101 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c0179e70-7de5-4413-868b-8d442ec891e5-dns-svc\") pod \"c0179e70-7de5-4413-868b-8d442ec891e5\" (UID: \"c0179e70-7de5-4413-868b-8d442ec891e5\") " Mar 20 10:55:05 crc kubenswrapper[4748]: I0320 10:55:05.164271 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sgkv9\" (UniqueName: \"kubernetes.io/projected/c0179e70-7de5-4413-868b-8d442ec891e5-kube-api-access-sgkv9\") pod \"c0179e70-7de5-4413-868b-8d442ec891e5\" (UID: \"c0179e70-7de5-4413-868b-8d442ec891e5\") " Mar 20 10:55:05 crc kubenswrapper[4748]: I0320 10:55:05.164639 4748 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f69b3ab-d71e-44be-b0b0-fa830eb8756a-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:55:05 crc kubenswrapper[4748]: I0320 10:55:05.164664 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d7b2j\" (UniqueName: \"kubernetes.io/projected/0f69b3ab-d71e-44be-b0b0-fa830eb8756a-kube-api-access-d7b2j\") on node \"crc\" DevicePath \"\"" Mar 20 10:55:05 crc kubenswrapper[4748]: I0320 10:55:05.164677 4748 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0f69b3ab-d71e-44be-b0b0-fa830eb8756a-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 10:55:05 crc kubenswrapper[4748]: I0320 10:55:05.172013 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0179e70-7de5-4413-868b-8d442ec891e5-kube-api-access-sgkv9" (OuterVolumeSpecName: "kube-api-access-sgkv9") pod "c0179e70-7de5-4413-868b-8d442ec891e5" (UID: "c0179e70-7de5-4413-868b-8d442ec891e5"). InnerVolumeSpecName "kube-api-access-sgkv9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:55:05 crc kubenswrapper[4748]: I0320 10:55:05.210561 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0179e70-7de5-4413-868b-8d442ec891e5-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c0179e70-7de5-4413-868b-8d442ec891e5" (UID: "c0179e70-7de5-4413-868b-8d442ec891e5"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:55:05 crc kubenswrapper[4748]: I0320 10:55:05.224384 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0179e70-7de5-4413-868b-8d442ec891e5-config" (OuterVolumeSpecName: "config") pod "c0179e70-7de5-4413-868b-8d442ec891e5" (UID: "c0179e70-7de5-4413-868b-8d442ec891e5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:55:05 crc kubenswrapper[4748]: I0320 10:55:05.266386 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sgkv9\" (UniqueName: \"kubernetes.io/projected/c0179e70-7de5-4413-868b-8d442ec891e5-kube-api-access-sgkv9\") on node \"crc\" DevicePath \"\"" Mar 20 10:55:05 crc kubenswrapper[4748]: I0320 10:55:05.266450 4748 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0179e70-7de5-4413-868b-8d442ec891e5-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:55:05 crc kubenswrapper[4748]: I0320 10:55:05.266467 4748 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c0179e70-7de5-4413-868b-8d442ec891e5-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 10:55:05 crc kubenswrapper[4748]: I0320 10:55:05.499196 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-kdcvl" event={"ID":"0c60c0d4-dc51-4249-a35d-3a9601052fcd","Type":"ContainerStarted","Data":"a526abb7c92cd167c15ca0d20f783c77788e041efa605d3d51ca4e94e8fa699d"} Mar 20 10:55:05 crc kubenswrapper[4748]: I0320 10:55:05.501246 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7fd796d7df-kdcvl" Mar 20 10:55:05 crc kubenswrapper[4748]: I0320 10:55:05.507602 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"27339553-c013-4538-9a4d-5bbd249c197c","Type":"ContainerStarted","Data":"d426414ea2876ef99a59d647c158af808163c12a53101530221dda3ff92ec1fb"} Mar 20 10:55:05 crc kubenswrapper[4748]: I0320 10:55:05.507927 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Mar 20 10:55:05 crc kubenswrapper[4748]: I0320 10:55:05.519140 4748 generic.go:334] "Generic (PLEG): container finished" podID="c0179e70-7de5-4413-868b-8d442ec891e5" containerID="0c9350b14510d84174bc5cfc83122cdb2e8cee5632f1da2763ce251c6c3a8bf8" exitCode=0 Mar 20 10:55:05 crc kubenswrapper[4748]: I0320 10:55:05.519277 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-dv6gr" Mar 20 10:55:05 crc kubenswrapper[4748]: I0320 10:55:05.544759 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-qq2xk" Mar 20 10:55:05 crc kubenswrapper[4748]: I0320 10:55:05.546062 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-dv6gr" event={"ID":"c0179e70-7de5-4413-868b-8d442ec891e5","Type":"ContainerDied","Data":"0c9350b14510d84174bc5cfc83122cdb2e8cee5632f1da2763ce251c6c3a8bf8"} Mar 20 10:55:05 crc kubenswrapper[4748]: I0320 10:55:05.546151 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-dv6gr" event={"ID":"c0179e70-7de5-4413-868b-8d442ec891e5","Type":"ContainerDied","Data":"242bb20379b17471767cb9141d1623f2c76987d3ff4af3d38d6655d440a59abc"} Mar 20 10:55:05 crc kubenswrapper[4748]: I0320 10:55:05.546197 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-8hm6n" event={"ID":"954ea93a-7914-4792-b989-9a376beed991","Type":"ContainerStarted","Data":"cf097fcea63752268a3f9ba13c17109cf4b8f7688f589534286bade0237d59d1"} Mar 20 10:55:05 crc kubenswrapper[4748]: I0320 10:55:05.546214 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-qq2xk" event={"ID":"0f69b3ab-d71e-44be-b0b0-fa830eb8756a","Type":"ContainerDied","Data":"5793222b6b3cf464979eb7630466ac7e2dd9a5c51cf017ea06271d2b3a772977"} Mar 20 10:55:05 crc kubenswrapper[4748]: I0320 10:55:05.546298 4748 scope.go:117] "RemoveContainer" containerID="0c9350b14510d84174bc5cfc83122cdb2e8cee5632f1da2763ce251c6c3a8bf8" Mar 20 10:55:05 crc kubenswrapper[4748]: I0320 10:55:05.560201 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7fd796d7df-kdcvl" podStartSLOduration=8.56017618 podStartE2EDuration="8.56017618s" podCreationTimestamp="2026-03-20 10:54:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:55:05.526810632 +0000 UTC m=+1140.668356446" watchObservedRunningTime="2026-03-20 10:55:05.56017618 +0000 UTC m=+1140.701721994" Mar 20 10:55:05 crc kubenswrapper[4748]: I0320 10:55:05.596337 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=3.160219863 podStartE2EDuration="54.596302259s" podCreationTimestamp="2026-03-20 10:54:11 +0000 UTC" firstStartedPulling="2026-03-20 10:54:13.312015818 +0000 UTC m=+1088.453561632" lastFinishedPulling="2026-03-20 10:55:04.748098214 +0000 UTC m=+1139.889644028" observedRunningTime="2026-03-20 10:55:05.590564804 +0000 UTC m=+1140.732110628" watchObservedRunningTime="2026-03-20 10:55:05.596302259 +0000 UTC m=+1140.737848073" Mar 20 10:55:05 crc kubenswrapper[4748]: I0320 10:55:05.617156 4748 scope.go:117] "RemoveContainer" containerID="db24e1dadac5e77428c5a5774e6b4b3cc54e4699e8dadf4d8cc64e3b5f6d1c2c" Mar 20 10:55:05 crc kubenswrapper[4748]: I0320 10:55:05.628863 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-86db49b7ff-8hm6n" podStartSLOduration=8.628821096 podStartE2EDuration="8.628821096s" podCreationTimestamp="2026-03-20 10:54:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:55:05.626219571 +0000 UTC m=+1140.767765385" watchObservedRunningTime="2026-03-20 10:55:05.628821096 +0000 UTC m=+1140.770366910" Mar 20 10:55:05 crc kubenswrapper[4748]: I0320 10:55:05.672356 4748 scope.go:117] "RemoveContainer" containerID="0c9350b14510d84174bc5cfc83122cdb2e8cee5632f1da2763ce251c6c3a8bf8" Mar 20 10:55:05 crc kubenswrapper[4748]: E0320 10:55:05.680152 4748 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c9350b14510d84174bc5cfc83122cdb2e8cee5632f1da2763ce251c6c3a8bf8\": container with ID starting with 0c9350b14510d84174bc5cfc83122cdb2e8cee5632f1da2763ce251c6c3a8bf8 not found: ID does not exist" containerID="0c9350b14510d84174bc5cfc83122cdb2e8cee5632f1da2763ce251c6c3a8bf8" Mar 20 10:55:05 crc kubenswrapper[4748]: I0320 10:55:05.680226 4748 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c9350b14510d84174bc5cfc83122cdb2e8cee5632f1da2763ce251c6c3a8bf8"} err="failed to get container status \"0c9350b14510d84174bc5cfc83122cdb2e8cee5632f1da2763ce251c6c3a8bf8\": rpc error: code = NotFound desc = could not find container \"0c9350b14510d84174bc5cfc83122cdb2e8cee5632f1da2763ce251c6c3a8bf8\": container with ID starting with 0c9350b14510d84174bc5cfc83122cdb2e8cee5632f1da2763ce251c6c3a8bf8 not found: ID does not exist" Mar 20 10:55:05 crc kubenswrapper[4748]: I0320 10:55:05.680272 4748 scope.go:117] "RemoveContainer" containerID="db24e1dadac5e77428c5a5774e6b4b3cc54e4699e8dadf4d8cc64e3b5f6d1c2c" Mar 20 10:55:05 crc kubenswrapper[4748]: E0320 10:55:05.689848 4748 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db24e1dadac5e77428c5a5774e6b4b3cc54e4699e8dadf4d8cc64e3b5f6d1c2c\": container with ID starting with db24e1dadac5e77428c5a5774e6b4b3cc54e4699e8dadf4d8cc64e3b5f6d1c2c not found: ID does not exist" containerID="db24e1dadac5e77428c5a5774e6b4b3cc54e4699e8dadf4d8cc64e3b5f6d1c2c" Mar 20 10:55:05 crc kubenswrapper[4748]: I0320 10:55:05.689906 4748 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db24e1dadac5e77428c5a5774e6b4b3cc54e4699e8dadf4d8cc64e3b5f6d1c2c"} err="failed to get container status \"db24e1dadac5e77428c5a5774e6b4b3cc54e4699e8dadf4d8cc64e3b5f6d1c2c\": rpc error: code = NotFound desc = could not find container \"db24e1dadac5e77428c5a5774e6b4b3cc54e4699e8dadf4d8cc64e3b5f6d1c2c\": container with ID starting with db24e1dadac5e77428c5a5774e6b4b3cc54e4699e8dadf4d8cc64e3b5f6d1c2c not found: ID does not exist" Mar 20 10:55:05 crc kubenswrapper[4748]: I0320 10:55:05.689971 4748 scope.go:117] "RemoveContainer" containerID="95e07573622ab2537fd7fa99e82ab09c065ca3fd810458a9659fce161277bbdf" Mar 20 10:55:05 crc kubenswrapper[4748]: I0320 10:55:05.694017 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-qq2xk"] Mar 20 10:55:05 crc kubenswrapper[4748]: I0320 10:55:05.702799 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-qq2xk"] Mar 20 10:55:05 crc kubenswrapper[4748]: I0320 10:55:05.717496 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-dv6gr"] Mar 20 10:55:05 crc kubenswrapper[4748]: I0320 10:55:05.727172 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-dv6gr"] Mar 20 10:55:06 crc kubenswrapper[4748]: I0320 10:55:06.550926 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"9b80cd60-a2f6-4638-a600-4d866573bbc3","Type":"ContainerStarted","Data":"92d4bae34933c7ee04058b252229ac243769d89e72ae8031d814c4aa8db0c128"} Mar 20 10:55:06 crc kubenswrapper[4748]: I0320 10:55:06.551876 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86db49b7ff-8hm6n" Mar 20 10:55:06 crc kubenswrapper[4748]: I0320 10:55:06.551908 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"9b80cd60-a2f6-4638-a600-4d866573bbc3","Type":"ContainerStarted","Data":"873fc30687f11a75d5390b46d915a585cdfeb4c1a78067d0e5df6e040625a26e"} Mar 20 10:55:06 crc kubenswrapper[4748]: I0320 10:55:06.551934 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Mar 20 10:55:06 crc kubenswrapper[4748]: I0320 10:55:06.576441 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=3.182047288 podStartE2EDuration="9.57642099s" podCreationTimestamp="2026-03-20 10:54:57 +0000 UTC" firstStartedPulling="2026-03-20 10:54:59.065416105 +0000 UTC m=+1134.206961919" lastFinishedPulling="2026-03-20 10:55:05.459789807 +0000 UTC m=+1140.601335621" observedRunningTime="2026-03-20 10:55:06.568271376 +0000 UTC m=+1141.709817190" watchObservedRunningTime="2026-03-20 10:55:06.57642099 +0000 UTC m=+1141.717966804" Mar 20 10:55:07 crc kubenswrapper[4748]: I0320 10:55:07.531815 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f69b3ab-d71e-44be-b0b0-fa830eb8756a" path="/var/lib/kubelet/pods/0f69b3ab-d71e-44be-b0b0-fa830eb8756a/volumes" Mar 20 10:55:07 crc kubenswrapper[4748]: I0320 10:55:07.532374 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0179e70-7de5-4413-868b-8d442ec891e5" path="/var/lib/kubelet/pods/c0179e70-7de5-4413-868b-8d442ec891e5/volumes" Mar 20 10:55:08 crc kubenswrapper[4748]: I0320 10:55:08.569394 4748 generic.go:334] "Generic (PLEG): container finished" podID="ffd19d53-385f-45a9-a222-caa7fbf6545e" containerID="4cca4c55509844bd6486a06711f3eae70b3ec61c1db71d2b9df39854b1b94ddb" exitCode=0 Mar 20 10:55:08 crc kubenswrapper[4748]: I0320 10:55:08.569475 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"ffd19d53-385f-45a9-a222-caa7fbf6545e","Type":"ContainerDied","Data":"4cca4c55509844bd6486a06711f3eae70b3ec61c1db71d2b9df39854b1b94ddb"} Mar 20 10:55:09 crc kubenswrapper[4748]: I0320 10:55:09.580120 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"ffd19d53-385f-45a9-a222-caa7fbf6545e","Type":"ContainerStarted","Data":"918495e214487cf1346e5b706116f7d6e2c29f14f5e938d7fd49a35eaad2f213"} Mar 20 10:55:09 crc kubenswrapper[4748]: I0320 10:55:09.602702 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=-9223371977.252096 podStartE2EDuration="59.602679014s" podCreationTimestamp="2026-03-20 10:54:10 +0000 UTC" firstStartedPulling="2026-03-20 10:54:13.343146315 +0000 UTC m=+1088.484692119" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:55:09.60133311 +0000 UTC m=+1144.742878934" watchObservedRunningTime="2026-03-20 10:55:09.602679014 +0000 UTC m=+1144.744224838" Mar 20 10:55:10 crc kubenswrapper[4748]: I0320 10:55:10.786244 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Mar 20 10:55:10 crc kubenswrapper[4748]: I0320 10:55:10.786812 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Mar 20 10:55:10 crc kubenswrapper[4748]: I0320 10:55:10.870759 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Mar 20 10:55:11 crc kubenswrapper[4748]: I0320 10:55:11.670946 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Mar 20 10:55:12 crc kubenswrapper[4748]: I0320 10:55:12.100685 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Mar 20 10:55:12 crc kubenswrapper[4748]: I0320 10:55:12.100757 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Mar 20 10:55:12 crc kubenswrapper[4748]: I0320 10:55:12.113225 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Mar 20 10:55:12 crc kubenswrapper[4748]: I0320 10:55:12.928362 4748 patch_prober.go:28] interesting pod/machine-config-daemon-5lbvz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 10:55:12 crc kubenswrapper[4748]: I0320 10:55:12.928733 4748 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 10:55:12 crc kubenswrapper[4748]: I0320 10:55:12.928806 4748 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" Mar 20 10:55:12 crc kubenswrapper[4748]: I0320 10:55:12.929643 4748 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c1253b5c73f9d1fbdb9567d5ed33468b56df43e62dd82201c361a89f824870ce"} pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 10:55:12 crc kubenswrapper[4748]: I0320 10:55:12.929724 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" containerName="machine-config-daemon" containerID="cri-o://c1253b5c73f9d1fbdb9567d5ed33468b56df43e62dd82201c361a89f824870ce" gracePeriod=600 Mar 20 10:55:13 crc kubenswrapper[4748]: I0320 10:55:13.127095 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7fd796d7df-kdcvl" Mar 20 10:55:13 crc kubenswrapper[4748]: I0320 10:55:13.337120 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-86db49b7ff-8hm6n" Mar 20 10:55:13 crc kubenswrapper[4748]: I0320 10:55:13.401042 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-kdcvl"] Mar 20 10:55:13 crc kubenswrapper[4748]: I0320 10:55:13.497407 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-blm4h"] Mar 20 10:55:13 crc kubenswrapper[4748]: E0320 10:55:13.498091 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0179e70-7de5-4413-868b-8d442ec891e5" containerName="init" Mar 20 10:55:13 crc kubenswrapper[4748]: I0320 10:55:13.498201 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0179e70-7de5-4413-868b-8d442ec891e5" containerName="init" Mar 20 10:55:13 crc kubenswrapper[4748]: E0320 10:55:13.498302 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f69b3ab-d71e-44be-b0b0-fa830eb8756a" containerName="init" Mar 20 10:55:13 crc kubenswrapper[4748]: I0320 10:55:13.498366 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f69b3ab-d71e-44be-b0b0-fa830eb8756a" containerName="init" Mar 20 10:55:13 crc kubenswrapper[4748]: E0320 10:55:13.498457 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0179e70-7de5-4413-868b-8d442ec891e5" containerName="dnsmasq-dns" Mar 20 10:55:13 crc kubenswrapper[4748]: I0320 10:55:13.498516 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0179e70-7de5-4413-868b-8d442ec891e5" containerName="dnsmasq-dns" Mar 20 10:55:13 crc kubenswrapper[4748]: I0320 10:55:13.498747 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0179e70-7de5-4413-868b-8d442ec891e5" containerName="dnsmasq-dns" Mar 20 10:55:13 crc kubenswrapper[4748]: I0320 10:55:13.498852 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f69b3ab-d71e-44be-b0b0-fa830eb8756a" containerName="init" Mar 20 10:55:13 crc kubenswrapper[4748]: I0320 10:55:13.499507 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-blm4h" Mar 20 10:55:13 crc kubenswrapper[4748]: I0320 10:55:13.565044 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-blm4h"] Mar 20 10:55:13 crc kubenswrapper[4748]: I0320 10:55:13.606341 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-96de-account-create-update-6v8g4"] Mar 20 10:55:13 crc kubenswrapper[4748]: I0320 10:55:13.607523 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-96de-account-create-update-6v8g4" Mar 20 10:55:13 crc kubenswrapper[4748]: I0320 10:55:13.611214 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Mar 20 10:55:13 crc kubenswrapper[4748]: I0320 10:55:13.620712 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/089a3bc7-a7c8-4579-bb88-8e7e515e750a-operator-scripts\") pod \"placement-db-create-blm4h\" (UID: \"089a3bc7-a7c8-4579-bb88-8e7e515e750a\") " pod="openstack/placement-db-create-blm4h" Mar 20 10:55:13 crc kubenswrapper[4748]: I0320 10:55:13.620792 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxh2z\" (UniqueName: \"kubernetes.io/projected/089a3bc7-a7c8-4579-bb88-8e7e515e750a-kube-api-access-mxh2z\") pod \"placement-db-create-blm4h\" (UID: \"089a3bc7-a7c8-4579-bb88-8e7e515e750a\") " pod="openstack/placement-db-create-blm4h" Mar 20 10:55:13 crc kubenswrapper[4748]: I0320 10:55:13.623908 4748 generic.go:334] "Generic (PLEG): container finished" podID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" containerID="c1253b5c73f9d1fbdb9567d5ed33468b56df43e62dd82201c361a89f824870ce" exitCode=0 Mar 20 10:55:13 crc kubenswrapper[4748]: I0320 10:55:13.624002 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" event={"ID":"8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c","Type":"ContainerDied","Data":"c1253b5c73f9d1fbdb9567d5ed33468b56df43e62dd82201c361a89f824870ce"} Mar 20 10:55:13 crc kubenswrapper[4748]: I0320 10:55:13.624054 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" event={"ID":"8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c","Type":"ContainerStarted","Data":"5702c4811be941554197075836f07c222a1d80d3d9c15e6ccc7b992ee69ce82c"} Mar 20 10:55:13 crc kubenswrapper[4748]: I0320 10:55:13.624074 4748 scope.go:117] "RemoveContainer" containerID="9de74f70ec1f6bdcd394aa24f2c91f712fff4a83d715b6c60cbd64842427bc57" Mar 20 10:55:13 crc kubenswrapper[4748]: I0320 10:55:13.624130 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7fd796d7df-kdcvl" podUID="0c60c0d4-dc51-4249-a35d-3a9601052fcd" containerName="dnsmasq-dns" containerID="cri-o://a526abb7c92cd167c15ca0d20f783c77788e041efa605d3d51ca4e94e8fa699d" gracePeriod=10 Mar 20 10:55:13 crc kubenswrapper[4748]: I0320 10:55:13.633711 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-96de-account-create-update-6v8g4"] Mar 20 10:55:13 crc kubenswrapper[4748]: I0320 10:55:13.723392 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/089a3bc7-a7c8-4579-bb88-8e7e515e750a-operator-scripts\") pod \"placement-db-create-blm4h\" (UID: \"089a3bc7-a7c8-4579-bb88-8e7e515e750a\") " pod="openstack/placement-db-create-blm4h" Mar 20 10:55:13 crc kubenswrapper[4748]: I0320 10:55:13.723518 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxh2z\" (UniqueName: \"kubernetes.io/projected/089a3bc7-a7c8-4579-bb88-8e7e515e750a-kube-api-access-mxh2z\") pod \"placement-db-create-blm4h\" (UID: \"089a3bc7-a7c8-4579-bb88-8e7e515e750a\") " pod="openstack/placement-db-create-blm4h" Mar 20 10:55:13 crc kubenswrapper[4748]: I0320 10:55:13.723559 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g57hv\" (UniqueName: \"kubernetes.io/projected/ff27e027-9d58-432b-81aa-6ceacbf7fe94-kube-api-access-g57hv\") pod \"placement-96de-account-create-update-6v8g4\" (UID: \"ff27e027-9d58-432b-81aa-6ceacbf7fe94\") " pod="openstack/placement-96de-account-create-update-6v8g4" Mar 20 10:55:13 crc kubenswrapper[4748]: I0320 10:55:13.723642 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ff27e027-9d58-432b-81aa-6ceacbf7fe94-operator-scripts\") pod \"placement-96de-account-create-update-6v8g4\" (UID: \"ff27e027-9d58-432b-81aa-6ceacbf7fe94\") " pod="openstack/placement-96de-account-create-update-6v8g4" Mar 20 10:55:13 crc kubenswrapper[4748]: I0320 10:55:13.724706 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/089a3bc7-a7c8-4579-bb88-8e7e515e750a-operator-scripts\") pod \"placement-db-create-blm4h\" (UID: \"089a3bc7-a7c8-4579-bb88-8e7e515e750a\") " pod="openstack/placement-db-create-blm4h" Mar 20 10:55:13 crc kubenswrapper[4748]: I0320 10:55:13.752857 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxh2z\" (UniqueName: \"kubernetes.io/projected/089a3bc7-a7c8-4579-bb88-8e7e515e750a-kube-api-access-mxh2z\") pod \"placement-db-create-blm4h\" (UID: \"089a3bc7-a7c8-4579-bb88-8e7e515e750a\") " pod="openstack/placement-db-create-blm4h" Mar 20 10:55:13 crc kubenswrapper[4748]: I0320 10:55:13.820465 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-blm4h" Mar 20 10:55:13 crc kubenswrapper[4748]: I0320 10:55:13.825056 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g57hv\" (UniqueName: \"kubernetes.io/projected/ff27e027-9d58-432b-81aa-6ceacbf7fe94-kube-api-access-g57hv\") pod \"placement-96de-account-create-update-6v8g4\" (UID: \"ff27e027-9d58-432b-81aa-6ceacbf7fe94\") " pod="openstack/placement-96de-account-create-update-6v8g4" Mar 20 10:55:13 crc kubenswrapper[4748]: I0320 10:55:13.825286 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ff27e027-9d58-432b-81aa-6ceacbf7fe94-operator-scripts\") pod \"placement-96de-account-create-update-6v8g4\" (UID: \"ff27e027-9d58-432b-81aa-6ceacbf7fe94\") " pod="openstack/placement-96de-account-create-update-6v8g4" Mar 20 10:55:13 crc kubenswrapper[4748]: I0320 10:55:13.826423 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ff27e027-9d58-432b-81aa-6ceacbf7fe94-operator-scripts\") pod \"placement-96de-account-create-update-6v8g4\" (UID: \"ff27e027-9d58-432b-81aa-6ceacbf7fe94\") " pod="openstack/placement-96de-account-create-update-6v8g4" Mar 20 10:55:13 crc kubenswrapper[4748]: I0320 10:55:13.850379 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g57hv\" (UniqueName: \"kubernetes.io/projected/ff27e027-9d58-432b-81aa-6ceacbf7fe94-kube-api-access-g57hv\") pod \"placement-96de-account-create-update-6v8g4\" (UID: \"ff27e027-9d58-432b-81aa-6ceacbf7fe94\") " pod="openstack/placement-96de-account-create-update-6v8g4" Mar 20 10:55:13 crc kubenswrapper[4748]: I0320 10:55:13.936235 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-96de-account-create-update-6v8g4" Mar 20 10:55:14 crc kubenswrapper[4748]: I0320 10:55:14.092736 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-kdcvl" Mar 20 10:55:14 crc kubenswrapper[4748]: I0320 10:55:14.135112 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c60c0d4-dc51-4249-a35d-3a9601052fcd-config\") pod \"0c60c0d4-dc51-4249-a35d-3a9601052fcd\" (UID: \"0c60c0d4-dc51-4249-a35d-3a9601052fcd\") " Mar 20 10:55:14 crc kubenswrapper[4748]: I0320 10:55:14.135168 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-827hq\" (UniqueName: \"kubernetes.io/projected/0c60c0d4-dc51-4249-a35d-3a9601052fcd-kube-api-access-827hq\") pod \"0c60c0d4-dc51-4249-a35d-3a9601052fcd\" (UID: \"0c60c0d4-dc51-4249-a35d-3a9601052fcd\") " Mar 20 10:55:14 crc kubenswrapper[4748]: I0320 10:55:14.135253 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0c60c0d4-dc51-4249-a35d-3a9601052fcd-dns-svc\") pod \"0c60c0d4-dc51-4249-a35d-3a9601052fcd\" (UID: \"0c60c0d4-dc51-4249-a35d-3a9601052fcd\") " Mar 20 10:55:14 crc kubenswrapper[4748]: I0320 10:55:14.135332 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0c60c0d4-dc51-4249-a35d-3a9601052fcd-ovsdbserver-nb\") pod \"0c60c0d4-dc51-4249-a35d-3a9601052fcd\" (UID: \"0c60c0d4-dc51-4249-a35d-3a9601052fcd\") " Mar 20 10:55:14 crc kubenswrapper[4748]: I0320 10:55:14.143373 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c60c0d4-dc51-4249-a35d-3a9601052fcd-kube-api-access-827hq" (OuterVolumeSpecName: "kube-api-access-827hq") pod "0c60c0d4-dc51-4249-a35d-3a9601052fcd" (UID: "0c60c0d4-dc51-4249-a35d-3a9601052fcd"). InnerVolumeSpecName "kube-api-access-827hq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:55:14 crc kubenswrapper[4748]: I0320 10:55:14.182618 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0c60c0d4-dc51-4249-a35d-3a9601052fcd-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "0c60c0d4-dc51-4249-a35d-3a9601052fcd" (UID: "0c60c0d4-dc51-4249-a35d-3a9601052fcd"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:55:14 crc kubenswrapper[4748]: I0320 10:55:14.183253 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0c60c0d4-dc51-4249-a35d-3a9601052fcd-config" (OuterVolumeSpecName: "config") pod "0c60c0d4-dc51-4249-a35d-3a9601052fcd" (UID: "0c60c0d4-dc51-4249-a35d-3a9601052fcd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:55:14 crc kubenswrapper[4748]: I0320 10:55:14.186542 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0c60c0d4-dc51-4249-a35d-3a9601052fcd-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0c60c0d4-dc51-4249-a35d-3a9601052fcd" (UID: "0c60c0d4-dc51-4249-a35d-3a9601052fcd"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:55:14 crc kubenswrapper[4748]: I0320 10:55:14.237324 4748 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0c60c0d4-dc51-4249-a35d-3a9601052fcd-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 10:55:14 crc kubenswrapper[4748]: I0320 10:55:14.237362 4748 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c60c0d4-dc51-4249-a35d-3a9601052fcd-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:55:14 crc kubenswrapper[4748]: I0320 10:55:14.237372 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-827hq\" (UniqueName: \"kubernetes.io/projected/0c60c0d4-dc51-4249-a35d-3a9601052fcd-kube-api-access-827hq\") on node \"crc\" DevicePath \"\"" Mar 20 10:55:14 crc kubenswrapper[4748]: I0320 10:55:14.237382 4748 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0c60c0d4-dc51-4249-a35d-3a9601052fcd-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 10:55:14 crc kubenswrapper[4748]: I0320 10:55:14.300240 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-blm4h"] Mar 20 10:55:14 crc kubenswrapper[4748]: W0320 10:55:14.306987 4748 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod089a3bc7_a7c8_4579_bb88_8e7e515e750a.slice/crio-63d6a6a6b99d4b3f4adcf089fbf5a2b09011133eb173de54f681fb69f3b5ff97 WatchSource:0}: Error finding container 63d6a6a6b99d4b3f4adcf089fbf5a2b09011133eb173de54f681fb69f3b5ff97: Status 404 returned error can't find the container with id 63d6a6a6b99d4b3f4adcf089fbf5a2b09011133eb173de54f681fb69f3b5ff97 Mar 20 10:55:14 crc kubenswrapper[4748]: I0320 10:55:14.422173 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-96de-account-create-update-6v8g4"] Mar 20 10:55:14 crc kubenswrapper[4748]: I0320 10:55:14.640065 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-blm4h" event={"ID":"089a3bc7-a7c8-4579-bb88-8e7e515e750a","Type":"ContainerStarted","Data":"a594226dd2e1fe41ac6300f680114ffedfadebb30c0ede6352508cec5a95a15e"} Mar 20 10:55:14 crc kubenswrapper[4748]: I0320 10:55:14.640128 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-blm4h" event={"ID":"089a3bc7-a7c8-4579-bb88-8e7e515e750a","Type":"ContainerStarted","Data":"63d6a6a6b99d4b3f4adcf089fbf5a2b09011133eb173de54f681fb69f3b5ff97"} Mar 20 10:55:14 crc kubenswrapper[4748]: I0320 10:55:14.644299 4748 generic.go:334] "Generic (PLEG): container finished" podID="0c60c0d4-dc51-4249-a35d-3a9601052fcd" containerID="a526abb7c92cd167c15ca0d20f783c77788e041efa605d3d51ca4e94e8fa699d" exitCode=0 Mar 20 10:55:14 crc kubenswrapper[4748]: I0320 10:55:14.644383 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-kdcvl" event={"ID":"0c60c0d4-dc51-4249-a35d-3a9601052fcd","Type":"ContainerDied","Data":"a526abb7c92cd167c15ca0d20f783c77788e041efa605d3d51ca4e94e8fa699d"} Mar 20 10:55:14 crc kubenswrapper[4748]: I0320 10:55:14.644424 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-kdcvl" event={"ID":"0c60c0d4-dc51-4249-a35d-3a9601052fcd","Type":"ContainerDied","Data":"983c54fd22a4523c7b017e9c1d1b1033d26281ec63520750f6f8d1e929237243"} Mar 20 10:55:14 crc kubenswrapper[4748]: I0320 10:55:14.644443 4748 scope.go:117] "RemoveContainer" containerID="a526abb7c92cd167c15ca0d20f783c77788e041efa605d3d51ca4e94e8fa699d" Mar 20 10:55:14 crc kubenswrapper[4748]: I0320 10:55:14.644557 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-kdcvl" Mar 20 10:55:14 crc kubenswrapper[4748]: I0320 10:55:14.647515 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-96de-account-create-update-6v8g4" event={"ID":"ff27e027-9d58-432b-81aa-6ceacbf7fe94","Type":"ContainerStarted","Data":"71e234e3809b3d8da9dde5d5f2196c5e4903bdc342f50c61309a220a0ca75dbe"} Mar 20 10:55:14 crc kubenswrapper[4748]: I0320 10:55:14.647573 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-96de-account-create-update-6v8g4" event={"ID":"ff27e027-9d58-432b-81aa-6ceacbf7fe94","Type":"ContainerStarted","Data":"b07b3e45413689ad00a3aea551eac803ff592157c57ae369bbfe9bc232cd561f"} Mar 20 10:55:14 crc kubenswrapper[4748]: I0320 10:55:14.663435 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-create-blm4h" podStartSLOduration=1.663409796 podStartE2EDuration="1.663409796s" podCreationTimestamp="2026-03-20 10:55:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:55:14.657436126 +0000 UTC m=+1149.799002640" watchObservedRunningTime="2026-03-20 10:55:14.663409796 +0000 UTC m=+1149.804955610" Mar 20 10:55:14 crc kubenswrapper[4748]: I0320 10:55:14.674873 4748 scope.go:117] "RemoveContainer" containerID="f94717516fbfb078d41ff2191b61c8f6f726fff3b36db70a3c06eeb40f1b0a8e" Mar 20 10:55:14 crc kubenswrapper[4748]: I0320 10:55:14.685021 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-96de-account-create-update-6v8g4" podStartSLOduration=1.684995799 podStartE2EDuration="1.684995799s" podCreationTimestamp="2026-03-20 10:55:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:55:14.678564197 +0000 UTC m=+1149.820110011" watchObservedRunningTime="2026-03-20 10:55:14.684995799 +0000 UTC m=+1149.826541613" Mar 20 10:55:14 crc kubenswrapper[4748]: I0320 10:55:14.708055 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-kdcvl"] Mar 20 10:55:14 crc kubenswrapper[4748]: I0320 10:55:14.715405 4748 scope.go:117] "RemoveContainer" containerID="a526abb7c92cd167c15ca0d20f783c77788e041efa605d3d51ca4e94e8fa699d" Mar 20 10:55:14 crc kubenswrapper[4748]: I0320 10:55:14.717523 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-kdcvl"] Mar 20 10:55:14 crc kubenswrapper[4748]: E0320 10:55:14.718130 4748 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a526abb7c92cd167c15ca0d20f783c77788e041efa605d3d51ca4e94e8fa699d\": container with ID starting with a526abb7c92cd167c15ca0d20f783c77788e041efa605d3d51ca4e94e8fa699d not found: ID does not exist" containerID="a526abb7c92cd167c15ca0d20f783c77788e041efa605d3d51ca4e94e8fa699d" Mar 20 10:55:14 crc kubenswrapper[4748]: I0320 10:55:14.718177 4748 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a526abb7c92cd167c15ca0d20f783c77788e041efa605d3d51ca4e94e8fa699d"} err="failed to get container status \"a526abb7c92cd167c15ca0d20f783c77788e041efa605d3d51ca4e94e8fa699d\": rpc error: code = NotFound desc = could not find container \"a526abb7c92cd167c15ca0d20f783c77788e041efa605d3d51ca4e94e8fa699d\": container with ID starting with a526abb7c92cd167c15ca0d20f783c77788e041efa605d3d51ca4e94e8fa699d not found: ID does not exist" Mar 20 10:55:14 crc kubenswrapper[4748]: I0320 10:55:14.718211 4748 scope.go:117] "RemoveContainer" containerID="f94717516fbfb078d41ff2191b61c8f6f726fff3b36db70a3c06eeb40f1b0a8e" Mar 20 10:55:14 crc kubenswrapper[4748]: E0320 10:55:14.721089 4748 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f94717516fbfb078d41ff2191b61c8f6f726fff3b36db70a3c06eeb40f1b0a8e\": container with ID starting with f94717516fbfb078d41ff2191b61c8f6f726fff3b36db70a3c06eeb40f1b0a8e not found: ID does not exist" containerID="f94717516fbfb078d41ff2191b61c8f6f726fff3b36db70a3c06eeb40f1b0a8e" Mar 20 10:55:14 crc kubenswrapper[4748]: I0320 10:55:14.721153 4748 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f94717516fbfb078d41ff2191b61c8f6f726fff3b36db70a3c06eeb40f1b0a8e"} err="failed to get container status \"f94717516fbfb078d41ff2191b61c8f6f726fff3b36db70a3c06eeb40f1b0a8e\": rpc error: code = NotFound desc = could not find container \"f94717516fbfb078d41ff2191b61c8f6f726fff3b36db70a3c06eeb40f1b0a8e\": container with ID starting with f94717516fbfb078d41ff2191b61c8f6f726fff3b36db70a3c06eeb40f1b0a8e not found: ID does not exist" Mar 20 10:55:14 crc kubenswrapper[4748]: I0320 10:55:14.926082 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-698758b865-jb9kr"] Mar 20 10:55:14 crc kubenswrapper[4748]: E0320 10:55:14.926781 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c60c0d4-dc51-4249-a35d-3a9601052fcd" containerName="dnsmasq-dns" Mar 20 10:55:14 crc kubenswrapper[4748]: I0320 10:55:14.926799 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c60c0d4-dc51-4249-a35d-3a9601052fcd" containerName="dnsmasq-dns" Mar 20 10:55:14 crc kubenswrapper[4748]: E0320 10:55:14.926820 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c60c0d4-dc51-4249-a35d-3a9601052fcd" containerName="init" Mar 20 10:55:14 crc kubenswrapper[4748]: I0320 10:55:14.926846 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c60c0d4-dc51-4249-a35d-3a9601052fcd" containerName="init" Mar 20 10:55:14 crc kubenswrapper[4748]: I0320 10:55:14.927015 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c60c0d4-dc51-4249-a35d-3a9601052fcd" containerName="dnsmasq-dns" Mar 20 10:55:14 crc kubenswrapper[4748]: I0320 10:55:14.927986 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-jb9kr" Mar 20 10:55:14 crc kubenswrapper[4748]: I0320 10:55:14.964359 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-jb9kr"] Mar 20 10:55:15 crc kubenswrapper[4748]: I0320 10:55:15.052787 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fe43a4aa-ef4d-4509-8760-b98a4de5b2e5-dns-svc\") pod \"dnsmasq-dns-698758b865-jb9kr\" (UID: \"fe43a4aa-ef4d-4509-8760-b98a4de5b2e5\") " pod="openstack/dnsmasq-dns-698758b865-jb9kr" Mar 20 10:55:15 crc kubenswrapper[4748]: I0320 10:55:15.052889 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fe43a4aa-ef4d-4509-8760-b98a4de5b2e5-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-jb9kr\" (UID: \"fe43a4aa-ef4d-4509-8760-b98a4de5b2e5\") " pod="openstack/dnsmasq-dns-698758b865-jb9kr" Mar 20 10:55:15 crc kubenswrapper[4748]: I0320 10:55:15.052912 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fe43a4aa-ef4d-4509-8760-b98a4de5b2e5-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-jb9kr\" (UID: \"fe43a4aa-ef4d-4509-8760-b98a4de5b2e5\") " pod="openstack/dnsmasq-dns-698758b865-jb9kr" Mar 20 10:55:15 crc kubenswrapper[4748]: I0320 10:55:15.052973 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe43a4aa-ef4d-4509-8760-b98a4de5b2e5-config\") pod \"dnsmasq-dns-698758b865-jb9kr\" (UID: \"fe43a4aa-ef4d-4509-8760-b98a4de5b2e5\") " pod="openstack/dnsmasq-dns-698758b865-jb9kr" Mar 20 10:55:15 crc kubenswrapper[4748]: I0320 10:55:15.052997 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qz5v\" (UniqueName: \"kubernetes.io/projected/fe43a4aa-ef4d-4509-8760-b98a4de5b2e5-kube-api-access-9qz5v\") pod \"dnsmasq-dns-698758b865-jb9kr\" (UID: \"fe43a4aa-ef4d-4509-8760-b98a4de5b2e5\") " pod="openstack/dnsmasq-dns-698758b865-jb9kr" Mar 20 10:55:15 crc kubenswrapper[4748]: I0320 10:55:15.154925 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe43a4aa-ef4d-4509-8760-b98a4de5b2e5-config\") pod \"dnsmasq-dns-698758b865-jb9kr\" (UID: \"fe43a4aa-ef4d-4509-8760-b98a4de5b2e5\") " pod="openstack/dnsmasq-dns-698758b865-jb9kr" Mar 20 10:55:15 crc kubenswrapper[4748]: I0320 10:55:15.154985 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qz5v\" (UniqueName: \"kubernetes.io/projected/fe43a4aa-ef4d-4509-8760-b98a4de5b2e5-kube-api-access-9qz5v\") pod \"dnsmasq-dns-698758b865-jb9kr\" (UID: \"fe43a4aa-ef4d-4509-8760-b98a4de5b2e5\") " pod="openstack/dnsmasq-dns-698758b865-jb9kr" Mar 20 10:55:15 crc kubenswrapper[4748]: I0320 10:55:15.155061 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fe43a4aa-ef4d-4509-8760-b98a4de5b2e5-dns-svc\") pod \"dnsmasq-dns-698758b865-jb9kr\" (UID: \"fe43a4aa-ef4d-4509-8760-b98a4de5b2e5\") " pod="openstack/dnsmasq-dns-698758b865-jb9kr" Mar 20 10:55:15 crc kubenswrapper[4748]: I0320 10:55:15.155102 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fe43a4aa-ef4d-4509-8760-b98a4de5b2e5-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-jb9kr\" (UID: \"fe43a4aa-ef4d-4509-8760-b98a4de5b2e5\") " pod="openstack/dnsmasq-dns-698758b865-jb9kr" Mar 20 10:55:15 crc kubenswrapper[4748]: I0320 10:55:15.155129 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fe43a4aa-ef4d-4509-8760-b98a4de5b2e5-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-jb9kr\" (UID: \"fe43a4aa-ef4d-4509-8760-b98a4de5b2e5\") " pod="openstack/dnsmasq-dns-698758b865-jb9kr" Mar 20 10:55:15 crc kubenswrapper[4748]: I0320 10:55:15.156035 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fe43a4aa-ef4d-4509-8760-b98a4de5b2e5-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-jb9kr\" (UID: \"fe43a4aa-ef4d-4509-8760-b98a4de5b2e5\") " pod="openstack/dnsmasq-dns-698758b865-jb9kr" Mar 20 10:55:15 crc kubenswrapper[4748]: I0320 10:55:15.156062 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe43a4aa-ef4d-4509-8760-b98a4de5b2e5-config\") pod \"dnsmasq-dns-698758b865-jb9kr\" (UID: \"fe43a4aa-ef4d-4509-8760-b98a4de5b2e5\") " pod="openstack/dnsmasq-dns-698758b865-jb9kr" Mar 20 10:55:15 crc kubenswrapper[4748]: I0320 10:55:15.156115 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fe43a4aa-ef4d-4509-8760-b98a4de5b2e5-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-jb9kr\" (UID: \"fe43a4aa-ef4d-4509-8760-b98a4de5b2e5\") " pod="openstack/dnsmasq-dns-698758b865-jb9kr" Mar 20 10:55:15 crc kubenswrapper[4748]: I0320 10:55:15.156751 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fe43a4aa-ef4d-4509-8760-b98a4de5b2e5-dns-svc\") pod \"dnsmasq-dns-698758b865-jb9kr\" (UID: \"fe43a4aa-ef4d-4509-8760-b98a4de5b2e5\") " pod="openstack/dnsmasq-dns-698758b865-jb9kr" Mar 20 10:55:15 crc kubenswrapper[4748]: I0320 10:55:15.176980 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qz5v\" (UniqueName: \"kubernetes.io/projected/fe43a4aa-ef4d-4509-8760-b98a4de5b2e5-kube-api-access-9qz5v\") pod \"dnsmasq-dns-698758b865-jb9kr\" (UID: \"fe43a4aa-ef4d-4509-8760-b98a4de5b2e5\") " pod="openstack/dnsmasq-dns-698758b865-jb9kr" Mar 20 10:55:15 crc kubenswrapper[4748]: I0320 10:55:15.254948 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-jb9kr" Mar 20 10:55:15 crc kubenswrapper[4748]: I0320 10:55:15.526757 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c60c0d4-dc51-4249-a35d-3a9601052fcd" path="/var/lib/kubelet/pods/0c60c0d4-dc51-4249-a35d-3a9601052fcd/volumes" Mar 20 10:55:15 crc kubenswrapper[4748]: I0320 10:55:15.662875 4748 generic.go:334] "Generic (PLEG): container finished" podID="089a3bc7-a7c8-4579-bb88-8e7e515e750a" containerID="a594226dd2e1fe41ac6300f680114ffedfadebb30c0ede6352508cec5a95a15e" exitCode=0 Mar 20 10:55:15 crc kubenswrapper[4748]: I0320 10:55:15.663038 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-blm4h" event={"ID":"089a3bc7-a7c8-4579-bb88-8e7e515e750a","Type":"ContainerDied","Data":"a594226dd2e1fe41ac6300f680114ffedfadebb30c0ede6352508cec5a95a15e"} Mar 20 10:55:15 crc kubenswrapper[4748]: I0320 10:55:15.664919 4748 generic.go:334] "Generic (PLEG): container finished" podID="ff27e027-9d58-432b-81aa-6ceacbf7fe94" containerID="71e234e3809b3d8da9dde5d5f2196c5e4903bdc342f50c61309a220a0ca75dbe" exitCode=0 Mar 20 10:55:15 crc kubenswrapper[4748]: I0320 10:55:15.664950 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-96de-account-create-update-6v8g4" event={"ID":"ff27e027-9d58-432b-81aa-6ceacbf7fe94","Type":"ContainerDied","Data":"71e234e3809b3d8da9dde5d5f2196c5e4903bdc342f50c61309a220a0ca75dbe"} Mar 20 10:55:15 crc kubenswrapper[4748]: I0320 10:55:15.751041 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-jb9kr"] Mar 20 10:55:15 crc kubenswrapper[4748]: W0320 10:55:15.751263 4748 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfe43a4aa_ef4d_4509_8760_b98a4de5b2e5.slice/crio-99179eae6e1188d0ddd1698060bd0537b36751de5986cf3d13bc2f2b07aba527 WatchSource:0}: Error finding container 99179eae6e1188d0ddd1698060bd0537b36751de5986cf3d13bc2f2b07aba527: Status 404 returned error can't find the container with id 99179eae6e1188d0ddd1698060bd0537b36751de5986cf3d13bc2f2b07aba527 Mar 20 10:55:16 crc kubenswrapper[4748]: I0320 10:55:16.098778 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Mar 20 10:55:16 crc kubenswrapper[4748]: I0320 10:55:16.105374 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 20 10:55:16 crc kubenswrapper[4748]: I0320 10:55:16.107910 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Mar 20 10:55:16 crc kubenswrapper[4748]: I0320 10:55:16.108149 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Mar 20 10:55:16 crc kubenswrapper[4748]: I0320 10:55:16.113483 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Mar 20 10:55:16 crc kubenswrapper[4748]: I0320 10:55:16.113961 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-7b9ps" Mar 20 10:55:16 crc kubenswrapper[4748]: I0320 10:55:16.122224 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 20 10:55:16 crc kubenswrapper[4748]: I0320 10:55:16.172987 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/7753c601-7739-4165-b5f2-a673b0797334-etc-swift\") pod \"swift-storage-0\" (UID: \"7753c601-7739-4165-b5f2-a673b0797334\") " pod="openstack/swift-storage-0" Mar 20 10:55:16 crc kubenswrapper[4748]: I0320 10:55:16.173087 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/7753c601-7739-4165-b5f2-a673b0797334-cache\") pod \"swift-storage-0\" (UID: \"7753c601-7739-4165-b5f2-a673b0797334\") " pod="openstack/swift-storage-0" Mar 20 10:55:16 crc kubenswrapper[4748]: I0320 10:55:16.173158 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"swift-storage-0\" (UID: \"7753c601-7739-4165-b5f2-a673b0797334\") " pod="openstack/swift-storage-0" Mar 20 10:55:16 crc kubenswrapper[4748]: I0320 10:55:16.173214 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7753c601-7739-4165-b5f2-a673b0797334-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"7753c601-7739-4165-b5f2-a673b0797334\") " pod="openstack/swift-storage-0" Mar 20 10:55:16 crc kubenswrapper[4748]: I0320 10:55:16.173308 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cwg59\" (UniqueName: \"kubernetes.io/projected/7753c601-7739-4165-b5f2-a673b0797334-kube-api-access-cwg59\") pod \"swift-storage-0\" (UID: \"7753c601-7739-4165-b5f2-a673b0797334\") " pod="openstack/swift-storage-0" Mar 20 10:55:16 crc kubenswrapper[4748]: I0320 10:55:16.173364 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/7753c601-7739-4165-b5f2-a673b0797334-lock\") pod \"swift-storage-0\" (UID: \"7753c601-7739-4165-b5f2-a673b0797334\") " pod="openstack/swift-storage-0" Mar 20 10:55:16 crc kubenswrapper[4748]: I0320 10:55:16.192250 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Mar 20 10:55:16 crc kubenswrapper[4748]: I0320 10:55:16.262928 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Mar 20 10:55:16 crc kubenswrapper[4748]: I0320 10:55:16.275396 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/7753c601-7739-4165-b5f2-a673b0797334-cache\") pod \"swift-storage-0\" (UID: \"7753c601-7739-4165-b5f2-a673b0797334\") " pod="openstack/swift-storage-0" Mar 20 10:55:16 crc kubenswrapper[4748]: I0320 10:55:16.275460 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"swift-storage-0\" (UID: \"7753c601-7739-4165-b5f2-a673b0797334\") " pod="openstack/swift-storage-0" Mar 20 10:55:16 crc kubenswrapper[4748]: I0320 10:55:16.275497 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7753c601-7739-4165-b5f2-a673b0797334-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"7753c601-7739-4165-b5f2-a673b0797334\") " pod="openstack/swift-storage-0" Mar 20 10:55:16 crc kubenswrapper[4748]: I0320 10:55:16.275570 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cwg59\" (UniqueName: \"kubernetes.io/projected/7753c601-7739-4165-b5f2-a673b0797334-kube-api-access-cwg59\") pod \"swift-storage-0\" (UID: \"7753c601-7739-4165-b5f2-a673b0797334\") " pod="openstack/swift-storage-0" Mar 20 10:55:16 crc kubenswrapper[4748]: I0320 10:55:16.275638 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/7753c601-7739-4165-b5f2-a673b0797334-lock\") pod \"swift-storage-0\" (UID: \"7753c601-7739-4165-b5f2-a673b0797334\") " pod="openstack/swift-storage-0" Mar 20 10:55:16 crc kubenswrapper[4748]: I0320 10:55:16.275782 4748 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"swift-storage-0\" (UID: \"7753c601-7739-4165-b5f2-a673b0797334\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/swift-storage-0" Mar 20 10:55:16 crc kubenswrapper[4748]: E0320 10:55:16.276017 4748 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 20 10:55:16 crc kubenswrapper[4748]: E0320 10:55:16.276056 4748 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 20 10:55:16 crc kubenswrapper[4748]: E0320 10:55:16.276131 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7753c601-7739-4165-b5f2-a673b0797334-etc-swift podName:7753c601-7739-4165-b5f2-a673b0797334 nodeName:}" failed. No retries permitted until 2026-03-20 10:55:16.776103241 +0000 UTC m=+1151.917649065 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/7753c601-7739-4165-b5f2-a673b0797334-etc-swift") pod "swift-storage-0" (UID: "7753c601-7739-4165-b5f2-a673b0797334") : configmap "swift-ring-files" not found Mar 20 10:55:16 crc kubenswrapper[4748]: I0320 10:55:16.276244 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/7753c601-7739-4165-b5f2-a673b0797334-lock\") pod \"swift-storage-0\" (UID: \"7753c601-7739-4165-b5f2-a673b0797334\") " pod="openstack/swift-storage-0" Mar 20 10:55:16 crc kubenswrapper[4748]: I0320 10:55:16.276029 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/7753c601-7739-4165-b5f2-a673b0797334-cache\") pod \"swift-storage-0\" (UID: \"7753c601-7739-4165-b5f2-a673b0797334\") " pod="openstack/swift-storage-0" Mar 20 10:55:16 crc kubenswrapper[4748]: I0320 10:55:16.275801 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/7753c601-7739-4165-b5f2-a673b0797334-etc-swift\") pod \"swift-storage-0\" (UID: \"7753c601-7739-4165-b5f2-a673b0797334\") " pod="openstack/swift-storage-0" Mar 20 10:55:16 crc kubenswrapper[4748]: I0320 10:55:16.281289 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7753c601-7739-4165-b5f2-a673b0797334-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"7753c601-7739-4165-b5f2-a673b0797334\") " pod="openstack/swift-storage-0" Mar 20 10:55:16 crc kubenswrapper[4748]: I0320 10:55:16.304013 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cwg59\" (UniqueName: \"kubernetes.io/projected/7753c601-7739-4165-b5f2-a673b0797334-kube-api-access-cwg59\") pod \"swift-storage-0\" (UID: \"7753c601-7739-4165-b5f2-a673b0797334\") " pod="openstack/swift-storage-0" Mar 20 10:55:16 crc kubenswrapper[4748]: I0320 10:55:16.313996 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"swift-storage-0\" (UID: \"7753c601-7739-4165-b5f2-a673b0797334\") " pod="openstack/swift-storage-0" Mar 20 10:55:16 crc kubenswrapper[4748]: I0320 10:55:16.673142 4748 generic.go:334] "Generic (PLEG): container finished" podID="fe43a4aa-ef4d-4509-8760-b98a4de5b2e5" containerID="7c06a4bcfc3a018e2e063d74f0306892b82ab0ac2ca20afe5ec5de3fd15e55cf" exitCode=0 Mar 20 10:55:16 crc kubenswrapper[4748]: I0320 10:55:16.673308 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-jb9kr" event={"ID":"fe43a4aa-ef4d-4509-8760-b98a4de5b2e5","Type":"ContainerDied","Data":"7c06a4bcfc3a018e2e063d74f0306892b82ab0ac2ca20afe5ec5de3fd15e55cf"} Mar 20 10:55:16 crc kubenswrapper[4748]: I0320 10:55:16.673367 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-jb9kr" event={"ID":"fe43a4aa-ef4d-4509-8760-b98a4de5b2e5","Type":"ContainerStarted","Data":"99179eae6e1188d0ddd1698060bd0537b36751de5986cf3d13bc2f2b07aba527"} Mar 20 10:55:16 crc kubenswrapper[4748]: I0320 10:55:16.785312 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/7753c601-7739-4165-b5f2-a673b0797334-etc-swift\") pod \"swift-storage-0\" (UID: \"7753c601-7739-4165-b5f2-a673b0797334\") " pod="openstack/swift-storage-0" Mar 20 10:55:16 crc kubenswrapper[4748]: E0320 10:55:16.791924 4748 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 20 10:55:16 crc kubenswrapper[4748]: E0320 10:55:16.791992 4748 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 20 10:55:16 crc kubenswrapper[4748]: E0320 10:55:16.792107 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7753c601-7739-4165-b5f2-a673b0797334-etc-swift podName:7753c601-7739-4165-b5f2-a673b0797334 nodeName:}" failed. No retries permitted until 2026-03-20 10:55:17.792024552 +0000 UTC m=+1152.933570366 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/7753c601-7739-4165-b5f2-a673b0797334-etc-swift") pod "swift-storage-0" (UID: "7753c601-7739-4165-b5f2-a673b0797334") : configmap "swift-ring-files" not found Mar 20 10:55:17 crc kubenswrapper[4748]: I0320 10:55:17.008998 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-96de-account-create-update-6v8g4" Mar 20 10:55:17 crc kubenswrapper[4748]: I0320 10:55:17.091809 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ff27e027-9d58-432b-81aa-6ceacbf7fe94-operator-scripts\") pod \"ff27e027-9d58-432b-81aa-6ceacbf7fe94\" (UID: \"ff27e027-9d58-432b-81aa-6ceacbf7fe94\") " Mar 20 10:55:17 crc kubenswrapper[4748]: I0320 10:55:17.092433 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g57hv\" (UniqueName: \"kubernetes.io/projected/ff27e027-9d58-432b-81aa-6ceacbf7fe94-kube-api-access-g57hv\") pod \"ff27e027-9d58-432b-81aa-6ceacbf7fe94\" (UID: \"ff27e027-9d58-432b-81aa-6ceacbf7fe94\") " Mar 20 10:55:17 crc kubenswrapper[4748]: I0320 10:55:17.092656 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff27e027-9d58-432b-81aa-6ceacbf7fe94-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ff27e027-9d58-432b-81aa-6ceacbf7fe94" (UID: "ff27e027-9d58-432b-81aa-6ceacbf7fe94"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:55:17 crc kubenswrapper[4748]: I0320 10:55:17.093007 4748 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ff27e027-9d58-432b-81aa-6ceacbf7fe94-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 10:55:17 crc kubenswrapper[4748]: I0320 10:55:17.097176 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff27e027-9d58-432b-81aa-6ceacbf7fe94-kube-api-access-g57hv" (OuterVolumeSpecName: "kube-api-access-g57hv") pod "ff27e027-9d58-432b-81aa-6ceacbf7fe94" (UID: "ff27e027-9d58-432b-81aa-6ceacbf7fe94"). InnerVolumeSpecName "kube-api-access-g57hv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:55:17 crc kubenswrapper[4748]: I0320 10:55:17.158415 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-blm4h" Mar 20 10:55:17 crc kubenswrapper[4748]: I0320 10:55:17.194321 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mxh2z\" (UniqueName: \"kubernetes.io/projected/089a3bc7-a7c8-4579-bb88-8e7e515e750a-kube-api-access-mxh2z\") pod \"089a3bc7-a7c8-4579-bb88-8e7e515e750a\" (UID: \"089a3bc7-a7c8-4579-bb88-8e7e515e750a\") " Mar 20 10:55:17 crc kubenswrapper[4748]: I0320 10:55:17.194411 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/089a3bc7-a7c8-4579-bb88-8e7e515e750a-operator-scripts\") pod \"089a3bc7-a7c8-4579-bb88-8e7e515e750a\" (UID: \"089a3bc7-a7c8-4579-bb88-8e7e515e750a\") " Mar 20 10:55:17 crc kubenswrapper[4748]: I0320 10:55:17.194751 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g57hv\" (UniqueName: \"kubernetes.io/projected/ff27e027-9d58-432b-81aa-6ceacbf7fe94-kube-api-access-g57hv\") on node \"crc\" DevicePath \"\"" Mar 20 10:55:17 crc kubenswrapper[4748]: I0320 10:55:17.195169 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/089a3bc7-a7c8-4579-bb88-8e7e515e750a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "089a3bc7-a7c8-4579-bb88-8e7e515e750a" (UID: "089a3bc7-a7c8-4579-bb88-8e7e515e750a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:55:17 crc kubenswrapper[4748]: I0320 10:55:17.200784 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/089a3bc7-a7c8-4579-bb88-8e7e515e750a-kube-api-access-mxh2z" (OuterVolumeSpecName: "kube-api-access-mxh2z") pod "089a3bc7-a7c8-4579-bb88-8e7e515e750a" (UID: "089a3bc7-a7c8-4579-bb88-8e7e515e750a"). InnerVolumeSpecName "kube-api-access-mxh2z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:55:17 crc kubenswrapper[4748]: I0320 10:55:17.295914 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mxh2z\" (UniqueName: \"kubernetes.io/projected/089a3bc7-a7c8-4579-bb88-8e7e515e750a-kube-api-access-mxh2z\") on node \"crc\" DevicePath \"\"" Mar 20 10:55:17 crc kubenswrapper[4748]: I0320 10:55:17.295951 4748 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/089a3bc7-a7c8-4579-bb88-8e7e515e750a-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 10:55:17 crc kubenswrapper[4748]: I0320 10:55:17.681869 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-jb9kr" event={"ID":"fe43a4aa-ef4d-4509-8760-b98a4de5b2e5","Type":"ContainerStarted","Data":"8790c054f45fcf832b100fb8aef752a10849709a0806741a7a4087d053414ea1"} Mar 20 10:55:17 crc kubenswrapper[4748]: I0320 10:55:17.681955 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-jb9kr" Mar 20 10:55:17 crc kubenswrapper[4748]: I0320 10:55:17.683575 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-96de-account-create-update-6v8g4" event={"ID":"ff27e027-9d58-432b-81aa-6ceacbf7fe94","Type":"ContainerDied","Data":"b07b3e45413689ad00a3aea551eac803ff592157c57ae369bbfe9bc232cd561f"} Mar 20 10:55:17 crc kubenswrapper[4748]: I0320 10:55:17.683600 4748 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b07b3e45413689ad00a3aea551eac803ff592157c57ae369bbfe9bc232cd561f" Mar 20 10:55:17 crc kubenswrapper[4748]: I0320 10:55:17.683644 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-96de-account-create-update-6v8g4" Mar 20 10:55:17 crc kubenswrapper[4748]: I0320 10:55:17.686373 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-blm4h" event={"ID":"089a3bc7-a7c8-4579-bb88-8e7e515e750a","Type":"ContainerDied","Data":"63d6a6a6b99d4b3f4adcf089fbf5a2b09011133eb173de54f681fb69f3b5ff97"} Mar 20 10:55:17 crc kubenswrapper[4748]: I0320 10:55:17.686577 4748 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="63d6a6a6b99d4b3f4adcf089fbf5a2b09011133eb173de54f681fb69f3b5ff97" Mar 20 10:55:17 crc kubenswrapper[4748]: I0320 10:55:17.686558 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-blm4h" Mar 20 10:55:17 crc kubenswrapper[4748]: I0320 10:55:17.721138 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-698758b865-jb9kr" podStartSLOduration=3.72110913 podStartE2EDuration="3.72110913s" podCreationTimestamp="2026-03-20 10:55:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:55:17.707099858 +0000 UTC m=+1152.848645672" watchObservedRunningTime="2026-03-20 10:55:17.72110913 +0000 UTC m=+1152.862654944" Mar 20 10:55:17 crc kubenswrapper[4748]: I0320 10:55:17.803593 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/7753c601-7739-4165-b5f2-a673b0797334-etc-swift\") pod \"swift-storage-0\" (UID: \"7753c601-7739-4165-b5f2-a673b0797334\") " pod="openstack/swift-storage-0" Mar 20 10:55:17 crc kubenswrapper[4748]: E0320 10:55:17.804014 4748 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 20 10:55:17 crc kubenswrapper[4748]: E0320 10:55:17.805021 4748 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 20 10:55:17 crc kubenswrapper[4748]: E0320 10:55:17.805163 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7753c601-7739-4165-b5f2-a673b0797334-etc-swift podName:7753c601-7739-4165-b5f2-a673b0797334 nodeName:}" failed. No retries permitted until 2026-03-20 10:55:19.805136303 +0000 UTC m=+1154.946682137 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/7753c601-7739-4165-b5f2-a673b0797334-etc-swift") pod "swift-storage-0" (UID: "7753c601-7739-4165-b5f2-a673b0797334") : configmap "swift-ring-files" not found Mar 20 10:55:18 crc kubenswrapper[4748]: I0320 10:55:18.423717 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Mar 20 10:55:19 crc kubenswrapper[4748]: I0320 10:55:19.039789 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-q2gxz"] Mar 20 10:55:19 crc kubenswrapper[4748]: E0320 10:55:19.040851 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff27e027-9d58-432b-81aa-6ceacbf7fe94" containerName="mariadb-account-create-update" Mar 20 10:55:19 crc kubenswrapper[4748]: I0320 10:55:19.040944 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff27e027-9d58-432b-81aa-6ceacbf7fe94" containerName="mariadb-account-create-update" Mar 20 10:55:19 crc kubenswrapper[4748]: E0320 10:55:19.041032 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="089a3bc7-a7c8-4579-bb88-8e7e515e750a" containerName="mariadb-database-create" Mar 20 10:55:19 crc kubenswrapper[4748]: I0320 10:55:19.041100 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="089a3bc7-a7c8-4579-bb88-8e7e515e750a" containerName="mariadb-database-create" Mar 20 10:55:19 crc kubenswrapper[4748]: I0320 10:55:19.041353 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="089a3bc7-a7c8-4579-bb88-8e7e515e750a" containerName="mariadb-database-create" Mar 20 10:55:19 crc kubenswrapper[4748]: I0320 10:55:19.041465 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff27e027-9d58-432b-81aa-6ceacbf7fe94" containerName="mariadb-account-create-update" Mar 20 10:55:19 crc kubenswrapper[4748]: I0320 10:55:19.042196 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-q2gxz" Mar 20 10:55:19 crc kubenswrapper[4748]: I0320 10:55:19.045002 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 20 10:55:19 crc kubenswrapper[4748]: I0320 10:55:19.052811 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-q2gxz"] Mar 20 10:55:19 crc kubenswrapper[4748]: I0320 10:55:19.127424 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wn6jg\" (UniqueName: \"kubernetes.io/projected/4029da34-a74b-481e-a7b8-692ff54e61f9-kube-api-access-wn6jg\") pod \"root-account-create-update-q2gxz\" (UID: \"4029da34-a74b-481e-a7b8-692ff54e61f9\") " pod="openstack/root-account-create-update-q2gxz" Mar 20 10:55:19 crc kubenswrapper[4748]: I0320 10:55:19.127901 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4029da34-a74b-481e-a7b8-692ff54e61f9-operator-scripts\") pod \"root-account-create-update-q2gxz\" (UID: \"4029da34-a74b-481e-a7b8-692ff54e61f9\") " pod="openstack/root-account-create-update-q2gxz" Mar 20 10:55:19 crc kubenswrapper[4748]: I0320 10:55:19.229578 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4029da34-a74b-481e-a7b8-692ff54e61f9-operator-scripts\") pod \"root-account-create-update-q2gxz\" (UID: \"4029da34-a74b-481e-a7b8-692ff54e61f9\") " pod="openstack/root-account-create-update-q2gxz" Mar 20 10:55:19 crc kubenswrapper[4748]: I0320 10:55:19.230052 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wn6jg\" (UniqueName: \"kubernetes.io/projected/4029da34-a74b-481e-a7b8-692ff54e61f9-kube-api-access-wn6jg\") pod \"root-account-create-update-q2gxz\" (UID: \"4029da34-a74b-481e-a7b8-692ff54e61f9\") " pod="openstack/root-account-create-update-q2gxz" Mar 20 10:55:19 crc kubenswrapper[4748]: I0320 10:55:19.230541 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4029da34-a74b-481e-a7b8-692ff54e61f9-operator-scripts\") pod \"root-account-create-update-q2gxz\" (UID: \"4029da34-a74b-481e-a7b8-692ff54e61f9\") " pod="openstack/root-account-create-update-q2gxz" Mar 20 10:55:19 crc kubenswrapper[4748]: I0320 10:55:19.255765 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wn6jg\" (UniqueName: \"kubernetes.io/projected/4029da34-a74b-481e-a7b8-692ff54e61f9-kube-api-access-wn6jg\") pod \"root-account-create-update-q2gxz\" (UID: \"4029da34-a74b-481e-a7b8-692ff54e61f9\") " pod="openstack/root-account-create-update-q2gxz" Mar 20 10:55:19 crc kubenswrapper[4748]: I0320 10:55:19.357936 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-q2gxz" Mar 20 10:55:19 crc kubenswrapper[4748]: I0320 10:55:19.777912 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-q2gxz"] Mar 20 10:55:19 crc kubenswrapper[4748]: I0320 10:55:19.841476 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/7753c601-7739-4165-b5f2-a673b0797334-etc-swift\") pod \"swift-storage-0\" (UID: \"7753c601-7739-4165-b5f2-a673b0797334\") " pod="openstack/swift-storage-0" Mar 20 10:55:19 crc kubenswrapper[4748]: E0320 10:55:19.841773 4748 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 20 10:55:19 crc kubenswrapper[4748]: E0320 10:55:19.841829 4748 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 20 10:55:19 crc kubenswrapper[4748]: E0320 10:55:19.841944 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7753c601-7739-4165-b5f2-a673b0797334-etc-swift podName:7753c601-7739-4165-b5f2-a673b0797334 nodeName:}" failed. No retries permitted until 2026-03-20 10:55:23.8419157 +0000 UTC m=+1158.983461534 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/7753c601-7739-4165-b5f2-a673b0797334-etc-swift") pod "swift-storage-0" (UID: "7753c601-7739-4165-b5f2-a673b0797334") : configmap "swift-ring-files" not found Mar 20 10:55:20 crc kubenswrapper[4748]: I0320 10:55:20.035119 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-mlztp"] Mar 20 10:55:20 crc kubenswrapper[4748]: I0320 10:55:20.036750 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-mlztp" Mar 20 10:55:20 crc kubenswrapper[4748]: I0320 10:55:20.040539 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Mar 20 10:55:20 crc kubenswrapper[4748]: I0320 10:55:20.047048 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-mlztp"] Mar 20 10:55:20 crc kubenswrapper[4748]: I0320 10:55:20.060335 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Mar 20 10:55:20 crc kubenswrapper[4748]: I0320 10:55:20.060691 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 20 10:55:20 crc kubenswrapper[4748]: I0320 10:55:20.147502 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7z5bv\" (UniqueName: \"kubernetes.io/projected/8bde82f9-d6e7-4ab4-8666-5fbbd5b90f17-kube-api-access-7z5bv\") pod \"swift-ring-rebalance-mlztp\" (UID: \"8bde82f9-d6e7-4ab4-8666-5fbbd5b90f17\") " pod="openstack/swift-ring-rebalance-mlztp" Mar 20 10:55:20 crc kubenswrapper[4748]: I0320 10:55:20.147605 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8bde82f9-d6e7-4ab4-8666-5fbbd5b90f17-etc-swift\") pod \"swift-ring-rebalance-mlztp\" (UID: \"8bde82f9-d6e7-4ab4-8666-5fbbd5b90f17\") " pod="openstack/swift-ring-rebalance-mlztp" Mar 20 10:55:20 crc kubenswrapper[4748]: I0320 10:55:20.147632 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8bde82f9-d6e7-4ab4-8666-5fbbd5b90f17-dispersionconf\") pod \"swift-ring-rebalance-mlztp\" (UID: \"8bde82f9-d6e7-4ab4-8666-5fbbd5b90f17\") " pod="openstack/swift-ring-rebalance-mlztp" Mar 20 10:55:20 crc kubenswrapper[4748]: I0320 10:55:20.147718 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8bde82f9-d6e7-4ab4-8666-5fbbd5b90f17-scripts\") pod \"swift-ring-rebalance-mlztp\" (UID: \"8bde82f9-d6e7-4ab4-8666-5fbbd5b90f17\") " pod="openstack/swift-ring-rebalance-mlztp" Mar 20 10:55:20 crc kubenswrapper[4748]: I0320 10:55:20.147743 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bde82f9-d6e7-4ab4-8666-5fbbd5b90f17-combined-ca-bundle\") pod \"swift-ring-rebalance-mlztp\" (UID: \"8bde82f9-d6e7-4ab4-8666-5fbbd5b90f17\") " pod="openstack/swift-ring-rebalance-mlztp" Mar 20 10:55:20 crc kubenswrapper[4748]: I0320 10:55:20.147778 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8bde82f9-d6e7-4ab4-8666-5fbbd5b90f17-ring-data-devices\") pod \"swift-ring-rebalance-mlztp\" (UID: \"8bde82f9-d6e7-4ab4-8666-5fbbd5b90f17\") " pod="openstack/swift-ring-rebalance-mlztp" Mar 20 10:55:20 crc kubenswrapper[4748]: I0320 10:55:20.147801 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8bde82f9-d6e7-4ab4-8666-5fbbd5b90f17-swiftconf\") pod \"swift-ring-rebalance-mlztp\" (UID: \"8bde82f9-d6e7-4ab4-8666-5fbbd5b90f17\") " pod="openstack/swift-ring-rebalance-mlztp" Mar 20 10:55:20 crc kubenswrapper[4748]: I0320 10:55:20.249502 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8bde82f9-d6e7-4ab4-8666-5fbbd5b90f17-dispersionconf\") pod \"swift-ring-rebalance-mlztp\" (UID: \"8bde82f9-d6e7-4ab4-8666-5fbbd5b90f17\") " pod="openstack/swift-ring-rebalance-mlztp" Mar 20 10:55:20 crc kubenswrapper[4748]: I0320 10:55:20.249622 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8bde82f9-d6e7-4ab4-8666-5fbbd5b90f17-scripts\") pod \"swift-ring-rebalance-mlztp\" (UID: \"8bde82f9-d6e7-4ab4-8666-5fbbd5b90f17\") " pod="openstack/swift-ring-rebalance-mlztp" Mar 20 10:55:20 crc kubenswrapper[4748]: I0320 10:55:20.249647 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bde82f9-d6e7-4ab4-8666-5fbbd5b90f17-combined-ca-bundle\") pod \"swift-ring-rebalance-mlztp\" (UID: \"8bde82f9-d6e7-4ab4-8666-5fbbd5b90f17\") " pod="openstack/swift-ring-rebalance-mlztp" Mar 20 10:55:20 crc kubenswrapper[4748]: I0320 10:55:20.249689 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8bde82f9-d6e7-4ab4-8666-5fbbd5b90f17-ring-data-devices\") pod \"swift-ring-rebalance-mlztp\" (UID: \"8bde82f9-d6e7-4ab4-8666-5fbbd5b90f17\") " pod="openstack/swift-ring-rebalance-mlztp" Mar 20 10:55:20 crc kubenswrapper[4748]: I0320 10:55:20.249710 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8bde82f9-d6e7-4ab4-8666-5fbbd5b90f17-swiftconf\") pod \"swift-ring-rebalance-mlztp\" (UID: \"8bde82f9-d6e7-4ab4-8666-5fbbd5b90f17\") " pod="openstack/swift-ring-rebalance-mlztp" Mar 20 10:55:20 crc kubenswrapper[4748]: I0320 10:55:20.249770 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7z5bv\" (UniqueName: \"kubernetes.io/projected/8bde82f9-d6e7-4ab4-8666-5fbbd5b90f17-kube-api-access-7z5bv\") pod \"swift-ring-rebalance-mlztp\" (UID: \"8bde82f9-d6e7-4ab4-8666-5fbbd5b90f17\") " pod="openstack/swift-ring-rebalance-mlztp" Mar 20 10:55:20 crc kubenswrapper[4748]: I0320 10:55:20.249815 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8bde82f9-d6e7-4ab4-8666-5fbbd5b90f17-etc-swift\") pod \"swift-ring-rebalance-mlztp\" (UID: \"8bde82f9-d6e7-4ab4-8666-5fbbd5b90f17\") " pod="openstack/swift-ring-rebalance-mlztp" Mar 20 10:55:20 crc kubenswrapper[4748]: I0320 10:55:20.250355 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8bde82f9-d6e7-4ab4-8666-5fbbd5b90f17-etc-swift\") pod \"swift-ring-rebalance-mlztp\" (UID: \"8bde82f9-d6e7-4ab4-8666-5fbbd5b90f17\") " pod="openstack/swift-ring-rebalance-mlztp" Mar 20 10:55:20 crc kubenswrapper[4748]: I0320 10:55:20.251031 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8bde82f9-d6e7-4ab4-8666-5fbbd5b90f17-scripts\") pod \"swift-ring-rebalance-mlztp\" (UID: \"8bde82f9-d6e7-4ab4-8666-5fbbd5b90f17\") " pod="openstack/swift-ring-rebalance-mlztp" Mar 20 10:55:20 crc kubenswrapper[4748]: I0320 10:55:20.251617 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8bde82f9-d6e7-4ab4-8666-5fbbd5b90f17-ring-data-devices\") pod \"swift-ring-rebalance-mlztp\" (UID: \"8bde82f9-d6e7-4ab4-8666-5fbbd5b90f17\") " pod="openstack/swift-ring-rebalance-mlztp" Mar 20 10:55:20 crc kubenswrapper[4748]: I0320 10:55:20.257584 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8bde82f9-d6e7-4ab4-8666-5fbbd5b90f17-dispersionconf\") pod \"swift-ring-rebalance-mlztp\" (UID: \"8bde82f9-d6e7-4ab4-8666-5fbbd5b90f17\") " pod="openstack/swift-ring-rebalance-mlztp" Mar 20 10:55:20 crc kubenswrapper[4748]: I0320 10:55:20.259504 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8bde82f9-d6e7-4ab4-8666-5fbbd5b90f17-swiftconf\") pod \"swift-ring-rebalance-mlztp\" (UID: \"8bde82f9-d6e7-4ab4-8666-5fbbd5b90f17\") " pod="openstack/swift-ring-rebalance-mlztp" Mar 20 10:55:20 crc kubenswrapper[4748]: I0320 10:55:20.260486 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bde82f9-d6e7-4ab4-8666-5fbbd5b90f17-combined-ca-bundle\") pod \"swift-ring-rebalance-mlztp\" (UID: \"8bde82f9-d6e7-4ab4-8666-5fbbd5b90f17\") " pod="openstack/swift-ring-rebalance-mlztp" Mar 20 10:55:20 crc kubenswrapper[4748]: I0320 10:55:20.271560 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7z5bv\" (UniqueName: \"kubernetes.io/projected/8bde82f9-d6e7-4ab4-8666-5fbbd5b90f17-kube-api-access-7z5bv\") pod \"swift-ring-rebalance-mlztp\" (UID: \"8bde82f9-d6e7-4ab4-8666-5fbbd5b90f17\") " pod="openstack/swift-ring-rebalance-mlztp" Mar 20 10:55:20 crc kubenswrapper[4748]: I0320 10:55:20.362073 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-mlztp" Mar 20 10:55:20 crc kubenswrapper[4748]: I0320 10:55:20.715893 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-q2gxz" event={"ID":"4029da34-a74b-481e-a7b8-692ff54e61f9","Type":"ContainerStarted","Data":"f36200a33dc22e05dc3c7d05e0b92e8c66da61fab414e967919c12f075a3e662"} Mar 20 10:55:20 crc kubenswrapper[4748]: I0320 10:55:20.716209 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-q2gxz" event={"ID":"4029da34-a74b-481e-a7b8-692ff54e61f9","Type":"ContainerStarted","Data":"7dded27ddce30be3328a62bf59b4fccfbb2bcd98a1c8938c8d707a488beb986e"} Mar 20 10:55:20 crc kubenswrapper[4748]: I0320 10:55:20.822824 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-mlztp"] Mar 20 10:55:21 crc kubenswrapper[4748]: I0320 10:55:21.731252 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-mlztp" event={"ID":"8bde82f9-d6e7-4ab4-8666-5fbbd5b90f17","Type":"ContainerStarted","Data":"d19766b63e59dee1d42eb0af65bd1b1d8a8ee2aaef6b74254a36804a339ca987"} Mar 20 10:55:21 crc kubenswrapper[4748]: I0320 10:55:21.750504 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-q2gxz" podStartSLOduration=2.750477903 podStartE2EDuration="2.750477903s" podCreationTimestamp="2026-03-20 10:55:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:55:21.748094713 +0000 UTC m=+1156.889640537" watchObservedRunningTime="2026-03-20 10:55:21.750477903 +0000 UTC m=+1156.892023717" Mar 20 10:55:22 crc kubenswrapper[4748]: I0320 10:55:22.270738 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-n8tkm"] Mar 20 10:55:22 crc kubenswrapper[4748]: I0320 10:55:22.272377 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-n8tkm" Mar 20 10:55:22 crc kubenswrapper[4748]: I0320 10:55:22.279111 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-n8tkm"] Mar 20 10:55:22 crc kubenswrapper[4748]: I0320 10:55:22.286444 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wvnt\" (UniqueName: \"kubernetes.io/projected/8949f08a-2d27-4055-86a7-a66a77de530b-kube-api-access-2wvnt\") pod \"glance-db-create-n8tkm\" (UID: \"8949f08a-2d27-4055-86a7-a66a77de530b\") " pod="openstack/glance-db-create-n8tkm" Mar 20 10:55:22 crc kubenswrapper[4748]: I0320 10:55:22.286992 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8949f08a-2d27-4055-86a7-a66a77de530b-operator-scripts\") pod \"glance-db-create-n8tkm\" (UID: \"8949f08a-2d27-4055-86a7-a66a77de530b\") " pod="openstack/glance-db-create-n8tkm" Mar 20 10:55:22 crc kubenswrapper[4748]: I0320 10:55:22.391620 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2wvnt\" (UniqueName: \"kubernetes.io/projected/8949f08a-2d27-4055-86a7-a66a77de530b-kube-api-access-2wvnt\") pod \"glance-db-create-n8tkm\" (UID: \"8949f08a-2d27-4055-86a7-a66a77de530b\") " pod="openstack/glance-db-create-n8tkm" Mar 20 10:55:22 crc kubenswrapper[4748]: I0320 10:55:22.391857 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8949f08a-2d27-4055-86a7-a66a77de530b-operator-scripts\") pod \"glance-db-create-n8tkm\" (UID: \"8949f08a-2d27-4055-86a7-a66a77de530b\") " pod="openstack/glance-db-create-n8tkm" Mar 20 10:55:22 crc kubenswrapper[4748]: I0320 10:55:22.393092 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8949f08a-2d27-4055-86a7-a66a77de530b-operator-scripts\") pod \"glance-db-create-n8tkm\" (UID: \"8949f08a-2d27-4055-86a7-a66a77de530b\") " pod="openstack/glance-db-create-n8tkm" Mar 20 10:55:22 crc kubenswrapper[4748]: I0320 10:55:22.396052 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-7a59-account-create-update-vt42l"] Mar 20 10:55:22 crc kubenswrapper[4748]: I0320 10:55:22.402268 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-7a59-account-create-update-vt42l" Mar 20 10:55:22 crc kubenswrapper[4748]: I0320 10:55:22.407276 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Mar 20 10:55:22 crc kubenswrapper[4748]: I0320 10:55:22.412389 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-7a59-account-create-update-vt42l"] Mar 20 10:55:22 crc kubenswrapper[4748]: I0320 10:55:22.427595 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wvnt\" (UniqueName: \"kubernetes.io/projected/8949f08a-2d27-4055-86a7-a66a77de530b-kube-api-access-2wvnt\") pod \"glance-db-create-n8tkm\" (UID: \"8949f08a-2d27-4055-86a7-a66a77de530b\") " pod="openstack/glance-db-create-n8tkm" Mar 20 10:55:22 crc kubenswrapper[4748]: I0320 10:55:22.493909 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29b85\" (UniqueName: \"kubernetes.io/projected/5905f14e-ca31-4983-9335-db320b71f0d1-kube-api-access-29b85\") pod \"glance-7a59-account-create-update-vt42l\" (UID: \"5905f14e-ca31-4983-9335-db320b71f0d1\") " pod="openstack/glance-7a59-account-create-update-vt42l" Mar 20 10:55:22 crc kubenswrapper[4748]: I0320 10:55:22.494577 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5905f14e-ca31-4983-9335-db320b71f0d1-operator-scripts\") pod \"glance-7a59-account-create-update-vt42l\" (UID: \"5905f14e-ca31-4983-9335-db320b71f0d1\") " pod="openstack/glance-7a59-account-create-update-vt42l" Mar 20 10:55:22 crc kubenswrapper[4748]: I0320 10:55:22.596787 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-29b85\" (UniqueName: \"kubernetes.io/projected/5905f14e-ca31-4983-9335-db320b71f0d1-kube-api-access-29b85\") pod \"glance-7a59-account-create-update-vt42l\" (UID: \"5905f14e-ca31-4983-9335-db320b71f0d1\") " pod="openstack/glance-7a59-account-create-update-vt42l" Mar 20 10:55:22 crc kubenswrapper[4748]: I0320 10:55:22.596885 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5905f14e-ca31-4983-9335-db320b71f0d1-operator-scripts\") pod \"glance-7a59-account-create-update-vt42l\" (UID: \"5905f14e-ca31-4983-9335-db320b71f0d1\") " pod="openstack/glance-7a59-account-create-update-vt42l" Mar 20 10:55:22 crc kubenswrapper[4748]: I0320 10:55:22.598130 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5905f14e-ca31-4983-9335-db320b71f0d1-operator-scripts\") pod \"glance-7a59-account-create-update-vt42l\" (UID: \"5905f14e-ca31-4983-9335-db320b71f0d1\") " pod="openstack/glance-7a59-account-create-update-vt42l" Mar 20 10:55:22 crc kubenswrapper[4748]: I0320 10:55:22.600021 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-n8tkm" Mar 20 10:55:22 crc kubenswrapper[4748]: I0320 10:55:22.652081 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-29b85\" (UniqueName: \"kubernetes.io/projected/5905f14e-ca31-4983-9335-db320b71f0d1-kube-api-access-29b85\") pod \"glance-7a59-account-create-update-vt42l\" (UID: \"5905f14e-ca31-4983-9335-db320b71f0d1\") " pod="openstack/glance-7a59-account-create-update-vt42l" Mar 20 10:55:22 crc kubenswrapper[4748]: I0320 10:55:22.761485 4748 generic.go:334] "Generic (PLEG): container finished" podID="4029da34-a74b-481e-a7b8-692ff54e61f9" containerID="f36200a33dc22e05dc3c7d05e0b92e8c66da61fab414e967919c12f075a3e662" exitCode=0 Mar 20 10:55:22 crc kubenswrapper[4748]: I0320 10:55:22.761546 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-q2gxz" event={"ID":"4029da34-a74b-481e-a7b8-692ff54e61f9","Type":"ContainerDied","Data":"f36200a33dc22e05dc3c7d05e0b92e8c66da61fab414e967919c12f075a3e662"} Mar 20 10:55:22 crc kubenswrapper[4748]: I0320 10:55:22.761725 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-7a59-account-create-update-vt42l" Mar 20 10:55:23 crc kubenswrapper[4748]: I0320 10:55:23.032805 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-mzn7b"] Mar 20 10:55:23 crc kubenswrapper[4748]: I0320 10:55:23.033974 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-mzn7b" Mar 20 10:55:23 crc kubenswrapper[4748]: I0320 10:55:23.047978 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-mzn7b"] Mar 20 10:55:23 crc kubenswrapper[4748]: I0320 10:55:23.108124 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z62s5\" (UniqueName: \"kubernetes.io/projected/056ff617-ed15-4344-8883-afb1abd38abb-kube-api-access-z62s5\") pod \"keystone-db-create-mzn7b\" (UID: \"056ff617-ed15-4344-8883-afb1abd38abb\") " pod="openstack/keystone-db-create-mzn7b" Mar 20 10:55:23 crc kubenswrapper[4748]: I0320 10:55:23.108358 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/056ff617-ed15-4344-8883-afb1abd38abb-operator-scripts\") pod \"keystone-db-create-mzn7b\" (UID: \"056ff617-ed15-4344-8883-afb1abd38abb\") " pod="openstack/keystone-db-create-mzn7b" Mar 20 10:55:23 crc kubenswrapper[4748]: I0320 10:55:23.137504 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-915e-account-create-update-rq9s8"] Mar 20 10:55:23 crc kubenswrapper[4748]: I0320 10:55:23.138828 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-915e-account-create-update-rq9s8" Mar 20 10:55:23 crc kubenswrapper[4748]: I0320 10:55:23.143813 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Mar 20 10:55:23 crc kubenswrapper[4748]: I0320 10:55:23.146650 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-915e-account-create-update-rq9s8"] Mar 20 10:55:23 crc kubenswrapper[4748]: I0320 10:55:23.209788 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7b34c916-eb28-4060-8a0c-22b7ae45bcaa-operator-scripts\") pod \"keystone-915e-account-create-update-rq9s8\" (UID: \"7b34c916-eb28-4060-8a0c-22b7ae45bcaa\") " pod="openstack/keystone-915e-account-create-update-rq9s8" Mar 20 10:55:23 crc kubenswrapper[4748]: I0320 10:55:23.210196 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddxhn\" (UniqueName: \"kubernetes.io/projected/7b34c916-eb28-4060-8a0c-22b7ae45bcaa-kube-api-access-ddxhn\") pod \"keystone-915e-account-create-update-rq9s8\" (UID: \"7b34c916-eb28-4060-8a0c-22b7ae45bcaa\") " pod="openstack/keystone-915e-account-create-update-rq9s8" Mar 20 10:55:23 crc kubenswrapper[4748]: I0320 10:55:23.210254 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z62s5\" (UniqueName: \"kubernetes.io/projected/056ff617-ed15-4344-8883-afb1abd38abb-kube-api-access-z62s5\") pod \"keystone-db-create-mzn7b\" (UID: \"056ff617-ed15-4344-8883-afb1abd38abb\") " pod="openstack/keystone-db-create-mzn7b" Mar 20 10:55:23 crc kubenswrapper[4748]: I0320 10:55:23.210334 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/056ff617-ed15-4344-8883-afb1abd38abb-operator-scripts\") pod \"keystone-db-create-mzn7b\" (UID: \"056ff617-ed15-4344-8883-afb1abd38abb\") " pod="openstack/keystone-db-create-mzn7b" Mar 20 10:55:23 crc kubenswrapper[4748]: I0320 10:55:23.211259 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/056ff617-ed15-4344-8883-afb1abd38abb-operator-scripts\") pod \"keystone-db-create-mzn7b\" (UID: \"056ff617-ed15-4344-8883-afb1abd38abb\") " pod="openstack/keystone-db-create-mzn7b" Mar 20 10:55:23 crc kubenswrapper[4748]: I0320 10:55:23.234556 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z62s5\" (UniqueName: \"kubernetes.io/projected/056ff617-ed15-4344-8883-afb1abd38abb-kube-api-access-z62s5\") pod \"keystone-db-create-mzn7b\" (UID: \"056ff617-ed15-4344-8883-afb1abd38abb\") " pod="openstack/keystone-db-create-mzn7b" Mar 20 10:55:23 crc kubenswrapper[4748]: I0320 10:55:23.312013 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ddxhn\" (UniqueName: \"kubernetes.io/projected/7b34c916-eb28-4060-8a0c-22b7ae45bcaa-kube-api-access-ddxhn\") pod \"keystone-915e-account-create-update-rq9s8\" (UID: \"7b34c916-eb28-4060-8a0c-22b7ae45bcaa\") " pod="openstack/keystone-915e-account-create-update-rq9s8" Mar 20 10:55:23 crc kubenswrapper[4748]: I0320 10:55:23.312062 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7b34c916-eb28-4060-8a0c-22b7ae45bcaa-operator-scripts\") pod \"keystone-915e-account-create-update-rq9s8\" (UID: \"7b34c916-eb28-4060-8a0c-22b7ae45bcaa\") " pod="openstack/keystone-915e-account-create-update-rq9s8" Mar 20 10:55:23 crc kubenswrapper[4748]: I0320 10:55:23.313070 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7b34c916-eb28-4060-8a0c-22b7ae45bcaa-operator-scripts\") pod \"keystone-915e-account-create-update-rq9s8\" (UID: \"7b34c916-eb28-4060-8a0c-22b7ae45bcaa\") " pod="openstack/keystone-915e-account-create-update-rq9s8" Mar 20 10:55:23 crc kubenswrapper[4748]: I0320 10:55:23.331690 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddxhn\" (UniqueName: \"kubernetes.io/projected/7b34c916-eb28-4060-8a0c-22b7ae45bcaa-kube-api-access-ddxhn\") pod \"keystone-915e-account-create-update-rq9s8\" (UID: \"7b34c916-eb28-4060-8a0c-22b7ae45bcaa\") " pod="openstack/keystone-915e-account-create-update-rq9s8" Mar 20 10:55:23 crc kubenswrapper[4748]: I0320 10:55:23.379753 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-mzn7b" Mar 20 10:55:23 crc kubenswrapper[4748]: I0320 10:55:23.456658 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-915e-account-create-update-rq9s8" Mar 20 10:55:23 crc kubenswrapper[4748]: I0320 10:55:23.781392 4748 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-bldp9" podUID="2482f122-92d5-410c-b4c0-41834cea1711" containerName="ovn-controller" probeResult="failure" output=< Mar 20 10:55:23 crc kubenswrapper[4748]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 20 10:55:23 crc kubenswrapper[4748]: > Mar 20 10:55:23 crc kubenswrapper[4748]: I0320 10:55:23.884371 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-756zb" Mar 20 10:55:23 crc kubenswrapper[4748]: I0320 10:55:23.884598 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-756zb" Mar 20 10:55:23 crc kubenswrapper[4748]: I0320 10:55:23.928708 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/7753c601-7739-4165-b5f2-a673b0797334-etc-swift\") pod \"swift-storage-0\" (UID: \"7753c601-7739-4165-b5f2-a673b0797334\") " pod="openstack/swift-storage-0" Mar 20 10:55:23 crc kubenswrapper[4748]: E0320 10:55:23.928973 4748 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 20 10:55:23 crc kubenswrapper[4748]: E0320 10:55:23.928998 4748 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 20 10:55:23 crc kubenswrapper[4748]: E0320 10:55:23.929057 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7753c601-7739-4165-b5f2-a673b0797334-etc-swift podName:7753c601-7739-4165-b5f2-a673b0797334 nodeName:}" failed. No retries permitted until 2026-03-20 10:55:31.929039905 +0000 UTC m=+1167.070585719 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/7753c601-7739-4165-b5f2-a673b0797334-etc-swift") pod "swift-storage-0" (UID: "7753c601-7739-4165-b5f2-a673b0797334") : configmap "swift-ring-files" not found Mar 20 10:55:24 crc kubenswrapper[4748]: I0320 10:55:24.095612 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-bldp9-config-7hfj6"] Mar 20 10:55:24 crc kubenswrapper[4748]: I0320 10:55:24.097023 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-bldp9-config-7hfj6" Mar 20 10:55:24 crc kubenswrapper[4748]: I0320 10:55:24.099664 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 20 10:55:24 crc kubenswrapper[4748]: I0320 10:55:24.112220 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-bldp9-config-7hfj6"] Mar 20 10:55:24 crc kubenswrapper[4748]: I0320 10:55:24.131796 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f1a01622-237d-4ea0-9168-c8db48fb1478-scripts\") pod \"ovn-controller-bldp9-config-7hfj6\" (UID: \"f1a01622-237d-4ea0-9168-c8db48fb1478\") " pod="openstack/ovn-controller-bldp9-config-7hfj6" Mar 20 10:55:24 crc kubenswrapper[4748]: I0320 10:55:24.131857 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/f1a01622-237d-4ea0-9168-c8db48fb1478-var-log-ovn\") pod \"ovn-controller-bldp9-config-7hfj6\" (UID: \"f1a01622-237d-4ea0-9168-c8db48fb1478\") " pod="openstack/ovn-controller-bldp9-config-7hfj6" Mar 20 10:55:24 crc kubenswrapper[4748]: I0320 10:55:24.131978 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kx22s\" (UniqueName: \"kubernetes.io/projected/f1a01622-237d-4ea0-9168-c8db48fb1478-kube-api-access-kx22s\") pod \"ovn-controller-bldp9-config-7hfj6\" (UID: \"f1a01622-237d-4ea0-9168-c8db48fb1478\") " pod="openstack/ovn-controller-bldp9-config-7hfj6" Mar 20 10:55:24 crc kubenswrapper[4748]: I0320 10:55:24.132303 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/f1a01622-237d-4ea0-9168-c8db48fb1478-var-run-ovn\") pod \"ovn-controller-bldp9-config-7hfj6\" (UID: \"f1a01622-237d-4ea0-9168-c8db48fb1478\") " pod="openstack/ovn-controller-bldp9-config-7hfj6" Mar 20 10:55:24 crc kubenswrapper[4748]: I0320 10:55:24.132430 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/f1a01622-237d-4ea0-9168-c8db48fb1478-additional-scripts\") pod \"ovn-controller-bldp9-config-7hfj6\" (UID: \"f1a01622-237d-4ea0-9168-c8db48fb1478\") " pod="openstack/ovn-controller-bldp9-config-7hfj6" Mar 20 10:55:24 crc kubenswrapper[4748]: I0320 10:55:24.132524 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f1a01622-237d-4ea0-9168-c8db48fb1478-var-run\") pod \"ovn-controller-bldp9-config-7hfj6\" (UID: \"f1a01622-237d-4ea0-9168-c8db48fb1478\") " pod="openstack/ovn-controller-bldp9-config-7hfj6" Mar 20 10:55:24 crc kubenswrapper[4748]: I0320 10:55:24.234594 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/f1a01622-237d-4ea0-9168-c8db48fb1478-var-run-ovn\") pod \"ovn-controller-bldp9-config-7hfj6\" (UID: \"f1a01622-237d-4ea0-9168-c8db48fb1478\") " pod="openstack/ovn-controller-bldp9-config-7hfj6" Mar 20 10:55:24 crc kubenswrapper[4748]: I0320 10:55:24.234656 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/f1a01622-237d-4ea0-9168-c8db48fb1478-additional-scripts\") pod \"ovn-controller-bldp9-config-7hfj6\" (UID: \"f1a01622-237d-4ea0-9168-c8db48fb1478\") " pod="openstack/ovn-controller-bldp9-config-7hfj6" Mar 20 10:55:24 crc kubenswrapper[4748]: I0320 10:55:24.234710 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f1a01622-237d-4ea0-9168-c8db48fb1478-var-run\") pod \"ovn-controller-bldp9-config-7hfj6\" (UID: \"f1a01622-237d-4ea0-9168-c8db48fb1478\") " pod="openstack/ovn-controller-bldp9-config-7hfj6" Mar 20 10:55:24 crc kubenswrapper[4748]: I0320 10:55:24.234796 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f1a01622-237d-4ea0-9168-c8db48fb1478-scripts\") pod \"ovn-controller-bldp9-config-7hfj6\" (UID: \"f1a01622-237d-4ea0-9168-c8db48fb1478\") " pod="openstack/ovn-controller-bldp9-config-7hfj6" Mar 20 10:55:24 crc kubenswrapper[4748]: I0320 10:55:24.234826 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/f1a01622-237d-4ea0-9168-c8db48fb1478-var-log-ovn\") pod \"ovn-controller-bldp9-config-7hfj6\" (UID: \"f1a01622-237d-4ea0-9168-c8db48fb1478\") " pod="openstack/ovn-controller-bldp9-config-7hfj6" Mar 20 10:55:24 crc kubenswrapper[4748]: I0320 10:55:24.234892 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kx22s\" (UniqueName: \"kubernetes.io/projected/f1a01622-237d-4ea0-9168-c8db48fb1478-kube-api-access-kx22s\") pod \"ovn-controller-bldp9-config-7hfj6\" (UID: \"f1a01622-237d-4ea0-9168-c8db48fb1478\") " pod="openstack/ovn-controller-bldp9-config-7hfj6" Mar 20 10:55:24 crc kubenswrapper[4748]: I0320 10:55:24.235020 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f1a01622-237d-4ea0-9168-c8db48fb1478-var-run\") pod \"ovn-controller-bldp9-config-7hfj6\" (UID: \"f1a01622-237d-4ea0-9168-c8db48fb1478\") " pod="openstack/ovn-controller-bldp9-config-7hfj6" Mar 20 10:55:24 crc kubenswrapper[4748]: I0320 10:55:24.235043 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/f1a01622-237d-4ea0-9168-c8db48fb1478-var-run-ovn\") pod \"ovn-controller-bldp9-config-7hfj6\" (UID: \"f1a01622-237d-4ea0-9168-c8db48fb1478\") " pod="openstack/ovn-controller-bldp9-config-7hfj6" Mar 20 10:55:24 crc kubenswrapper[4748]: I0320 10:55:24.235137 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/f1a01622-237d-4ea0-9168-c8db48fb1478-var-log-ovn\") pod \"ovn-controller-bldp9-config-7hfj6\" (UID: \"f1a01622-237d-4ea0-9168-c8db48fb1478\") " pod="openstack/ovn-controller-bldp9-config-7hfj6" Mar 20 10:55:24 crc kubenswrapper[4748]: I0320 10:55:24.236917 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f1a01622-237d-4ea0-9168-c8db48fb1478-scripts\") pod \"ovn-controller-bldp9-config-7hfj6\" (UID: \"f1a01622-237d-4ea0-9168-c8db48fb1478\") " pod="openstack/ovn-controller-bldp9-config-7hfj6" Mar 20 10:55:24 crc kubenswrapper[4748]: I0320 10:55:24.237383 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/f1a01622-237d-4ea0-9168-c8db48fb1478-additional-scripts\") pod \"ovn-controller-bldp9-config-7hfj6\" (UID: \"f1a01622-237d-4ea0-9168-c8db48fb1478\") " pod="openstack/ovn-controller-bldp9-config-7hfj6" Mar 20 10:55:24 crc kubenswrapper[4748]: I0320 10:55:24.255500 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kx22s\" (UniqueName: \"kubernetes.io/projected/f1a01622-237d-4ea0-9168-c8db48fb1478-kube-api-access-kx22s\") pod \"ovn-controller-bldp9-config-7hfj6\" (UID: \"f1a01622-237d-4ea0-9168-c8db48fb1478\") " pod="openstack/ovn-controller-bldp9-config-7hfj6" Mar 20 10:55:24 crc kubenswrapper[4748]: I0320 10:55:24.430992 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-bldp9-config-7hfj6" Mar 20 10:55:25 crc kubenswrapper[4748]: I0320 10:55:25.257071 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-698758b865-jb9kr" Mar 20 10:55:25 crc kubenswrapper[4748]: I0320 10:55:25.304200 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-8hm6n"] Mar 20 10:55:25 crc kubenswrapper[4748]: I0320 10:55:25.304469 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-86db49b7ff-8hm6n" podUID="954ea93a-7914-4792-b989-9a376beed991" containerName="dnsmasq-dns" containerID="cri-o://cf097fcea63752268a3f9ba13c17109cf4b8f7688f589534286bade0237d59d1" gracePeriod=10 Mar 20 10:55:26 crc kubenswrapper[4748]: I0320 10:55:26.801325 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-q2gxz" event={"ID":"4029da34-a74b-481e-a7b8-692ff54e61f9","Type":"ContainerDied","Data":"7dded27ddce30be3328a62bf59b4fccfbb2bcd98a1c8938c8d707a488beb986e"} Mar 20 10:55:26 crc kubenswrapper[4748]: I0320 10:55:26.801403 4748 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7dded27ddce30be3328a62bf59b4fccfbb2bcd98a1c8938c8d707a488beb986e" Mar 20 10:55:26 crc kubenswrapper[4748]: I0320 10:55:26.803990 4748 generic.go:334] "Generic (PLEG): container finished" podID="954ea93a-7914-4792-b989-9a376beed991" containerID="cf097fcea63752268a3f9ba13c17109cf4b8f7688f589534286bade0237d59d1" exitCode=0 Mar 20 10:55:26 crc kubenswrapper[4748]: I0320 10:55:26.804049 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-8hm6n" event={"ID":"954ea93a-7914-4792-b989-9a376beed991","Type":"ContainerDied","Data":"cf097fcea63752268a3f9ba13c17109cf4b8f7688f589534286bade0237d59d1"} Mar 20 10:55:26 crc kubenswrapper[4748]: I0320 10:55:26.916324 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-q2gxz" Mar 20 10:55:26 crc kubenswrapper[4748]: I0320 10:55:26.994037 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4029da34-a74b-481e-a7b8-692ff54e61f9-operator-scripts\") pod \"4029da34-a74b-481e-a7b8-692ff54e61f9\" (UID: \"4029da34-a74b-481e-a7b8-692ff54e61f9\") " Mar 20 10:55:26 crc kubenswrapper[4748]: I0320 10:55:26.994463 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wn6jg\" (UniqueName: \"kubernetes.io/projected/4029da34-a74b-481e-a7b8-692ff54e61f9-kube-api-access-wn6jg\") pod \"4029da34-a74b-481e-a7b8-692ff54e61f9\" (UID: \"4029da34-a74b-481e-a7b8-692ff54e61f9\") " Mar 20 10:55:26 crc kubenswrapper[4748]: I0320 10:55:26.997144 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4029da34-a74b-481e-a7b8-692ff54e61f9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4029da34-a74b-481e-a7b8-692ff54e61f9" (UID: "4029da34-a74b-481e-a7b8-692ff54e61f9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:55:27 crc kubenswrapper[4748]: I0320 10:55:27.005049 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4029da34-a74b-481e-a7b8-692ff54e61f9-kube-api-access-wn6jg" (OuterVolumeSpecName: "kube-api-access-wn6jg") pod "4029da34-a74b-481e-a7b8-692ff54e61f9" (UID: "4029da34-a74b-481e-a7b8-692ff54e61f9"). InnerVolumeSpecName "kube-api-access-wn6jg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:55:27 crc kubenswrapper[4748]: I0320 10:55:27.084416 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-8hm6n" Mar 20 10:55:27 crc kubenswrapper[4748]: I0320 10:55:27.096641 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wn6jg\" (UniqueName: \"kubernetes.io/projected/4029da34-a74b-481e-a7b8-692ff54e61f9-kube-api-access-wn6jg\") on node \"crc\" DevicePath \"\"" Mar 20 10:55:27 crc kubenswrapper[4748]: I0320 10:55:27.096678 4748 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4029da34-a74b-481e-a7b8-692ff54e61f9-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 10:55:27 crc kubenswrapper[4748]: I0320 10:55:27.197470 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/954ea93a-7914-4792-b989-9a376beed991-ovsdbserver-sb\") pod \"954ea93a-7914-4792-b989-9a376beed991\" (UID: \"954ea93a-7914-4792-b989-9a376beed991\") " Mar 20 10:55:27 crc kubenswrapper[4748]: I0320 10:55:27.197535 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/954ea93a-7914-4792-b989-9a376beed991-config\") pod \"954ea93a-7914-4792-b989-9a376beed991\" (UID: \"954ea93a-7914-4792-b989-9a376beed991\") " Mar 20 10:55:27 crc kubenswrapper[4748]: I0320 10:55:27.197646 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h9jzm\" (UniqueName: \"kubernetes.io/projected/954ea93a-7914-4792-b989-9a376beed991-kube-api-access-h9jzm\") pod \"954ea93a-7914-4792-b989-9a376beed991\" (UID: \"954ea93a-7914-4792-b989-9a376beed991\") " Mar 20 10:55:27 crc kubenswrapper[4748]: I0320 10:55:27.197752 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/954ea93a-7914-4792-b989-9a376beed991-dns-svc\") pod \"954ea93a-7914-4792-b989-9a376beed991\" (UID: \"954ea93a-7914-4792-b989-9a376beed991\") " Mar 20 10:55:27 crc kubenswrapper[4748]: I0320 10:55:27.197790 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/954ea93a-7914-4792-b989-9a376beed991-ovsdbserver-nb\") pod \"954ea93a-7914-4792-b989-9a376beed991\" (UID: \"954ea93a-7914-4792-b989-9a376beed991\") " Mar 20 10:55:27 crc kubenswrapper[4748]: I0320 10:55:27.204235 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/954ea93a-7914-4792-b989-9a376beed991-kube-api-access-h9jzm" (OuterVolumeSpecName: "kube-api-access-h9jzm") pod "954ea93a-7914-4792-b989-9a376beed991" (UID: "954ea93a-7914-4792-b989-9a376beed991"). InnerVolumeSpecName "kube-api-access-h9jzm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:55:27 crc kubenswrapper[4748]: I0320 10:55:27.241565 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-n8tkm"] Mar 20 10:55:27 crc kubenswrapper[4748]: I0320 10:55:27.246987 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/954ea93a-7914-4792-b989-9a376beed991-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "954ea93a-7914-4792-b989-9a376beed991" (UID: "954ea93a-7914-4792-b989-9a376beed991"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:55:27 crc kubenswrapper[4748]: I0320 10:55:27.247127 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/954ea93a-7914-4792-b989-9a376beed991-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "954ea93a-7914-4792-b989-9a376beed991" (UID: "954ea93a-7914-4792-b989-9a376beed991"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:55:27 crc kubenswrapper[4748]: I0320 10:55:27.253976 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/954ea93a-7914-4792-b989-9a376beed991-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "954ea93a-7914-4792-b989-9a376beed991" (UID: "954ea93a-7914-4792-b989-9a376beed991"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:55:27 crc kubenswrapper[4748]: I0320 10:55:27.262572 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/954ea93a-7914-4792-b989-9a376beed991-config" (OuterVolumeSpecName: "config") pod "954ea93a-7914-4792-b989-9a376beed991" (UID: "954ea93a-7914-4792-b989-9a376beed991"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:55:27 crc kubenswrapper[4748]: I0320 10:55:27.300749 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h9jzm\" (UniqueName: \"kubernetes.io/projected/954ea93a-7914-4792-b989-9a376beed991-kube-api-access-h9jzm\") on node \"crc\" DevicePath \"\"" Mar 20 10:55:27 crc kubenswrapper[4748]: I0320 10:55:27.300777 4748 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/954ea93a-7914-4792-b989-9a376beed991-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 10:55:27 crc kubenswrapper[4748]: I0320 10:55:27.300788 4748 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/954ea93a-7914-4792-b989-9a376beed991-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 10:55:27 crc kubenswrapper[4748]: I0320 10:55:27.300797 4748 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/954ea93a-7914-4792-b989-9a376beed991-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:55:27 crc kubenswrapper[4748]: I0320 10:55:27.300804 4748 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/954ea93a-7914-4792-b989-9a376beed991-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 10:55:27 crc kubenswrapper[4748]: I0320 10:55:27.404770 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-7a59-account-create-update-vt42l"] Mar 20 10:55:27 crc kubenswrapper[4748]: W0320 10:55:27.409681 4748 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5905f14e_ca31_4983_9335_db320b71f0d1.slice/crio-cfd384e0d3367ff6eb80b91a6ead8d9659d970ae1e7b3023112f4ba033984a69 WatchSource:0}: Error finding container cfd384e0d3367ff6eb80b91a6ead8d9659d970ae1e7b3023112f4ba033984a69: Status 404 returned error can't find the container with id cfd384e0d3367ff6eb80b91a6ead8d9659d970ae1e7b3023112f4ba033984a69 Mar 20 10:55:27 crc kubenswrapper[4748]: W0320 10:55:27.418151 4748 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7b34c916_eb28_4060_8a0c_22b7ae45bcaa.slice/crio-f01426a24ee59cd6ab9785ca0a3ae7f67102d9147c90d6fef54d284fd530ed65 WatchSource:0}: Error finding container f01426a24ee59cd6ab9785ca0a3ae7f67102d9147c90d6fef54d284fd530ed65: Status 404 returned error can't find the container with id f01426a24ee59cd6ab9785ca0a3ae7f67102d9147c90d6fef54d284fd530ed65 Mar 20 10:55:27 crc kubenswrapper[4748]: I0320 10:55:27.420764 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-915e-account-create-update-rq9s8"] Mar 20 10:55:27 crc kubenswrapper[4748]: I0320 10:55:27.436340 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-mzn7b"] Mar 20 10:55:27 crc kubenswrapper[4748]: I0320 10:55:27.600911 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-bldp9-config-7hfj6"] Mar 20 10:55:27 crc kubenswrapper[4748]: W0320 10:55:27.605658 4748 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf1a01622_237d_4ea0_9168_c8db48fb1478.slice/crio-98394b474fb915a92c2b531a1ee4476f483e28dfeb065d7d11f881fcbbf966f2 WatchSource:0}: Error finding container 98394b474fb915a92c2b531a1ee4476f483e28dfeb065d7d11f881fcbbf966f2: Status 404 returned error can't find the container with id 98394b474fb915a92c2b531a1ee4476f483e28dfeb065d7d11f881fcbbf966f2 Mar 20 10:55:27 crc kubenswrapper[4748]: I0320 10:55:27.817302 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-mlztp" event={"ID":"8bde82f9-d6e7-4ab4-8666-5fbbd5b90f17","Type":"ContainerStarted","Data":"863987fe20374b36fdc5f4e6a6d860e55e30e7e7660c1768c5557c8667f9fc13"} Mar 20 10:55:27 crc kubenswrapper[4748]: I0320 10:55:27.818954 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-bldp9-config-7hfj6" event={"ID":"f1a01622-237d-4ea0-9168-c8db48fb1478","Type":"ContainerStarted","Data":"98394b474fb915a92c2b531a1ee4476f483e28dfeb065d7d11f881fcbbf966f2"} Mar 20 10:55:27 crc kubenswrapper[4748]: I0320 10:55:27.820912 4748 generic.go:334] "Generic (PLEG): container finished" podID="056ff617-ed15-4344-8883-afb1abd38abb" containerID="d00d1cd95d6aeea04a3f52207c886bfedc4b1bf732ae0fcfdc5f08c4c58d28bd" exitCode=0 Mar 20 10:55:27 crc kubenswrapper[4748]: I0320 10:55:27.821122 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-mzn7b" event={"ID":"056ff617-ed15-4344-8883-afb1abd38abb","Type":"ContainerDied","Data":"d00d1cd95d6aeea04a3f52207c886bfedc4b1bf732ae0fcfdc5f08c4c58d28bd"} Mar 20 10:55:27 crc kubenswrapper[4748]: I0320 10:55:27.821155 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-mzn7b" event={"ID":"056ff617-ed15-4344-8883-afb1abd38abb","Type":"ContainerStarted","Data":"5899a73ba00c6580a1004b3438e2ae715f5539b31d4490cf0f6de1d643e8af4a"} Mar 20 10:55:27 crc kubenswrapper[4748]: I0320 10:55:27.822854 4748 generic.go:334] "Generic (PLEG): container finished" podID="8949f08a-2d27-4055-86a7-a66a77de530b" containerID="d7bc10c1a8c9d9d3e0ee576dd9aa00a993b281bf5342e91016f27cc02bba9b86" exitCode=0 Mar 20 10:55:27 crc kubenswrapper[4748]: I0320 10:55:27.822931 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-n8tkm" event={"ID":"8949f08a-2d27-4055-86a7-a66a77de530b","Type":"ContainerDied","Data":"d7bc10c1a8c9d9d3e0ee576dd9aa00a993b281bf5342e91016f27cc02bba9b86"} Mar 20 10:55:27 crc kubenswrapper[4748]: I0320 10:55:27.822968 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-n8tkm" event={"ID":"8949f08a-2d27-4055-86a7-a66a77de530b","Type":"ContainerStarted","Data":"41a64a3454839cb9e62c1df63d0cfa4bbbc8977f721d4e38e7930ba6b1c3c664"} Mar 20 10:55:27 crc kubenswrapper[4748]: I0320 10:55:27.824784 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-7a59-account-create-update-vt42l" event={"ID":"5905f14e-ca31-4983-9335-db320b71f0d1","Type":"ContainerStarted","Data":"2ccdc100ca6c5f92b72e272d08534275a25bf6a2c5c36331713c59dd6c4c33da"} Mar 20 10:55:27 crc kubenswrapper[4748]: I0320 10:55:27.824810 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-7a59-account-create-update-vt42l" event={"ID":"5905f14e-ca31-4983-9335-db320b71f0d1","Type":"ContainerStarted","Data":"cfd384e0d3367ff6eb80b91a6ead8d9659d970ae1e7b3023112f4ba033984a69"} Mar 20 10:55:27 crc kubenswrapper[4748]: I0320 10:55:27.826991 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-8hm6n" event={"ID":"954ea93a-7914-4792-b989-9a376beed991","Type":"ContainerDied","Data":"d3c55c5370ebc6b98b221595a9803fe09255531485f0e997eb83aa38a0f29051"} Mar 20 10:55:27 crc kubenswrapper[4748]: I0320 10:55:27.827099 4748 scope.go:117] "RemoveContainer" containerID="cf097fcea63752268a3f9ba13c17109cf4b8f7688f589534286bade0237d59d1" Mar 20 10:55:27 crc kubenswrapper[4748]: I0320 10:55:27.827019 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-8hm6n" Mar 20 10:55:27 crc kubenswrapper[4748]: I0320 10:55:27.828691 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-915e-account-create-update-rq9s8" event={"ID":"7b34c916-eb28-4060-8a0c-22b7ae45bcaa","Type":"ContainerStarted","Data":"e3f5d3bfe950565c6a7ea780b13c5fa011c1630f5c5b9ae8eaefa509cb2dd41c"} Mar 20 10:55:27 crc kubenswrapper[4748]: I0320 10:55:27.828721 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-q2gxz" Mar 20 10:55:27 crc kubenswrapper[4748]: I0320 10:55:27.828735 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-915e-account-create-update-rq9s8" event={"ID":"7b34c916-eb28-4060-8a0c-22b7ae45bcaa","Type":"ContainerStarted","Data":"f01426a24ee59cd6ab9785ca0a3ae7f67102d9147c90d6fef54d284fd530ed65"} Mar 20 10:55:27 crc kubenswrapper[4748]: I0320 10:55:27.839468 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-mlztp" podStartSLOduration=1.916471727 podStartE2EDuration="7.839444727s" podCreationTimestamp="2026-03-20 10:55:20 +0000 UTC" firstStartedPulling="2026-03-20 10:55:20.83310017 +0000 UTC m=+1155.974645984" lastFinishedPulling="2026-03-20 10:55:26.75607317 +0000 UTC m=+1161.897618984" observedRunningTime="2026-03-20 10:55:27.83876522 +0000 UTC m=+1162.980311034" watchObservedRunningTime="2026-03-20 10:55:27.839444727 +0000 UTC m=+1162.980990541" Mar 20 10:55:27 crc kubenswrapper[4748]: I0320 10:55:27.860187 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-915e-account-create-update-rq9s8" podStartSLOduration=4.860157508 podStartE2EDuration="4.860157508s" podCreationTimestamp="2026-03-20 10:55:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:55:27.858712272 +0000 UTC m=+1163.000258136" watchObservedRunningTime="2026-03-20 10:55:27.860157508 +0000 UTC m=+1163.001703322" Mar 20 10:55:27 crc kubenswrapper[4748]: I0320 10:55:27.888450 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-7a59-account-create-update-vt42l" podStartSLOduration=5.8884215090000005 podStartE2EDuration="5.888421509s" podCreationTimestamp="2026-03-20 10:55:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:55:27.879860823 +0000 UTC m=+1163.021406657" watchObservedRunningTime="2026-03-20 10:55:27.888421509 +0000 UTC m=+1163.029967323" Mar 20 10:55:27 crc kubenswrapper[4748]: I0320 10:55:27.990123 4748 scope.go:117] "RemoveContainer" containerID="99de9558a1c154da176d583f15404d0f01c5c93bd30f19853d9beac956ffd70d" Mar 20 10:55:28 crc kubenswrapper[4748]: I0320 10:55:28.021707 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-8hm6n"] Mar 20 10:55:28 crc kubenswrapper[4748]: I0320 10:55:28.029130 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-8hm6n"] Mar 20 10:55:28 crc kubenswrapper[4748]: I0320 10:55:28.771646 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-bldp9" Mar 20 10:55:28 crc kubenswrapper[4748]: I0320 10:55:28.861229 4748 generic.go:334] "Generic (PLEG): container finished" podID="5905f14e-ca31-4983-9335-db320b71f0d1" containerID="2ccdc100ca6c5f92b72e272d08534275a25bf6a2c5c36331713c59dd6c4c33da" exitCode=0 Mar 20 10:55:28 crc kubenswrapper[4748]: I0320 10:55:28.861326 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-7a59-account-create-update-vt42l" event={"ID":"5905f14e-ca31-4983-9335-db320b71f0d1","Type":"ContainerDied","Data":"2ccdc100ca6c5f92b72e272d08534275a25bf6a2c5c36331713c59dd6c4c33da"} Mar 20 10:55:28 crc kubenswrapper[4748]: I0320 10:55:28.889659 4748 generic.go:334] "Generic (PLEG): container finished" podID="7b34c916-eb28-4060-8a0c-22b7ae45bcaa" containerID="e3f5d3bfe950565c6a7ea780b13c5fa011c1630f5c5b9ae8eaefa509cb2dd41c" exitCode=0 Mar 20 10:55:28 crc kubenswrapper[4748]: I0320 10:55:28.889777 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-915e-account-create-update-rq9s8" event={"ID":"7b34c916-eb28-4060-8a0c-22b7ae45bcaa","Type":"ContainerDied","Data":"e3f5d3bfe950565c6a7ea780b13c5fa011c1630f5c5b9ae8eaefa509cb2dd41c"} Mar 20 10:55:28 crc kubenswrapper[4748]: I0320 10:55:28.896556 4748 generic.go:334] "Generic (PLEG): container finished" podID="f1a01622-237d-4ea0-9168-c8db48fb1478" containerID="7cb3666c6f49b0aba3c470df100fcae2dc72637ad9c5c723dc545bd54342e5c2" exitCode=0 Mar 20 10:55:28 crc kubenswrapper[4748]: I0320 10:55:28.897314 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-bldp9-config-7hfj6" event={"ID":"f1a01622-237d-4ea0-9168-c8db48fb1478","Type":"ContainerDied","Data":"7cb3666c6f49b0aba3c470df100fcae2dc72637ad9c5c723dc545bd54342e5c2"} Mar 20 10:55:29 crc kubenswrapper[4748]: I0320 10:55:29.323780 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-n8tkm" Mar 20 10:55:29 crc kubenswrapper[4748]: I0320 10:55:29.331270 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-mzn7b" Mar 20 10:55:29 crc kubenswrapper[4748]: I0320 10:55:29.344094 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2wvnt\" (UniqueName: \"kubernetes.io/projected/8949f08a-2d27-4055-86a7-a66a77de530b-kube-api-access-2wvnt\") pod \"8949f08a-2d27-4055-86a7-a66a77de530b\" (UID: \"8949f08a-2d27-4055-86a7-a66a77de530b\") " Mar 20 10:55:29 crc kubenswrapper[4748]: I0320 10:55:29.344198 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8949f08a-2d27-4055-86a7-a66a77de530b-operator-scripts\") pod \"8949f08a-2d27-4055-86a7-a66a77de530b\" (UID: \"8949f08a-2d27-4055-86a7-a66a77de530b\") " Mar 20 10:55:29 crc kubenswrapper[4748]: I0320 10:55:29.344243 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/056ff617-ed15-4344-8883-afb1abd38abb-operator-scripts\") pod \"056ff617-ed15-4344-8883-afb1abd38abb\" (UID: \"056ff617-ed15-4344-8883-afb1abd38abb\") " Mar 20 10:55:29 crc kubenswrapper[4748]: I0320 10:55:29.344269 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z62s5\" (UniqueName: \"kubernetes.io/projected/056ff617-ed15-4344-8883-afb1abd38abb-kube-api-access-z62s5\") pod \"056ff617-ed15-4344-8883-afb1abd38abb\" (UID: \"056ff617-ed15-4344-8883-afb1abd38abb\") " Mar 20 10:55:29 crc kubenswrapper[4748]: I0320 10:55:29.345303 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8949f08a-2d27-4055-86a7-a66a77de530b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8949f08a-2d27-4055-86a7-a66a77de530b" (UID: "8949f08a-2d27-4055-86a7-a66a77de530b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:55:29 crc kubenswrapper[4748]: I0320 10:55:29.345322 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/056ff617-ed15-4344-8883-afb1abd38abb-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "056ff617-ed15-4344-8883-afb1abd38abb" (UID: "056ff617-ed15-4344-8883-afb1abd38abb"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:55:29 crc kubenswrapper[4748]: I0320 10:55:29.354952 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8949f08a-2d27-4055-86a7-a66a77de530b-kube-api-access-2wvnt" (OuterVolumeSpecName: "kube-api-access-2wvnt") pod "8949f08a-2d27-4055-86a7-a66a77de530b" (UID: "8949f08a-2d27-4055-86a7-a66a77de530b"). InnerVolumeSpecName "kube-api-access-2wvnt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:55:29 crc kubenswrapper[4748]: I0320 10:55:29.363393 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/056ff617-ed15-4344-8883-afb1abd38abb-kube-api-access-z62s5" (OuterVolumeSpecName: "kube-api-access-z62s5") pod "056ff617-ed15-4344-8883-afb1abd38abb" (UID: "056ff617-ed15-4344-8883-afb1abd38abb"). InnerVolumeSpecName "kube-api-access-z62s5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:55:29 crc kubenswrapper[4748]: I0320 10:55:29.446389 4748 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8949f08a-2d27-4055-86a7-a66a77de530b-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 10:55:29 crc kubenswrapper[4748]: I0320 10:55:29.446441 4748 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/056ff617-ed15-4344-8883-afb1abd38abb-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 10:55:29 crc kubenswrapper[4748]: I0320 10:55:29.446456 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z62s5\" (UniqueName: \"kubernetes.io/projected/056ff617-ed15-4344-8883-afb1abd38abb-kube-api-access-z62s5\") on node \"crc\" DevicePath \"\"" Mar 20 10:55:29 crc kubenswrapper[4748]: I0320 10:55:29.446469 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2wvnt\" (UniqueName: \"kubernetes.io/projected/8949f08a-2d27-4055-86a7-a66a77de530b-kube-api-access-2wvnt\") on node \"crc\" DevicePath \"\"" Mar 20 10:55:29 crc kubenswrapper[4748]: I0320 10:55:29.530147 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="954ea93a-7914-4792-b989-9a376beed991" path="/var/lib/kubelet/pods/954ea93a-7914-4792-b989-9a376beed991/volumes" Mar 20 10:55:29 crc kubenswrapper[4748]: I0320 10:55:29.906971 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-mzn7b" event={"ID":"056ff617-ed15-4344-8883-afb1abd38abb","Type":"ContainerDied","Data":"5899a73ba00c6580a1004b3438e2ae715f5539b31d4490cf0f6de1d643e8af4a"} Mar 20 10:55:29 crc kubenswrapper[4748]: I0320 10:55:29.907017 4748 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5899a73ba00c6580a1004b3438e2ae715f5539b31d4490cf0f6de1d643e8af4a" Mar 20 10:55:29 crc kubenswrapper[4748]: I0320 10:55:29.907015 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-mzn7b" Mar 20 10:55:29 crc kubenswrapper[4748]: I0320 10:55:29.909014 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-n8tkm" event={"ID":"8949f08a-2d27-4055-86a7-a66a77de530b","Type":"ContainerDied","Data":"41a64a3454839cb9e62c1df63d0cfa4bbbc8977f721d4e38e7930ba6b1c3c664"} Mar 20 10:55:29 crc kubenswrapper[4748]: I0320 10:55:29.909038 4748 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="41a64a3454839cb9e62c1df63d0cfa4bbbc8977f721d4e38e7930ba6b1c3c664" Mar 20 10:55:29 crc kubenswrapper[4748]: I0320 10:55:29.909058 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-n8tkm" Mar 20 10:55:29 crc kubenswrapper[4748]: I0320 10:55:29.911173 4748 generic.go:334] "Generic (PLEG): container finished" podID="c9362889-0195-4aad-96bd-ed63db88da83" containerID="6ccf567ca597230b5f8e105670b07f93dbf0776dfe28b4aadbed6ba96ca21f1b" exitCode=0 Mar 20 10:55:29 crc kubenswrapper[4748]: I0320 10:55:29.911304 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c9362889-0195-4aad-96bd-ed63db88da83","Type":"ContainerDied","Data":"6ccf567ca597230b5f8e105670b07f93dbf0776dfe28b4aadbed6ba96ca21f1b"} Mar 20 10:55:30 crc kubenswrapper[4748]: I0320 10:55:30.335044 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-bldp9-config-7hfj6" Mar 20 10:55:30 crc kubenswrapper[4748]: I0320 10:55:30.372515 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/f1a01622-237d-4ea0-9168-c8db48fb1478-additional-scripts\") pod \"f1a01622-237d-4ea0-9168-c8db48fb1478\" (UID: \"f1a01622-237d-4ea0-9168-c8db48fb1478\") " Mar 20 10:55:30 crc kubenswrapper[4748]: I0320 10:55:30.372637 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/f1a01622-237d-4ea0-9168-c8db48fb1478-var-run-ovn\") pod \"f1a01622-237d-4ea0-9168-c8db48fb1478\" (UID: \"f1a01622-237d-4ea0-9168-c8db48fb1478\") " Mar 20 10:55:30 crc kubenswrapper[4748]: I0320 10:55:30.372665 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kx22s\" (UniqueName: \"kubernetes.io/projected/f1a01622-237d-4ea0-9168-c8db48fb1478-kube-api-access-kx22s\") pod \"f1a01622-237d-4ea0-9168-c8db48fb1478\" (UID: \"f1a01622-237d-4ea0-9168-c8db48fb1478\") " Mar 20 10:55:30 crc kubenswrapper[4748]: I0320 10:55:30.372718 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/f1a01622-237d-4ea0-9168-c8db48fb1478-var-log-ovn\") pod \"f1a01622-237d-4ea0-9168-c8db48fb1478\" (UID: \"f1a01622-237d-4ea0-9168-c8db48fb1478\") " Mar 20 10:55:30 crc kubenswrapper[4748]: I0320 10:55:30.372754 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f1a01622-237d-4ea0-9168-c8db48fb1478-scripts\") pod \"f1a01622-237d-4ea0-9168-c8db48fb1478\" (UID: \"f1a01622-237d-4ea0-9168-c8db48fb1478\") " Mar 20 10:55:30 crc kubenswrapper[4748]: I0320 10:55:30.372858 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f1a01622-237d-4ea0-9168-c8db48fb1478-var-run\") pod \"f1a01622-237d-4ea0-9168-c8db48fb1478\" (UID: \"f1a01622-237d-4ea0-9168-c8db48fb1478\") " Mar 20 10:55:30 crc kubenswrapper[4748]: I0320 10:55:30.375192 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f1a01622-237d-4ea0-9168-c8db48fb1478-var-run" (OuterVolumeSpecName: "var-run") pod "f1a01622-237d-4ea0-9168-c8db48fb1478" (UID: "f1a01622-237d-4ea0-9168-c8db48fb1478"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 10:55:30 crc kubenswrapper[4748]: I0320 10:55:30.375245 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f1a01622-237d-4ea0-9168-c8db48fb1478-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "f1a01622-237d-4ea0-9168-c8db48fb1478" (UID: "f1a01622-237d-4ea0-9168-c8db48fb1478"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 10:55:30 crc kubenswrapper[4748]: I0320 10:55:30.376417 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1a01622-237d-4ea0-9168-c8db48fb1478-scripts" (OuterVolumeSpecName: "scripts") pod "f1a01622-237d-4ea0-9168-c8db48fb1478" (UID: "f1a01622-237d-4ea0-9168-c8db48fb1478"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:55:30 crc kubenswrapper[4748]: I0320 10:55:30.376464 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f1a01622-237d-4ea0-9168-c8db48fb1478-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "f1a01622-237d-4ea0-9168-c8db48fb1478" (UID: "f1a01622-237d-4ea0-9168-c8db48fb1478"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 10:55:30 crc kubenswrapper[4748]: I0320 10:55:30.376745 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1a01622-237d-4ea0-9168-c8db48fb1478-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "f1a01622-237d-4ea0-9168-c8db48fb1478" (UID: "f1a01622-237d-4ea0-9168-c8db48fb1478"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:55:30 crc kubenswrapper[4748]: I0320 10:55:30.385872 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1a01622-237d-4ea0-9168-c8db48fb1478-kube-api-access-kx22s" (OuterVolumeSpecName: "kube-api-access-kx22s") pod "f1a01622-237d-4ea0-9168-c8db48fb1478" (UID: "f1a01622-237d-4ea0-9168-c8db48fb1478"). InnerVolumeSpecName "kube-api-access-kx22s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:55:30 crc kubenswrapper[4748]: I0320 10:55:30.469155 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-7a59-account-create-update-vt42l" Mar 20 10:55:30 crc kubenswrapper[4748]: I0320 10:55:30.476701 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-915e-account-create-update-rq9s8" Mar 20 10:55:30 crc kubenswrapper[4748]: I0320 10:55:30.476936 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-29b85\" (UniqueName: \"kubernetes.io/projected/5905f14e-ca31-4983-9335-db320b71f0d1-kube-api-access-29b85\") pod \"5905f14e-ca31-4983-9335-db320b71f0d1\" (UID: \"5905f14e-ca31-4983-9335-db320b71f0d1\") " Mar 20 10:55:30 crc kubenswrapper[4748]: I0320 10:55:30.477163 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5905f14e-ca31-4983-9335-db320b71f0d1-operator-scripts\") pod \"5905f14e-ca31-4983-9335-db320b71f0d1\" (UID: \"5905f14e-ca31-4983-9335-db320b71f0d1\") " Mar 20 10:55:30 crc kubenswrapper[4748]: I0320 10:55:30.477739 4748 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f1a01622-237d-4ea0-9168-c8db48fb1478-var-run\") on node \"crc\" DevicePath \"\"" Mar 20 10:55:30 crc kubenswrapper[4748]: I0320 10:55:30.477764 4748 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/f1a01622-237d-4ea0-9168-c8db48fb1478-additional-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 10:55:30 crc kubenswrapper[4748]: I0320 10:55:30.477794 4748 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/f1a01622-237d-4ea0-9168-c8db48fb1478-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 20 10:55:30 crc kubenswrapper[4748]: I0320 10:55:30.477805 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kx22s\" (UniqueName: \"kubernetes.io/projected/f1a01622-237d-4ea0-9168-c8db48fb1478-kube-api-access-kx22s\") on node \"crc\" DevicePath \"\"" Mar 20 10:55:30 crc kubenswrapper[4748]: I0320 10:55:30.477814 4748 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/f1a01622-237d-4ea0-9168-c8db48fb1478-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 20 10:55:30 crc kubenswrapper[4748]: I0320 10:55:30.477823 4748 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f1a01622-237d-4ea0-9168-c8db48fb1478-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 10:55:30 crc kubenswrapper[4748]: I0320 10:55:30.477824 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5905f14e-ca31-4983-9335-db320b71f0d1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5905f14e-ca31-4983-9335-db320b71f0d1" (UID: "5905f14e-ca31-4983-9335-db320b71f0d1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:55:30 crc kubenswrapper[4748]: I0320 10:55:30.480756 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5905f14e-ca31-4983-9335-db320b71f0d1-kube-api-access-29b85" (OuterVolumeSpecName: "kube-api-access-29b85") pod "5905f14e-ca31-4983-9335-db320b71f0d1" (UID: "5905f14e-ca31-4983-9335-db320b71f0d1"). InnerVolumeSpecName "kube-api-access-29b85". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:55:30 crc kubenswrapper[4748]: I0320 10:55:30.486963 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-q2gxz"] Mar 20 10:55:30 crc kubenswrapper[4748]: I0320 10:55:30.493747 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-q2gxz"] Mar 20 10:55:30 crc kubenswrapper[4748]: I0320 10:55:30.578718 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7b34c916-eb28-4060-8a0c-22b7ae45bcaa-operator-scripts\") pod \"7b34c916-eb28-4060-8a0c-22b7ae45bcaa\" (UID: \"7b34c916-eb28-4060-8a0c-22b7ae45bcaa\") " Mar 20 10:55:30 crc kubenswrapper[4748]: I0320 10:55:30.578906 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ddxhn\" (UniqueName: \"kubernetes.io/projected/7b34c916-eb28-4060-8a0c-22b7ae45bcaa-kube-api-access-ddxhn\") pod \"7b34c916-eb28-4060-8a0c-22b7ae45bcaa\" (UID: \"7b34c916-eb28-4060-8a0c-22b7ae45bcaa\") " Mar 20 10:55:30 crc kubenswrapper[4748]: I0320 10:55:30.579240 4748 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5905f14e-ca31-4983-9335-db320b71f0d1-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 10:55:30 crc kubenswrapper[4748]: I0320 10:55:30.579292 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-29b85\" (UniqueName: \"kubernetes.io/projected/5905f14e-ca31-4983-9335-db320b71f0d1-kube-api-access-29b85\") on node \"crc\" DevicePath \"\"" Mar 20 10:55:30 crc kubenswrapper[4748]: I0320 10:55:30.579459 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b34c916-eb28-4060-8a0c-22b7ae45bcaa-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7b34c916-eb28-4060-8a0c-22b7ae45bcaa" (UID: "7b34c916-eb28-4060-8a0c-22b7ae45bcaa"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:55:30 crc kubenswrapper[4748]: I0320 10:55:30.582225 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b34c916-eb28-4060-8a0c-22b7ae45bcaa-kube-api-access-ddxhn" (OuterVolumeSpecName: "kube-api-access-ddxhn") pod "7b34c916-eb28-4060-8a0c-22b7ae45bcaa" (UID: "7b34c916-eb28-4060-8a0c-22b7ae45bcaa"). InnerVolumeSpecName "kube-api-access-ddxhn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:55:30 crc kubenswrapper[4748]: I0320 10:55:30.681082 4748 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7b34c916-eb28-4060-8a0c-22b7ae45bcaa-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 10:55:30 crc kubenswrapper[4748]: I0320 10:55:30.681117 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ddxhn\" (UniqueName: \"kubernetes.io/projected/7b34c916-eb28-4060-8a0c-22b7ae45bcaa-kube-api-access-ddxhn\") on node \"crc\" DevicePath \"\"" Mar 20 10:55:30 crc kubenswrapper[4748]: I0320 10:55:30.921414 4748 generic.go:334] "Generic (PLEG): container finished" podID="a5a9b3e3-3a44-4765-ab5b-0e7955b524f7" containerID="6ba003918b6897670cc8bdcdc8ff22f66bc5073d251d1b4a3f6edadbbe769774" exitCode=0 Mar 20 10:55:30 crc kubenswrapper[4748]: I0320 10:55:30.921536 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a5a9b3e3-3a44-4765-ab5b-0e7955b524f7","Type":"ContainerDied","Data":"6ba003918b6897670cc8bdcdc8ff22f66bc5073d251d1b4a3f6edadbbe769774"} Mar 20 10:55:30 crc kubenswrapper[4748]: I0320 10:55:30.923224 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-915e-account-create-update-rq9s8" event={"ID":"7b34c916-eb28-4060-8a0c-22b7ae45bcaa","Type":"ContainerDied","Data":"f01426a24ee59cd6ab9785ca0a3ae7f67102d9147c90d6fef54d284fd530ed65"} Mar 20 10:55:30 crc kubenswrapper[4748]: I0320 10:55:30.923660 4748 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f01426a24ee59cd6ab9785ca0a3ae7f67102d9147c90d6fef54d284fd530ed65" Mar 20 10:55:30 crc kubenswrapper[4748]: I0320 10:55:30.923250 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-915e-account-create-update-rq9s8" Mar 20 10:55:30 crc kubenswrapper[4748]: I0320 10:55:30.925573 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c9362889-0195-4aad-96bd-ed63db88da83","Type":"ContainerStarted","Data":"80f1fafc8886612f8d64fc6572f4a7fedd2096e9035a5f9faa8fb4a3ddc04814"} Mar 20 10:55:30 crc kubenswrapper[4748]: I0320 10:55:30.925784 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 20 10:55:30 crc kubenswrapper[4748]: I0320 10:55:30.927559 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-bldp9-config-7hfj6" Mar 20 10:55:30 crc kubenswrapper[4748]: I0320 10:55:30.927984 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-bldp9-config-7hfj6" event={"ID":"f1a01622-237d-4ea0-9168-c8db48fb1478","Type":"ContainerDied","Data":"98394b474fb915a92c2b531a1ee4476f483e28dfeb065d7d11f881fcbbf966f2"} Mar 20 10:55:30 crc kubenswrapper[4748]: I0320 10:55:30.928423 4748 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="98394b474fb915a92c2b531a1ee4476f483e28dfeb065d7d11f881fcbbf966f2" Mar 20 10:55:30 crc kubenswrapper[4748]: I0320 10:55:30.929400 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-7a59-account-create-update-vt42l" event={"ID":"5905f14e-ca31-4983-9335-db320b71f0d1","Type":"ContainerDied","Data":"cfd384e0d3367ff6eb80b91a6ead8d9659d970ae1e7b3023112f4ba033984a69"} Mar 20 10:55:30 crc kubenswrapper[4748]: I0320 10:55:30.929468 4748 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cfd384e0d3367ff6eb80b91a6ead8d9659d970ae1e7b3023112f4ba033984a69" Mar 20 10:55:30 crc kubenswrapper[4748]: I0320 10:55:30.929431 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-7a59-account-create-update-vt42l" Mar 20 10:55:30 crc kubenswrapper[4748]: I0320 10:55:30.982623 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.628641086 podStartE2EDuration="1m22.98260275s" podCreationTimestamp="2026-03-20 10:54:08 +0000 UTC" firstStartedPulling="2026-03-20 10:54:10.728946598 +0000 UTC m=+1085.870492412" lastFinishedPulling="2026-03-20 10:54:56.082908262 +0000 UTC m=+1131.224454076" observedRunningTime="2026-03-20 10:55:30.976918437 +0000 UTC m=+1166.118464261" watchObservedRunningTime="2026-03-20 10:55:30.98260275 +0000 UTC m=+1166.124148564" Mar 20 10:55:31 crc kubenswrapper[4748]: I0320 10:55:31.440263 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-bldp9-config-7hfj6"] Mar 20 10:55:31 crc kubenswrapper[4748]: I0320 10:55:31.448563 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-bldp9-config-7hfj6"] Mar 20 10:55:31 crc kubenswrapper[4748]: I0320 10:55:31.527405 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4029da34-a74b-481e-a7b8-692ff54e61f9" path="/var/lib/kubelet/pods/4029da34-a74b-481e-a7b8-692ff54e61f9/volumes" Mar 20 10:55:31 crc kubenswrapper[4748]: I0320 10:55:31.528047 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1a01622-237d-4ea0-9168-c8db48fb1478" path="/var/lib/kubelet/pods/f1a01622-237d-4ea0-9168-c8db48fb1478/volumes" Mar 20 10:55:31 crc kubenswrapper[4748]: I0320 10:55:31.579199 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-bldp9-config-w9trp"] Mar 20 10:55:31 crc kubenswrapper[4748]: E0320 10:55:31.579609 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="056ff617-ed15-4344-8883-afb1abd38abb" containerName="mariadb-database-create" Mar 20 10:55:31 crc kubenswrapper[4748]: I0320 10:55:31.579635 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="056ff617-ed15-4344-8883-afb1abd38abb" containerName="mariadb-database-create" Mar 20 10:55:31 crc kubenswrapper[4748]: E0320 10:55:31.579656 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8949f08a-2d27-4055-86a7-a66a77de530b" containerName="mariadb-database-create" Mar 20 10:55:31 crc kubenswrapper[4748]: I0320 10:55:31.579665 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="8949f08a-2d27-4055-86a7-a66a77de530b" containerName="mariadb-database-create" Mar 20 10:55:31 crc kubenswrapper[4748]: E0320 10:55:31.579681 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="954ea93a-7914-4792-b989-9a376beed991" containerName="dnsmasq-dns" Mar 20 10:55:31 crc kubenswrapper[4748]: I0320 10:55:31.579690 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="954ea93a-7914-4792-b989-9a376beed991" containerName="dnsmasq-dns" Mar 20 10:55:31 crc kubenswrapper[4748]: E0320 10:55:31.579704 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b34c916-eb28-4060-8a0c-22b7ae45bcaa" containerName="mariadb-account-create-update" Mar 20 10:55:31 crc kubenswrapper[4748]: I0320 10:55:31.579711 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b34c916-eb28-4060-8a0c-22b7ae45bcaa" containerName="mariadb-account-create-update" Mar 20 10:55:31 crc kubenswrapper[4748]: E0320 10:55:31.579732 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4029da34-a74b-481e-a7b8-692ff54e61f9" containerName="mariadb-account-create-update" Mar 20 10:55:31 crc kubenswrapper[4748]: I0320 10:55:31.579739 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="4029da34-a74b-481e-a7b8-692ff54e61f9" containerName="mariadb-account-create-update" Mar 20 10:55:31 crc kubenswrapper[4748]: E0320 10:55:31.579756 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5905f14e-ca31-4983-9335-db320b71f0d1" containerName="mariadb-account-create-update" Mar 20 10:55:31 crc kubenswrapper[4748]: I0320 10:55:31.579765 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="5905f14e-ca31-4983-9335-db320b71f0d1" containerName="mariadb-account-create-update" Mar 20 10:55:31 crc kubenswrapper[4748]: E0320 10:55:31.579776 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1a01622-237d-4ea0-9168-c8db48fb1478" containerName="ovn-config" Mar 20 10:55:31 crc kubenswrapper[4748]: I0320 10:55:31.579784 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1a01622-237d-4ea0-9168-c8db48fb1478" containerName="ovn-config" Mar 20 10:55:31 crc kubenswrapper[4748]: E0320 10:55:31.579796 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="954ea93a-7914-4792-b989-9a376beed991" containerName="init" Mar 20 10:55:31 crc kubenswrapper[4748]: I0320 10:55:31.579806 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="954ea93a-7914-4792-b989-9a376beed991" containerName="init" Mar 20 10:55:31 crc kubenswrapper[4748]: I0320 10:55:31.580013 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="5905f14e-ca31-4983-9335-db320b71f0d1" containerName="mariadb-account-create-update" Mar 20 10:55:31 crc kubenswrapper[4748]: I0320 10:55:31.580033 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1a01622-237d-4ea0-9168-c8db48fb1478" containerName="ovn-config" Mar 20 10:55:31 crc kubenswrapper[4748]: I0320 10:55:31.580048 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b34c916-eb28-4060-8a0c-22b7ae45bcaa" containerName="mariadb-account-create-update" Mar 20 10:55:31 crc kubenswrapper[4748]: I0320 10:55:31.580061 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="954ea93a-7914-4792-b989-9a376beed991" containerName="dnsmasq-dns" Mar 20 10:55:31 crc kubenswrapper[4748]: I0320 10:55:31.580072 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="8949f08a-2d27-4055-86a7-a66a77de530b" containerName="mariadb-database-create" Mar 20 10:55:31 crc kubenswrapper[4748]: I0320 10:55:31.580088 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="056ff617-ed15-4344-8883-afb1abd38abb" containerName="mariadb-database-create" Mar 20 10:55:31 crc kubenswrapper[4748]: I0320 10:55:31.580101 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="4029da34-a74b-481e-a7b8-692ff54e61f9" containerName="mariadb-account-create-update" Mar 20 10:55:31 crc kubenswrapper[4748]: I0320 10:55:31.580739 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-bldp9-config-w9trp" Mar 20 10:55:31 crc kubenswrapper[4748]: I0320 10:55:31.584916 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 20 10:55:31 crc kubenswrapper[4748]: I0320 10:55:31.602935 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-bldp9-config-w9trp"] Mar 20 10:55:31 crc kubenswrapper[4748]: I0320 10:55:31.699105 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2qpf\" (UniqueName: \"kubernetes.io/projected/fabd0c00-596f-4785-ae16-be41a0bd6ee2-kube-api-access-j2qpf\") pod \"ovn-controller-bldp9-config-w9trp\" (UID: \"fabd0c00-596f-4785-ae16-be41a0bd6ee2\") " pod="openstack/ovn-controller-bldp9-config-w9trp" Mar 20 10:55:31 crc kubenswrapper[4748]: I0320 10:55:31.699173 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/fabd0c00-596f-4785-ae16-be41a0bd6ee2-var-run\") pod \"ovn-controller-bldp9-config-w9trp\" (UID: \"fabd0c00-596f-4785-ae16-be41a0bd6ee2\") " pod="openstack/ovn-controller-bldp9-config-w9trp" Mar 20 10:55:31 crc kubenswrapper[4748]: I0320 10:55:31.699322 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/fabd0c00-596f-4785-ae16-be41a0bd6ee2-additional-scripts\") pod \"ovn-controller-bldp9-config-w9trp\" (UID: \"fabd0c00-596f-4785-ae16-be41a0bd6ee2\") " pod="openstack/ovn-controller-bldp9-config-w9trp" Mar 20 10:55:31 crc kubenswrapper[4748]: I0320 10:55:31.699449 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fabd0c00-596f-4785-ae16-be41a0bd6ee2-scripts\") pod \"ovn-controller-bldp9-config-w9trp\" (UID: \"fabd0c00-596f-4785-ae16-be41a0bd6ee2\") " pod="openstack/ovn-controller-bldp9-config-w9trp" Mar 20 10:55:31 crc kubenswrapper[4748]: I0320 10:55:31.700197 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/fabd0c00-596f-4785-ae16-be41a0bd6ee2-var-log-ovn\") pod \"ovn-controller-bldp9-config-w9trp\" (UID: \"fabd0c00-596f-4785-ae16-be41a0bd6ee2\") " pod="openstack/ovn-controller-bldp9-config-w9trp" Mar 20 10:55:31 crc kubenswrapper[4748]: I0320 10:55:31.700255 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/fabd0c00-596f-4785-ae16-be41a0bd6ee2-var-run-ovn\") pod \"ovn-controller-bldp9-config-w9trp\" (UID: \"fabd0c00-596f-4785-ae16-be41a0bd6ee2\") " pod="openstack/ovn-controller-bldp9-config-w9trp" Mar 20 10:55:31 crc kubenswrapper[4748]: I0320 10:55:31.801483 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fabd0c00-596f-4785-ae16-be41a0bd6ee2-scripts\") pod \"ovn-controller-bldp9-config-w9trp\" (UID: \"fabd0c00-596f-4785-ae16-be41a0bd6ee2\") " pod="openstack/ovn-controller-bldp9-config-w9trp" Mar 20 10:55:31 crc kubenswrapper[4748]: I0320 10:55:31.801883 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/fabd0c00-596f-4785-ae16-be41a0bd6ee2-var-log-ovn\") pod \"ovn-controller-bldp9-config-w9trp\" (UID: \"fabd0c00-596f-4785-ae16-be41a0bd6ee2\") " pod="openstack/ovn-controller-bldp9-config-w9trp" Mar 20 10:55:31 crc kubenswrapper[4748]: I0320 10:55:31.801920 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/fabd0c00-596f-4785-ae16-be41a0bd6ee2-var-run-ovn\") pod \"ovn-controller-bldp9-config-w9trp\" (UID: \"fabd0c00-596f-4785-ae16-be41a0bd6ee2\") " pod="openstack/ovn-controller-bldp9-config-w9trp" Mar 20 10:55:31 crc kubenswrapper[4748]: I0320 10:55:31.801976 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2qpf\" (UniqueName: \"kubernetes.io/projected/fabd0c00-596f-4785-ae16-be41a0bd6ee2-kube-api-access-j2qpf\") pod \"ovn-controller-bldp9-config-w9trp\" (UID: \"fabd0c00-596f-4785-ae16-be41a0bd6ee2\") " pod="openstack/ovn-controller-bldp9-config-w9trp" Mar 20 10:55:31 crc kubenswrapper[4748]: I0320 10:55:31.802015 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/fabd0c00-596f-4785-ae16-be41a0bd6ee2-var-run\") pod \"ovn-controller-bldp9-config-w9trp\" (UID: \"fabd0c00-596f-4785-ae16-be41a0bd6ee2\") " pod="openstack/ovn-controller-bldp9-config-w9trp" Mar 20 10:55:31 crc kubenswrapper[4748]: I0320 10:55:31.802119 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/fabd0c00-596f-4785-ae16-be41a0bd6ee2-additional-scripts\") pod \"ovn-controller-bldp9-config-w9trp\" (UID: \"fabd0c00-596f-4785-ae16-be41a0bd6ee2\") " pod="openstack/ovn-controller-bldp9-config-w9trp" Mar 20 10:55:31 crc kubenswrapper[4748]: I0320 10:55:31.802356 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/fabd0c00-596f-4785-ae16-be41a0bd6ee2-var-run\") pod \"ovn-controller-bldp9-config-w9trp\" (UID: \"fabd0c00-596f-4785-ae16-be41a0bd6ee2\") " pod="openstack/ovn-controller-bldp9-config-w9trp" Mar 20 10:55:31 crc kubenswrapper[4748]: I0320 10:55:31.802402 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/fabd0c00-596f-4785-ae16-be41a0bd6ee2-var-run-ovn\") pod \"ovn-controller-bldp9-config-w9trp\" (UID: \"fabd0c00-596f-4785-ae16-be41a0bd6ee2\") " pod="openstack/ovn-controller-bldp9-config-w9trp" Mar 20 10:55:31 crc kubenswrapper[4748]: I0320 10:55:31.802467 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/fabd0c00-596f-4785-ae16-be41a0bd6ee2-var-log-ovn\") pod \"ovn-controller-bldp9-config-w9trp\" (UID: \"fabd0c00-596f-4785-ae16-be41a0bd6ee2\") " pod="openstack/ovn-controller-bldp9-config-w9trp" Mar 20 10:55:31 crc kubenswrapper[4748]: I0320 10:55:31.802717 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/fabd0c00-596f-4785-ae16-be41a0bd6ee2-additional-scripts\") pod \"ovn-controller-bldp9-config-w9trp\" (UID: \"fabd0c00-596f-4785-ae16-be41a0bd6ee2\") " pod="openstack/ovn-controller-bldp9-config-w9trp" Mar 20 10:55:31 crc kubenswrapper[4748]: I0320 10:55:31.803672 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fabd0c00-596f-4785-ae16-be41a0bd6ee2-scripts\") pod \"ovn-controller-bldp9-config-w9trp\" (UID: \"fabd0c00-596f-4785-ae16-be41a0bd6ee2\") " pod="openstack/ovn-controller-bldp9-config-w9trp" Mar 20 10:55:31 crc kubenswrapper[4748]: I0320 10:55:31.823530 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2qpf\" (UniqueName: \"kubernetes.io/projected/fabd0c00-596f-4785-ae16-be41a0bd6ee2-kube-api-access-j2qpf\") pod \"ovn-controller-bldp9-config-w9trp\" (UID: \"fabd0c00-596f-4785-ae16-be41a0bd6ee2\") " pod="openstack/ovn-controller-bldp9-config-w9trp" Mar 20 10:55:31 crc kubenswrapper[4748]: I0320 10:55:31.901281 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-bldp9-config-w9trp" Mar 20 10:55:31 crc kubenswrapper[4748]: I0320 10:55:31.940696 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a5a9b3e3-3a44-4765-ab5b-0e7955b524f7","Type":"ContainerStarted","Data":"3b70d0f50bcb025dda893bb95fefe7d1a64f4c657a0f73723c19dd93258de0c9"} Mar 20 10:55:31 crc kubenswrapper[4748]: I0320 10:55:31.941732 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 20 10:55:31 crc kubenswrapper[4748]: I0320 10:55:31.977231 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=38.928048684 podStartE2EDuration="1m24.977209535s" podCreationTimestamp="2026-03-20 10:54:07 +0000 UTC" firstStartedPulling="2026-03-20 10:54:10.086170389 +0000 UTC m=+1085.227716213" lastFinishedPulling="2026-03-20 10:54:56.13533125 +0000 UTC m=+1131.276877064" observedRunningTime="2026-03-20 10:55:31.975324227 +0000 UTC m=+1167.116870041" watchObservedRunningTime="2026-03-20 10:55:31.977209535 +0000 UTC m=+1167.118755359" Mar 20 10:55:32 crc kubenswrapper[4748]: I0320 10:55:32.005059 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/7753c601-7739-4165-b5f2-a673b0797334-etc-swift\") pod \"swift-storage-0\" (UID: \"7753c601-7739-4165-b5f2-a673b0797334\") " pod="openstack/swift-storage-0" Mar 20 10:55:32 crc kubenswrapper[4748]: E0320 10:55:32.005329 4748 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 20 10:55:32 crc kubenswrapper[4748]: E0320 10:55:32.005350 4748 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 20 10:55:32 crc kubenswrapper[4748]: E0320 10:55:32.005419 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7753c601-7739-4165-b5f2-a673b0797334-etc-swift podName:7753c601-7739-4165-b5f2-a673b0797334 nodeName:}" failed. No retries permitted until 2026-03-20 10:55:48.005399413 +0000 UTC m=+1183.146945227 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/7753c601-7739-4165-b5f2-a673b0797334-etc-swift") pod "swift-storage-0" (UID: "7753c601-7739-4165-b5f2-a673b0797334") : configmap "swift-ring-files" not found Mar 20 10:55:32 crc kubenswrapper[4748]: I0320 10:55:32.457446 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-bldp9-config-w9trp"] Mar 20 10:55:32 crc kubenswrapper[4748]: I0320 10:55:32.641418 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-t76rv"] Mar 20 10:55:32 crc kubenswrapper[4748]: I0320 10:55:32.642908 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-t76rv" Mar 20 10:55:32 crc kubenswrapper[4748]: I0320 10:55:32.645230 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Mar 20 10:55:32 crc kubenswrapper[4748]: I0320 10:55:32.646638 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-vdzj8" Mar 20 10:55:32 crc kubenswrapper[4748]: I0320 10:55:32.680010 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-t76rv"] Mar 20 10:55:32 crc kubenswrapper[4748]: I0320 10:55:32.720967 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfabafe6-f3da-44d2-bb13-cc39bffc6dbd-combined-ca-bundle\") pod \"glance-db-sync-t76rv\" (UID: \"dfabafe6-f3da-44d2-bb13-cc39bffc6dbd\") " pod="openstack/glance-db-sync-t76rv" Mar 20 10:55:32 crc kubenswrapper[4748]: I0320 10:55:32.721457 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbdvq\" (UniqueName: \"kubernetes.io/projected/dfabafe6-f3da-44d2-bb13-cc39bffc6dbd-kube-api-access-mbdvq\") pod \"glance-db-sync-t76rv\" (UID: \"dfabafe6-f3da-44d2-bb13-cc39bffc6dbd\") " pod="openstack/glance-db-sync-t76rv" Mar 20 10:55:32 crc kubenswrapper[4748]: I0320 10:55:32.721815 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfabafe6-f3da-44d2-bb13-cc39bffc6dbd-config-data\") pod \"glance-db-sync-t76rv\" (UID: \"dfabafe6-f3da-44d2-bb13-cc39bffc6dbd\") " pod="openstack/glance-db-sync-t76rv" Mar 20 10:55:32 crc kubenswrapper[4748]: I0320 10:55:32.722105 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/dfabafe6-f3da-44d2-bb13-cc39bffc6dbd-db-sync-config-data\") pod \"glance-db-sync-t76rv\" (UID: \"dfabafe6-f3da-44d2-bb13-cc39bffc6dbd\") " pod="openstack/glance-db-sync-t76rv" Mar 20 10:55:32 crc kubenswrapper[4748]: I0320 10:55:32.823353 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/dfabafe6-f3da-44d2-bb13-cc39bffc6dbd-db-sync-config-data\") pod \"glance-db-sync-t76rv\" (UID: \"dfabafe6-f3da-44d2-bb13-cc39bffc6dbd\") " pod="openstack/glance-db-sync-t76rv" Mar 20 10:55:32 crc kubenswrapper[4748]: I0320 10:55:32.823485 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfabafe6-f3da-44d2-bb13-cc39bffc6dbd-combined-ca-bundle\") pod \"glance-db-sync-t76rv\" (UID: \"dfabafe6-f3da-44d2-bb13-cc39bffc6dbd\") " pod="openstack/glance-db-sync-t76rv" Mar 20 10:55:32 crc kubenswrapper[4748]: I0320 10:55:32.823509 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mbdvq\" (UniqueName: \"kubernetes.io/projected/dfabafe6-f3da-44d2-bb13-cc39bffc6dbd-kube-api-access-mbdvq\") pod \"glance-db-sync-t76rv\" (UID: \"dfabafe6-f3da-44d2-bb13-cc39bffc6dbd\") " pod="openstack/glance-db-sync-t76rv" Mar 20 10:55:32 crc kubenswrapper[4748]: I0320 10:55:32.823544 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfabafe6-f3da-44d2-bb13-cc39bffc6dbd-config-data\") pod \"glance-db-sync-t76rv\" (UID: \"dfabafe6-f3da-44d2-bb13-cc39bffc6dbd\") " pod="openstack/glance-db-sync-t76rv" Mar 20 10:55:32 crc kubenswrapper[4748]: I0320 10:55:32.831182 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/dfabafe6-f3da-44d2-bb13-cc39bffc6dbd-db-sync-config-data\") pod \"glance-db-sync-t76rv\" (UID: \"dfabafe6-f3da-44d2-bb13-cc39bffc6dbd\") " pod="openstack/glance-db-sync-t76rv" Mar 20 10:55:32 crc kubenswrapper[4748]: I0320 10:55:32.831352 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfabafe6-f3da-44d2-bb13-cc39bffc6dbd-config-data\") pod \"glance-db-sync-t76rv\" (UID: \"dfabafe6-f3da-44d2-bb13-cc39bffc6dbd\") " pod="openstack/glance-db-sync-t76rv" Mar 20 10:55:32 crc kubenswrapper[4748]: I0320 10:55:32.839882 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfabafe6-f3da-44d2-bb13-cc39bffc6dbd-combined-ca-bundle\") pod \"glance-db-sync-t76rv\" (UID: \"dfabafe6-f3da-44d2-bb13-cc39bffc6dbd\") " pod="openstack/glance-db-sync-t76rv" Mar 20 10:55:32 crc kubenswrapper[4748]: I0320 10:55:32.842811 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mbdvq\" (UniqueName: \"kubernetes.io/projected/dfabafe6-f3da-44d2-bb13-cc39bffc6dbd-kube-api-access-mbdvq\") pod \"glance-db-sync-t76rv\" (UID: \"dfabafe6-f3da-44d2-bb13-cc39bffc6dbd\") " pod="openstack/glance-db-sync-t76rv" Mar 20 10:55:32 crc kubenswrapper[4748]: I0320 10:55:32.951409 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-bldp9-config-w9trp" event={"ID":"fabd0c00-596f-4785-ae16-be41a0bd6ee2","Type":"ContainerStarted","Data":"e0195e5b9b083710c317244426866370155836b4ed13cd67af2e6cb0295ece7b"} Mar 20 10:55:32 crc kubenswrapper[4748]: I0320 10:55:32.953033 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-bldp9-config-w9trp" event={"ID":"fabd0c00-596f-4785-ae16-be41a0bd6ee2","Type":"ContainerStarted","Data":"2221a1bc56169d854eba3612703a047142a66f519011bbebc1531eb32bf2a3c6"} Mar 20 10:55:32 crc kubenswrapper[4748]: I0320 10:55:32.976057 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-bldp9-config-w9trp" podStartSLOduration=1.9760276060000002 podStartE2EDuration="1.976027606s" podCreationTimestamp="2026-03-20 10:55:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:55:32.972639401 +0000 UTC m=+1168.114185215" watchObservedRunningTime="2026-03-20 10:55:32.976027606 +0000 UTC m=+1168.117573420" Mar 20 10:55:32 crc kubenswrapper[4748]: I0320 10:55:32.979690 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-t76rv" Mar 20 10:55:33 crc kubenswrapper[4748]: I0320 10:55:33.619396 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-t76rv"] Mar 20 10:55:33 crc kubenswrapper[4748]: W0320 10:55:33.622486 4748 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddfabafe6_f3da_44d2_bb13_cc39bffc6dbd.slice/crio-9fe3147ab0152b5b9c83f5d3c060b48df0756a6ba68993d41cbb466ea2dfb079 WatchSource:0}: Error finding container 9fe3147ab0152b5b9c83f5d3c060b48df0756a6ba68993d41cbb466ea2dfb079: Status 404 returned error can't find the container with id 9fe3147ab0152b5b9c83f5d3c060b48df0756a6ba68993d41cbb466ea2dfb079 Mar 20 10:55:33 crc kubenswrapper[4748]: I0320 10:55:33.960931 4748 generic.go:334] "Generic (PLEG): container finished" podID="fabd0c00-596f-4785-ae16-be41a0bd6ee2" containerID="e0195e5b9b083710c317244426866370155836b4ed13cd67af2e6cb0295ece7b" exitCode=0 Mar 20 10:55:33 crc kubenswrapper[4748]: I0320 10:55:33.962111 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-bldp9-config-w9trp" event={"ID":"fabd0c00-596f-4785-ae16-be41a0bd6ee2","Type":"ContainerDied","Data":"e0195e5b9b083710c317244426866370155836b4ed13cd67af2e6cb0295ece7b"} Mar 20 10:55:33 crc kubenswrapper[4748]: I0320 10:55:33.965123 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-t76rv" event={"ID":"dfabafe6-f3da-44d2-bb13-cc39bffc6dbd","Type":"ContainerStarted","Data":"9fe3147ab0152b5b9c83f5d3c060b48df0756a6ba68993d41cbb466ea2dfb079"} Mar 20 10:55:34 crc kubenswrapper[4748]: I0320 10:55:34.976764 4748 generic.go:334] "Generic (PLEG): container finished" podID="8bde82f9-d6e7-4ab4-8666-5fbbd5b90f17" containerID="863987fe20374b36fdc5f4e6a6d860e55e30e7e7660c1768c5557c8667f9fc13" exitCode=0 Mar 20 10:55:34 crc kubenswrapper[4748]: I0320 10:55:34.977046 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-mlztp" event={"ID":"8bde82f9-d6e7-4ab4-8666-5fbbd5b90f17","Type":"ContainerDied","Data":"863987fe20374b36fdc5f4e6a6d860e55e30e7e7660c1768c5557c8667f9fc13"} Mar 20 10:55:35 crc kubenswrapper[4748]: I0320 10:55:35.341175 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-bldp9-config-w9trp" Mar 20 10:55:35 crc kubenswrapper[4748]: I0320 10:55:35.468247 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/fabd0c00-596f-4785-ae16-be41a0bd6ee2-additional-scripts\") pod \"fabd0c00-596f-4785-ae16-be41a0bd6ee2\" (UID: \"fabd0c00-596f-4785-ae16-be41a0bd6ee2\") " Mar 20 10:55:35 crc kubenswrapper[4748]: I0320 10:55:35.468371 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/fabd0c00-596f-4785-ae16-be41a0bd6ee2-var-run-ovn\") pod \"fabd0c00-596f-4785-ae16-be41a0bd6ee2\" (UID: \"fabd0c00-596f-4785-ae16-be41a0bd6ee2\") " Mar 20 10:55:35 crc kubenswrapper[4748]: I0320 10:55:35.468428 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fabd0c00-596f-4785-ae16-be41a0bd6ee2-scripts\") pod \"fabd0c00-596f-4785-ae16-be41a0bd6ee2\" (UID: \"fabd0c00-596f-4785-ae16-be41a0bd6ee2\") " Mar 20 10:55:35 crc kubenswrapper[4748]: I0320 10:55:35.468499 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j2qpf\" (UniqueName: \"kubernetes.io/projected/fabd0c00-596f-4785-ae16-be41a0bd6ee2-kube-api-access-j2qpf\") pod \"fabd0c00-596f-4785-ae16-be41a0bd6ee2\" (UID: \"fabd0c00-596f-4785-ae16-be41a0bd6ee2\") " Mar 20 10:55:35 crc kubenswrapper[4748]: I0320 10:55:35.468537 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/fabd0c00-596f-4785-ae16-be41a0bd6ee2-var-log-ovn\") pod \"fabd0c00-596f-4785-ae16-be41a0bd6ee2\" (UID: \"fabd0c00-596f-4785-ae16-be41a0bd6ee2\") " Mar 20 10:55:35 crc kubenswrapper[4748]: I0320 10:55:35.468614 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/fabd0c00-596f-4785-ae16-be41a0bd6ee2-var-run\") pod \"fabd0c00-596f-4785-ae16-be41a0bd6ee2\" (UID: \"fabd0c00-596f-4785-ae16-be41a0bd6ee2\") " Mar 20 10:55:35 crc kubenswrapper[4748]: I0320 10:55:35.469185 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fabd0c00-596f-4785-ae16-be41a0bd6ee2-var-run" (OuterVolumeSpecName: "var-run") pod "fabd0c00-596f-4785-ae16-be41a0bd6ee2" (UID: "fabd0c00-596f-4785-ae16-be41a0bd6ee2"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 10:55:35 crc kubenswrapper[4748]: I0320 10:55:35.469983 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fabd0c00-596f-4785-ae16-be41a0bd6ee2-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "fabd0c00-596f-4785-ae16-be41a0bd6ee2" (UID: "fabd0c00-596f-4785-ae16-be41a0bd6ee2"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:55:35 crc kubenswrapper[4748]: I0320 10:55:35.470171 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fabd0c00-596f-4785-ae16-be41a0bd6ee2-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "fabd0c00-596f-4785-ae16-be41a0bd6ee2" (UID: "fabd0c00-596f-4785-ae16-be41a0bd6ee2"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 10:55:35 crc kubenswrapper[4748]: I0320 10:55:35.470260 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fabd0c00-596f-4785-ae16-be41a0bd6ee2-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "fabd0c00-596f-4785-ae16-be41a0bd6ee2" (UID: "fabd0c00-596f-4785-ae16-be41a0bd6ee2"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 10:55:35 crc kubenswrapper[4748]: I0320 10:55:35.470959 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fabd0c00-596f-4785-ae16-be41a0bd6ee2-scripts" (OuterVolumeSpecName: "scripts") pod "fabd0c00-596f-4785-ae16-be41a0bd6ee2" (UID: "fabd0c00-596f-4785-ae16-be41a0bd6ee2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:55:35 crc kubenswrapper[4748]: I0320 10:55:35.475655 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fabd0c00-596f-4785-ae16-be41a0bd6ee2-kube-api-access-j2qpf" (OuterVolumeSpecName: "kube-api-access-j2qpf") pod "fabd0c00-596f-4785-ae16-be41a0bd6ee2" (UID: "fabd0c00-596f-4785-ae16-be41a0bd6ee2"). InnerVolumeSpecName "kube-api-access-j2qpf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:55:35 crc kubenswrapper[4748]: I0320 10:55:35.510305 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-gk4gw"] Mar 20 10:55:35 crc kubenswrapper[4748]: E0320 10:55:35.510696 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fabd0c00-596f-4785-ae16-be41a0bd6ee2" containerName="ovn-config" Mar 20 10:55:35 crc kubenswrapper[4748]: I0320 10:55:35.510721 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="fabd0c00-596f-4785-ae16-be41a0bd6ee2" containerName="ovn-config" Mar 20 10:55:35 crc kubenswrapper[4748]: I0320 10:55:35.510916 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="fabd0c00-596f-4785-ae16-be41a0bd6ee2" containerName="ovn-config" Mar 20 10:55:35 crc kubenswrapper[4748]: I0320 10:55:35.511511 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-gk4gw" Mar 20 10:55:35 crc kubenswrapper[4748]: I0320 10:55:35.522336 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Mar 20 10:55:35 crc kubenswrapper[4748]: I0320 10:55:35.540335 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-gk4gw"] Mar 20 10:55:35 crc kubenswrapper[4748]: I0320 10:55:35.571305 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkcbs\" (UniqueName: \"kubernetes.io/projected/07aca114-65eb-4bd6-8d63-d17635771c2d-kube-api-access-qkcbs\") pod \"root-account-create-update-gk4gw\" (UID: \"07aca114-65eb-4bd6-8d63-d17635771c2d\") " pod="openstack/root-account-create-update-gk4gw" Mar 20 10:55:35 crc kubenswrapper[4748]: I0320 10:55:35.571529 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/07aca114-65eb-4bd6-8d63-d17635771c2d-operator-scripts\") pod \"root-account-create-update-gk4gw\" (UID: \"07aca114-65eb-4bd6-8d63-d17635771c2d\") " pod="openstack/root-account-create-update-gk4gw" Mar 20 10:55:35 crc kubenswrapper[4748]: I0320 10:55:35.571582 4748 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/fabd0c00-596f-4785-ae16-be41a0bd6ee2-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 20 10:55:35 crc kubenswrapper[4748]: I0320 10:55:35.571598 4748 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fabd0c00-596f-4785-ae16-be41a0bd6ee2-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 10:55:35 crc kubenswrapper[4748]: I0320 10:55:35.571610 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j2qpf\" (UniqueName: \"kubernetes.io/projected/fabd0c00-596f-4785-ae16-be41a0bd6ee2-kube-api-access-j2qpf\") on node \"crc\" DevicePath \"\"" Mar 20 10:55:35 crc kubenswrapper[4748]: I0320 10:55:35.571621 4748 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/fabd0c00-596f-4785-ae16-be41a0bd6ee2-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 20 10:55:35 crc kubenswrapper[4748]: I0320 10:55:35.571631 4748 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/fabd0c00-596f-4785-ae16-be41a0bd6ee2-var-run\") on node \"crc\" DevicePath \"\"" Mar 20 10:55:35 crc kubenswrapper[4748]: I0320 10:55:35.571642 4748 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/fabd0c00-596f-4785-ae16-be41a0bd6ee2-additional-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 10:55:35 crc kubenswrapper[4748]: I0320 10:55:35.597567 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-bldp9-config-w9trp"] Mar 20 10:55:35 crc kubenswrapper[4748]: I0320 10:55:35.621019 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-bldp9-config-w9trp"] Mar 20 10:55:35 crc kubenswrapper[4748]: I0320 10:55:35.673330 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/07aca114-65eb-4bd6-8d63-d17635771c2d-operator-scripts\") pod \"root-account-create-update-gk4gw\" (UID: \"07aca114-65eb-4bd6-8d63-d17635771c2d\") " pod="openstack/root-account-create-update-gk4gw" Mar 20 10:55:35 crc kubenswrapper[4748]: I0320 10:55:35.673433 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qkcbs\" (UniqueName: \"kubernetes.io/projected/07aca114-65eb-4bd6-8d63-d17635771c2d-kube-api-access-qkcbs\") pod \"root-account-create-update-gk4gw\" (UID: \"07aca114-65eb-4bd6-8d63-d17635771c2d\") " pod="openstack/root-account-create-update-gk4gw" Mar 20 10:55:35 crc kubenswrapper[4748]: I0320 10:55:35.674749 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/07aca114-65eb-4bd6-8d63-d17635771c2d-operator-scripts\") pod \"root-account-create-update-gk4gw\" (UID: \"07aca114-65eb-4bd6-8d63-d17635771c2d\") " pod="openstack/root-account-create-update-gk4gw" Mar 20 10:55:35 crc kubenswrapper[4748]: I0320 10:55:35.711655 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkcbs\" (UniqueName: \"kubernetes.io/projected/07aca114-65eb-4bd6-8d63-d17635771c2d-kube-api-access-qkcbs\") pod \"root-account-create-update-gk4gw\" (UID: \"07aca114-65eb-4bd6-8d63-d17635771c2d\") " pod="openstack/root-account-create-update-gk4gw" Mar 20 10:55:35 crc kubenswrapper[4748]: I0320 10:55:35.834204 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-gk4gw" Mar 20 10:55:35 crc kubenswrapper[4748]: I0320 10:55:35.991344 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-bldp9-config-w9trp" Mar 20 10:55:35 crc kubenswrapper[4748]: I0320 10:55:35.991519 4748 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2221a1bc56169d854eba3612703a047142a66f519011bbebc1531eb32bf2a3c6" Mar 20 10:55:36 crc kubenswrapper[4748]: I0320 10:55:36.307254 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-gk4gw"] Mar 20 10:55:37 crc kubenswrapper[4748]: I0320 10:55:37.006012 4748 generic.go:334] "Generic (PLEG): container finished" podID="07aca114-65eb-4bd6-8d63-d17635771c2d" containerID="16edf332d6a43bd6e5cbc1f5a8b01321d8d02151271b55dc3b7b4f2faef92351" exitCode=0 Mar 20 10:55:37 crc kubenswrapper[4748]: I0320 10:55:37.006139 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-gk4gw" event={"ID":"07aca114-65eb-4bd6-8d63-d17635771c2d","Type":"ContainerDied","Data":"16edf332d6a43bd6e5cbc1f5a8b01321d8d02151271b55dc3b7b4f2faef92351"} Mar 20 10:55:37 crc kubenswrapper[4748]: I0320 10:55:37.006386 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-gk4gw" event={"ID":"07aca114-65eb-4bd6-8d63-d17635771c2d","Type":"ContainerStarted","Data":"72525d7282ba6163020c74660e10a20535646951648be12855cfc7b534033bfb"} Mar 20 10:55:37 crc kubenswrapper[4748]: I0320 10:55:37.214148 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-mlztp" Mar 20 10:55:37 crc kubenswrapper[4748]: I0320 10:55:37.306605 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8bde82f9-d6e7-4ab4-8666-5fbbd5b90f17-etc-swift\") pod \"8bde82f9-d6e7-4ab4-8666-5fbbd5b90f17\" (UID: \"8bde82f9-d6e7-4ab4-8666-5fbbd5b90f17\") " Mar 20 10:55:37 crc kubenswrapper[4748]: I0320 10:55:37.306853 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8bde82f9-d6e7-4ab4-8666-5fbbd5b90f17-dispersionconf\") pod \"8bde82f9-d6e7-4ab4-8666-5fbbd5b90f17\" (UID: \"8bde82f9-d6e7-4ab4-8666-5fbbd5b90f17\") " Mar 20 10:55:37 crc kubenswrapper[4748]: I0320 10:55:37.306913 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7z5bv\" (UniqueName: \"kubernetes.io/projected/8bde82f9-d6e7-4ab4-8666-5fbbd5b90f17-kube-api-access-7z5bv\") pod \"8bde82f9-d6e7-4ab4-8666-5fbbd5b90f17\" (UID: \"8bde82f9-d6e7-4ab4-8666-5fbbd5b90f17\") " Mar 20 10:55:37 crc kubenswrapper[4748]: I0320 10:55:37.307021 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8bde82f9-d6e7-4ab4-8666-5fbbd5b90f17-swiftconf\") pod \"8bde82f9-d6e7-4ab4-8666-5fbbd5b90f17\" (UID: \"8bde82f9-d6e7-4ab4-8666-5fbbd5b90f17\") " Mar 20 10:55:37 crc kubenswrapper[4748]: I0320 10:55:37.307056 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8bde82f9-d6e7-4ab4-8666-5fbbd5b90f17-ring-data-devices\") pod \"8bde82f9-d6e7-4ab4-8666-5fbbd5b90f17\" (UID: \"8bde82f9-d6e7-4ab4-8666-5fbbd5b90f17\") " Mar 20 10:55:37 crc kubenswrapper[4748]: I0320 10:55:37.307083 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bde82f9-d6e7-4ab4-8666-5fbbd5b90f17-combined-ca-bundle\") pod \"8bde82f9-d6e7-4ab4-8666-5fbbd5b90f17\" (UID: \"8bde82f9-d6e7-4ab4-8666-5fbbd5b90f17\") " Mar 20 10:55:37 crc kubenswrapper[4748]: I0320 10:55:37.307128 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8bde82f9-d6e7-4ab4-8666-5fbbd5b90f17-scripts\") pod \"8bde82f9-d6e7-4ab4-8666-5fbbd5b90f17\" (UID: \"8bde82f9-d6e7-4ab4-8666-5fbbd5b90f17\") " Mar 20 10:55:37 crc kubenswrapper[4748]: I0320 10:55:37.308048 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8bde82f9-d6e7-4ab4-8666-5fbbd5b90f17-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "8bde82f9-d6e7-4ab4-8666-5fbbd5b90f17" (UID: "8bde82f9-d6e7-4ab4-8666-5fbbd5b90f17"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:55:37 crc kubenswrapper[4748]: I0320 10:55:37.308108 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8bde82f9-d6e7-4ab4-8666-5fbbd5b90f17-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "8bde82f9-d6e7-4ab4-8666-5fbbd5b90f17" (UID: "8bde82f9-d6e7-4ab4-8666-5fbbd5b90f17"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:55:37 crc kubenswrapper[4748]: I0320 10:55:37.326585 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8bde82f9-d6e7-4ab4-8666-5fbbd5b90f17-kube-api-access-7z5bv" (OuterVolumeSpecName: "kube-api-access-7z5bv") pod "8bde82f9-d6e7-4ab4-8666-5fbbd5b90f17" (UID: "8bde82f9-d6e7-4ab4-8666-5fbbd5b90f17"). InnerVolumeSpecName "kube-api-access-7z5bv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:55:37 crc kubenswrapper[4748]: E0320 10:55:37.337161 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8bde82f9-d6e7-4ab4-8666-5fbbd5b90f17-combined-ca-bundle podName:8bde82f9-d6e7-4ab4-8666-5fbbd5b90f17 nodeName:}" failed. No retries permitted until 2026-03-20 10:55:37.837111189 +0000 UTC m=+1172.978657003 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/8bde82f9-d6e7-4ab4-8666-5fbbd5b90f17-combined-ca-bundle") pod "8bde82f9-d6e7-4ab4-8666-5fbbd5b90f17" (UID: "8bde82f9-d6e7-4ab4-8666-5fbbd5b90f17") : error deleting /var/lib/kubelet/pods/8bde82f9-d6e7-4ab4-8666-5fbbd5b90f17/volume-subpaths: remove /var/lib/kubelet/pods/8bde82f9-d6e7-4ab4-8666-5fbbd5b90f17/volume-subpaths: no such file or directory Mar 20 10:55:37 crc kubenswrapper[4748]: E0320 10:55:37.337378 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/8bde82f9-d6e7-4ab4-8666-5fbbd5b90f17-scripts podName:8bde82f9-d6e7-4ab4-8666-5fbbd5b90f17 nodeName:}" failed. No retries permitted until 2026-03-20 10:55:37.837355175 +0000 UTC m=+1172.978900989 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "scripts" (UniqueName: "kubernetes.io/configmap/8bde82f9-d6e7-4ab4-8666-5fbbd5b90f17-scripts") pod "8bde82f9-d6e7-4ab4-8666-5fbbd5b90f17" (UID: "8bde82f9-d6e7-4ab4-8666-5fbbd5b90f17") : error deleting /var/lib/kubelet/pods/8bde82f9-d6e7-4ab4-8666-5fbbd5b90f17/volume-subpaths: remove /var/lib/kubelet/pods/8bde82f9-d6e7-4ab4-8666-5fbbd5b90f17/volume-subpaths: no such file or directory Mar 20 10:55:37 crc kubenswrapper[4748]: E0320 10:55:37.337487 4748 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8bde82f9-d6e7-4ab4-8666-5fbbd5b90f17-swiftconf podName:8bde82f9-d6e7-4ab4-8666-5fbbd5b90f17 nodeName:}" failed. No retries permitted until 2026-03-20 10:55:37.837474508 +0000 UTC m=+1172.979020312 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "swiftconf" (UniqueName: "kubernetes.io/secret/8bde82f9-d6e7-4ab4-8666-5fbbd5b90f17-swiftconf") pod "8bde82f9-d6e7-4ab4-8666-5fbbd5b90f17" (UID: "8bde82f9-d6e7-4ab4-8666-5fbbd5b90f17") : error deleting /var/lib/kubelet/pods/8bde82f9-d6e7-4ab4-8666-5fbbd5b90f17/volume-subpaths: remove /var/lib/kubelet/pods/8bde82f9-d6e7-4ab4-8666-5fbbd5b90f17/volume-subpaths: no such file or directory Mar 20 10:55:37 crc kubenswrapper[4748]: I0320 10:55:37.340614 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8bde82f9-d6e7-4ab4-8666-5fbbd5b90f17-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "8bde82f9-d6e7-4ab4-8666-5fbbd5b90f17" (UID: "8bde82f9-d6e7-4ab4-8666-5fbbd5b90f17"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:55:37 crc kubenswrapper[4748]: I0320 10:55:37.409294 4748 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8bde82f9-d6e7-4ab4-8666-5fbbd5b90f17-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 20 10:55:37 crc kubenswrapper[4748]: I0320 10:55:37.409336 4748 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8bde82f9-d6e7-4ab4-8666-5fbbd5b90f17-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 20 10:55:37 crc kubenswrapper[4748]: I0320 10:55:37.409349 4748 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8bde82f9-d6e7-4ab4-8666-5fbbd5b90f17-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 20 10:55:37 crc kubenswrapper[4748]: I0320 10:55:37.409358 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7z5bv\" (UniqueName: \"kubernetes.io/projected/8bde82f9-d6e7-4ab4-8666-5fbbd5b90f17-kube-api-access-7z5bv\") on node \"crc\" DevicePath \"\"" Mar 20 10:55:37 crc kubenswrapper[4748]: I0320 10:55:37.524184 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fabd0c00-596f-4785-ae16-be41a0bd6ee2" path="/var/lib/kubelet/pods/fabd0c00-596f-4785-ae16-be41a0bd6ee2/volumes" Mar 20 10:55:37 crc kubenswrapper[4748]: I0320 10:55:37.919759 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8bde82f9-d6e7-4ab4-8666-5fbbd5b90f17-swiftconf\") pod \"8bde82f9-d6e7-4ab4-8666-5fbbd5b90f17\" (UID: \"8bde82f9-d6e7-4ab4-8666-5fbbd5b90f17\") " Mar 20 10:55:37 crc kubenswrapper[4748]: I0320 10:55:37.920162 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bde82f9-d6e7-4ab4-8666-5fbbd5b90f17-combined-ca-bundle\") pod \"8bde82f9-d6e7-4ab4-8666-5fbbd5b90f17\" (UID: \"8bde82f9-d6e7-4ab4-8666-5fbbd5b90f17\") " Mar 20 10:55:37 crc kubenswrapper[4748]: I0320 10:55:37.920229 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8bde82f9-d6e7-4ab4-8666-5fbbd5b90f17-scripts\") pod \"8bde82f9-d6e7-4ab4-8666-5fbbd5b90f17\" (UID: \"8bde82f9-d6e7-4ab4-8666-5fbbd5b90f17\") " Mar 20 10:55:37 crc kubenswrapper[4748]: I0320 10:55:37.921239 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8bde82f9-d6e7-4ab4-8666-5fbbd5b90f17-scripts" (OuterVolumeSpecName: "scripts") pod "8bde82f9-d6e7-4ab4-8666-5fbbd5b90f17" (UID: "8bde82f9-d6e7-4ab4-8666-5fbbd5b90f17"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:55:37 crc kubenswrapper[4748]: I0320 10:55:37.926426 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8bde82f9-d6e7-4ab4-8666-5fbbd5b90f17-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "8bde82f9-d6e7-4ab4-8666-5fbbd5b90f17" (UID: "8bde82f9-d6e7-4ab4-8666-5fbbd5b90f17"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:55:37 crc kubenswrapper[4748]: I0320 10:55:37.927063 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8bde82f9-d6e7-4ab4-8666-5fbbd5b90f17-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8bde82f9-d6e7-4ab4-8666-5fbbd5b90f17" (UID: "8bde82f9-d6e7-4ab4-8666-5fbbd5b90f17"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:55:38 crc kubenswrapper[4748]: I0320 10:55:38.016104 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-mlztp" event={"ID":"8bde82f9-d6e7-4ab4-8666-5fbbd5b90f17","Type":"ContainerDied","Data":"d19766b63e59dee1d42eb0af65bd1b1d8a8ee2aaef6b74254a36804a339ca987"} Mar 20 10:55:38 crc kubenswrapper[4748]: I0320 10:55:38.016205 4748 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d19766b63e59dee1d42eb0af65bd1b1d8a8ee2aaef6b74254a36804a339ca987" Mar 20 10:55:38 crc kubenswrapper[4748]: I0320 10:55:38.016120 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-mlztp" Mar 20 10:55:38 crc kubenswrapper[4748]: I0320 10:55:38.022280 4748 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8bde82f9-d6e7-4ab4-8666-5fbbd5b90f17-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 10:55:38 crc kubenswrapper[4748]: I0320 10:55:38.022310 4748 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8bde82f9-d6e7-4ab4-8666-5fbbd5b90f17-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 20 10:55:38 crc kubenswrapper[4748]: I0320 10:55:38.022320 4748 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bde82f9-d6e7-4ab4-8666-5fbbd5b90f17-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 10:55:38 crc kubenswrapper[4748]: I0320 10:55:38.273362 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-gk4gw" Mar 20 10:55:38 crc kubenswrapper[4748]: I0320 10:55:38.329239 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qkcbs\" (UniqueName: \"kubernetes.io/projected/07aca114-65eb-4bd6-8d63-d17635771c2d-kube-api-access-qkcbs\") pod \"07aca114-65eb-4bd6-8d63-d17635771c2d\" (UID: \"07aca114-65eb-4bd6-8d63-d17635771c2d\") " Mar 20 10:55:38 crc kubenswrapper[4748]: I0320 10:55:38.329369 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/07aca114-65eb-4bd6-8d63-d17635771c2d-operator-scripts\") pod \"07aca114-65eb-4bd6-8d63-d17635771c2d\" (UID: \"07aca114-65eb-4bd6-8d63-d17635771c2d\") " Mar 20 10:55:38 crc kubenswrapper[4748]: I0320 10:55:38.330605 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07aca114-65eb-4bd6-8d63-d17635771c2d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "07aca114-65eb-4bd6-8d63-d17635771c2d" (UID: "07aca114-65eb-4bd6-8d63-d17635771c2d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:55:38 crc kubenswrapper[4748]: I0320 10:55:38.335610 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07aca114-65eb-4bd6-8d63-d17635771c2d-kube-api-access-qkcbs" (OuterVolumeSpecName: "kube-api-access-qkcbs") pod "07aca114-65eb-4bd6-8d63-d17635771c2d" (UID: "07aca114-65eb-4bd6-8d63-d17635771c2d"). InnerVolumeSpecName "kube-api-access-qkcbs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:55:38 crc kubenswrapper[4748]: I0320 10:55:38.431608 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qkcbs\" (UniqueName: \"kubernetes.io/projected/07aca114-65eb-4bd6-8d63-d17635771c2d-kube-api-access-qkcbs\") on node \"crc\" DevicePath \"\"" Mar 20 10:55:38 crc kubenswrapper[4748]: I0320 10:55:38.431652 4748 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/07aca114-65eb-4bd6-8d63-d17635771c2d-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 10:55:39 crc kubenswrapper[4748]: I0320 10:55:39.026211 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-gk4gw" event={"ID":"07aca114-65eb-4bd6-8d63-d17635771c2d","Type":"ContainerDied","Data":"72525d7282ba6163020c74660e10a20535646951648be12855cfc7b534033bfb"} Mar 20 10:55:39 crc kubenswrapper[4748]: I0320 10:55:39.026498 4748 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="72525d7282ba6163020c74660e10a20535646951648be12855cfc7b534033bfb" Mar 20 10:55:39 crc kubenswrapper[4748]: I0320 10:55:39.026581 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-gk4gw" Mar 20 10:55:47 crc kubenswrapper[4748]: I0320 10:55:47.101000 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-t76rv" event={"ID":"dfabafe6-f3da-44d2-bb13-cc39bffc6dbd","Type":"ContainerStarted","Data":"b8f75909bd9bc56662a6978c2f85bf27abd90e98a29d9195c5fe14dc0b753532"} Mar 20 10:55:47 crc kubenswrapper[4748]: I0320 10:55:47.122656 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-t76rv" podStartSLOduration=2.943023909 podStartE2EDuration="15.122635297s" podCreationTimestamp="2026-03-20 10:55:32 +0000 UTC" firstStartedPulling="2026-03-20 10:55:33.628671335 +0000 UTC m=+1168.770217149" lastFinishedPulling="2026-03-20 10:55:45.808282723 +0000 UTC m=+1180.949828537" observedRunningTime="2026-03-20 10:55:47.120929404 +0000 UTC m=+1182.262475238" watchObservedRunningTime="2026-03-20 10:55:47.122635297 +0000 UTC m=+1182.264181111" Mar 20 10:55:48 crc kubenswrapper[4748]: I0320 10:55:48.018999 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/7753c601-7739-4165-b5f2-a673b0797334-etc-swift\") pod \"swift-storage-0\" (UID: \"7753c601-7739-4165-b5f2-a673b0797334\") " pod="openstack/swift-storage-0" Mar 20 10:55:48 crc kubenswrapper[4748]: I0320 10:55:48.025390 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/7753c601-7739-4165-b5f2-a673b0797334-etc-swift\") pod \"swift-storage-0\" (UID: \"7753c601-7739-4165-b5f2-a673b0797334\") " pod="openstack/swift-storage-0" Mar 20 10:55:48 crc kubenswrapper[4748]: I0320 10:55:48.251458 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 20 10:55:48 crc kubenswrapper[4748]: I0320 10:55:48.825495 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 20 10:55:48 crc kubenswrapper[4748]: W0320 10:55:48.829525 4748 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7753c601_7739_4165_b5f2_a673b0797334.slice/crio-3cf304fb39398a6abc598ebd89e33c029cf31d5efde1fb05457316f5b4bf26d3 WatchSource:0}: Error finding container 3cf304fb39398a6abc598ebd89e33c029cf31d5efde1fb05457316f5b4bf26d3: Status 404 returned error can't find the container with id 3cf304fb39398a6abc598ebd89e33c029cf31d5efde1fb05457316f5b4bf26d3 Mar 20 10:55:49 crc kubenswrapper[4748]: I0320 10:55:49.116994 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7753c601-7739-4165-b5f2-a673b0797334","Type":"ContainerStarted","Data":"3cf304fb39398a6abc598ebd89e33c029cf31d5efde1fb05457316f5b4bf26d3"} Mar 20 10:55:49 crc kubenswrapper[4748]: I0320 10:55:49.277115 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 20 10:55:49 crc kubenswrapper[4748]: I0320 10:55:49.816174 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 20 10:55:51 crc kubenswrapper[4748]: I0320 10:55:51.190337 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-qnhjl"] Mar 20 10:55:51 crc kubenswrapper[4748]: E0320 10:55:51.190704 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07aca114-65eb-4bd6-8d63-d17635771c2d" containerName="mariadb-account-create-update" Mar 20 10:55:51 crc kubenswrapper[4748]: I0320 10:55:51.190720 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="07aca114-65eb-4bd6-8d63-d17635771c2d" containerName="mariadb-account-create-update" Mar 20 10:55:51 crc kubenswrapper[4748]: E0320 10:55:51.190736 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bde82f9-d6e7-4ab4-8666-5fbbd5b90f17" containerName="swift-ring-rebalance" Mar 20 10:55:51 crc kubenswrapper[4748]: I0320 10:55:51.190741 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bde82f9-d6e7-4ab4-8666-5fbbd5b90f17" containerName="swift-ring-rebalance" Mar 20 10:55:51 crc kubenswrapper[4748]: I0320 10:55:51.190937 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="8bde82f9-d6e7-4ab4-8666-5fbbd5b90f17" containerName="swift-ring-rebalance" Mar 20 10:55:51 crc kubenswrapper[4748]: I0320 10:55:51.190950 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="07aca114-65eb-4bd6-8d63-d17635771c2d" containerName="mariadb-account-create-update" Mar 20 10:55:51 crc kubenswrapper[4748]: I0320 10:55:51.191475 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-qnhjl" Mar 20 10:55:51 crc kubenswrapper[4748]: I0320 10:55:51.208957 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-qnhjl"] Mar 20 10:55:51 crc kubenswrapper[4748]: I0320 10:55:51.382815 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjv9q\" (UniqueName: \"kubernetes.io/projected/2ed8081e-cf8f-41b3-829c-a65608af1644-kube-api-access-rjv9q\") pod \"cinder-db-create-qnhjl\" (UID: \"2ed8081e-cf8f-41b3-829c-a65608af1644\") " pod="openstack/cinder-db-create-qnhjl" Mar 20 10:55:51 crc kubenswrapper[4748]: I0320 10:55:51.382921 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2ed8081e-cf8f-41b3-829c-a65608af1644-operator-scripts\") pod \"cinder-db-create-qnhjl\" (UID: \"2ed8081e-cf8f-41b3-829c-a65608af1644\") " pod="openstack/cinder-db-create-qnhjl" Mar 20 10:55:51 crc kubenswrapper[4748]: I0320 10:55:51.411293 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-edd6-account-create-update-v5v7v"] Mar 20 10:55:51 crc kubenswrapper[4748]: I0320 10:55:51.412286 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-edd6-account-create-update-v5v7v" Mar 20 10:55:51 crc kubenswrapper[4748]: I0320 10:55:51.414568 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Mar 20 10:55:51 crc kubenswrapper[4748]: I0320 10:55:51.440672 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-edd6-account-create-update-v5v7v"] Mar 20 10:55:51 crc kubenswrapper[4748]: I0320 10:55:51.484949 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rjv9q\" (UniqueName: \"kubernetes.io/projected/2ed8081e-cf8f-41b3-829c-a65608af1644-kube-api-access-rjv9q\") pod \"cinder-db-create-qnhjl\" (UID: \"2ed8081e-cf8f-41b3-829c-a65608af1644\") " pod="openstack/cinder-db-create-qnhjl" Mar 20 10:55:51 crc kubenswrapper[4748]: I0320 10:55:51.485019 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2ed8081e-cf8f-41b3-829c-a65608af1644-operator-scripts\") pod \"cinder-db-create-qnhjl\" (UID: \"2ed8081e-cf8f-41b3-829c-a65608af1644\") " pod="openstack/cinder-db-create-qnhjl" Mar 20 10:55:51 crc kubenswrapper[4748]: I0320 10:55:51.485800 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2ed8081e-cf8f-41b3-829c-a65608af1644-operator-scripts\") pod \"cinder-db-create-qnhjl\" (UID: \"2ed8081e-cf8f-41b3-829c-a65608af1644\") " pod="openstack/cinder-db-create-qnhjl" Mar 20 10:55:51 crc kubenswrapper[4748]: I0320 10:55:51.527498 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjv9q\" (UniqueName: \"kubernetes.io/projected/2ed8081e-cf8f-41b3-829c-a65608af1644-kube-api-access-rjv9q\") pod \"cinder-db-create-qnhjl\" (UID: \"2ed8081e-cf8f-41b3-829c-a65608af1644\") " pod="openstack/cinder-db-create-qnhjl" Mar 20 10:55:51 crc kubenswrapper[4748]: I0320 10:55:51.529226 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-kctrc"] Mar 20 10:55:51 crc kubenswrapper[4748]: I0320 10:55:51.530927 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-kctrc" Mar 20 10:55:51 crc kubenswrapper[4748]: I0320 10:55:51.534614 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 20 10:55:51 crc kubenswrapper[4748]: I0320 10:55:51.538063 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 20 10:55:51 crc kubenswrapper[4748]: I0320 10:55:51.539196 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-n27fv" Mar 20 10:55:51 crc kubenswrapper[4748]: I0320 10:55:51.539267 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 20 10:55:51 crc kubenswrapper[4748]: I0320 10:55:51.551848 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-qnhjl" Mar 20 10:55:51 crc kubenswrapper[4748]: I0320 10:55:51.552096 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-kctrc"] Mar 20 10:55:51 crc kubenswrapper[4748]: I0320 10:55:51.589819 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/df4bc055-0cec-45a8-b6ef-a3b1b8a3b5be-operator-scripts\") pod \"cinder-edd6-account-create-update-v5v7v\" (UID: \"df4bc055-0cec-45a8-b6ef-a3b1b8a3b5be\") " pod="openstack/cinder-edd6-account-create-update-v5v7v" Mar 20 10:55:51 crc kubenswrapper[4748]: I0320 10:55:51.590090 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqldq\" (UniqueName: \"kubernetes.io/projected/df4bc055-0cec-45a8-b6ef-a3b1b8a3b5be-kube-api-access-mqldq\") pod \"cinder-edd6-account-create-update-v5v7v\" (UID: \"df4bc055-0cec-45a8-b6ef-a3b1b8a3b5be\") " pod="openstack/cinder-edd6-account-create-update-v5v7v" Mar 20 10:55:51 crc kubenswrapper[4748]: I0320 10:55:51.636758 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-znvbx"] Mar 20 10:55:51 crc kubenswrapper[4748]: I0320 10:55:51.637946 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-znvbx" Mar 20 10:55:51 crc kubenswrapper[4748]: I0320 10:55:51.652943 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-znvbx"] Mar 20 10:55:51 crc kubenswrapper[4748]: I0320 10:55:51.694080 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ab07a02-a8eb-4d81-a8e0-e8999fbbca4a-combined-ca-bundle\") pod \"keystone-db-sync-kctrc\" (UID: \"3ab07a02-a8eb-4d81-a8e0-e8999fbbca4a\") " pod="openstack/keystone-db-sync-kctrc" Mar 20 10:55:51 crc kubenswrapper[4748]: I0320 10:55:51.694174 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/df4bc055-0cec-45a8-b6ef-a3b1b8a3b5be-operator-scripts\") pod \"cinder-edd6-account-create-update-v5v7v\" (UID: \"df4bc055-0cec-45a8-b6ef-a3b1b8a3b5be\") " pod="openstack/cinder-edd6-account-create-update-v5v7v" Mar 20 10:55:51 crc kubenswrapper[4748]: I0320 10:55:51.694214 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tg5hj\" (UniqueName: \"kubernetes.io/projected/3ab07a02-a8eb-4d81-a8e0-e8999fbbca4a-kube-api-access-tg5hj\") pod \"keystone-db-sync-kctrc\" (UID: \"3ab07a02-a8eb-4d81-a8e0-e8999fbbca4a\") " pod="openstack/keystone-db-sync-kctrc" Mar 20 10:55:51 crc kubenswrapper[4748]: I0320 10:55:51.694319 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mqldq\" (UniqueName: \"kubernetes.io/projected/df4bc055-0cec-45a8-b6ef-a3b1b8a3b5be-kube-api-access-mqldq\") pod \"cinder-edd6-account-create-update-v5v7v\" (UID: \"df4bc055-0cec-45a8-b6ef-a3b1b8a3b5be\") " pod="openstack/cinder-edd6-account-create-update-v5v7v" Mar 20 10:55:51 crc kubenswrapper[4748]: I0320 10:55:51.694423 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ab07a02-a8eb-4d81-a8e0-e8999fbbca4a-config-data\") pod \"keystone-db-sync-kctrc\" (UID: \"3ab07a02-a8eb-4d81-a8e0-e8999fbbca4a\") " pod="openstack/keystone-db-sync-kctrc" Mar 20 10:55:51 crc kubenswrapper[4748]: I0320 10:55:51.695735 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/df4bc055-0cec-45a8-b6ef-a3b1b8a3b5be-operator-scripts\") pod \"cinder-edd6-account-create-update-v5v7v\" (UID: \"df4bc055-0cec-45a8-b6ef-a3b1b8a3b5be\") " pod="openstack/cinder-edd6-account-create-update-v5v7v" Mar 20 10:55:51 crc kubenswrapper[4748]: I0320 10:55:51.710296 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-2hr4g"] Mar 20 10:55:51 crc kubenswrapper[4748]: I0320 10:55:51.712188 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-2hr4g" Mar 20 10:55:51 crc kubenswrapper[4748]: I0320 10:55:51.740257 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-beb0-account-create-update-ls746"] Mar 20 10:55:51 crc kubenswrapper[4748]: I0320 10:55:51.742672 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-beb0-account-create-update-ls746" Mar 20 10:55:51 crc kubenswrapper[4748]: I0320 10:55:51.749405 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Mar 20 10:55:51 crc kubenswrapper[4748]: I0320 10:55:51.749784 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-2hr4g"] Mar 20 10:55:51 crc kubenswrapper[4748]: I0320 10:55:51.754710 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqldq\" (UniqueName: \"kubernetes.io/projected/df4bc055-0cec-45a8-b6ef-a3b1b8a3b5be-kube-api-access-mqldq\") pod \"cinder-edd6-account-create-update-v5v7v\" (UID: \"df4bc055-0cec-45a8-b6ef-a3b1b8a3b5be\") " pod="openstack/cinder-edd6-account-create-update-v5v7v" Mar 20 10:55:51 crc kubenswrapper[4748]: I0320 10:55:51.770551 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-beb0-account-create-update-ls746"] Mar 20 10:55:51 crc kubenswrapper[4748]: I0320 10:55:51.796473 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ab07a02-a8eb-4d81-a8e0-e8999fbbca4a-config-data\") pod \"keystone-db-sync-kctrc\" (UID: \"3ab07a02-a8eb-4d81-a8e0-e8999fbbca4a\") " pod="openstack/keystone-db-sync-kctrc" Mar 20 10:55:51 crc kubenswrapper[4748]: I0320 10:55:51.796557 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a2435374-d4a6-4d1d-81ab-ab6bc61ae023-operator-scripts\") pod \"barbican-db-create-znvbx\" (UID: \"a2435374-d4a6-4d1d-81ab-ab6bc61ae023\") " pod="openstack/barbican-db-create-znvbx" Mar 20 10:55:51 crc kubenswrapper[4748]: I0320 10:55:51.796594 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ab07a02-a8eb-4d81-a8e0-e8999fbbca4a-combined-ca-bundle\") pod \"keystone-db-sync-kctrc\" (UID: \"3ab07a02-a8eb-4d81-a8e0-e8999fbbca4a\") " pod="openstack/keystone-db-sync-kctrc" Mar 20 10:55:51 crc kubenswrapper[4748]: I0320 10:55:51.796623 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tg5hj\" (UniqueName: \"kubernetes.io/projected/3ab07a02-a8eb-4d81-a8e0-e8999fbbca4a-kube-api-access-tg5hj\") pod \"keystone-db-sync-kctrc\" (UID: \"3ab07a02-a8eb-4d81-a8e0-e8999fbbca4a\") " pod="openstack/keystone-db-sync-kctrc" Mar 20 10:55:51 crc kubenswrapper[4748]: I0320 10:55:51.796684 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-89rs4\" (UniqueName: \"kubernetes.io/projected/a2435374-d4a6-4d1d-81ab-ab6bc61ae023-kube-api-access-89rs4\") pod \"barbican-db-create-znvbx\" (UID: \"a2435374-d4a6-4d1d-81ab-ab6bc61ae023\") " pod="openstack/barbican-db-create-znvbx" Mar 20 10:55:51 crc kubenswrapper[4748]: I0320 10:55:51.806608 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ab07a02-a8eb-4d81-a8e0-e8999fbbca4a-combined-ca-bundle\") pod \"keystone-db-sync-kctrc\" (UID: \"3ab07a02-a8eb-4d81-a8e0-e8999fbbca4a\") " pod="openstack/keystone-db-sync-kctrc" Mar 20 10:55:51 crc kubenswrapper[4748]: I0320 10:55:51.817337 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ab07a02-a8eb-4d81-a8e0-e8999fbbca4a-config-data\") pod \"keystone-db-sync-kctrc\" (UID: \"3ab07a02-a8eb-4d81-a8e0-e8999fbbca4a\") " pod="openstack/keystone-db-sync-kctrc" Mar 20 10:55:51 crc kubenswrapper[4748]: I0320 10:55:51.827502 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tg5hj\" (UniqueName: \"kubernetes.io/projected/3ab07a02-a8eb-4d81-a8e0-e8999fbbca4a-kube-api-access-tg5hj\") pod \"keystone-db-sync-kctrc\" (UID: \"3ab07a02-a8eb-4d81-a8e0-e8999fbbca4a\") " pod="openstack/keystone-db-sync-kctrc" Mar 20 10:55:51 crc kubenswrapper[4748]: I0320 10:55:51.829216 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-f74c-account-create-update-s4q4j"] Mar 20 10:55:51 crc kubenswrapper[4748]: I0320 10:55:51.839784 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f74c-account-create-update-s4q4j" Mar 20 10:55:51 crc kubenswrapper[4748]: I0320 10:55:51.843220 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Mar 20 10:55:51 crc kubenswrapper[4748]: I0320 10:55:51.844525 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-f74c-account-create-update-s4q4j"] Mar 20 10:55:51 crc kubenswrapper[4748]: I0320 10:55:51.900087 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ebd3eb4a-6f8b-4ea0-b44d-e279e0c8331f-operator-scripts\") pod \"neutron-db-create-2hr4g\" (UID: \"ebd3eb4a-6f8b-4ea0-b44d-e279e0c8331f\") " pod="openstack/neutron-db-create-2hr4g" Mar 20 10:55:51 crc kubenswrapper[4748]: I0320 10:55:51.900161 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a2435374-d4a6-4d1d-81ab-ab6bc61ae023-operator-scripts\") pod \"barbican-db-create-znvbx\" (UID: \"a2435374-d4a6-4d1d-81ab-ab6bc61ae023\") " pod="openstack/barbican-db-create-znvbx" Mar 20 10:55:51 crc kubenswrapper[4748]: I0320 10:55:51.900568 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zrvj\" (UniqueName: \"kubernetes.io/projected/2b730a50-8cb3-43e9-af09-5bf38f7cfe3f-kube-api-access-9zrvj\") pod \"barbican-beb0-account-create-update-ls746\" (UID: \"2b730a50-8cb3-43e9-af09-5bf38f7cfe3f\") " pod="openstack/barbican-beb0-account-create-update-ls746" Mar 20 10:55:51 crc kubenswrapper[4748]: I0320 10:55:51.900644 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5fw4\" (UniqueName: \"kubernetes.io/projected/ebd3eb4a-6f8b-4ea0-b44d-e279e0c8331f-kube-api-access-z5fw4\") pod \"neutron-db-create-2hr4g\" (UID: \"ebd3eb4a-6f8b-4ea0-b44d-e279e0c8331f\") " pod="openstack/neutron-db-create-2hr4g" Mar 20 10:55:51 crc kubenswrapper[4748]: I0320 10:55:51.900897 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-89rs4\" (UniqueName: \"kubernetes.io/projected/a2435374-d4a6-4d1d-81ab-ab6bc61ae023-kube-api-access-89rs4\") pod \"barbican-db-create-znvbx\" (UID: \"a2435374-d4a6-4d1d-81ab-ab6bc61ae023\") " pod="openstack/barbican-db-create-znvbx" Mar 20 10:55:51 crc kubenswrapper[4748]: I0320 10:55:51.900991 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2b730a50-8cb3-43e9-af09-5bf38f7cfe3f-operator-scripts\") pod \"barbican-beb0-account-create-update-ls746\" (UID: \"2b730a50-8cb3-43e9-af09-5bf38f7cfe3f\") " pod="openstack/barbican-beb0-account-create-update-ls746" Mar 20 10:55:51 crc kubenswrapper[4748]: I0320 10:55:51.901029 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a2435374-d4a6-4d1d-81ab-ab6bc61ae023-operator-scripts\") pod \"barbican-db-create-znvbx\" (UID: \"a2435374-d4a6-4d1d-81ab-ab6bc61ae023\") " pod="openstack/barbican-db-create-znvbx" Mar 20 10:55:51 crc kubenswrapper[4748]: I0320 10:55:51.918525 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-89rs4\" (UniqueName: \"kubernetes.io/projected/a2435374-d4a6-4d1d-81ab-ab6bc61ae023-kube-api-access-89rs4\") pod \"barbican-db-create-znvbx\" (UID: \"a2435374-d4a6-4d1d-81ab-ab6bc61ae023\") " pod="openstack/barbican-db-create-znvbx" Mar 20 10:55:51 crc kubenswrapper[4748]: I0320 10:55:51.968762 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-kctrc" Mar 20 10:55:51 crc kubenswrapper[4748]: I0320 10:55:51.993803 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-znvbx" Mar 20 10:55:52 crc kubenswrapper[4748]: I0320 10:55:52.004134 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8z2f7\" (UniqueName: \"kubernetes.io/projected/eb7e7326-cf77-4cb1-93a5-f463fb86b184-kube-api-access-8z2f7\") pod \"neutron-f74c-account-create-update-s4q4j\" (UID: \"eb7e7326-cf77-4cb1-93a5-f463fb86b184\") " pod="openstack/neutron-f74c-account-create-update-s4q4j" Mar 20 10:55:52 crc kubenswrapper[4748]: I0320 10:55:52.004235 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9zrvj\" (UniqueName: \"kubernetes.io/projected/2b730a50-8cb3-43e9-af09-5bf38f7cfe3f-kube-api-access-9zrvj\") pod \"barbican-beb0-account-create-update-ls746\" (UID: \"2b730a50-8cb3-43e9-af09-5bf38f7cfe3f\") " pod="openstack/barbican-beb0-account-create-update-ls746" Mar 20 10:55:52 crc kubenswrapper[4748]: I0320 10:55:52.004274 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z5fw4\" (UniqueName: \"kubernetes.io/projected/ebd3eb4a-6f8b-4ea0-b44d-e279e0c8331f-kube-api-access-z5fw4\") pod \"neutron-db-create-2hr4g\" (UID: \"ebd3eb4a-6f8b-4ea0-b44d-e279e0c8331f\") " pod="openstack/neutron-db-create-2hr4g" Mar 20 10:55:52 crc kubenswrapper[4748]: I0320 10:55:52.004376 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2b730a50-8cb3-43e9-af09-5bf38f7cfe3f-operator-scripts\") pod \"barbican-beb0-account-create-update-ls746\" (UID: \"2b730a50-8cb3-43e9-af09-5bf38f7cfe3f\") " pod="openstack/barbican-beb0-account-create-update-ls746" Mar 20 10:55:52 crc kubenswrapper[4748]: I0320 10:55:52.004460 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eb7e7326-cf77-4cb1-93a5-f463fb86b184-operator-scripts\") pod \"neutron-f74c-account-create-update-s4q4j\" (UID: \"eb7e7326-cf77-4cb1-93a5-f463fb86b184\") " pod="openstack/neutron-f74c-account-create-update-s4q4j" Mar 20 10:55:52 crc kubenswrapper[4748]: I0320 10:55:52.004499 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ebd3eb4a-6f8b-4ea0-b44d-e279e0c8331f-operator-scripts\") pod \"neutron-db-create-2hr4g\" (UID: \"ebd3eb4a-6f8b-4ea0-b44d-e279e0c8331f\") " pod="openstack/neutron-db-create-2hr4g" Mar 20 10:55:52 crc kubenswrapper[4748]: I0320 10:55:52.005501 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2b730a50-8cb3-43e9-af09-5bf38f7cfe3f-operator-scripts\") pod \"barbican-beb0-account-create-update-ls746\" (UID: \"2b730a50-8cb3-43e9-af09-5bf38f7cfe3f\") " pod="openstack/barbican-beb0-account-create-update-ls746" Mar 20 10:55:52 crc kubenswrapper[4748]: I0320 10:55:52.005625 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ebd3eb4a-6f8b-4ea0-b44d-e279e0c8331f-operator-scripts\") pod \"neutron-db-create-2hr4g\" (UID: \"ebd3eb4a-6f8b-4ea0-b44d-e279e0c8331f\") " pod="openstack/neutron-db-create-2hr4g" Mar 20 10:55:52 crc kubenswrapper[4748]: I0320 10:55:52.027338 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5fw4\" (UniqueName: \"kubernetes.io/projected/ebd3eb4a-6f8b-4ea0-b44d-e279e0c8331f-kube-api-access-z5fw4\") pod \"neutron-db-create-2hr4g\" (UID: \"ebd3eb4a-6f8b-4ea0-b44d-e279e0c8331f\") " pod="openstack/neutron-db-create-2hr4g" Mar 20 10:55:52 crc kubenswrapper[4748]: I0320 10:55:52.032671 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zrvj\" (UniqueName: \"kubernetes.io/projected/2b730a50-8cb3-43e9-af09-5bf38f7cfe3f-kube-api-access-9zrvj\") pod \"barbican-beb0-account-create-update-ls746\" (UID: \"2b730a50-8cb3-43e9-af09-5bf38f7cfe3f\") " pod="openstack/barbican-beb0-account-create-update-ls746" Mar 20 10:55:52 crc kubenswrapper[4748]: I0320 10:55:52.033447 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-edd6-account-create-update-v5v7v" Mar 20 10:55:52 crc kubenswrapper[4748]: I0320 10:55:52.106503 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8z2f7\" (UniqueName: \"kubernetes.io/projected/eb7e7326-cf77-4cb1-93a5-f463fb86b184-kube-api-access-8z2f7\") pod \"neutron-f74c-account-create-update-s4q4j\" (UID: \"eb7e7326-cf77-4cb1-93a5-f463fb86b184\") " pod="openstack/neutron-f74c-account-create-update-s4q4j" Mar 20 10:55:52 crc kubenswrapper[4748]: I0320 10:55:52.106650 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eb7e7326-cf77-4cb1-93a5-f463fb86b184-operator-scripts\") pod \"neutron-f74c-account-create-update-s4q4j\" (UID: \"eb7e7326-cf77-4cb1-93a5-f463fb86b184\") " pod="openstack/neutron-f74c-account-create-update-s4q4j" Mar 20 10:55:52 crc kubenswrapper[4748]: I0320 10:55:52.107473 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eb7e7326-cf77-4cb1-93a5-f463fb86b184-operator-scripts\") pod \"neutron-f74c-account-create-update-s4q4j\" (UID: \"eb7e7326-cf77-4cb1-93a5-f463fb86b184\") " pod="openstack/neutron-f74c-account-create-update-s4q4j" Mar 20 10:55:52 crc kubenswrapper[4748]: I0320 10:55:52.155368 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-2hr4g" Mar 20 10:55:52 crc kubenswrapper[4748]: I0320 10:55:52.158941 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8z2f7\" (UniqueName: \"kubernetes.io/projected/eb7e7326-cf77-4cb1-93a5-f463fb86b184-kube-api-access-8z2f7\") pod \"neutron-f74c-account-create-update-s4q4j\" (UID: \"eb7e7326-cf77-4cb1-93a5-f463fb86b184\") " pod="openstack/neutron-f74c-account-create-update-s4q4j" Mar 20 10:55:52 crc kubenswrapper[4748]: I0320 10:55:52.162506 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-beb0-account-create-update-ls746" Mar 20 10:55:52 crc kubenswrapper[4748]: I0320 10:55:52.172274 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f74c-account-create-update-s4q4j" Mar 20 10:55:52 crc kubenswrapper[4748]: I0320 10:55:52.240605 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-qnhjl"] Mar 20 10:55:52 crc kubenswrapper[4748]: I0320 10:55:52.666639 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-kctrc"] Mar 20 10:55:52 crc kubenswrapper[4748]: W0320 10:55:52.677700 4748 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3ab07a02_a8eb_4d81_a8e0_e8999fbbca4a.slice/crio-ae5efb3bc8ef4d37229fa3ea91538492c59f46a851b8baecc3347bf48ff86a79 WatchSource:0}: Error finding container ae5efb3bc8ef4d37229fa3ea91538492c59f46a851b8baecc3347bf48ff86a79: Status 404 returned error can't find the container with id ae5efb3bc8ef4d37229fa3ea91538492c59f46a851b8baecc3347bf48ff86a79 Mar 20 10:55:52 crc kubenswrapper[4748]: I0320 10:55:52.824165 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-2hr4g"] Mar 20 10:55:52 crc kubenswrapper[4748]: W0320 10:55:52.830882 4748 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda2435374_d4a6_4d1d_81ab_ab6bc61ae023.slice/crio-f02f6cb7d36ec819cb6efa887a742cbac0587e6c145e18268e03d36416408252 WatchSource:0}: Error finding container f02f6cb7d36ec819cb6efa887a742cbac0587e6c145e18268e03d36416408252: Status 404 returned error can't find the container with id f02f6cb7d36ec819cb6efa887a742cbac0587e6c145e18268e03d36416408252 Mar 20 10:55:52 crc kubenswrapper[4748]: I0320 10:55:52.843465 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-znvbx"] Mar 20 10:55:52 crc kubenswrapper[4748]: I0320 10:55:52.852226 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-edd6-account-create-update-v5v7v"] Mar 20 10:55:52 crc kubenswrapper[4748]: I0320 10:55:52.939780 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-f74c-account-create-update-s4q4j"] Mar 20 10:55:52 crc kubenswrapper[4748]: I0320 10:55:52.948558 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-beb0-account-create-update-ls746"] Mar 20 10:55:53 crc kubenswrapper[4748]: I0320 10:55:53.153771 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-znvbx" event={"ID":"a2435374-d4a6-4d1d-81ab-ab6bc61ae023","Type":"ContainerStarted","Data":"f02f6cb7d36ec819cb6efa887a742cbac0587e6c145e18268e03d36416408252"} Mar 20 10:55:53 crc kubenswrapper[4748]: I0320 10:55:53.154904 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-edd6-account-create-update-v5v7v" event={"ID":"df4bc055-0cec-45a8-b6ef-a3b1b8a3b5be","Type":"ContainerStarted","Data":"b71f170a4ee9f827655e15b3c6bacf1e784478103042a2004f5277be6717542d"} Mar 20 10:55:53 crc kubenswrapper[4748]: I0320 10:55:53.156032 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-2hr4g" event={"ID":"ebd3eb4a-6f8b-4ea0-b44d-e279e0c8331f","Type":"ContainerStarted","Data":"550ea68686b9e9aef59ee0dfd98390d0d3ee4f6667e9ff1de7b4aa7eb730a51f"} Mar 20 10:55:53 crc kubenswrapper[4748]: I0320 10:55:53.157873 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-kctrc" event={"ID":"3ab07a02-a8eb-4d81-a8e0-e8999fbbca4a","Type":"ContainerStarted","Data":"ae5efb3bc8ef4d37229fa3ea91538492c59f46a851b8baecc3347bf48ff86a79"} Mar 20 10:55:53 crc kubenswrapper[4748]: I0320 10:55:53.158746 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-qnhjl" event={"ID":"2ed8081e-cf8f-41b3-829c-a65608af1644","Type":"ContainerStarted","Data":"87721ec913084be5864d6735b34c7723815fcb4df33d64fb4131289732e0bce2"} Mar 20 10:55:57 crc kubenswrapper[4748]: W0320 10:55:57.615891 4748 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2b730a50_8cb3_43e9_af09_5bf38f7cfe3f.slice/crio-5d267e762e731bcd37bec7c873f1a250e20093afac35a27940fd4c74bc9bf833 WatchSource:0}: Error finding container 5d267e762e731bcd37bec7c873f1a250e20093afac35a27940fd4c74bc9bf833: Status 404 returned error can't find the container with id 5d267e762e731bcd37bec7c873f1a250e20093afac35a27940fd4c74bc9bf833 Mar 20 10:55:58 crc kubenswrapper[4748]: I0320 10:55:58.209773 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-znvbx" event={"ID":"a2435374-d4a6-4d1d-81ab-ab6bc61ae023","Type":"ContainerStarted","Data":"25793bede7a0318a469e669ff529dd03177a5100b7973134147bf88e9eb4696b"} Mar 20 10:55:58 crc kubenswrapper[4748]: I0320 10:55:58.213922 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-beb0-account-create-update-ls746" event={"ID":"2b730a50-8cb3-43e9-af09-5bf38f7cfe3f","Type":"ContainerStarted","Data":"118d9930410d6da787a8d2f5979051084dc728d53b3d7004706200f762bb31b1"} Mar 20 10:55:58 crc kubenswrapper[4748]: I0320 10:55:58.213985 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-beb0-account-create-update-ls746" event={"ID":"2b730a50-8cb3-43e9-af09-5bf38f7cfe3f","Type":"ContainerStarted","Data":"5d267e762e731bcd37bec7c873f1a250e20093afac35a27940fd4c74bc9bf833"} Mar 20 10:55:58 crc kubenswrapper[4748]: I0320 10:55:58.217234 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-edd6-account-create-update-v5v7v" event={"ID":"df4bc055-0cec-45a8-b6ef-a3b1b8a3b5be","Type":"ContainerStarted","Data":"2139ed030748c21d1eaebb5f04701c7777310e08ef8766510c63ae73b9d9eeea"} Mar 20 10:55:58 crc kubenswrapper[4748]: I0320 10:55:58.218674 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-2hr4g" event={"ID":"ebd3eb4a-6f8b-4ea0-b44d-e279e0c8331f","Type":"ContainerStarted","Data":"2bdafb6b2197d279e48298731c49d3da2e047b701b39d77a84a4581d41810e0a"} Mar 20 10:55:58 crc kubenswrapper[4748]: I0320 10:55:58.220416 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f74c-account-create-update-s4q4j" event={"ID":"eb7e7326-cf77-4cb1-93a5-f463fb86b184","Type":"ContainerStarted","Data":"284ac5c3e73609ade4a8deae9663a678883f6ec93e2571ca8b0166050123fddf"} Mar 20 10:55:58 crc kubenswrapper[4748]: I0320 10:55:58.220462 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f74c-account-create-update-s4q4j" event={"ID":"eb7e7326-cf77-4cb1-93a5-f463fb86b184","Type":"ContainerStarted","Data":"6ead239a11c88d275f7185f60ea4a7dbcb26b89764e5bf62f21914850e67c99d"} Mar 20 10:55:58 crc kubenswrapper[4748]: I0320 10:55:58.223244 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-qnhjl" event={"ID":"2ed8081e-cf8f-41b3-829c-a65608af1644","Type":"ContainerStarted","Data":"05db1c5e609ddbd336478b4a6064b39774d3a3bc47feaa116f5ec5c21630f999"} Mar 20 10:55:58 crc kubenswrapper[4748]: I0320 10:55:58.238451 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-create-znvbx" podStartSLOduration=7.238423711 podStartE2EDuration="7.238423711s" podCreationTimestamp="2026-03-20 10:55:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:55:58.230169213 +0000 UTC m=+1193.371715037" watchObservedRunningTime="2026-03-20 10:55:58.238423711 +0000 UTC m=+1193.379969525" Mar 20 10:55:58 crc kubenswrapper[4748]: I0320 10:55:58.263430 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-beb0-account-create-update-ls746" podStartSLOduration=7.263405999 podStartE2EDuration="7.263405999s" podCreationTimestamp="2026-03-20 10:55:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:55:58.256656079 +0000 UTC m=+1193.398201893" watchObservedRunningTime="2026-03-20 10:55:58.263405999 +0000 UTC m=+1193.404951813" Mar 20 10:55:58 crc kubenswrapper[4748]: I0320 10:55:58.287466 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-create-qnhjl" podStartSLOduration=7.287430363 podStartE2EDuration="7.287430363s" podCreationTimestamp="2026-03-20 10:55:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:55:58.272312323 +0000 UTC m=+1193.413858137" watchObservedRunningTime="2026-03-20 10:55:58.287430363 +0000 UTC m=+1193.428976197" Mar 20 10:55:58 crc kubenswrapper[4748]: I0320 10:55:58.303150 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-create-2hr4g" podStartSLOduration=7.303125037 podStartE2EDuration="7.303125037s" podCreationTimestamp="2026-03-20 10:55:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:55:58.290803978 +0000 UTC m=+1193.432349812" watchObservedRunningTime="2026-03-20 10:55:58.303125037 +0000 UTC m=+1193.444670851" Mar 20 10:55:58 crc kubenswrapper[4748]: I0320 10:55:58.322231 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-edd6-account-create-update-v5v7v" podStartSLOduration=7.322195987 podStartE2EDuration="7.322195987s" podCreationTimestamp="2026-03-20 10:55:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:55:58.313389306 +0000 UTC m=+1193.454935140" watchObservedRunningTime="2026-03-20 10:55:58.322195987 +0000 UTC m=+1193.463741801" Mar 20 10:55:58 crc kubenswrapper[4748]: I0320 10:55:58.343146 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-f74c-account-create-update-s4q4j" podStartSLOduration=7.343112753 podStartE2EDuration="7.343112753s" podCreationTimestamp="2026-03-20 10:55:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:55:58.339086272 +0000 UTC m=+1193.480632086" watchObservedRunningTime="2026-03-20 10:55:58.343112753 +0000 UTC m=+1193.484658567" Mar 20 10:55:59 crc kubenswrapper[4748]: I0320 10:55:59.241827 4748 generic.go:334] "Generic (PLEG): container finished" podID="ebd3eb4a-6f8b-4ea0-b44d-e279e0c8331f" containerID="2bdafb6b2197d279e48298731c49d3da2e047b701b39d77a84a4581d41810e0a" exitCode=0 Mar 20 10:55:59 crc kubenswrapper[4748]: I0320 10:55:59.242154 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-2hr4g" event={"ID":"ebd3eb4a-6f8b-4ea0-b44d-e279e0c8331f","Type":"ContainerDied","Data":"2bdafb6b2197d279e48298731c49d3da2e047b701b39d77a84a4581d41810e0a"} Mar 20 10:55:59 crc kubenswrapper[4748]: I0320 10:55:59.252405 4748 generic.go:334] "Generic (PLEG): container finished" podID="eb7e7326-cf77-4cb1-93a5-f463fb86b184" containerID="284ac5c3e73609ade4a8deae9663a678883f6ec93e2571ca8b0166050123fddf" exitCode=0 Mar 20 10:55:59 crc kubenswrapper[4748]: I0320 10:55:59.252521 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f74c-account-create-update-s4q4j" event={"ID":"eb7e7326-cf77-4cb1-93a5-f463fb86b184","Type":"ContainerDied","Data":"284ac5c3e73609ade4a8deae9663a678883f6ec93e2571ca8b0166050123fddf"} Mar 20 10:55:59 crc kubenswrapper[4748]: I0320 10:55:59.268931 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7753c601-7739-4165-b5f2-a673b0797334","Type":"ContainerStarted","Data":"312254f07ceb5f888b0103261488c134c13b91355de9d3d08453416d4d87ee2b"} Mar 20 10:55:59 crc kubenswrapper[4748]: I0320 10:55:59.269067 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7753c601-7739-4165-b5f2-a673b0797334","Type":"ContainerStarted","Data":"d44b13c546368c4edaa4ddd74e3c9deb71160321a9ad83d16934b5358ac32be6"} Mar 20 10:55:59 crc kubenswrapper[4748]: I0320 10:55:59.269082 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7753c601-7739-4165-b5f2-a673b0797334","Type":"ContainerStarted","Data":"fd2f22f9ef208a79f1b2b63695c663a8910ee6089d449a07f2e4f7a0a6cb6ed2"} Mar 20 10:55:59 crc kubenswrapper[4748]: I0320 10:55:59.272803 4748 generic.go:334] "Generic (PLEG): container finished" podID="2ed8081e-cf8f-41b3-829c-a65608af1644" containerID="05db1c5e609ddbd336478b4a6064b39774d3a3bc47feaa116f5ec5c21630f999" exitCode=0 Mar 20 10:55:59 crc kubenswrapper[4748]: I0320 10:55:59.272920 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-qnhjl" event={"ID":"2ed8081e-cf8f-41b3-829c-a65608af1644","Type":"ContainerDied","Data":"05db1c5e609ddbd336478b4a6064b39774d3a3bc47feaa116f5ec5c21630f999"} Mar 20 10:55:59 crc kubenswrapper[4748]: I0320 10:55:59.275113 4748 generic.go:334] "Generic (PLEG): container finished" podID="a2435374-d4a6-4d1d-81ab-ab6bc61ae023" containerID="25793bede7a0318a469e669ff529dd03177a5100b7973134147bf88e9eb4696b" exitCode=0 Mar 20 10:55:59 crc kubenswrapper[4748]: I0320 10:55:59.275176 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-znvbx" event={"ID":"a2435374-d4a6-4d1d-81ab-ab6bc61ae023","Type":"ContainerDied","Data":"25793bede7a0318a469e669ff529dd03177a5100b7973134147bf88e9eb4696b"} Mar 20 10:55:59 crc kubenswrapper[4748]: I0320 10:55:59.281779 4748 generic.go:334] "Generic (PLEG): container finished" podID="2b730a50-8cb3-43e9-af09-5bf38f7cfe3f" containerID="118d9930410d6da787a8d2f5979051084dc728d53b3d7004706200f762bb31b1" exitCode=0 Mar 20 10:55:59 crc kubenswrapper[4748]: I0320 10:55:59.281901 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-beb0-account-create-update-ls746" event={"ID":"2b730a50-8cb3-43e9-af09-5bf38f7cfe3f","Type":"ContainerDied","Data":"118d9930410d6da787a8d2f5979051084dc728d53b3d7004706200f762bb31b1"} Mar 20 10:55:59 crc kubenswrapper[4748]: I0320 10:55:59.285619 4748 generic.go:334] "Generic (PLEG): container finished" podID="df4bc055-0cec-45a8-b6ef-a3b1b8a3b5be" containerID="2139ed030748c21d1eaebb5f04701c7777310e08ef8766510c63ae73b9d9eeea" exitCode=0 Mar 20 10:55:59 crc kubenswrapper[4748]: I0320 10:55:59.285659 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-edd6-account-create-update-v5v7v" event={"ID":"df4bc055-0cec-45a8-b6ef-a3b1b8a3b5be","Type":"ContainerDied","Data":"2139ed030748c21d1eaebb5f04701c7777310e08ef8766510c63ae73b9d9eeea"} Mar 20 10:56:00 crc kubenswrapper[4748]: I0320 10:56:00.143075 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566736-zpttt"] Mar 20 10:56:00 crc kubenswrapper[4748]: I0320 10:56:00.144926 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566736-zpttt" Mar 20 10:56:00 crc kubenswrapper[4748]: I0320 10:56:00.147544 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-p6q8z" Mar 20 10:56:00 crc kubenswrapper[4748]: I0320 10:56:00.148270 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 10:56:00 crc kubenswrapper[4748]: I0320 10:56:00.151148 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 10:56:00 crc kubenswrapper[4748]: I0320 10:56:00.154779 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566736-zpttt"] Mar 20 10:56:00 crc kubenswrapper[4748]: I0320 10:56:00.199427 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hd4xg\" (UniqueName: \"kubernetes.io/projected/1de2aa6f-ec78-4ec8-bc9d-f903756a3fd3-kube-api-access-hd4xg\") pod \"auto-csr-approver-29566736-zpttt\" (UID: \"1de2aa6f-ec78-4ec8-bc9d-f903756a3fd3\") " pod="openshift-infra/auto-csr-approver-29566736-zpttt" Mar 20 10:56:00 crc kubenswrapper[4748]: I0320 10:56:00.301493 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hd4xg\" (UniqueName: \"kubernetes.io/projected/1de2aa6f-ec78-4ec8-bc9d-f903756a3fd3-kube-api-access-hd4xg\") pod \"auto-csr-approver-29566736-zpttt\" (UID: \"1de2aa6f-ec78-4ec8-bc9d-f903756a3fd3\") " pod="openshift-infra/auto-csr-approver-29566736-zpttt" Mar 20 10:56:00 crc kubenswrapper[4748]: I0320 10:56:00.308121 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7753c601-7739-4165-b5f2-a673b0797334","Type":"ContainerStarted","Data":"4615da19ddf0764445538091218ef07b9d09a0c619e7f7826577de287aa493b4"} Mar 20 10:56:00 crc kubenswrapper[4748]: I0320 10:56:00.332828 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hd4xg\" (UniqueName: \"kubernetes.io/projected/1de2aa6f-ec78-4ec8-bc9d-f903756a3fd3-kube-api-access-hd4xg\") pod \"auto-csr-approver-29566736-zpttt\" (UID: \"1de2aa6f-ec78-4ec8-bc9d-f903756a3fd3\") " pod="openshift-infra/auto-csr-approver-29566736-zpttt" Mar 20 10:56:00 crc kubenswrapper[4748]: I0320 10:56:00.487804 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566736-zpttt" Mar 20 10:56:02 crc kubenswrapper[4748]: I0320 10:56:02.351482 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-beb0-account-create-update-ls746" event={"ID":"2b730a50-8cb3-43e9-af09-5bf38f7cfe3f","Type":"ContainerDied","Data":"5d267e762e731bcd37bec7c873f1a250e20093afac35a27940fd4c74bc9bf833"} Mar 20 10:56:02 crc kubenswrapper[4748]: I0320 10:56:02.353466 4748 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5d267e762e731bcd37bec7c873f1a250e20093afac35a27940fd4c74bc9bf833" Mar 20 10:56:02 crc kubenswrapper[4748]: I0320 10:56:02.353587 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-edd6-account-create-update-v5v7v" event={"ID":"df4bc055-0cec-45a8-b6ef-a3b1b8a3b5be","Type":"ContainerDied","Data":"b71f170a4ee9f827655e15b3c6bacf1e784478103042a2004f5277be6717542d"} Mar 20 10:56:02 crc kubenswrapper[4748]: I0320 10:56:02.353744 4748 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b71f170a4ee9f827655e15b3c6bacf1e784478103042a2004f5277be6717542d" Mar 20 10:56:02 crc kubenswrapper[4748]: I0320 10:56:02.355284 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-2hr4g" event={"ID":"ebd3eb4a-6f8b-4ea0-b44d-e279e0c8331f","Type":"ContainerDied","Data":"550ea68686b9e9aef59ee0dfd98390d0d3ee4f6667e9ff1de7b4aa7eb730a51f"} Mar 20 10:56:02 crc kubenswrapper[4748]: I0320 10:56:02.355405 4748 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="550ea68686b9e9aef59ee0dfd98390d0d3ee4f6667e9ff1de7b4aa7eb730a51f" Mar 20 10:56:02 crc kubenswrapper[4748]: I0320 10:56:02.357361 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f74c-account-create-update-s4q4j" event={"ID":"eb7e7326-cf77-4cb1-93a5-f463fb86b184","Type":"ContainerDied","Data":"6ead239a11c88d275f7185f60ea4a7dbcb26b89764e5bf62f21914850e67c99d"} Mar 20 10:56:02 crc kubenswrapper[4748]: I0320 10:56:02.357397 4748 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6ead239a11c88d275f7185f60ea4a7dbcb26b89764e5bf62f21914850e67c99d" Mar 20 10:56:02 crc kubenswrapper[4748]: I0320 10:56:02.359722 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-qnhjl" event={"ID":"2ed8081e-cf8f-41b3-829c-a65608af1644","Type":"ContainerDied","Data":"87721ec913084be5864d6735b34c7723815fcb4df33d64fb4131289732e0bce2"} Mar 20 10:56:02 crc kubenswrapper[4748]: I0320 10:56:02.359783 4748 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="87721ec913084be5864d6735b34c7723815fcb4df33d64fb4131289732e0bce2" Mar 20 10:56:02 crc kubenswrapper[4748]: I0320 10:56:02.362038 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-znvbx" event={"ID":"a2435374-d4a6-4d1d-81ab-ab6bc61ae023","Type":"ContainerDied","Data":"f02f6cb7d36ec819cb6efa887a742cbac0587e6c145e18268e03d36416408252"} Mar 20 10:56:02 crc kubenswrapper[4748]: I0320 10:56:02.362180 4748 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f02f6cb7d36ec819cb6efa887a742cbac0587e6c145e18268e03d36416408252" Mar 20 10:56:02 crc kubenswrapper[4748]: I0320 10:56:02.502902 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f74c-account-create-update-s4q4j" Mar 20 10:56:02 crc kubenswrapper[4748]: I0320 10:56:02.507483 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-qnhjl" Mar 20 10:56:02 crc kubenswrapper[4748]: I0320 10:56:02.544411 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-beb0-account-create-update-ls746" Mar 20 10:56:02 crc kubenswrapper[4748]: I0320 10:56:02.555923 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rjv9q\" (UniqueName: \"kubernetes.io/projected/2ed8081e-cf8f-41b3-829c-a65608af1644-kube-api-access-rjv9q\") pod \"2ed8081e-cf8f-41b3-829c-a65608af1644\" (UID: \"2ed8081e-cf8f-41b3-829c-a65608af1644\") " Mar 20 10:56:02 crc kubenswrapper[4748]: I0320 10:56:02.555978 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2ed8081e-cf8f-41b3-829c-a65608af1644-operator-scripts\") pod \"2ed8081e-cf8f-41b3-829c-a65608af1644\" (UID: \"2ed8081e-cf8f-41b3-829c-a65608af1644\") " Mar 20 10:56:02 crc kubenswrapper[4748]: I0320 10:56:02.556077 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eb7e7326-cf77-4cb1-93a5-f463fb86b184-operator-scripts\") pod \"eb7e7326-cf77-4cb1-93a5-f463fb86b184\" (UID: \"eb7e7326-cf77-4cb1-93a5-f463fb86b184\") " Mar 20 10:56:02 crc kubenswrapper[4748]: I0320 10:56:02.556175 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8z2f7\" (UniqueName: \"kubernetes.io/projected/eb7e7326-cf77-4cb1-93a5-f463fb86b184-kube-api-access-8z2f7\") pod \"eb7e7326-cf77-4cb1-93a5-f463fb86b184\" (UID: \"eb7e7326-cf77-4cb1-93a5-f463fb86b184\") " Mar 20 10:56:02 crc kubenswrapper[4748]: I0320 10:56:02.562874 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ed8081e-cf8f-41b3-829c-a65608af1644-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2ed8081e-cf8f-41b3-829c-a65608af1644" (UID: "2ed8081e-cf8f-41b3-829c-a65608af1644"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:02 crc kubenswrapper[4748]: I0320 10:56:02.563387 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb7e7326-cf77-4cb1-93a5-f463fb86b184-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "eb7e7326-cf77-4cb1-93a5-f463fb86b184" (UID: "eb7e7326-cf77-4cb1-93a5-f463fb86b184"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:02 crc kubenswrapper[4748]: I0320 10:56:02.571967 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-2hr4g" Mar 20 10:56:02 crc kubenswrapper[4748]: I0320 10:56:02.572360 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb7e7326-cf77-4cb1-93a5-f463fb86b184-kube-api-access-8z2f7" (OuterVolumeSpecName: "kube-api-access-8z2f7") pod "eb7e7326-cf77-4cb1-93a5-f463fb86b184" (UID: "eb7e7326-cf77-4cb1-93a5-f463fb86b184"). InnerVolumeSpecName "kube-api-access-8z2f7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:02 crc kubenswrapper[4748]: I0320 10:56:02.573318 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ed8081e-cf8f-41b3-829c-a65608af1644-kube-api-access-rjv9q" (OuterVolumeSpecName: "kube-api-access-rjv9q") pod "2ed8081e-cf8f-41b3-829c-a65608af1644" (UID: "2ed8081e-cf8f-41b3-829c-a65608af1644"). InnerVolumeSpecName "kube-api-access-rjv9q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:02 crc kubenswrapper[4748]: I0320 10:56:02.581961 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-znvbx" Mar 20 10:56:02 crc kubenswrapper[4748]: I0320 10:56:02.595141 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-edd6-account-create-update-v5v7v" Mar 20 10:56:02 crc kubenswrapper[4748]: I0320 10:56:02.658177 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mqldq\" (UniqueName: \"kubernetes.io/projected/df4bc055-0cec-45a8-b6ef-a3b1b8a3b5be-kube-api-access-mqldq\") pod \"df4bc055-0cec-45a8-b6ef-a3b1b8a3b5be\" (UID: \"df4bc055-0cec-45a8-b6ef-a3b1b8a3b5be\") " Mar 20 10:56:02 crc kubenswrapper[4748]: I0320 10:56:02.658264 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9zrvj\" (UniqueName: \"kubernetes.io/projected/2b730a50-8cb3-43e9-af09-5bf38f7cfe3f-kube-api-access-9zrvj\") pod \"2b730a50-8cb3-43e9-af09-5bf38f7cfe3f\" (UID: \"2b730a50-8cb3-43e9-af09-5bf38f7cfe3f\") " Mar 20 10:56:02 crc kubenswrapper[4748]: I0320 10:56:02.658351 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a2435374-d4a6-4d1d-81ab-ab6bc61ae023-operator-scripts\") pod \"a2435374-d4a6-4d1d-81ab-ab6bc61ae023\" (UID: \"a2435374-d4a6-4d1d-81ab-ab6bc61ae023\") " Mar 20 10:56:02 crc kubenswrapper[4748]: I0320 10:56:02.658399 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-89rs4\" (UniqueName: \"kubernetes.io/projected/a2435374-d4a6-4d1d-81ab-ab6bc61ae023-kube-api-access-89rs4\") pod \"a2435374-d4a6-4d1d-81ab-ab6bc61ae023\" (UID: \"a2435374-d4a6-4d1d-81ab-ab6bc61ae023\") " Mar 20 10:56:02 crc kubenswrapper[4748]: I0320 10:56:02.658493 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ebd3eb4a-6f8b-4ea0-b44d-e279e0c8331f-operator-scripts\") pod \"ebd3eb4a-6f8b-4ea0-b44d-e279e0c8331f\" (UID: \"ebd3eb4a-6f8b-4ea0-b44d-e279e0c8331f\") " Mar 20 10:56:02 crc kubenswrapper[4748]: I0320 10:56:02.658550 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z5fw4\" (UniqueName: \"kubernetes.io/projected/ebd3eb4a-6f8b-4ea0-b44d-e279e0c8331f-kube-api-access-z5fw4\") pod \"ebd3eb4a-6f8b-4ea0-b44d-e279e0c8331f\" (UID: \"ebd3eb4a-6f8b-4ea0-b44d-e279e0c8331f\") " Mar 20 10:56:02 crc kubenswrapper[4748]: I0320 10:56:02.658625 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/df4bc055-0cec-45a8-b6ef-a3b1b8a3b5be-operator-scripts\") pod \"df4bc055-0cec-45a8-b6ef-a3b1b8a3b5be\" (UID: \"df4bc055-0cec-45a8-b6ef-a3b1b8a3b5be\") " Mar 20 10:56:02 crc kubenswrapper[4748]: I0320 10:56:02.658713 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2b730a50-8cb3-43e9-af09-5bf38f7cfe3f-operator-scripts\") pod \"2b730a50-8cb3-43e9-af09-5bf38f7cfe3f\" (UID: \"2b730a50-8cb3-43e9-af09-5bf38f7cfe3f\") " Mar 20 10:56:02 crc kubenswrapper[4748]: I0320 10:56:02.659144 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8z2f7\" (UniqueName: \"kubernetes.io/projected/eb7e7326-cf77-4cb1-93a5-f463fb86b184-kube-api-access-8z2f7\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:02 crc kubenswrapper[4748]: I0320 10:56:02.659168 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rjv9q\" (UniqueName: \"kubernetes.io/projected/2ed8081e-cf8f-41b3-829c-a65608af1644-kube-api-access-rjv9q\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:02 crc kubenswrapper[4748]: I0320 10:56:02.659184 4748 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2ed8081e-cf8f-41b3-829c-a65608af1644-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:02 crc kubenswrapper[4748]: I0320 10:56:02.659199 4748 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eb7e7326-cf77-4cb1-93a5-f463fb86b184-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:02 crc kubenswrapper[4748]: I0320 10:56:02.659197 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ebd3eb4a-6f8b-4ea0-b44d-e279e0c8331f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ebd3eb4a-6f8b-4ea0-b44d-e279e0c8331f" (UID: "ebd3eb4a-6f8b-4ea0-b44d-e279e0c8331f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:02 crc kubenswrapper[4748]: I0320 10:56:02.659678 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a2435374-d4a6-4d1d-81ab-ab6bc61ae023-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a2435374-d4a6-4d1d-81ab-ab6bc61ae023" (UID: "a2435374-d4a6-4d1d-81ab-ab6bc61ae023"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:02 crc kubenswrapper[4748]: I0320 10:56:02.659731 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b730a50-8cb3-43e9-af09-5bf38f7cfe3f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2b730a50-8cb3-43e9-af09-5bf38f7cfe3f" (UID: "2b730a50-8cb3-43e9-af09-5bf38f7cfe3f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:02 crc kubenswrapper[4748]: I0320 10:56:02.660218 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df4bc055-0cec-45a8-b6ef-a3b1b8a3b5be-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "df4bc055-0cec-45a8-b6ef-a3b1b8a3b5be" (UID: "df4bc055-0cec-45a8-b6ef-a3b1b8a3b5be"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:02 crc kubenswrapper[4748]: I0320 10:56:02.662410 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df4bc055-0cec-45a8-b6ef-a3b1b8a3b5be-kube-api-access-mqldq" (OuterVolumeSpecName: "kube-api-access-mqldq") pod "df4bc055-0cec-45a8-b6ef-a3b1b8a3b5be" (UID: "df4bc055-0cec-45a8-b6ef-a3b1b8a3b5be"). InnerVolumeSpecName "kube-api-access-mqldq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:02 crc kubenswrapper[4748]: I0320 10:56:02.669521 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b730a50-8cb3-43e9-af09-5bf38f7cfe3f-kube-api-access-9zrvj" (OuterVolumeSpecName: "kube-api-access-9zrvj") pod "2b730a50-8cb3-43e9-af09-5bf38f7cfe3f" (UID: "2b730a50-8cb3-43e9-af09-5bf38f7cfe3f"). InnerVolumeSpecName "kube-api-access-9zrvj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:02 crc kubenswrapper[4748]: I0320 10:56:02.669594 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ebd3eb4a-6f8b-4ea0-b44d-e279e0c8331f-kube-api-access-z5fw4" (OuterVolumeSpecName: "kube-api-access-z5fw4") pod "ebd3eb4a-6f8b-4ea0-b44d-e279e0c8331f" (UID: "ebd3eb4a-6f8b-4ea0-b44d-e279e0c8331f"). InnerVolumeSpecName "kube-api-access-z5fw4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:02 crc kubenswrapper[4748]: I0320 10:56:02.669938 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2435374-d4a6-4d1d-81ab-ab6bc61ae023-kube-api-access-89rs4" (OuterVolumeSpecName: "kube-api-access-89rs4") pod "a2435374-d4a6-4d1d-81ab-ab6bc61ae023" (UID: "a2435374-d4a6-4d1d-81ab-ab6bc61ae023"). InnerVolumeSpecName "kube-api-access-89rs4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:02 crc kubenswrapper[4748]: I0320 10:56:02.757876 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566736-zpttt"] Mar 20 10:56:02 crc kubenswrapper[4748]: I0320 10:56:02.760263 4748 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2b730a50-8cb3-43e9-af09-5bf38f7cfe3f-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:02 crc kubenswrapper[4748]: I0320 10:56:02.760319 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mqldq\" (UniqueName: \"kubernetes.io/projected/df4bc055-0cec-45a8-b6ef-a3b1b8a3b5be-kube-api-access-mqldq\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:02 crc kubenswrapper[4748]: I0320 10:56:02.760333 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9zrvj\" (UniqueName: \"kubernetes.io/projected/2b730a50-8cb3-43e9-af09-5bf38f7cfe3f-kube-api-access-9zrvj\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:02 crc kubenswrapper[4748]: I0320 10:56:02.760345 4748 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a2435374-d4a6-4d1d-81ab-ab6bc61ae023-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:02 crc kubenswrapper[4748]: I0320 10:56:02.760353 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-89rs4\" (UniqueName: \"kubernetes.io/projected/a2435374-d4a6-4d1d-81ab-ab6bc61ae023-kube-api-access-89rs4\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:02 crc kubenswrapper[4748]: I0320 10:56:02.760362 4748 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ebd3eb4a-6f8b-4ea0-b44d-e279e0c8331f-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:02 crc kubenswrapper[4748]: I0320 10:56:02.760372 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z5fw4\" (UniqueName: \"kubernetes.io/projected/ebd3eb4a-6f8b-4ea0-b44d-e279e0c8331f-kube-api-access-z5fw4\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:02 crc kubenswrapper[4748]: I0320 10:56:02.760381 4748 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/df4bc055-0cec-45a8-b6ef-a3b1b8a3b5be-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:03 crc kubenswrapper[4748]: I0320 10:56:03.378272 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566736-zpttt" event={"ID":"1de2aa6f-ec78-4ec8-bc9d-f903756a3fd3","Type":"ContainerStarted","Data":"a8b13a193335f88ff1d2e7a03b4af303069e3593ee8e50e13bc0d62870bf8b84"} Mar 20 10:56:03 crc kubenswrapper[4748]: I0320 10:56:03.380607 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-kctrc" event={"ID":"3ab07a02-a8eb-4d81-a8e0-e8999fbbca4a","Type":"ContainerStarted","Data":"2f176c006cdeb4d99ef251284203bab1e079b75677e00e6eda348f7000787342"} Mar 20 10:56:03 crc kubenswrapper[4748]: I0320 10:56:03.383109 4748 generic.go:334] "Generic (PLEG): container finished" podID="dfabafe6-f3da-44d2-bb13-cc39bffc6dbd" containerID="b8f75909bd9bc56662a6978c2f85bf27abd90e98a29d9195c5fe14dc0b753532" exitCode=0 Mar 20 10:56:03 crc kubenswrapper[4748]: I0320 10:56:03.383168 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-t76rv" event={"ID":"dfabafe6-f3da-44d2-bb13-cc39bffc6dbd","Type":"ContainerDied","Data":"b8f75909bd9bc56662a6978c2f85bf27abd90e98a29d9195c5fe14dc0b753532"} Mar 20 10:56:03 crc kubenswrapper[4748]: I0320 10:56:03.383247 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-beb0-account-create-update-ls746" Mar 20 10:56:03 crc kubenswrapper[4748]: I0320 10:56:03.383267 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-qnhjl" Mar 20 10:56:03 crc kubenswrapper[4748]: I0320 10:56:03.383258 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-znvbx" Mar 20 10:56:03 crc kubenswrapper[4748]: I0320 10:56:03.383372 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-2hr4g" Mar 20 10:56:03 crc kubenswrapper[4748]: I0320 10:56:03.383561 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f74c-account-create-update-s4q4j" Mar 20 10:56:03 crc kubenswrapper[4748]: I0320 10:56:03.383662 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-edd6-account-create-update-v5v7v" Mar 20 10:56:03 crc kubenswrapper[4748]: I0320 10:56:03.402578 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-kctrc" podStartSLOduration=2.7588344989999998 podStartE2EDuration="12.402556063s" podCreationTimestamp="2026-03-20 10:55:51 +0000 UTC" firstStartedPulling="2026-03-20 10:55:52.689534516 +0000 UTC m=+1187.831080320" lastFinishedPulling="2026-03-20 10:56:02.33325607 +0000 UTC m=+1197.474801884" observedRunningTime="2026-03-20 10:56:03.398594264 +0000 UTC m=+1198.540140098" watchObservedRunningTime="2026-03-20 10:56:03.402556063 +0000 UTC m=+1198.544101887" Mar 20 10:56:04 crc kubenswrapper[4748]: I0320 10:56:04.396477 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7753c601-7739-4165-b5f2-a673b0797334","Type":"ContainerStarted","Data":"1eb9e5fbd18fddb21dfe423b70aa92a4d5cc29392d2d93a1bda1c9bbda68e35f"} Mar 20 10:56:04 crc kubenswrapper[4748]: I0320 10:56:04.397133 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7753c601-7739-4165-b5f2-a673b0797334","Type":"ContainerStarted","Data":"966caf677f15a624939e7eeedd11b668d27cf83abbba357f9324bb5176c89f12"} Mar 20 10:56:04 crc kubenswrapper[4748]: I0320 10:56:04.397149 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7753c601-7739-4165-b5f2-a673b0797334","Type":"ContainerStarted","Data":"b295017c0381d184594fce61c0b9511f16379f69f357eadb90551b9e30e42985"} Mar 20 10:56:05 crc kubenswrapper[4748]: I0320 10:56:05.428452 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7753c601-7739-4165-b5f2-a673b0797334","Type":"ContainerStarted","Data":"477a70ee5bd6b48ca3fb8ad49abb0de52b75d9634cae646a1d7f6ffd924ea127"} Mar 20 10:56:05 crc kubenswrapper[4748]: I0320 10:56:05.432159 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-t76rv" event={"ID":"dfabafe6-f3da-44d2-bb13-cc39bffc6dbd","Type":"ContainerDied","Data":"9fe3147ab0152b5b9c83f5d3c060b48df0756a6ba68993d41cbb466ea2dfb079"} Mar 20 10:56:05 crc kubenswrapper[4748]: I0320 10:56:05.432208 4748 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9fe3147ab0152b5b9c83f5d3c060b48df0756a6ba68993d41cbb466ea2dfb079" Mar 20 10:56:05 crc kubenswrapper[4748]: I0320 10:56:05.476618 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-t76rv" Mar 20 10:56:05 crc kubenswrapper[4748]: I0320 10:56:05.515637 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfabafe6-f3da-44d2-bb13-cc39bffc6dbd-config-data\") pod \"dfabafe6-f3da-44d2-bb13-cc39bffc6dbd\" (UID: \"dfabafe6-f3da-44d2-bb13-cc39bffc6dbd\") " Mar 20 10:56:05 crc kubenswrapper[4748]: I0320 10:56:05.515784 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfabafe6-f3da-44d2-bb13-cc39bffc6dbd-combined-ca-bundle\") pod \"dfabafe6-f3da-44d2-bb13-cc39bffc6dbd\" (UID: \"dfabafe6-f3da-44d2-bb13-cc39bffc6dbd\") " Mar 20 10:56:05 crc kubenswrapper[4748]: I0320 10:56:05.515861 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mbdvq\" (UniqueName: \"kubernetes.io/projected/dfabafe6-f3da-44d2-bb13-cc39bffc6dbd-kube-api-access-mbdvq\") pod \"dfabafe6-f3da-44d2-bb13-cc39bffc6dbd\" (UID: \"dfabafe6-f3da-44d2-bb13-cc39bffc6dbd\") " Mar 20 10:56:05 crc kubenswrapper[4748]: I0320 10:56:05.515963 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/dfabafe6-f3da-44d2-bb13-cc39bffc6dbd-db-sync-config-data\") pod \"dfabafe6-f3da-44d2-bb13-cc39bffc6dbd\" (UID: \"dfabafe6-f3da-44d2-bb13-cc39bffc6dbd\") " Mar 20 10:56:05 crc kubenswrapper[4748]: I0320 10:56:05.524288 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dfabafe6-f3da-44d2-bb13-cc39bffc6dbd-kube-api-access-mbdvq" (OuterVolumeSpecName: "kube-api-access-mbdvq") pod "dfabafe6-f3da-44d2-bb13-cc39bffc6dbd" (UID: "dfabafe6-f3da-44d2-bb13-cc39bffc6dbd"). InnerVolumeSpecName "kube-api-access-mbdvq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:05 crc kubenswrapper[4748]: I0320 10:56:05.541200 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfabafe6-f3da-44d2-bb13-cc39bffc6dbd-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "dfabafe6-f3da-44d2-bb13-cc39bffc6dbd" (UID: "dfabafe6-f3da-44d2-bb13-cc39bffc6dbd"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:05 crc kubenswrapper[4748]: I0320 10:56:05.553996 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfabafe6-f3da-44d2-bb13-cc39bffc6dbd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dfabafe6-f3da-44d2-bb13-cc39bffc6dbd" (UID: "dfabafe6-f3da-44d2-bb13-cc39bffc6dbd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:05 crc kubenswrapper[4748]: I0320 10:56:05.567206 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfabafe6-f3da-44d2-bb13-cc39bffc6dbd-config-data" (OuterVolumeSpecName: "config-data") pod "dfabafe6-f3da-44d2-bb13-cc39bffc6dbd" (UID: "dfabafe6-f3da-44d2-bb13-cc39bffc6dbd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:05 crc kubenswrapper[4748]: I0320 10:56:05.617674 4748 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfabafe6-f3da-44d2-bb13-cc39bffc6dbd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:05 crc kubenswrapper[4748]: I0320 10:56:05.617709 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mbdvq\" (UniqueName: \"kubernetes.io/projected/dfabafe6-f3da-44d2-bb13-cc39bffc6dbd-kube-api-access-mbdvq\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:05 crc kubenswrapper[4748]: I0320 10:56:05.617723 4748 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/dfabafe6-f3da-44d2-bb13-cc39bffc6dbd-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:05 crc kubenswrapper[4748]: I0320 10:56:05.617737 4748 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfabafe6-f3da-44d2-bb13-cc39bffc6dbd-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:06 crc kubenswrapper[4748]: I0320 10:56:06.447140 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566736-zpttt" event={"ID":"1de2aa6f-ec78-4ec8-bc9d-f903756a3fd3","Type":"ContainerStarted","Data":"e587cab5b997726602fd7a4fea64c3dd529dc0c8b7bed010dfa12a6999b395b7"} Mar 20 10:56:06 crc kubenswrapper[4748]: I0320 10:56:06.447169 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-t76rv" Mar 20 10:56:06 crc kubenswrapper[4748]: I0320 10:56:06.811500 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-67n95"] Mar 20 10:56:06 crc kubenswrapper[4748]: E0320 10:56:06.812241 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b730a50-8cb3-43e9-af09-5bf38f7cfe3f" containerName="mariadb-account-create-update" Mar 20 10:56:06 crc kubenswrapper[4748]: I0320 10:56:06.812260 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b730a50-8cb3-43e9-af09-5bf38f7cfe3f" containerName="mariadb-account-create-update" Mar 20 10:56:06 crc kubenswrapper[4748]: E0320 10:56:06.812277 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfabafe6-f3da-44d2-bb13-cc39bffc6dbd" containerName="glance-db-sync" Mar 20 10:56:06 crc kubenswrapper[4748]: I0320 10:56:06.812283 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfabafe6-f3da-44d2-bb13-cc39bffc6dbd" containerName="glance-db-sync" Mar 20 10:56:06 crc kubenswrapper[4748]: E0320 10:56:06.812295 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb7e7326-cf77-4cb1-93a5-f463fb86b184" containerName="mariadb-account-create-update" Mar 20 10:56:06 crc kubenswrapper[4748]: I0320 10:56:06.812301 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb7e7326-cf77-4cb1-93a5-f463fb86b184" containerName="mariadb-account-create-update" Mar 20 10:56:06 crc kubenswrapper[4748]: E0320 10:56:06.812312 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ed8081e-cf8f-41b3-829c-a65608af1644" containerName="mariadb-database-create" Mar 20 10:56:06 crc kubenswrapper[4748]: I0320 10:56:06.812318 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ed8081e-cf8f-41b3-829c-a65608af1644" containerName="mariadb-database-create" Mar 20 10:56:06 crc kubenswrapper[4748]: E0320 10:56:06.812340 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df4bc055-0cec-45a8-b6ef-a3b1b8a3b5be" containerName="mariadb-account-create-update" Mar 20 10:56:06 crc kubenswrapper[4748]: I0320 10:56:06.812345 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="df4bc055-0cec-45a8-b6ef-a3b1b8a3b5be" containerName="mariadb-account-create-update" Mar 20 10:56:06 crc kubenswrapper[4748]: E0320 10:56:06.812357 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2435374-d4a6-4d1d-81ab-ab6bc61ae023" containerName="mariadb-database-create" Mar 20 10:56:06 crc kubenswrapper[4748]: I0320 10:56:06.812362 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2435374-d4a6-4d1d-81ab-ab6bc61ae023" containerName="mariadb-database-create" Mar 20 10:56:06 crc kubenswrapper[4748]: E0320 10:56:06.812370 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebd3eb4a-6f8b-4ea0-b44d-e279e0c8331f" containerName="mariadb-database-create" Mar 20 10:56:06 crc kubenswrapper[4748]: I0320 10:56:06.812377 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebd3eb4a-6f8b-4ea0-b44d-e279e0c8331f" containerName="mariadb-database-create" Mar 20 10:56:06 crc kubenswrapper[4748]: I0320 10:56:06.812526 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb7e7326-cf77-4cb1-93a5-f463fb86b184" containerName="mariadb-account-create-update" Mar 20 10:56:06 crc kubenswrapper[4748]: I0320 10:56:06.812538 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b730a50-8cb3-43e9-af09-5bf38f7cfe3f" containerName="mariadb-account-create-update" Mar 20 10:56:06 crc kubenswrapper[4748]: I0320 10:56:06.812549 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="ebd3eb4a-6f8b-4ea0-b44d-e279e0c8331f" containerName="mariadb-database-create" Mar 20 10:56:06 crc kubenswrapper[4748]: I0320 10:56:06.812555 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ed8081e-cf8f-41b3-829c-a65608af1644" containerName="mariadb-database-create" Mar 20 10:56:06 crc kubenswrapper[4748]: I0320 10:56:06.812565 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2435374-d4a6-4d1d-81ab-ab6bc61ae023" containerName="mariadb-database-create" Mar 20 10:56:06 crc kubenswrapper[4748]: I0320 10:56:06.812573 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="dfabafe6-f3da-44d2-bb13-cc39bffc6dbd" containerName="glance-db-sync" Mar 20 10:56:06 crc kubenswrapper[4748]: I0320 10:56:06.812584 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="df4bc055-0cec-45a8-b6ef-a3b1b8a3b5be" containerName="mariadb-account-create-update" Mar 20 10:56:06 crc kubenswrapper[4748]: I0320 10:56:06.813482 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b946c75cc-67n95" Mar 20 10:56:06 crc kubenswrapper[4748]: I0320 10:56:06.835059 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lz55l\" (UniqueName: \"kubernetes.io/projected/c5e2826a-d87e-4002-bece-b3eed4990771-kube-api-access-lz55l\") pod \"dnsmasq-dns-5b946c75cc-67n95\" (UID: \"c5e2826a-d87e-4002-bece-b3eed4990771\") " pod="openstack/dnsmasq-dns-5b946c75cc-67n95" Mar 20 10:56:06 crc kubenswrapper[4748]: I0320 10:56:06.835123 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c5e2826a-d87e-4002-bece-b3eed4990771-ovsdbserver-sb\") pod \"dnsmasq-dns-5b946c75cc-67n95\" (UID: \"c5e2826a-d87e-4002-bece-b3eed4990771\") " pod="openstack/dnsmasq-dns-5b946c75cc-67n95" Mar 20 10:56:06 crc kubenswrapper[4748]: I0320 10:56:06.835150 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5e2826a-d87e-4002-bece-b3eed4990771-config\") pod \"dnsmasq-dns-5b946c75cc-67n95\" (UID: \"c5e2826a-d87e-4002-bece-b3eed4990771\") " pod="openstack/dnsmasq-dns-5b946c75cc-67n95" Mar 20 10:56:06 crc kubenswrapper[4748]: I0320 10:56:06.835183 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c5e2826a-d87e-4002-bece-b3eed4990771-ovsdbserver-nb\") pod \"dnsmasq-dns-5b946c75cc-67n95\" (UID: \"c5e2826a-d87e-4002-bece-b3eed4990771\") " pod="openstack/dnsmasq-dns-5b946c75cc-67n95" Mar 20 10:56:06 crc kubenswrapper[4748]: I0320 10:56:06.835248 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c5e2826a-d87e-4002-bece-b3eed4990771-dns-svc\") pod \"dnsmasq-dns-5b946c75cc-67n95\" (UID: \"c5e2826a-d87e-4002-bece-b3eed4990771\") " pod="openstack/dnsmasq-dns-5b946c75cc-67n95" Mar 20 10:56:06 crc kubenswrapper[4748]: I0320 10:56:06.841287 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-67n95"] Mar 20 10:56:06 crc kubenswrapper[4748]: I0320 10:56:06.936907 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5e2826a-d87e-4002-bece-b3eed4990771-config\") pod \"dnsmasq-dns-5b946c75cc-67n95\" (UID: \"c5e2826a-d87e-4002-bece-b3eed4990771\") " pod="openstack/dnsmasq-dns-5b946c75cc-67n95" Mar 20 10:56:06 crc kubenswrapper[4748]: I0320 10:56:06.937007 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c5e2826a-d87e-4002-bece-b3eed4990771-ovsdbserver-nb\") pod \"dnsmasq-dns-5b946c75cc-67n95\" (UID: \"c5e2826a-d87e-4002-bece-b3eed4990771\") " pod="openstack/dnsmasq-dns-5b946c75cc-67n95" Mar 20 10:56:06 crc kubenswrapper[4748]: I0320 10:56:06.937163 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c5e2826a-d87e-4002-bece-b3eed4990771-dns-svc\") pod \"dnsmasq-dns-5b946c75cc-67n95\" (UID: \"c5e2826a-d87e-4002-bece-b3eed4990771\") " pod="openstack/dnsmasq-dns-5b946c75cc-67n95" Mar 20 10:56:06 crc kubenswrapper[4748]: I0320 10:56:06.937223 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lz55l\" (UniqueName: \"kubernetes.io/projected/c5e2826a-d87e-4002-bece-b3eed4990771-kube-api-access-lz55l\") pod \"dnsmasq-dns-5b946c75cc-67n95\" (UID: \"c5e2826a-d87e-4002-bece-b3eed4990771\") " pod="openstack/dnsmasq-dns-5b946c75cc-67n95" Mar 20 10:56:06 crc kubenswrapper[4748]: I0320 10:56:06.937272 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c5e2826a-d87e-4002-bece-b3eed4990771-ovsdbserver-sb\") pod \"dnsmasq-dns-5b946c75cc-67n95\" (UID: \"c5e2826a-d87e-4002-bece-b3eed4990771\") " pod="openstack/dnsmasq-dns-5b946c75cc-67n95" Mar 20 10:56:06 crc kubenswrapper[4748]: I0320 10:56:06.937950 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5e2826a-d87e-4002-bece-b3eed4990771-config\") pod \"dnsmasq-dns-5b946c75cc-67n95\" (UID: \"c5e2826a-d87e-4002-bece-b3eed4990771\") " pod="openstack/dnsmasq-dns-5b946c75cc-67n95" Mar 20 10:56:06 crc kubenswrapper[4748]: I0320 10:56:06.938269 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c5e2826a-d87e-4002-bece-b3eed4990771-dns-svc\") pod \"dnsmasq-dns-5b946c75cc-67n95\" (UID: \"c5e2826a-d87e-4002-bece-b3eed4990771\") " pod="openstack/dnsmasq-dns-5b946c75cc-67n95" Mar 20 10:56:06 crc kubenswrapper[4748]: I0320 10:56:06.938719 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c5e2826a-d87e-4002-bece-b3eed4990771-ovsdbserver-nb\") pod \"dnsmasq-dns-5b946c75cc-67n95\" (UID: \"c5e2826a-d87e-4002-bece-b3eed4990771\") " pod="openstack/dnsmasq-dns-5b946c75cc-67n95" Mar 20 10:56:06 crc kubenswrapper[4748]: I0320 10:56:06.938977 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c5e2826a-d87e-4002-bece-b3eed4990771-ovsdbserver-sb\") pod \"dnsmasq-dns-5b946c75cc-67n95\" (UID: \"c5e2826a-d87e-4002-bece-b3eed4990771\") " pod="openstack/dnsmasq-dns-5b946c75cc-67n95" Mar 20 10:56:06 crc kubenswrapper[4748]: I0320 10:56:06.964564 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lz55l\" (UniqueName: \"kubernetes.io/projected/c5e2826a-d87e-4002-bece-b3eed4990771-kube-api-access-lz55l\") pod \"dnsmasq-dns-5b946c75cc-67n95\" (UID: \"c5e2826a-d87e-4002-bece-b3eed4990771\") " pod="openstack/dnsmasq-dns-5b946c75cc-67n95" Mar 20 10:56:07 crc kubenswrapper[4748]: I0320 10:56:07.131854 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b946c75cc-67n95" Mar 20 10:56:07 crc kubenswrapper[4748]: I0320 10:56:07.460298 4748 generic.go:334] "Generic (PLEG): container finished" podID="1de2aa6f-ec78-4ec8-bc9d-f903756a3fd3" containerID="e587cab5b997726602fd7a4fea64c3dd529dc0c8b7bed010dfa12a6999b395b7" exitCode=0 Mar 20 10:56:07 crc kubenswrapper[4748]: I0320 10:56:07.460425 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566736-zpttt" event={"ID":"1de2aa6f-ec78-4ec8-bc9d-f903756a3fd3","Type":"ContainerDied","Data":"e587cab5b997726602fd7a4fea64c3dd529dc0c8b7bed010dfa12a6999b395b7"} Mar 20 10:56:07 crc kubenswrapper[4748]: I0320 10:56:07.835643 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-67n95"] Mar 20 10:56:08 crc kubenswrapper[4748]: I0320 10:56:08.481405 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7753c601-7739-4165-b5f2-a673b0797334","Type":"ContainerStarted","Data":"291238f21819abbe4e3cbe813ec7120bc1022e36d2bb2271adc4f6a9e5f8394f"} Mar 20 10:56:08 crc kubenswrapper[4748]: I0320 10:56:08.481874 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7753c601-7739-4165-b5f2-a673b0797334","Type":"ContainerStarted","Data":"4d59073542643ec1a62c8eb1ed38fea0036af08bcf7710d8a077215109f59120"} Mar 20 10:56:08 crc kubenswrapper[4748]: I0320 10:56:08.481891 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7753c601-7739-4165-b5f2-a673b0797334","Type":"ContainerStarted","Data":"fc298d699f3a86df3f7e9661ab653cd0b84c431642b5c5b770524ace843557e2"} Mar 20 10:56:08 crc kubenswrapper[4748]: I0320 10:56:08.481901 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7753c601-7739-4165-b5f2-a673b0797334","Type":"ContainerStarted","Data":"c14bfc0aa1843fbdc37a851a193e2a509ba5fcce8e217b939c3897ade95062fb"} Mar 20 10:56:08 crc kubenswrapper[4748]: I0320 10:56:08.484344 4748 generic.go:334] "Generic (PLEG): container finished" podID="c5e2826a-d87e-4002-bece-b3eed4990771" containerID="e197470b807c4b52cd627816c1acbe9e1159bd716fce0a771da472c164383599" exitCode=0 Mar 20 10:56:08 crc kubenswrapper[4748]: I0320 10:56:08.485000 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b946c75cc-67n95" event={"ID":"c5e2826a-d87e-4002-bece-b3eed4990771","Type":"ContainerDied","Data":"e197470b807c4b52cd627816c1acbe9e1159bd716fce0a771da472c164383599"} Mar 20 10:56:08 crc kubenswrapper[4748]: I0320 10:56:08.485091 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b946c75cc-67n95" event={"ID":"c5e2826a-d87e-4002-bece-b3eed4990771","Type":"ContainerStarted","Data":"8bdfecc9dcbb210dedcfd1f67e19c1da4b9d70c765b8606a90de5aa699670bba"} Mar 20 10:56:09 crc kubenswrapper[4748]: I0320 10:56:09.009282 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566736-zpttt" Mar 20 10:56:09 crc kubenswrapper[4748]: I0320 10:56:09.195655 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hd4xg\" (UniqueName: \"kubernetes.io/projected/1de2aa6f-ec78-4ec8-bc9d-f903756a3fd3-kube-api-access-hd4xg\") pod \"1de2aa6f-ec78-4ec8-bc9d-f903756a3fd3\" (UID: \"1de2aa6f-ec78-4ec8-bc9d-f903756a3fd3\") " Mar 20 10:56:09 crc kubenswrapper[4748]: I0320 10:56:09.201962 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1de2aa6f-ec78-4ec8-bc9d-f903756a3fd3-kube-api-access-hd4xg" (OuterVolumeSpecName: "kube-api-access-hd4xg") pod "1de2aa6f-ec78-4ec8-bc9d-f903756a3fd3" (UID: "1de2aa6f-ec78-4ec8-bc9d-f903756a3fd3"). InnerVolumeSpecName "kube-api-access-hd4xg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:09 crc kubenswrapper[4748]: I0320 10:56:09.297989 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hd4xg\" (UniqueName: \"kubernetes.io/projected/1de2aa6f-ec78-4ec8-bc9d-f903756a3fd3-kube-api-access-hd4xg\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:09 crc kubenswrapper[4748]: I0320 10:56:09.492850 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566736-zpttt" event={"ID":"1de2aa6f-ec78-4ec8-bc9d-f903756a3fd3","Type":"ContainerDied","Data":"a8b13a193335f88ff1d2e7a03b4af303069e3593ee8e50e13bc0d62870bf8b84"} Mar 20 10:56:09 crc kubenswrapper[4748]: I0320 10:56:09.492903 4748 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a8b13a193335f88ff1d2e7a03b4af303069e3593ee8e50e13bc0d62870bf8b84" Mar 20 10:56:09 crc kubenswrapper[4748]: I0320 10:56:09.492910 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566736-zpttt" Mar 20 10:56:09 crc kubenswrapper[4748]: I0320 10:56:09.494620 4748 generic.go:334] "Generic (PLEG): container finished" podID="3ab07a02-a8eb-4d81-a8e0-e8999fbbca4a" containerID="2f176c006cdeb4d99ef251284203bab1e079b75677e00e6eda348f7000787342" exitCode=0 Mar 20 10:56:09 crc kubenswrapper[4748]: I0320 10:56:09.494678 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-kctrc" event={"ID":"3ab07a02-a8eb-4d81-a8e0-e8999fbbca4a","Type":"ContainerDied","Data":"2f176c006cdeb4d99ef251284203bab1e079b75677e00e6eda348f7000787342"} Mar 20 10:56:09 crc kubenswrapper[4748]: I0320 10:56:09.501093 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7753c601-7739-4165-b5f2-a673b0797334","Type":"ContainerStarted","Data":"34114e484767c94d71fd7a9c968684e10d9ba7d310ac01f19636fdf70aba6947"} Mar 20 10:56:09 crc kubenswrapper[4748]: I0320 10:56:09.501120 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7753c601-7739-4165-b5f2-a673b0797334","Type":"ContainerStarted","Data":"be9d800e3a66581319191d8c74ddb7f3758d8ee72903b833bf6889425c689808"} Mar 20 10:56:09 crc kubenswrapper[4748]: I0320 10:56:09.501131 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7753c601-7739-4165-b5f2-a673b0797334","Type":"ContainerStarted","Data":"b65a4447a1077414e12f6f041b355473294945455ea9ae4ba51ceea75c663fca"} Mar 20 10:56:09 crc kubenswrapper[4748]: I0320 10:56:09.506361 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b946c75cc-67n95" event={"ID":"c5e2826a-d87e-4002-bece-b3eed4990771","Type":"ContainerStarted","Data":"0e436915fb6c6bf3dc203d98ac633ed1fd433ebcd13e6da71041d12969b3f57b"} Mar 20 10:56:09 crc kubenswrapper[4748]: I0320 10:56:09.506641 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5b946c75cc-67n95" Mar 20 10:56:09 crc kubenswrapper[4748]: I0320 10:56:09.591307 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5b946c75cc-67n95" podStartSLOduration=3.591282895 podStartE2EDuration="3.591282895s" podCreationTimestamp="2026-03-20 10:56:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:56:09.590946606 +0000 UTC m=+1204.732492430" watchObservedRunningTime="2026-03-20 10:56:09.591282895 +0000 UTC m=+1204.732828709" Mar 20 10:56:09 crc kubenswrapper[4748]: I0320 10:56:09.593671 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=35.89498219 podStartE2EDuration="54.593660275s" podCreationTimestamp="2026-03-20 10:55:15 +0000 UTC" firstStartedPulling="2026-03-20 10:55:48.832122546 +0000 UTC m=+1183.973668360" lastFinishedPulling="2026-03-20 10:56:07.530800631 +0000 UTC m=+1202.672346445" observedRunningTime="2026-03-20 10:56:09.573264342 +0000 UTC m=+1204.714810156" watchObservedRunningTime="2026-03-20 10:56:09.593660275 +0000 UTC m=+1204.735206079" Mar 20 10:56:09 crc kubenswrapper[4748]: I0320 10:56:09.840649 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-67n95"] Mar 20 10:56:09 crc kubenswrapper[4748]: I0320 10:56:09.873816 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-qpm57"] Mar 20 10:56:09 crc kubenswrapper[4748]: E0320 10:56:09.874283 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1de2aa6f-ec78-4ec8-bc9d-f903756a3fd3" containerName="oc" Mar 20 10:56:09 crc kubenswrapper[4748]: I0320 10:56:09.874351 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="1de2aa6f-ec78-4ec8-bc9d-f903756a3fd3" containerName="oc" Mar 20 10:56:09 crc kubenswrapper[4748]: I0320 10:56:09.874610 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="1de2aa6f-ec78-4ec8-bc9d-f903756a3fd3" containerName="oc" Mar 20 10:56:09 crc kubenswrapper[4748]: I0320 10:56:09.875728 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-qpm57" Mar 20 10:56:09 crc kubenswrapper[4748]: I0320 10:56:09.880793 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Mar 20 10:56:09 crc kubenswrapper[4748]: I0320 10:56:09.898980 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-qpm57"] Mar 20 10:56:10 crc kubenswrapper[4748]: I0320 10:56:10.008637 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/543b35d5-7087-4a09-bbfb-6937b3765855-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6bcbc87-qpm57\" (UID: \"543b35d5-7087-4a09-bbfb-6937b3765855\") " pod="openstack/dnsmasq-dns-74f6bcbc87-qpm57" Mar 20 10:56:10 crc kubenswrapper[4748]: I0320 10:56:10.008710 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/543b35d5-7087-4a09-bbfb-6937b3765855-dns-svc\") pod \"dnsmasq-dns-74f6bcbc87-qpm57\" (UID: \"543b35d5-7087-4a09-bbfb-6937b3765855\") " pod="openstack/dnsmasq-dns-74f6bcbc87-qpm57" Mar 20 10:56:10 crc kubenswrapper[4748]: I0320 10:56:10.008752 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/543b35d5-7087-4a09-bbfb-6937b3765855-dns-swift-storage-0\") pod \"dnsmasq-dns-74f6bcbc87-qpm57\" (UID: \"543b35d5-7087-4a09-bbfb-6937b3765855\") " pod="openstack/dnsmasq-dns-74f6bcbc87-qpm57" Mar 20 10:56:10 crc kubenswrapper[4748]: I0320 10:56:10.008799 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4g8tb\" (UniqueName: \"kubernetes.io/projected/543b35d5-7087-4a09-bbfb-6937b3765855-kube-api-access-4g8tb\") pod \"dnsmasq-dns-74f6bcbc87-qpm57\" (UID: \"543b35d5-7087-4a09-bbfb-6937b3765855\") " pod="openstack/dnsmasq-dns-74f6bcbc87-qpm57" Mar 20 10:56:10 crc kubenswrapper[4748]: I0320 10:56:10.008969 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/543b35d5-7087-4a09-bbfb-6937b3765855-config\") pod \"dnsmasq-dns-74f6bcbc87-qpm57\" (UID: \"543b35d5-7087-4a09-bbfb-6937b3765855\") " pod="openstack/dnsmasq-dns-74f6bcbc87-qpm57" Mar 20 10:56:10 crc kubenswrapper[4748]: I0320 10:56:10.009025 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/543b35d5-7087-4a09-bbfb-6937b3765855-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6bcbc87-qpm57\" (UID: \"543b35d5-7087-4a09-bbfb-6937b3765855\") " pod="openstack/dnsmasq-dns-74f6bcbc87-qpm57" Mar 20 10:56:10 crc kubenswrapper[4748]: I0320 10:56:10.082981 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566730-gq92r"] Mar 20 10:56:10 crc kubenswrapper[4748]: I0320 10:56:10.091073 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566730-gq92r"] Mar 20 10:56:10 crc kubenswrapper[4748]: I0320 10:56:10.110103 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/543b35d5-7087-4a09-bbfb-6937b3765855-config\") pod \"dnsmasq-dns-74f6bcbc87-qpm57\" (UID: \"543b35d5-7087-4a09-bbfb-6937b3765855\") " pod="openstack/dnsmasq-dns-74f6bcbc87-qpm57" Mar 20 10:56:10 crc kubenswrapper[4748]: I0320 10:56:10.110192 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/543b35d5-7087-4a09-bbfb-6937b3765855-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6bcbc87-qpm57\" (UID: \"543b35d5-7087-4a09-bbfb-6937b3765855\") " pod="openstack/dnsmasq-dns-74f6bcbc87-qpm57" Mar 20 10:56:10 crc kubenswrapper[4748]: I0320 10:56:10.110238 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/543b35d5-7087-4a09-bbfb-6937b3765855-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6bcbc87-qpm57\" (UID: \"543b35d5-7087-4a09-bbfb-6937b3765855\") " pod="openstack/dnsmasq-dns-74f6bcbc87-qpm57" Mar 20 10:56:10 crc kubenswrapper[4748]: I0320 10:56:10.110261 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/543b35d5-7087-4a09-bbfb-6937b3765855-dns-svc\") pod \"dnsmasq-dns-74f6bcbc87-qpm57\" (UID: \"543b35d5-7087-4a09-bbfb-6937b3765855\") " pod="openstack/dnsmasq-dns-74f6bcbc87-qpm57" Mar 20 10:56:10 crc kubenswrapper[4748]: I0320 10:56:10.110295 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/543b35d5-7087-4a09-bbfb-6937b3765855-dns-swift-storage-0\") pod \"dnsmasq-dns-74f6bcbc87-qpm57\" (UID: \"543b35d5-7087-4a09-bbfb-6937b3765855\") " pod="openstack/dnsmasq-dns-74f6bcbc87-qpm57" Mar 20 10:56:10 crc kubenswrapper[4748]: I0320 10:56:10.110331 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4g8tb\" (UniqueName: \"kubernetes.io/projected/543b35d5-7087-4a09-bbfb-6937b3765855-kube-api-access-4g8tb\") pod \"dnsmasq-dns-74f6bcbc87-qpm57\" (UID: \"543b35d5-7087-4a09-bbfb-6937b3765855\") " pod="openstack/dnsmasq-dns-74f6bcbc87-qpm57" Mar 20 10:56:10 crc kubenswrapper[4748]: I0320 10:56:10.111402 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/543b35d5-7087-4a09-bbfb-6937b3765855-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6bcbc87-qpm57\" (UID: \"543b35d5-7087-4a09-bbfb-6937b3765855\") " pod="openstack/dnsmasq-dns-74f6bcbc87-qpm57" Mar 20 10:56:10 crc kubenswrapper[4748]: I0320 10:56:10.111414 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/543b35d5-7087-4a09-bbfb-6937b3765855-config\") pod \"dnsmasq-dns-74f6bcbc87-qpm57\" (UID: \"543b35d5-7087-4a09-bbfb-6937b3765855\") " pod="openstack/dnsmasq-dns-74f6bcbc87-qpm57" Mar 20 10:56:10 crc kubenswrapper[4748]: I0320 10:56:10.111449 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/543b35d5-7087-4a09-bbfb-6937b3765855-dns-svc\") pod \"dnsmasq-dns-74f6bcbc87-qpm57\" (UID: \"543b35d5-7087-4a09-bbfb-6937b3765855\") " pod="openstack/dnsmasq-dns-74f6bcbc87-qpm57" Mar 20 10:56:10 crc kubenswrapper[4748]: I0320 10:56:10.111427 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/543b35d5-7087-4a09-bbfb-6937b3765855-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6bcbc87-qpm57\" (UID: \"543b35d5-7087-4a09-bbfb-6937b3765855\") " pod="openstack/dnsmasq-dns-74f6bcbc87-qpm57" Mar 20 10:56:10 crc kubenswrapper[4748]: I0320 10:56:10.111576 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/543b35d5-7087-4a09-bbfb-6937b3765855-dns-swift-storage-0\") pod \"dnsmasq-dns-74f6bcbc87-qpm57\" (UID: \"543b35d5-7087-4a09-bbfb-6937b3765855\") " pod="openstack/dnsmasq-dns-74f6bcbc87-qpm57" Mar 20 10:56:10 crc kubenswrapper[4748]: I0320 10:56:10.131400 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4g8tb\" (UniqueName: \"kubernetes.io/projected/543b35d5-7087-4a09-bbfb-6937b3765855-kube-api-access-4g8tb\") pod \"dnsmasq-dns-74f6bcbc87-qpm57\" (UID: \"543b35d5-7087-4a09-bbfb-6937b3765855\") " pod="openstack/dnsmasq-dns-74f6bcbc87-qpm57" Mar 20 10:56:10 crc kubenswrapper[4748]: I0320 10:56:10.195745 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-qpm57" Mar 20 10:56:10 crc kubenswrapper[4748]: I0320 10:56:10.675309 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-qpm57"] Mar 20 10:56:10 crc kubenswrapper[4748]: I0320 10:56:10.922008 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-kctrc" Mar 20 10:56:11 crc kubenswrapper[4748]: I0320 10:56:11.027305 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tg5hj\" (UniqueName: \"kubernetes.io/projected/3ab07a02-a8eb-4d81-a8e0-e8999fbbca4a-kube-api-access-tg5hj\") pod \"3ab07a02-a8eb-4d81-a8e0-e8999fbbca4a\" (UID: \"3ab07a02-a8eb-4d81-a8e0-e8999fbbca4a\") " Mar 20 10:56:11 crc kubenswrapper[4748]: I0320 10:56:11.027538 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ab07a02-a8eb-4d81-a8e0-e8999fbbca4a-combined-ca-bundle\") pod \"3ab07a02-a8eb-4d81-a8e0-e8999fbbca4a\" (UID: \"3ab07a02-a8eb-4d81-a8e0-e8999fbbca4a\") " Mar 20 10:56:11 crc kubenswrapper[4748]: I0320 10:56:11.027576 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ab07a02-a8eb-4d81-a8e0-e8999fbbca4a-config-data\") pod \"3ab07a02-a8eb-4d81-a8e0-e8999fbbca4a\" (UID: \"3ab07a02-a8eb-4d81-a8e0-e8999fbbca4a\") " Mar 20 10:56:11 crc kubenswrapper[4748]: I0320 10:56:11.031773 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab07a02-a8eb-4d81-a8e0-e8999fbbca4a-kube-api-access-tg5hj" (OuterVolumeSpecName: "kube-api-access-tg5hj") pod "3ab07a02-a8eb-4d81-a8e0-e8999fbbca4a" (UID: "3ab07a02-a8eb-4d81-a8e0-e8999fbbca4a"). InnerVolumeSpecName "kube-api-access-tg5hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:11 crc kubenswrapper[4748]: I0320 10:56:11.062121 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab07a02-a8eb-4d81-a8e0-e8999fbbca4a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3ab07a02-a8eb-4d81-a8e0-e8999fbbca4a" (UID: "3ab07a02-a8eb-4d81-a8e0-e8999fbbca4a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:11 crc kubenswrapper[4748]: I0320 10:56:11.082384 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab07a02-a8eb-4d81-a8e0-e8999fbbca4a-config-data" (OuterVolumeSpecName: "config-data") pod "3ab07a02-a8eb-4d81-a8e0-e8999fbbca4a" (UID: "3ab07a02-a8eb-4d81-a8e0-e8999fbbca4a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:11 crc kubenswrapper[4748]: I0320 10:56:11.129432 4748 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ab07a02-a8eb-4d81-a8e0-e8999fbbca4a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:11 crc kubenswrapper[4748]: I0320 10:56:11.129525 4748 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ab07a02-a8eb-4d81-a8e0-e8999fbbca4a-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:11 crc kubenswrapper[4748]: I0320 10:56:11.129544 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tg5hj\" (UniqueName: \"kubernetes.io/projected/3ab07a02-a8eb-4d81-a8e0-e8999fbbca4a-kube-api-access-tg5hj\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:11 crc kubenswrapper[4748]: I0320 10:56:11.527863 4748 generic.go:334] "Generic (PLEG): container finished" podID="543b35d5-7087-4a09-bbfb-6937b3765855" containerID="4f534fdbc40d88a75c92508b0831ab6db7bba17dfabbce4a88fd5ea0bf9fea07" exitCode=0 Mar 20 10:56:11 crc kubenswrapper[4748]: I0320 10:56:11.531184 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-kctrc" Mar 20 10:56:11 crc kubenswrapper[4748]: I0320 10:56:11.531322 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5b946c75cc-67n95" podUID="c5e2826a-d87e-4002-bece-b3eed4990771" containerName="dnsmasq-dns" containerID="cri-o://0e436915fb6c6bf3dc203d98ac633ed1fd433ebcd13e6da71041d12969b3f57b" gracePeriod=10 Mar 20 10:56:11 crc kubenswrapper[4748]: I0320 10:56:11.547529 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3d53081-2293-4131-bbfb-99306c7b2c0a" path="/var/lib/kubelet/pods/a3d53081-2293-4131-bbfb-99306c7b2c0a/volumes" Mar 20 10:56:11 crc kubenswrapper[4748]: I0320 10:56:11.548572 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-qpm57" event={"ID":"543b35d5-7087-4a09-bbfb-6937b3765855","Type":"ContainerDied","Data":"4f534fdbc40d88a75c92508b0831ab6db7bba17dfabbce4a88fd5ea0bf9fea07"} Mar 20 10:56:11 crc kubenswrapper[4748]: I0320 10:56:11.548614 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-qpm57" event={"ID":"543b35d5-7087-4a09-bbfb-6937b3765855","Type":"ContainerStarted","Data":"c47f490911807806df5ff33fc4949a3ab383925901b1513709f35bd46ed90c91"} Mar 20 10:56:11 crc kubenswrapper[4748]: I0320 10:56:11.548633 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-kctrc" event={"ID":"3ab07a02-a8eb-4d81-a8e0-e8999fbbca4a","Type":"ContainerDied","Data":"ae5efb3bc8ef4d37229fa3ea91538492c59f46a851b8baecc3347bf48ff86a79"} Mar 20 10:56:11 crc kubenswrapper[4748]: I0320 10:56:11.548648 4748 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ae5efb3bc8ef4d37229fa3ea91538492c59f46a851b8baecc3347bf48ff86a79" Mar 20 10:56:11 crc kubenswrapper[4748]: I0320 10:56:11.811111 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-qpm57"] Mar 20 10:56:11 crc kubenswrapper[4748]: I0320 10:56:11.829272 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-42s2j"] Mar 20 10:56:11 crc kubenswrapper[4748]: E0320 10:56:11.829674 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ab07a02-a8eb-4d81-a8e0-e8999fbbca4a" containerName="keystone-db-sync" Mar 20 10:56:11 crc kubenswrapper[4748]: I0320 10:56:11.829690 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ab07a02-a8eb-4d81-a8e0-e8999fbbca4a" containerName="keystone-db-sync" Mar 20 10:56:11 crc kubenswrapper[4748]: I0320 10:56:11.829906 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ab07a02-a8eb-4d81-a8e0-e8999fbbca4a" containerName="keystone-db-sync" Mar 20 10:56:11 crc kubenswrapper[4748]: I0320 10:56:11.830424 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-42s2j" Mar 20 10:56:11 crc kubenswrapper[4748]: I0320 10:56:11.833997 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 20 10:56:11 crc kubenswrapper[4748]: I0320 10:56:11.834189 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 20 10:56:11 crc kubenswrapper[4748]: I0320 10:56:11.836630 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 20 10:56:11 crc kubenswrapper[4748]: I0320 10:56:11.837022 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-n27fv" Mar 20 10:56:11 crc kubenswrapper[4748]: I0320 10:56:11.838420 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 20 10:56:11 crc kubenswrapper[4748]: I0320 10:56:11.855943 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-42s2j"] Mar 20 10:56:11 crc kubenswrapper[4748]: I0320 10:56:11.910708 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-8t679"] Mar 20 10:56:11 crc kubenswrapper[4748]: I0320 10:56:11.912469 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-8t679" Mar 20 10:56:11 crc kubenswrapper[4748]: I0320 10:56:11.925639 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-8t679"] Mar 20 10:56:11 crc kubenswrapper[4748]: I0320 10:56:11.944259 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fcd1bb77-acf9-4f10-855d-4ab63f3ad229-fernet-keys\") pod \"keystone-bootstrap-42s2j\" (UID: \"fcd1bb77-acf9-4f10-855d-4ab63f3ad229\") " pod="openstack/keystone-bootstrap-42s2j" Mar 20 10:56:11 crc kubenswrapper[4748]: I0320 10:56:11.944313 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcd1bb77-acf9-4f10-855d-4ab63f3ad229-combined-ca-bundle\") pod \"keystone-bootstrap-42s2j\" (UID: \"fcd1bb77-acf9-4f10-855d-4ab63f3ad229\") " pod="openstack/keystone-bootstrap-42s2j" Mar 20 10:56:11 crc kubenswrapper[4748]: I0320 10:56:11.944369 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2n28\" (UniqueName: \"kubernetes.io/projected/fcd1bb77-acf9-4f10-855d-4ab63f3ad229-kube-api-access-v2n28\") pod \"keystone-bootstrap-42s2j\" (UID: \"fcd1bb77-acf9-4f10-855d-4ab63f3ad229\") " pod="openstack/keystone-bootstrap-42s2j" Mar 20 10:56:11 crc kubenswrapper[4748]: I0320 10:56:11.944445 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fcd1bb77-acf9-4f10-855d-4ab63f3ad229-scripts\") pod \"keystone-bootstrap-42s2j\" (UID: \"fcd1bb77-acf9-4f10-855d-4ab63f3ad229\") " pod="openstack/keystone-bootstrap-42s2j" Mar 20 10:56:11 crc kubenswrapper[4748]: I0320 10:56:11.944512 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fcd1bb77-acf9-4f10-855d-4ab63f3ad229-config-data\") pod \"keystone-bootstrap-42s2j\" (UID: \"fcd1bb77-acf9-4f10-855d-4ab63f3ad229\") " pod="openstack/keystone-bootstrap-42s2j" Mar 20 10:56:11 crc kubenswrapper[4748]: I0320 10:56:11.944540 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fcd1bb77-acf9-4f10-855d-4ab63f3ad229-credential-keys\") pod \"keystone-bootstrap-42s2j\" (UID: \"fcd1bb77-acf9-4f10-855d-4ab63f3ad229\") " pod="openstack/keystone-bootstrap-42s2j" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.055491 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fcd1bb77-acf9-4f10-855d-4ab63f3ad229-credential-keys\") pod \"keystone-bootstrap-42s2j\" (UID: \"fcd1bb77-acf9-4f10-855d-4ab63f3ad229\") " pod="openstack/keystone-bootstrap-42s2j" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.055546 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fcd1bb77-acf9-4f10-855d-4ab63f3ad229-fernet-keys\") pod \"keystone-bootstrap-42s2j\" (UID: \"fcd1bb77-acf9-4f10-855d-4ab63f3ad229\") " pod="openstack/keystone-bootstrap-42s2j" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.055580 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcd1bb77-acf9-4f10-855d-4ab63f3ad229-combined-ca-bundle\") pod \"keystone-bootstrap-42s2j\" (UID: \"fcd1bb77-acf9-4f10-855d-4ab63f3ad229\") " pod="openstack/keystone-bootstrap-42s2j" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.055632 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v2n28\" (UniqueName: \"kubernetes.io/projected/fcd1bb77-acf9-4f10-855d-4ab63f3ad229-kube-api-access-v2n28\") pod \"keystone-bootstrap-42s2j\" (UID: \"fcd1bb77-acf9-4f10-855d-4ab63f3ad229\") " pod="openstack/keystone-bootstrap-42s2j" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.055664 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24d482f6-45ad-43d4-b2a8-3bbcf4d3f8ae-config\") pod \"dnsmasq-dns-847c4cc679-8t679\" (UID: \"24d482f6-45ad-43d4-b2a8-3bbcf4d3f8ae\") " pod="openstack/dnsmasq-dns-847c4cc679-8t679" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.055695 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/24d482f6-45ad-43d4-b2a8-3bbcf4d3f8ae-dns-svc\") pod \"dnsmasq-dns-847c4cc679-8t679\" (UID: \"24d482f6-45ad-43d4-b2a8-3bbcf4d3f8ae\") " pod="openstack/dnsmasq-dns-847c4cc679-8t679" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.055734 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/24d482f6-45ad-43d4-b2a8-3bbcf4d3f8ae-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-8t679\" (UID: \"24d482f6-45ad-43d4-b2a8-3bbcf4d3f8ae\") " pod="openstack/dnsmasq-dns-847c4cc679-8t679" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.055784 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fcd1bb77-acf9-4f10-855d-4ab63f3ad229-scripts\") pod \"keystone-bootstrap-42s2j\" (UID: \"fcd1bb77-acf9-4f10-855d-4ab63f3ad229\") " pod="openstack/keystone-bootstrap-42s2j" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.055854 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qfm8\" (UniqueName: \"kubernetes.io/projected/24d482f6-45ad-43d4-b2a8-3bbcf4d3f8ae-kube-api-access-7qfm8\") pod \"dnsmasq-dns-847c4cc679-8t679\" (UID: \"24d482f6-45ad-43d4-b2a8-3bbcf4d3f8ae\") " pod="openstack/dnsmasq-dns-847c4cc679-8t679" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.055881 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/24d482f6-45ad-43d4-b2a8-3bbcf4d3f8ae-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-8t679\" (UID: \"24d482f6-45ad-43d4-b2a8-3bbcf4d3f8ae\") " pod="openstack/dnsmasq-dns-847c4cc679-8t679" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.055905 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/24d482f6-45ad-43d4-b2a8-3bbcf4d3f8ae-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-8t679\" (UID: \"24d482f6-45ad-43d4-b2a8-3bbcf4d3f8ae\") " pod="openstack/dnsmasq-dns-847c4cc679-8t679" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.055932 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fcd1bb77-acf9-4f10-855d-4ab63f3ad229-config-data\") pod \"keystone-bootstrap-42s2j\" (UID: \"fcd1bb77-acf9-4f10-855d-4ab63f3ad229\") " pod="openstack/keystone-bootstrap-42s2j" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.066698 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-r6t2v"] Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.067911 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-r6t2v" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.073784 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fcd1bb77-acf9-4f10-855d-4ab63f3ad229-scripts\") pod \"keystone-bootstrap-42s2j\" (UID: \"fcd1bb77-acf9-4f10-855d-4ab63f3ad229\") " pod="openstack/keystone-bootstrap-42s2j" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.076644 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.076792 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fcd1bb77-acf9-4f10-855d-4ab63f3ad229-credential-keys\") pod \"keystone-bootstrap-42s2j\" (UID: \"fcd1bb77-acf9-4f10-855d-4ab63f3ad229\") " pod="openstack/keystone-bootstrap-42s2j" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.077283 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.077488 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-6n2f8" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.082701 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fcd1bb77-acf9-4f10-855d-4ab63f3ad229-fernet-keys\") pod \"keystone-bootstrap-42s2j\" (UID: \"fcd1bb77-acf9-4f10-855d-4ab63f3ad229\") " pod="openstack/keystone-bootstrap-42s2j" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.084720 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcd1bb77-acf9-4f10-855d-4ab63f3ad229-combined-ca-bundle\") pod \"keystone-bootstrap-42s2j\" (UID: \"fcd1bb77-acf9-4f10-855d-4ab63f3ad229\") " pod="openstack/keystone-bootstrap-42s2j" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.087876 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fcd1bb77-acf9-4f10-855d-4ab63f3ad229-config-data\") pod \"keystone-bootstrap-42s2j\" (UID: \"fcd1bb77-acf9-4f10-855d-4ab63f3ad229\") " pod="openstack/keystone-bootstrap-42s2j" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.092429 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2n28\" (UniqueName: \"kubernetes.io/projected/fcd1bb77-acf9-4f10-855d-4ab63f3ad229-kube-api-access-v2n28\") pod \"keystone-bootstrap-42s2j\" (UID: \"fcd1bb77-acf9-4f10-855d-4ab63f3ad229\") " pod="openstack/keystone-bootstrap-42s2j" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.099174 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-r6t2v"] Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.122566 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-b8d4ccdcc-grvq4"] Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.127774 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-b8d4ccdcc-grvq4" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.131977 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.132245 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-tskbv" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.148307 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.191300 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e7f8f96f-de61-435d-a542-0b10d8860ccd-config\") pod \"neutron-db-sync-r6t2v\" (UID: \"e7f8f96f-de61-435d-a542-0b10d8860ccd\") " pod="openstack/neutron-db-sync-r6t2v" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.192283 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dddc05be-fe06-4c29-8102-d6118b6f36d5-logs\") pod \"horizon-b8d4ccdcc-grvq4\" (UID: \"dddc05be-fe06-4c29-8102-d6118b6f36d5\") " pod="openstack/horizon-b8d4ccdcc-grvq4" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.192427 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mj62\" (UniqueName: \"kubernetes.io/projected/e7f8f96f-de61-435d-a542-0b10d8860ccd-kube-api-access-7mj62\") pod \"neutron-db-sync-r6t2v\" (UID: \"e7f8f96f-de61-435d-a542-0b10d8860ccd\") " pod="openstack/neutron-db-sync-r6t2v" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.192638 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24d482f6-45ad-43d4-b2a8-3bbcf4d3f8ae-config\") pod \"dnsmasq-dns-847c4cc679-8t679\" (UID: \"24d482f6-45ad-43d4-b2a8-3bbcf4d3f8ae\") " pod="openstack/dnsmasq-dns-847c4cc679-8t679" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.192758 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dddc05be-fe06-4c29-8102-d6118b6f36d5-config-data\") pod \"horizon-b8d4ccdcc-grvq4\" (UID: \"dddc05be-fe06-4c29-8102-d6118b6f36d5\") " pod="openstack/horizon-b8d4ccdcc-grvq4" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.192848 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/24d482f6-45ad-43d4-b2a8-3bbcf4d3f8ae-dns-svc\") pod \"dnsmasq-dns-847c4cc679-8t679\" (UID: \"24d482f6-45ad-43d4-b2a8-3bbcf4d3f8ae\") " pod="openstack/dnsmasq-dns-847c4cc679-8t679" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.193263 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7f8f96f-de61-435d-a542-0b10d8860ccd-combined-ca-bundle\") pod \"neutron-db-sync-r6t2v\" (UID: \"e7f8f96f-de61-435d-a542-0b10d8860ccd\") " pod="openstack/neutron-db-sync-r6t2v" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.193362 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/24d482f6-45ad-43d4-b2a8-3bbcf4d3f8ae-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-8t679\" (UID: \"24d482f6-45ad-43d4-b2a8-3bbcf4d3f8ae\") " pod="openstack/dnsmasq-dns-847c4cc679-8t679" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.200355 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/dddc05be-fe06-4c29-8102-d6118b6f36d5-horizon-secret-key\") pod \"horizon-b8d4ccdcc-grvq4\" (UID: \"dddc05be-fe06-4c29-8102-d6118b6f36d5\") " pod="openstack/horizon-b8d4ccdcc-grvq4" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.200468 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/24d482f6-45ad-43d4-b2a8-3bbcf4d3f8ae-dns-svc\") pod \"dnsmasq-dns-847c4cc679-8t679\" (UID: \"24d482f6-45ad-43d4-b2a8-3bbcf4d3f8ae\") " pod="openstack/dnsmasq-dns-847c4cc679-8t679" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.200741 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7qfm8\" (UniqueName: \"kubernetes.io/projected/24d482f6-45ad-43d4-b2a8-3bbcf4d3f8ae-kube-api-access-7qfm8\") pod \"dnsmasq-dns-847c4cc679-8t679\" (UID: \"24d482f6-45ad-43d4-b2a8-3bbcf4d3f8ae\") " pod="openstack/dnsmasq-dns-847c4cc679-8t679" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.200899 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/24d482f6-45ad-43d4-b2a8-3bbcf4d3f8ae-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-8t679\" (UID: \"24d482f6-45ad-43d4-b2a8-3bbcf4d3f8ae\") " pod="openstack/dnsmasq-dns-847c4cc679-8t679" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.200983 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dddc05be-fe06-4c29-8102-d6118b6f36d5-scripts\") pod \"horizon-b8d4ccdcc-grvq4\" (UID: \"dddc05be-fe06-4c29-8102-d6118b6f36d5\") " pod="openstack/horizon-b8d4ccdcc-grvq4" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.201616 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/24d482f6-45ad-43d4-b2a8-3bbcf4d3f8ae-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-8t679\" (UID: \"24d482f6-45ad-43d4-b2a8-3bbcf4d3f8ae\") " pod="openstack/dnsmasq-dns-847c4cc679-8t679" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.195620 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/24d482f6-45ad-43d4-b2a8-3bbcf4d3f8ae-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-8t679\" (UID: \"24d482f6-45ad-43d4-b2a8-3bbcf4d3f8ae\") " pod="openstack/dnsmasq-dns-847c4cc679-8t679" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.205079 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/24d482f6-45ad-43d4-b2a8-3bbcf4d3f8ae-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-8t679\" (UID: \"24d482f6-45ad-43d4-b2a8-3bbcf4d3f8ae\") " pod="openstack/dnsmasq-dns-847c4cc679-8t679" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.205674 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.227763 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-42s2j" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.241430 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24d482f6-45ad-43d4-b2a8-3bbcf4d3f8ae-config\") pod \"dnsmasq-dns-847c4cc679-8t679\" (UID: \"24d482f6-45ad-43d4-b2a8-3bbcf4d3f8ae\") " pod="openstack/dnsmasq-dns-847c4cc679-8t679" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.242390 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/24d482f6-45ad-43d4-b2a8-3bbcf4d3f8ae-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-8t679\" (UID: \"24d482f6-45ad-43d4-b2a8-3bbcf4d3f8ae\") " pod="openstack/dnsmasq-dns-847c4cc679-8t679" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.246035 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-b8d4ccdcc-grvq4"] Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.248008 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qfm8\" (UniqueName: \"kubernetes.io/projected/24d482f6-45ad-43d4-b2a8-3bbcf4d3f8ae-kube-api-access-7qfm8\") pod \"dnsmasq-dns-847c4cc679-8t679\" (UID: \"24d482f6-45ad-43d4-b2a8-3bbcf4d3f8ae\") " pod="openstack/dnsmasq-dns-847c4cc679-8t679" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.306637 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e7f8f96f-de61-435d-a542-0b10d8860ccd-config\") pod \"neutron-db-sync-r6t2v\" (UID: \"e7f8f96f-de61-435d-a542-0b10d8860ccd\") " pod="openstack/neutron-db-sync-r6t2v" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.306708 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dddc05be-fe06-4c29-8102-d6118b6f36d5-logs\") pod \"horizon-b8d4ccdcc-grvq4\" (UID: \"dddc05be-fe06-4c29-8102-d6118b6f36d5\") " pod="openstack/horizon-b8d4ccdcc-grvq4" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.306779 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mj62\" (UniqueName: \"kubernetes.io/projected/e7f8f96f-de61-435d-a542-0b10d8860ccd-kube-api-access-7mj62\") pod \"neutron-db-sync-r6t2v\" (UID: \"e7f8f96f-de61-435d-a542-0b10d8860ccd\") " pod="openstack/neutron-db-sync-r6t2v" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.306862 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dddc05be-fe06-4c29-8102-d6118b6f36d5-config-data\") pod \"horizon-b8d4ccdcc-grvq4\" (UID: \"dddc05be-fe06-4c29-8102-d6118b6f36d5\") " pod="openstack/horizon-b8d4ccdcc-grvq4" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.306927 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7f8f96f-de61-435d-a542-0b10d8860ccd-combined-ca-bundle\") pod \"neutron-db-sync-r6t2v\" (UID: \"e7f8f96f-de61-435d-a542-0b10d8860ccd\") " pod="openstack/neutron-db-sync-r6t2v" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.306992 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/dddc05be-fe06-4c29-8102-d6118b6f36d5-horizon-secret-key\") pod \"horizon-b8d4ccdcc-grvq4\" (UID: \"dddc05be-fe06-4c29-8102-d6118b6f36d5\") " pod="openstack/horizon-b8d4ccdcc-grvq4" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.307045 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dddc05be-fe06-4c29-8102-d6118b6f36d5-scripts\") pod \"horizon-b8d4ccdcc-grvq4\" (UID: \"dddc05be-fe06-4c29-8102-d6118b6f36d5\") " pod="openstack/horizon-b8d4ccdcc-grvq4" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.307081 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxb4f\" (UniqueName: \"kubernetes.io/projected/dddc05be-fe06-4c29-8102-d6118b6f36d5-kube-api-access-jxb4f\") pod \"horizon-b8d4ccdcc-grvq4\" (UID: \"dddc05be-fe06-4c29-8102-d6118b6f36d5\") " pod="openstack/horizon-b8d4ccdcc-grvq4" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.317266 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dddc05be-fe06-4c29-8102-d6118b6f36d5-config-data\") pod \"horizon-b8d4ccdcc-grvq4\" (UID: \"dddc05be-fe06-4c29-8102-d6118b6f36d5\") " pod="openstack/horizon-b8d4ccdcc-grvq4" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.317520 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dddc05be-fe06-4c29-8102-d6118b6f36d5-logs\") pod \"horizon-b8d4ccdcc-grvq4\" (UID: \"dddc05be-fe06-4c29-8102-d6118b6f36d5\") " pod="openstack/horizon-b8d4ccdcc-grvq4" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.324709 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dddc05be-fe06-4c29-8102-d6118b6f36d5-scripts\") pod \"horizon-b8d4ccdcc-grvq4\" (UID: \"dddc05be-fe06-4c29-8102-d6118b6f36d5\") " pod="openstack/horizon-b8d4ccdcc-grvq4" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.329446 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-g4qfl"] Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.351270 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7f8f96f-de61-435d-a542-0b10d8860ccd-combined-ca-bundle\") pod \"neutron-db-sync-r6t2v\" (UID: \"e7f8f96f-de61-435d-a542-0b10d8860ccd\") " pod="openstack/neutron-db-sync-r6t2v" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.368960 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/dddc05be-fe06-4c29-8102-d6118b6f36d5-horizon-secret-key\") pod \"horizon-b8d4ccdcc-grvq4\" (UID: \"dddc05be-fe06-4c29-8102-d6118b6f36d5\") " pod="openstack/horizon-b8d4ccdcc-grvq4" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.390101 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-g4qfl" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.393166 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-g4qfl"] Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.402147 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.409166 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jxb4f\" (UniqueName: \"kubernetes.io/projected/dddc05be-fe06-4c29-8102-d6118b6f36d5-kube-api-access-jxb4f\") pod \"horizon-b8d4ccdcc-grvq4\" (UID: \"dddc05be-fe06-4c29-8102-d6118b6f36d5\") " pod="openstack/horizon-b8d4ccdcc-grvq4" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.409434 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mj62\" (UniqueName: \"kubernetes.io/projected/e7f8f96f-de61-435d-a542-0b10d8860ccd-kube-api-access-7mj62\") pod \"neutron-db-sync-r6t2v\" (UID: \"e7f8f96f-de61-435d-a542-0b10d8860ccd\") " pod="openstack/neutron-db-sync-r6t2v" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.418382 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.418595 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/e7f8f96f-de61-435d-a542-0b10d8860ccd-config\") pod \"neutron-db-sync-r6t2v\" (UID: \"e7f8f96f-de61-435d-a542-0b10d8860ccd\") " pod="openstack/neutron-db-sync-r6t2v" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.419227 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-2l5mf" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.458226 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxb4f\" (UniqueName: \"kubernetes.io/projected/dddc05be-fe06-4c29-8102-d6118b6f36d5-kube-api-access-jxb4f\") pod \"horizon-b8d4ccdcc-grvq4\" (UID: \"dddc05be-fe06-4c29-8102-d6118b6f36d5\") " pod="openstack/horizon-b8d4ccdcc-grvq4" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.466626 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-rm5bp"] Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.468322 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-rm5bp" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.486110 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-q94rm" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.487417 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.488211 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.507617 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-rm5bp"] Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.512822 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6340e84c-8ffb-40e5-a470-52a50bff86f1-logs\") pod \"placement-db-sync-g4qfl\" (UID: \"6340e84c-8ffb-40e5-a470-52a50bff86f1\") " pod="openstack/placement-db-sync-g4qfl" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.512888 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6340e84c-8ffb-40e5-a470-52a50bff86f1-scripts\") pod \"placement-db-sync-g4qfl\" (UID: \"6340e84c-8ffb-40e5-a470-52a50bff86f1\") " pod="openstack/placement-db-sync-g4qfl" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.512948 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6340e84c-8ffb-40e5-a470-52a50bff86f1-combined-ca-bundle\") pod \"placement-db-sync-g4qfl\" (UID: \"6340e84c-8ffb-40e5-a470-52a50bff86f1\") " pod="openstack/placement-db-sync-g4qfl" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.513015 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6340e84c-8ffb-40e5-a470-52a50bff86f1-config-data\") pod \"placement-db-sync-g4qfl\" (UID: \"6340e84c-8ffb-40e5-a470-52a50bff86f1\") " pod="openstack/placement-db-sync-g4qfl" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.513088 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jbt7\" (UniqueName: \"kubernetes.io/projected/6340e84c-8ffb-40e5-a470-52a50bff86f1-kube-api-access-2jbt7\") pod \"placement-db-sync-g4qfl\" (UID: \"6340e84c-8ffb-40e5-a470-52a50bff86f1\") " pod="openstack/placement-db-sync-g4qfl" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.537270 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-8t679" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.543033 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b946c75cc-67n95" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.543493 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 10:56:12 crc kubenswrapper[4748]: E0320 10:56:12.543992 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5e2826a-d87e-4002-bece-b3eed4990771" containerName="dnsmasq-dns" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.544008 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5e2826a-d87e-4002-bece-b3eed4990771" containerName="dnsmasq-dns" Mar 20 10:56:12 crc kubenswrapper[4748]: E0320 10:56:12.544039 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5e2826a-d87e-4002-bece-b3eed4990771" containerName="init" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.544047 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5e2826a-d87e-4002-bece-b3eed4990771" containerName="init" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.544258 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5e2826a-d87e-4002-bece-b3eed4990771" containerName="dnsmasq-dns" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.545632 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.550179 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.550915 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.551198 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.552699 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-vdzj8" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.561048 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.582592 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-8t679"] Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.589374 4748 generic.go:334] "Generic (PLEG): container finished" podID="c5e2826a-d87e-4002-bece-b3eed4990771" containerID="0e436915fb6c6bf3dc203d98ac633ed1fd433ebcd13e6da71041d12969b3f57b" exitCode=0 Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.589460 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b946c75cc-67n95" event={"ID":"c5e2826a-d87e-4002-bece-b3eed4990771","Type":"ContainerDied","Data":"0e436915fb6c6bf3dc203d98ac633ed1fd433ebcd13e6da71041d12969b3f57b"} Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.589500 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b946c75cc-67n95" event={"ID":"c5e2826a-d87e-4002-bece-b3eed4990771","Type":"ContainerDied","Data":"8bdfecc9dcbb210dedcfd1f67e19c1da4b9d70c765b8606a90de5aa699670bba"} Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.589521 4748 scope.go:117] "RemoveContainer" containerID="0e436915fb6c6bf3dc203d98ac633ed1fd433ebcd13e6da71041d12969b3f57b" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.589659 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b946c75cc-67n95" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.591529 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.595801 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.603535 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-j7jqs"] Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.607637 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.608904 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-j7jqs" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.618494 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-74f6bcbc87-qpm57" podUID="543b35d5-7087-4a09-bbfb-6937b3765855" containerName="dnsmasq-dns" containerID="cri-o://cc3d10b384b48095b3ea900b6f455aeed1fe457a88698c25a7111902fa133019" gracePeriod=10 Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.618999 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.619072 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-hfmzk" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.619173 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.619259 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.619277 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-74f6bcbc87-qpm57" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.619329 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-qpm57" event={"ID":"543b35d5-7087-4a09-bbfb-6937b3765855","Type":"ContainerStarted","Data":"cc3d10b384b48095b3ea900b6f455aeed1fe457a88698c25a7111902fa133019"} Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.619669 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-r6t2v" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.620246 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af16052b-a5ab-4244-b007-69a32d050a35-combined-ca-bundle\") pod \"cinder-db-sync-rm5bp\" (UID: \"af16052b-a5ab-4244-b007-69a32d050a35\") " pod="openstack/cinder-db-sync-rm5bp" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.620805 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2jbt7\" (UniqueName: \"kubernetes.io/projected/6340e84c-8ffb-40e5-a470-52a50bff86f1-kube-api-access-2jbt7\") pod \"placement-db-sync-g4qfl\" (UID: \"6340e84c-8ffb-40e5-a470-52a50bff86f1\") " pod="openstack/placement-db-sync-g4qfl" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.620934 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ng9kt\" (UniqueName: \"kubernetes.io/projected/af16052b-a5ab-4244-b007-69a32d050a35-kube-api-access-ng9kt\") pod \"cinder-db-sync-rm5bp\" (UID: \"af16052b-a5ab-4244-b007-69a32d050a35\") " pod="openstack/cinder-db-sync-rm5bp" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.621086 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6340e84c-8ffb-40e5-a470-52a50bff86f1-logs\") pod \"placement-db-sync-g4qfl\" (UID: \"6340e84c-8ffb-40e5-a470-52a50bff86f1\") " pod="openstack/placement-db-sync-g4qfl" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.621158 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6340e84c-8ffb-40e5-a470-52a50bff86f1-scripts\") pod \"placement-db-sync-g4qfl\" (UID: \"6340e84c-8ffb-40e5-a470-52a50bff86f1\") " pod="openstack/placement-db-sync-g4qfl" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.621235 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/af16052b-a5ab-4244-b007-69a32d050a35-etc-machine-id\") pod \"cinder-db-sync-rm5bp\" (UID: \"af16052b-a5ab-4244-b007-69a32d050a35\") " pod="openstack/cinder-db-sync-rm5bp" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.621332 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/af16052b-a5ab-4244-b007-69a32d050a35-db-sync-config-data\") pod \"cinder-db-sync-rm5bp\" (UID: \"af16052b-a5ab-4244-b007-69a32d050a35\") " pod="openstack/cinder-db-sync-rm5bp" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.621464 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6340e84c-8ffb-40e5-a470-52a50bff86f1-combined-ca-bundle\") pod \"placement-db-sync-g4qfl\" (UID: \"6340e84c-8ffb-40e5-a470-52a50bff86f1\") " pod="openstack/placement-db-sync-g4qfl" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.621727 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af16052b-a5ab-4244-b007-69a32d050a35-config-data\") pod \"cinder-db-sync-rm5bp\" (UID: \"af16052b-a5ab-4244-b007-69a32d050a35\") " pod="openstack/cinder-db-sync-rm5bp" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.621802 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6340e84c-8ffb-40e5-a470-52a50bff86f1-config-data\") pod \"placement-db-sync-g4qfl\" (UID: \"6340e84c-8ffb-40e5-a470-52a50bff86f1\") " pod="openstack/placement-db-sync-g4qfl" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.621890 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6340e84c-8ffb-40e5-a470-52a50bff86f1-logs\") pod \"placement-db-sync-g4qfl\" (UID: \"6340e84c-8ffb-40e5-a470-52a50bff86f1\") " pod="openstack/placement-db-sync-g4qfl" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.621987 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af16052b-a5ab-4244-b007-69a32d050a35-scripts\") pod \"cinder-db-sync-rm5bp\" (UID: \"af16052b-a5ab-4244-b007-69a32d050a35\") " pod="openstack/cinder-db-sync-rm5bp" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.625333 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6340e84c-8ffb-40e5-a470-52a50bff86f1-scripts\") pod \"placement-db-sync-g4qfl\" (UID: \"6340e84c-8ffb-40e5-a470-52a50bff86f1\") " pod="openstack/placement-db-sync-g4qfl" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.626163 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6340e84c-8ffb-40e5-a470-52a50bff86f1-config-data\") pod \"placement-db-sync-g4qfl\" (UID: \"6340e84c-8ffb-40e5-a470-52a50bff86f1\") " pod="openstack/placement-db-sync-g4qfl" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.628789 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6340e84c-8ffb-40e5-a470-52a50bff86f1-combined-ca-bundle\") pod \"placement-db-sync-g4qfl\" (UID: \"6340e84c-8ffb-40e5-a470-52a50bff86f1\") " pod="openstack/placement-db-sync-g4qfl" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.629672 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-j7jqs"] Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.639448 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-5g98d"] Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.640785 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-5g98d" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.650346 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-69d8fb6c75-qkwp6"] Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.666485 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-69d8fb6c75-qkwp6" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.689392 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-5g98d"] Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.701642 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.703178 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.709388 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.715261 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-69d8fb6c75-qkwp6"] Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.718207 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jbt7\" (UniqueName: \"kubernetes.io/projected/6340e84c-8ffb-40e5-a470-52a50bff86f1-kube-api-access-2jbt7\") pod \"placement-db-sync-g4qfl\" (UID: \"6340e84c-8ffb-40e5-a470-52a50bff86f1\") " pod="openstack/placement-db-sync-g4qfl" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.735890 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz55l\" (UniqueName: \"kubernetes.io/projected/c5e2826a-d87e-4002-bece-b3eed4990771-kube-api-access-lz55l\") pod \"c5e2826a-d87e-4002-bece-b3eed4990771\" (UID: \"c5e2826a-d87e-4002-bece-b3eed4990771\") " Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.736086 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c5e2826a-d87e-4002-bece-b3eed4990771-ovsdbserver-nb\") pod \"c5e2826a-d87e-4002-bece-b3eed4990771\" (UID: \"c5e2826a-d87e-4002-bece-b3eed4990771\") " Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.736123 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c5e2826a-d87e-4002-bece-b3eed4990771-dns-svc\") pod \"c5e2826a-d87e-4002-bece-b3eed4990771\" (UID: \"c5e2826a-d87e-4002-bece-b3eed4990771\") " Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.736236 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c5e2826a-d87e-4002-bece-b3eed4990771-ovsdbserver-sb\") pod \"c5e2826a-d87e-4002-bece-b3eed4990771\" (UID: \"c5e2826a-d87e-4002-bece-b3eed4990771\") " Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.736295 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5e2826a-d87e-4002-bece-b3eed4990771-config\") pod \"c5e2826a-d87e-4002-bece-b3eed4990771\" (UID: \"c5e2826a-d87e-4002-bece-b3eed4990771\") " Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.736769 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16ac2620-c820-4881-b959-d12478e5f4ba-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"16ac2620-c820-4881-b959-d12478e5f4ba\") " pod="openstack/glance-default-external-api-0" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.736804 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"16ac2620-c820-4881-b959-d12478e5f4ba\") " pod="openstack/glance-default-external-api-0" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.736850 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/24adc7ea-6cf7-416b-9570-a92736a9b48b-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-5g98d\" (UID: \"24adc7ea-6cf7-416b-9570-a92736a9b48b\") " pod="openstack/dnsmasq-dns-785d8bcb8c-5g98d" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.736908 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/af16052b-a5ab-4244-b007-69a32d050a35-etc-machine-id\") pod \"cinder-db-sync-rm5bp\" (UID: \"af16052b-a5ab-4244-b007-69a32d050a35\") " pod="openstack/cinder-db-sync-rm5bp" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.736939 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/af16052b-a5ab-4244-b007-69a32d050a35-db-sync-config-data\") pod \"cinder-db-sync-rm5bp\" (UID: \"af16052b-a5ab-4244-b007-69a32d050a35\") " pod="openstack/cinder-db-sync-rm5bp" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.736985 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4a84b3e-b01b-424e-969b-e2cbc625f3f3-config-data\") pod \"ceilometer-0\" (UID: \"a4a84b3e-b01b-424e-969b-e2cbc625f3f3\") " pod="openstack/ceilometer-0" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.737012 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/24adc7ea-6cf7-416b-9570-a92736a9b48b-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-5g98d\" (UID: \"24adc7ea-6cf7-416b-9570-a92736a9b48b\") " pod="openstack/dnsmasq-dns-785d8bcb8c-5g98d" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.737034 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/24adc7ea-6cf7-416b-9570-a92736a9b48b-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-5g98d\" (UID: \"24adc7ea-6cf7-416b-9570-a92736a9b48b\") " pod="openstack/dnsmasq-dns-785d8bcb8c-5g98d" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.737062 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a4a4f230-3fe6-44a4-a91b-5b0ea07ae755-db-sync-config-data\") pod \"barbican-db-sync-j7jqs\" (UID: \"a4a4f230-3fe6-44a4-a91b-5b0ea07ae755\") " pod="openstack/barbican-db-sync-j7jqs" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.737080 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a4a84b3e-b01b-424e-969b-e2cbc625f3f3-scripts\") pod \"ceilometer-0\" (UID: \"a4a84b3e-b01b-424e-969b-e2cbc625f3f3\") " pod="openstack/ceilometer-0" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.737111 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16ac2620-c820-4881-b959-d12478e5f4ba-config-data\") pod \"glance-default-external-api-0\" (UID: \"16ac2620-c820-4881-b959-d12478e5f4ba\") " pod="openstack/glance-default-external-api-0" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.737140 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4a4f230-3fe6-44a4-a91b-5b0ea07ae755-combined-ca-bundle\") pod \"barbican-db-sync-j7jqs\" (UID: \"a4a4f230-3fe6-44a4-a91b-5b0ea07ae755\") " pod="openstack/barbican-db-sync-j7jqs" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.737201 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrmm6\" (UniqueName: \"kubernetes.io/projected/16ac2620-c820-4881-b959-d12478e5f4ba-kube-api-access-wrmm6\") pod \"glance-default-external-api-0\" (UID: \"16ac2620-c820-4881-b959-d12478e5f4ba\") " pod="openstack/glance-default-external-api-0" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.737222 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4a84b3e-b01b-424e-969b-e2cbc625f3f3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a4a84b3e-b01b-424e-969b-e2cbc625f3f3\") " pod="openstack/ceilometer-0" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.737243 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/16ac2620-c820-4881-b959-d12478e5f4ba-logs\") pod \"glance-default-external-api-0\" (UID: \"16ac2620-c820-4881-b959-d12478e5f4ba\") " pod="openstack/glance-default-external-api-0" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.737291 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhtff\" (UniqueName: \"kubernetes.io/projected/a4a4f230-3fe6-44a4-a91b-5b0ea07ae755-kube-api-access-qhtff\") pod \"barbican-db-sync-j7jqs\" (UID: \"a4a4f230-3fe6-44a4-a91b-5b0ea07ae755\") " pod="openstack/barbican-db-sync-j7jqs" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.737318 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/24adc7ea-6cf7-416b-9570-a92736a9b48b-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-5g98d\" (UID: \"24adc7ea-6cf7-416b-9570-a92736a9b48b\") " pod="openstack/dnsmasq-dns-785d8bcb8c-5g98d" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.737351 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af16052b-a5ab-4244-b007-69a32d050a35-config-data\") pod \"cinder-db-sync-rm5bp\" (UID: \"af16052b-a5ab-4244-b007-69a32d050a35\") " pod="openstack/cinder-db-sync-rm5bp" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.737377 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzsrp\" (UniqueName: \"kubernetes.io/projected/a4a84b3e-b01b-424e-969b-e2cbc625f3f3-kube-api-access-xzsrp\") pod \"ceilometer-0\" (UID: \"a4a84b3e-b01b-424e-969b-e2cbc625f3f3\") " pod="openstack/ceilometer-0" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.737430 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af16052b-a5ab-4244-b007-69a32d050a35-scripts\") pod \"cinder-db-sync-rm5bp\" (UID: \"af16052b-a5ab-4244-b007-69a32d050a35\") " pod="openstack/cinder-db-sync-rm5bp" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.737455 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/16ac2620-c820-4881-b959-d12478e5f4ba-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"16ac2620-c820-4881-b959-d12478e5f4ba\") " pod="openstack/glance-default-external-api-0" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.737492 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a4a84b3e-b01b-424e-969b-e2cbc625f3f3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a4a84b3e-b01b-424e-969b-e2cbc625f3f3\") " pod="openstack/ceilometer-0" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.737518 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af16052b-a5ab-4244-b007-69a32d050a35-combined-ca-bundle\") pod \"cinder-db-sync-rm5bp\" (UID: \"af16052b-a5ab-4244-b007-69a32d050a35\") " pod="openstack/cinder-db-sync-rm5bp" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.737541 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a4a84b3e-b01b-424e-969b-e2cbc625f3f3-run-httpd\") pod \"ceilometer-0\" (UID: \"a4a84b3e-b01b-424e-969b-e2cbc625f3f3\") " pod="openstack/ceilometer-0" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.737574 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24adc7ea-6cf7-416b-9570-a92736a9b48b-config\") pod \"dnsmasq-dns-785d8bcb8c-5g98d\" (UID: \"24adc7ea-6cf7-416b-9570-a92736a9b48b\") " pod="openstack/dnsmasq-dns-785d8bcb8c-5g98d" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.737599 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16ac2620-c820-4881-b959-d12478e5f4ba-scripts\") pod \"glance-default-external-api-0\" (UID: \"16ac2620-c820-4881-b959-d12478e5f4ba\") " pod="openstack/glance-default-external-api-0" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.737616 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/16ac2620-c820-4881-b959-d12478e5f4ba-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"16ac2620-c820-4881-b959-d12478e5f4ba\") " pod="openstack/glance-default-external-api-0" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.737639 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ng9kt\" (UniqueName: \"kubernetes.io/projected/af16052b-a5ab-4244-b007-69a32d050a35-kube-api-access-ng9kt\") pod \"cinder-db-sync-rm5bp\" (UID: \"af16052b-a5ab-4244-b007-69a32d050a35\") " pod="openstack/cinder-db-sync-rm5bp" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.737660 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a4a84b3e-b01b-424e-969b-e2cbc625f3f3-log-httpd\") pod \"ceilometer-0\" (UID: \"a4a84b3e-b01b-424e-969b-e2cbc625f3f3\") " pod="openstack/ceilometer-0" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.737686 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqdnr\" (UniqueName: \"kubernetes.io/projected/24adc7ea-6cf7-416b-9570-a92736a9b48b-kube-api-access-zqdnr\") pod \"dnsmasq-dns-785d8bcb8c-5g98d\" (UID: \"24adc7ea-6cf7-416b-9570-a92736a9b48b\") " pod="openstack/dnsmasq-dns-785d8bcb8c-5g98d" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.742139 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.745495 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/af16052b-a5ab-4244-b007-69a32d050a35-db-sync-config-data\") pod \"cinder-db-sync-rm5bp\" (UID: \"af16052b-a5ab-4244-b007-69a32d050a35\") " pod="openstack/cinder-db-sync-rm5bp" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.753241 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/af16052b-a5ab-4244-b007-69a32d050a35-etc-machine-id\") pod \"cinder-db-sync-rm5bp\" (UID: \"af16052b-a5ab-4244-b007-69a32d050a35\") " pod="openstack/cinder-db-sync-rm5bp" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.757657 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af16052b-a5ab-4244-b007-69a32d050a35-combined-ca-bundle\") pod \"cinder-db-sync-rm5bp\" (UID: \"af16052b-a5ab-4244-b007-69a32d050a35\") " pod="openstack/cinder-db-sync-rm5bp" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.758189 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-b8d4ccdcc-grvq4" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.760153 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af16052b-a5ab-4244-b007-69a32d050a35-config-data\") pod \"cinder-db-sync-rm5bp\" (UID: \"af16052b-a5ab-4244-b007-69a32d050a35\") " pod="openstack/cinder-db-sync-rm5bp" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.765805 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.783211 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5e2826a-d87e-4002-bece-b3eed4990771-kube-api-access-lz55l" (OuterVolumeSpecName: "kube-api-access-lz55l") pod "c5e2826a-d87e-4002-bece-b3eed4990771" (UID: "c5e2826a-d87e-4002-bece-b3eed4990771"). InnerVolumeSpecName "kube-api-access-lz55l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.789045 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-g4qfl" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.794564 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af16052b-a5ab-4244-b007-69a32d050a35-scripts\") pod \"cinder-db-sync-rm5bp\" (UID: \"af16052b-a5ab-4244-b007-69a32d050a35\") " pod="openstack/cinder-db-sync-rm5bp" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.820666 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ng9kt\" (UniqueName: \"kubernetes.io/projected/af16052b-a5ab-4244-b007-69a32d050a35-kube-api-access-ng9kt\") pod \"cinder-db-sync-rm5bp\" (UID: \"af16052b-a5ab-4244-b007-69a32d050a35\") " pod="openstack/cinder-db-sync-rm5bp" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.838898 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a4a84b3e-b01b-424e-969b-e2cbc625f3f3-run-httpd\") pod \"ceilometer-0\" (UID: \"a4a84b3e-b01b-424e-969b-e2cbc625f3f3\") " pod="openstack/ceilometer-0" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.838947 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24adc7ea-6cf7-416b-9570-a92736a9b48b-config\") pod \"dnsmasq-dns-785d8bcb8c-5g98d\" (UID: \"24adc7ea-6cf7-416b-9570-a92736a9b48b\") " pod="openstack/dnsmasq-dns-785d8bcb8c-5g98d" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.838968 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16ac2620-c820-4881-b959-d12478e5f4ba-scripts\") pod \"glance-default-external-api-0\" (UID: \"16ac2620-c820-4881-b959-d12478e5f4ba\") " pod="openstack/glance-default-external-api-0" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.838985 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/16ac2620-c820-4881-b959-d12478e5f4ba-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"16ac2620-c820-4881-b959-d12478e5f4ba\") " pod="openstack/glance-default-external-api-0" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.839014 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c2b0c5b1-956d-4aae-97b8-849e47e03677-config-data\") pod \"horizon-69d8fb6c75-qkwp6\" (UID: \"c2b0c5b1-956d-4aae-97b8-849e47e03677\") " pod="openstack/horizon-69d8fb6c75-qkwp6" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.839054 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a4a84b3e-b01b-424e-969b-e2cbc625f3f3-log-httpd\") pod \"ceilometer-0\" (UID: \"a4a84b3e-b01b-424e-969b-e2cbc625f3f3\") " pod="openstack/ceilometer-0" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.839075 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zqdnr\" (UniqueName: \"kubernetes.io/projected/24adc7ea-6cf7-416b-9570-a92736a9b48b-kube-api-access-zqdnr\") pod \"dnsmasq-dns-785d8bcb8c-5g98d\" (UID: \"24adc7ea-6cf7-416b-9570-a92736a9b48b\") " pod="openstack/dnsmasq-dns-785d8bcb8c-5g98d" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.839095 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16ac2620-c820-4881-b959-d12478e5f4ba-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"16ac2620-c820-4881-b959-d12478e5f4ba\") " pod="openstack/glance-default-external-api-0" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.839113 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"16ac2620-c820-4881-b959-d12478e5f4ba\") " pod="openstack/glance-default-external-api-0" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.839129 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/24adc7ea-6cf7-416b-9570-a92736a9b48b-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-5g98d\" (UID: \"24adc7ea-6cf7-416b-9570-a92736a9b48b\") " pod="openstack/dnsmasq-dns-785d8bcb8c-5g98d" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.839149 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/952e94ca-2ca0-4a73-af07-8a02572a080b-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"952e94ca-2ca0-4a73-af07-8a02572a080b\") " pod="openstack/glance-default-internal-api-0" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.839170 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/952e94ca-2ca0-4a73-af07-8a02572a080b-config-data\") pod \"glance-default-internal-api-0\" (UID: \"952e94ca-2ca0-4a73-af07-8a02572a080b\") " pod="openstack/glance-default-internal-api-0" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.839187 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c2b0c5b1-956d-4aae-97b8-849e47e03677-scripts\") pod \"horizon-69d8fb6c75-qkwp6\" (UID: \"c2b0c5b1-956d-4aae-97b8-849e47e03677\") " pod="openstack/horizon-69d8fb6c75-qkwp6" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.839208 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c2b0c5b1-956d-4aae-97b8-849e47e03677-horizon-secret-key\") pod \"horizon-69d8fb6c75-qkwp6\" (UID: \"c2b0c5b1-956d-4aae-97b8-849e47e03677\") " pod="openstack/horizon-69d8fb6c75-qkwp6" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.839224 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4a84b3e-b01b-424e-969b-e2cbc625f3f3-config-data\") pod \"ceilometer-0\" (UID: \"a4a84b3e-b01b-424e-969b-e2cbc625f3f3\") " pod="openstack/ceilometer-0" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.839240 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/24adc7ea-6cf7-416b-9570-a92736a9b48b-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-5g98d\" (UID: \"24adc7ea-6cf7-416b-9570-a92736a9b48b\") " pod="openstack/dnsmasq-dns-785d8bcb8c-5g98d" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.839256 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/24adc7ea-6cf7-416b-9570-a92736a9b48b-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-5g98d\" (UID: \"24adc7ea-6cf7-416b-9570-a92736a9b48b\") " pod="openstack/dnsmasq-dns-785d8bcb8c-5g98d" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.839274 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a4a4f230-3fe6-44a4-a91b-5b0ea07ae755-db-sync-config-data\") pod \"barbican-db-sync-j7jqs\" (UID: \"a4a4f230-3fe6-44a4-a91b-5b0ea07ae755\") " pod="openstack/barbican-db-sync-j7jqs" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.839291 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a4a84b3e-b01b-424e-969b-e2cbc625f3f3-scripts\") pod \"ceilometer-0\" (UID: \"a4a84b3e-b01b-424e-969b-e2cbc625f3f3\") " pod="openstack/ceilometer-0" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.839307 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c2b0c5b1-956d-4aae-97b8-849e47e03677-logs\") pod \"horizon-69d8fb6c75-qkwp6\" (UID: \"c2b0c5b1-956d-4aae-97b8-849e47e03677\") " pod="openstack/horizon-69d8fb6c75-qkwp6" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.839322 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/952e94ca-2ca0-4a73-af07-8a02572a080b-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"952e94ca-2ca0-4a73-af07-8a02572a080b\") " pod="openstack/glance-default-internal-api-0" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.839339 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16ac2620-c820-4881-b959-d12478e5f4ba-config-data\") pod \"glance-default-external-api-0\" (UID: \"16ac2620-c820-4881-b959-d12478e5f4ba\") " pod="openstack/glance-default-external-api-0" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.839366 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4a4f230-3fe6-44a4-a91b-5b0ea07ae755-combined-ca-bundle\") pod \"barbican-db-sync-j7jqs\" (UID: \"a4a4f230-3fe6-44a4-a91b-5b0ea07ae755\") " pod="openstack/barbican-db-sync-j7jqs" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.839383 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"952e94ca-2ca0-4a73-af07-8a02572a080b\") " pod="openstack/glance-default-internal-api-0" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.839405 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/952e94ca-2ca0-4a73-af07-8a02572a080b-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"952e94ca-2ca0-4a73-af07-8a02572a080b\") " pod="openstack/glance-default-internal-api-0" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.839431 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrmm6\" (UniqueName: \"kubernetes.io/projected/16ac2620-c820-4881-b959-d12478e5f4ba-kube-api-access-wrmm6\") pod \"glance-default-external-api-0\" (UID: \"16ac2620-c820-4881-b959-d12478e5f4ba\") " pod="openstack/glance-default-external-api-0" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.839446 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4a84b3e-b01b-424e-969b-e2cbc625f3f3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a4a84b3e-b01b-424e-969b-e2cbc625f3f3\") " pod="openstack/ceilometer-0" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.839461 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/16ac2620-c820-4881-b959-d12478e5f4ba-logs\") pod \"glance-default-external-api-0\" (UID: \"16ac2620-c820-4881-b959-d12478e5f4ba\") " pod="openstack/glance-default-external-api-0" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.839482 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/952e94ca-2ca0-4a73-af07-8a02572a080b-scripts\") pod \"glance-default-internal-api-0\" (UID: \"952e94ca-2ca0-4a73-af07-8a02572a080b\") " pod="openstack/glance-default-internal-api-0" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.839500 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qhtff\" (UniqueName: \"kubernetes.io/projected/a4a4f230-3fe6-44a4-a91b-5b0ea07ae755-kube-api-access-qhtff\") pod \"barbican-db-sync-j7jqs\" (UID: \"a4a4f230-3fe6-44a4-a91b-5b0ea07ae755\") " pod="openstack/barbican-db-sync-j7jqs" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.839525 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/952e94ca-2ca0-4a73-af07-8a02572a080b-logs\") pod \"glance-default-internal-api-0\" (UID: \"952e94ca-2ca0-4a73-af07-8a02572a080b\") " pod="openstack/glance-default-internal-api-0" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.839542 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/24adc7ea-6cf7-416b-9570-a92736a9b48b-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-5g98d\" (UID: \"24adc7ea-6cf7-416b-9570-a92736a9b48b\") " pod="openstack/dnsmasq-dns-785d8bcb8c-5g98d" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.839563 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2nhcj\" (UniqueName: \"kubernetes.io/projected/952e94ca-2ca0-4a73-af07-8a02572a080b-kube-api-access-2nhcj\") pod \"glance-default-internal-api-0\" (UID: \"952e94ca-2ca0-4a73-af07-8a02572a080b\") " pod="openstack/glance-default-internal-api-0" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.839584 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xzsrp\" (UniqueName: \"kubernetes.io/projected/a4a84b3e-b01b-424e-969b-e2cbc625f3f3-kube-api-access-xzsrp\") pod \"ceilometer-0\" (UID: \"a4a84b3e-b01b-424e-969b-e2cbc625f3f3\") " pod="openstack/ceilometer-0" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.839614 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m44cp\" (UniqueName: \"kubernetes.io/projected/c2b0c5b1-956d-4aae-97b8-849e47e03677-kube-api-access-m44cp\") pod \"horizon-69d8fb6c75-qkwp6\" (UID: \"c2b0c5b1-956d-4aae-97b8-849e47e03677\") " pod="openstack/horizon-69d8fb6c75-qkwp6" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.839634 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/16ac2620-c820-4881-b959-d12478e5f4ba-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"16ac2620-c820-4881-b959-d12478e5f4ba\") " pod="openstack/glance-default-external-api-0" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.839655 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a4a84b3e-b01b-424e-969b-e2cbc625f3f3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a4a84b3e-b01b-424e-969b-e2cbc625f3f3\") " pod="openstack/ceilometer-0" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.839701 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz55l\" (UniqueName: \"kubernetes.io/projected/c5e2826a-d87e-4002-bece-b3eed4990771-kube-api-access-lz55l\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.840922 4748 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"16ac2620-c820-4881-b959-d12478e5f4ba\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/glance-default-external-api-0" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.843215 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/16ac2620-c820-4881-b959-d12478e5f4ba-logs\") pod \"glance-default-external-api-0\" (UID: \"16ac2620-c820-4881-b959-d12478e5f4ba\") " pod="openstack/glance-default-external-api-0" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.844303 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/24adc7ea-6cf7-416b-9570-a92736a9b48b-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-5g98d\" (UID: \"24adc7ea-6cf7-416b-9570-a92736a9b48b\") " pod="openstack/dnsmasq-dns-785d8bcb8c-5g98d" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.844657 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/16ac2620-c820-4881-b959-d12478e5f4ba-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"16ac2620-c820-4881-b959-d12478e5f4ba\") " pod="openstack/glance-default-external-api-0" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.845763 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24adc7ea-6cf7-416b-9570-a92736a9b48b-config\") pod \"dnsmasq-dns-785d8bcb8c-5g98d\" (UID: \"24adc7ea-6cf7-416b-9570-a92736a9b48b\") " pod="openstack/dnsmasq-dns-785d8bcb8c-5g98d" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.846087 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a4a84b3e-b01b-424e-969b-e2cbc625f3f3-run-httpd\") pod \"ceilometer-0\" (UID: \"a4a84b3e-b01b-424e-969b-e2cbc625f3f3\") " pod="openstack/ceilometer-0" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.847009 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a4a84b3e-b01b-424e-969b-e2cbc625f3f3-log-httpd\") pod \"ceilometer-0\" (UID: \"a4a84b3e-b01b-424e-969b-e2cbc625f3f3\") " pod="openstack/ceilometer-0" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.850927 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/24adc7ea-6cf7-416b-9570-a92736a9b48b-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-5g98d\" (UID: \"24adc7ea-6cf7-416b-9570-a92736a9b48b\") " pod="openstack/dnsmasq-dns-785d8bcb8c-5g98d" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.854651 4748 scope.go:117] "RemoveContainer" containerID="e197470b807c4b52cd627816c1acbe9e1159bd716fce0a771da472c164383599" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.861623 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/24adc7ea-6cf7-416b-9570-a92736a9b48b-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-5g98d\" (UID: \"24adc7ea-6cf7-416b-9570-a92736a9b48b\") " pod="openstack/dnsmasq-dns-785d8bcb8c-5g98d" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.863410 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/24adc7ea-6cf7-416b-9570-a92736a9b48b-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-5g98d\" (UID: \"24adc7ea-6cf7-416b-9570-a92736a9b48b\") " pod="openstack/dnsmasq-dns-785d8bcb8c-5g98d" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.868043 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a4a84b3e-b01b-424e-969b-e2cbc625f3f3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a4a84b3e-b01b-424e-969b-e2cbc625f3f3\") " pod="openstack/ceilometer-0" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.871656 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-rm5bp" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.919636 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a4a4f230-3fe6-44a4-a91b-5b0ea07ae755-db-sync-config-data\") pod \"barbican-db-sync-j7jqs\" (UID: \"a4a4f230-3fe6-44a4-a91b-5b0ea07ae755\") " pod="openstack/barbican-db-sync-j7jqs" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.928799 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/16ac2620-c820-4881-b959-d12478e5f4ba-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"16ac2620-c820-4881-b959-d12478e5f4ba\") " pod="openstack/glance-default-external-api-0" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.929453 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4a84b3e-b01b-424e-969b-e2cbc625f3f3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a4a84b3e-b01b-424e-969b-e2cbc625f3f3\") " pod="openstack/ceilometer-0" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.929632 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16ac2620-c820-4881-b959-d12478e5f4ba-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"16ac2620-c820-4881-b959-d12478e5f4ba\") " pod="openstack/glance-default-external-api-0" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.940981 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/952e94ca-2ca0-4a73-af07-8a02572a080b-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"952e94ca-2ca0-4a73-af07-8a02572a080b\") " pod="openstack/glance-default-internal-api-0" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.941066 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/952e94ca-2ca0-4a73-af07-8a02572a080b-scripts\") pod \"glance-default-internal-api-0\" (UID: \"952e94ca-2ca0-4a73-af07-8a02572a080b\") " pod="openstack/glance-default-internal-api-0" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.941115 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/952e94ca-2ca0-4a73-af07-8a02572a080b-logs\") pod \"glance-default-internal-api-0\" (UID: \"952e94ca-2ca0-4a73-af07-8a02572a080b\") " pod="openstack/glance-default-internal-api-0" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.941152 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2nhcj\" (UniqueName: \"kubernetes.io/projected/952e94ca-2ca0-4a73-af07-8a02572a080b-kube-api-access-2nhcj\") pod \"glance-default-internal-api-0\" (UID: \"952e94ca-2ca0-4a73-af07-8a02572a080b\") " pod="openstack/glance-default-internal-api-0" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.941206 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m44cp\" (UniqueName: \"kubernetes.io/projected/c2b0c5b1-956d-4aae-97b8-849e47e03677-kube-api-access-m44cp\") pod \"horizon-69d8fb6c75-qkwp6\" (UID: \"c2b0c5b1-956d-4aae-97b8-849e47e03677\") " pod="openstack/horizon-69d8fb6c75-qkwp6" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.941253 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c2b0c5b1-956d-4aae-97b8-849e47e03677-config-data\") pod \"horizon-69d8fb6c75-qkwp6\" (UID: \"c2b0c5b1-956d-4aae-97b8-849e47e03677\") " pod="openstack/horizon-69d8fb6c75-qkwp6" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.941300 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/952e94ca-2ca0-4a73-af07-8a02572a080b-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"952e94ca-2ca0-4a73-af07-8a02572a080b\") " pod="openstack/glance-default-internal-api-0" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.941319 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/952e94ca-2ca0-4a73-af07-8a02572a080b-config-data\") pod \"glance-default-internal-api-0\" (UID: \"952e94ca-2ca0-4a73-af07-8a02572a080b\") " pod="openstack/glance-default-internal-api-0" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.941337 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c2b0c5b1-956d-4aae-97b8-849e47e03677-scripts\") pod \"horizon-69d8fb6c75-qkwp6\" (UID: \"c2b0c5b1-956d-4aae-97b8-849e47e03677\") " pod="openstack/horizon-69d8fb6c75-qkwp6" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.941358 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c2b0c5b1-956d-4aae-97b8-849e47e03677-horizon-secret-key\") pod \"horizon-69d8fb6c75-qkwp6\" (UID: \"c2b0c5b1-956d-4aae-97b8-849e47e03677\") " pod="openstack/horizon-69d8fb6c75-qkwp6" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.941390 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/952e94ca-2ca0-4a73-af07-8a02572a080b-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"952e94ca-2ca0-4a73-af07-8a02572a080b\") " pod="openstack/glance-default-internal-api-0" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.941405 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c2b0c5b1-956d-4aae-97b8-849e47e03677-logs\") pod \"horizon-69d8fb6c75-qkwp6\" (UID: \"c2b0c5b1-956d-4aae-97b8-849e47e03677\") " pod="openstack/horizon-69d8fb6c75-qkwp6" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.941429 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"952e94ca-2ca0-4a73-af07-8a02572a080b\") " pod="openstack/glance-default-internal-api-0" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.941773 4748 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"952e94ca-2ca0-4a73-af07-8a02572a080b\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-internal-api-0" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.948851 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c2b0c5b1-956d-4aae-97b8-849e47e03677-config-data\") pod \"horizon-69d8fb6c75-qkwp6\" (UID: \"c2b0c5b1-956d-4aae-97b8-849e47e03677\") " pod="openstack/horizon-69d8fb6c75-qkwp6" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.949432 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/952e94ca-2ca0-4a73-af07-8a02572a080b-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"952e94ca-2ca0-4a73-af07-8a02572a080b\") " pod="openstack/glance-default-internal-api-0" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.951458 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/952e94ca-2ca0-4a73-af07-8a02572a080b-logs\") pod \"glance-default-internal-api-0\" (UID: \"952e94ca-2ca0-4a73-af07-8a02572a080b\") " pod="openstack/glance-default-internal-api-0" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.953610 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c2b0c5b1-956d-4aae-97b8-849e47e03677-scripts\") pod \"horizon-69d8fb6c75-qkwp6\" (UID: \"c2b0c5b1-956d-4aae-97b8-849e47e03677\") " pod="openstack/horizon-69d8fb6c75-qkwp6" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.954428 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16ac2620-c820-4881-b959-d12478e5f4ba-config-data\") pod \"glance-default-external-api-0\" (UID: \"16ac2620-c820-4881-b959-d12478e5f4ba\") " pod="openstack/glance-default-external-api-0" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.955058 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4a4f230-3fe6-44a4-a91b-5b0ea07ae755-combined-ca-bundle\") pod \"barbican-db-sync-j7jqs\" (UID: \"a4a4f230-3fe6-44a4-a91b-5b0ea07ae755\") " pod="openstack/barbican-db-sync-j7jqs" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.957812 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/952e94ca-2ca0-4a73-af07-8a02572a080b-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"952e94ca-2ca0-4a73-af07-8a02572a080b\") " pod="openstack/glance-default-internal-api-0" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.959144 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4a84b3e-b01b-424e-969b-e2cbc625f3f3-config-data\") pod \"ceilometer-0\" (UID: \"a4a84b3e-b01b-424e-969b-e2cbc625f3f3\") " pod="openstack/ceilometer-0" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.965144 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c2b0c5b1-956d-4aae-97b8-849e47e03677-logs\") pod \"horizon-69d8fb6c75-qkwp6\" (UID: \"c2b0c5b1-956d-4aae-97b8-849e47e03677\") " pod="openstack/horizon-69d8fb6c75-qkwp6" Mar 20 10:56:12 crc kubenswrapper[4748]: I0320 10:56:12.969090 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-74f6bcbc87-qpm57" podStartSLOduration=3.969055816 podStartE2EDuration="3.969055816s" podCreationTimestamp="2026-03-20 10:56:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:56:12.927979073 +0000 UTC m=+1208.069524897" watchObservedRunningTime="2026-03-20 10:56:12.969055816 +0000 UTC m=+1208.110601630" Mar 20 10:56:13 crc kubenswrapper[4748]: I0320 10:56:13.013711 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a4a84b3e-b01b-424e-969b-e2cbc625f3f3-scripts\") pod \"ceilometer-0\" (UID: \"a4a84b3e-b01b-424e-969b-e2cbc625f3f3\") " pod="openstack/ceilometer-0" Mar 20 10:56:13 crc kubenswrapper[4748]: I0320 10:56:13.017034 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c2b0c5b1-956d-4aae-97b8-849e47e03677-horizon-secret-key\") pod \"horizon-69d8fb6c75-qkwp6\" (UID: \"c2b0c5b1-956d-4aae-97b8-849e47e03677\") " pod="openstack/horizon-69d8fb6c75-qkwp6" Mar 20 10:56:13 crc kubenswrapper[4748]: I0320 10:56:13.018298 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/952e94ca-2ca0-4a73-af07-8a02572a080b-config-data\") pod \"glance-default-internal-api-0\" (UID: \"952e94ca-2ca0-4a73-af07-8a02572a080b\") " pod="openstack/glance-default-internal-api-0" Mar 20 10:56:13 crc kubenswrapper[4748]: I0320 10:56:13.018397 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqdnr\" (UniqueName: \"kubernetes.io/projected/24adc7ea-6cf7-416b-9570-a92736a9b48b-kube-api-access-zqdnr\") pod \"dnsmasq-dns-785d8bcb8c-5g98d\" (UID: \"24adc7ea-6cf7-416b-9570-a92736a9b48b\") " pod="openstack/dnsmasq-dns-785d8bcb8c-5g98d" Mar 20 10:56:13 crc kubenswrapper[4748]: I0320 10:56:13.018931 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/952e94ca-2ca0-4a73-af07-8a02572a080b-scripts\") pod \"glance-default-internal-api-0\" (UID: \"952e94ca-2ca0-4a73-af07-8a02572a080b\") " pod="openstack/glance-default-internal-api-0" Mar 20 10:56:13 crc kubenswrapper[4748]: I0320 10:56:13.019119 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzsrp\" (UniqueName: \"kubernetes.io/projected/a4a84b3e-b01b-424e-969b-e2cbc625f3f3-kube-api-access-xzsrp\") pod \"ceilometer-0\" (UID: \"a4a84b3e-b01b-424e-969b-e2cbc625f3f3\") " pod="openstack/ceilometer-0" Mar 20 10:56:13 crc kubenswrapper[4748]: I0320 10:56:13.021750 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/952e94ca-2ca0-4a73-af07-8a02572a080b-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"952e94ca-2ca0-4a73-af07-8a02572a080b\") " pod="openstack/glance-default-internal-api-0" Mar 20 10:56:13 crc kubenswrapper[4748]: I0320 10:56:13.022406 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2nhcj\" (UniqueName: \"kubernetes.io/projected/952e94ca-2ca0-4a73-af07-8a02572a080b-kube-api-access-2nhcj\") pod \"glance-default-internal-api-0\" (UID: \"952e94ca-2ca0-4a73-af07-8a02572a080b\") " pod="openstack/glance-default-internal-api-0" Mar 20 10:56:13 crc kubenswrapper[4748]: I0320 10:56:13.029477 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16ac2620-c820-4881-b959-d12478e5f4ba-scripts\") pod \"glance-default-external-api-0\" (UID: \"16ac2620-c820-4881-b959-d12478e5f4ba\") " pod="openstack/glance-default-external-api-0" Mar 20 10:56:13 crc kubenswrapper[4748]: I0320 10:56:13.034220 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhtff\" (UniqueName: \"kubernetes.io/projected/a4a4f230-3fe6-44a4-a91b-5b0ea07ae755-kube-api-access-qhtff\") pod \"barbican-db-sync-j7jqs\" (UID: \"a4a4f230-3fe6-44a4-a91b-5b0ea07ae755\") " pod="openstack/barbican-db-sync-j7jqs" Mar 20 10:56:13 crc kubenswrapper[4748]: I0320 10:56:13.035083 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrmm6\" (UniqueName: \"kubernetes.io/projected/16ac2620-c820-4881-b959-d12478e5f4ba-kube-api-access-wrmm6\") pod \"glance-default-external-api-0\" (UID: \"16ac2620-c820-4881-b959-d12478e5f4ba\") " pod="openstack/glance-default-external-api-0" Mar 20 10:56:13 crc kubenswrapper[4748]: I0320 10:56:13.038416 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m44cp\" (UniqueName: \"kubernetes.io/projected/c2b0c5b1-956d-4aae-97b8-849e47e03677-kube-api-access-m44cp\") pod \"horizon-69d8fb6c75-qkwp6\" (UID: \"c2b0c5b1-956d-4aae-97b8-849e47e03677\") " pod="openstack/horizon-69d8fb6c75-qkwp6" Mar 20 10:56:13 crc kubenswrapper[4748]: I0320 10:56:13.103644 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5e2826a-d87e-4002-bece-b3eed4990771-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c5e2826a-d87e-4002-bece-b3eed4990771" (UID: "c5e2826a-d87e-4002-bece-b3eed4990771"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:13 crc kubenswrapper[4748]: I0320 10:56:13.103824 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"952e94ca-2ca0-4a73-af07-8a02572a080b\") " pod="openstack/glance-default-internal-api-0" Mar 20 10:56:13 crc kubenswrapper[4748]: I0320 10:56:13.153787 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5e2826a-d87e-4002-bece-b3eed4990771-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c5e2826a-d87e-4002-bece-b3eed4990771" (UID: "c5e2826a-d87e-4002-bece-b3eed4990771"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:13 crc kubenswrapper[4748]: I0320 10:56:13.156180 4748 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c5e2826a-d87e-4002-bece-b3eed4990771-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:13 crc kubenswrapper[4748]: I0320 10:56:13.156210 4748 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c5e2826a-d87e-4002-bece-b3eed4990771-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:13 crc kubenswrapper[4748]: I0320 10:56:13.185256 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"16ac2620-c820-4881-b959-d12478e5f4ba\") " pod="openstack/glance-default-external-api-0" Mar 20 10:56:13 crc kubenswrapper[4748]: I0320 10:56:13.189916 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5e2826a-d87e-4002-bece-b3eed4990771-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c5e2826a-d87e-4002-bece-b3eed4990771" (UID: "c5e2826a-d87e-4002-bece-b3eed4990771"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:13 crc kubenswrapper[4748]: I0320 10:56:13.219747 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5e2826a-d87e-4002-bece-b3eed4990771-config" (OuterVolumeSpecName: "config") pod "c5e2826a-d87e-4002-bece-b3eed4990771" (UID: "c5e2826a-d87e-4002-bece-b3eed4990771"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:13 crc kubenswrapper[4748]: I0320 10:56:13.236801 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 10:56:13 crc kubenswrapper[4748]: I0320 10:56:13.251728 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-j7jqs" Mar 20 10:56:13 crc kubenswrapper[4748]: I0320 10:56:13.257920 4748 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c5e2826a-d87e-4002-bece-b3eed4990771-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:13 crc kubenswrapper[4748]: I0320 10:56:13.257952 4748 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5e2826a-d87e-4002-bece-b3eed4990771-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:13 crc kubenswrapper[4748]: I0320 10:56:13.267695 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 10:56:13 crc kubenswrapper[4748]: I0320 10:56:13.316353 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-5g98d" Mar 20 10:56:13 crc kubenswrapper[4748]: I0320 10:56:13.329051 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-69d8fb6c75-qkwp6" Mar 20 10:56:13 crc kubenswrapper[4748]: I0320 10:56:13.351245 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 10:56:13 crc kubenswrapper[4748]: I0320 10:56:13.399147 4748 scope.go:117] "RemoveContainer" containerID="0e436915fb6c6bf3dc203d98ac633ed1fd433ebcd13e6da71041d12969b3f57b" Mar 20 10:56:13 crc kubenswrapper[4748]: E0320 10:56:13.400197 4748 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e436915fb6c6bf3dc203d98ac633ed1fd433ebcd13e6da71041d12969b3f57b\": container with ID starting with 0e436915fb6c6bf3dc203d98ac633ed1fd433ebcd13e6da71041d12969b3f57b not found: ID does not exist" containerID="0e436915fb6c6bf3dc203d98ac633ed1fd433ebcd13e6da71041d12969b3f57b" Mar 20 10:56:13 crc kubenswrapper[4748]: I0320 10:56:13.400257 4748 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e436915fb6c6bf3dc203d98ac633ed1fd433ebcd13e6da71041d12969b3f57b"} err="failed to get container status \"0e436915fb6c6bf3dc203d98ac633ed1fd433ebcd13e6da71041d12969b3f57b\": rpc error: code = NotFound desc = could not find container \"0e436915fb6c6bf3dc203d98ac633ed1fd433ebcd13e6da71041d12969b3f57b\": container with ID starting with 0e436915fb6c6bf3dc203d98ac633ed1fd433ebcd13e6da71041d12969b3f57b not found: ID does not exist" Mar 20 10:56:13 crc kubenswrapper[4748]: I0320 10:56:13.400296 4748 scope.go:117] "RemoveContainer" containerID="e197470b807c4b52cd627816c1acbe9e1159bd716fce0a771da472c164383599" Mar 20 10:56:13 crc kubenswrapper[4748]: E0320 10:56:13.400829 4748 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e197470b807c4b52cd627816c1acbe9e1159bd716fce0a771da472c164383599\": container with ID starting with e197470b807c4b52cd627816c1acbe9e1159bd716fce0a771da472c164383599 not found: ID does not exist" containerID="e197470b807c4b52cd627816c1acbe9e1159bd716fce0a771da472c164383599" Mar 20 10:56:13 crc kubenswrapper[4748]: I0320 10:56:13.400970 4748 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e197470b807c4b52cd627816c1acbe9e1159bd716fce0a771da472c164383599"} err="failed to get container status \"e197470b807c4b52cd627816c1acbe9e1159bd716fce0a771da472c164383599\": rpc error: code = NotFound desc = could not find container \"e197470b807c4b52cd627816c1acbe9e1159bd716fce0a771da472c164383599\": container with ID starting with e197470b807c4b52cd627816c1acbe9e1159bd716fce0a771da472c164383599 not found: ID does not exist" Mar 20 10:56:13 crc kubenswrapper[4748]: E0320 10:56:13.404518 4748 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod543b35d5_7087_4a09_bbfb_6937b3765855.slice/crio-conmon-cc3d10b384b48095b3ea900b6f455aeed1fe457a88698c25a7111902fa133019.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod543b35d5_7087_4a09_bbfb_6937b3765855.slice/crio-cc3d10b384b48095b3ea900b6f455aeed1fe457a88698c25a7111902fa133019.scope\": RecentStats: unable to find data in memory cache]" Mar 20 10:56:13 crc kubenswrapper[4748]: I0320 10:56:13.611816 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-67n95"] Mar 20 10:56:13 crc kubenswrapper[4748]: I0320 10:56:13.627765 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-67n95"] Mar 20 10:56:13 crc kubenswrapper[4748]: I0320 10:56:13.655524 4748 generic.go:334] "Generic (PLEG): container finished" podID="543b35d5-7087-4a09-bbfb-6937b3765855" containerID="cc3d10b384b48095b3ea900b6f455aeed1fe457a88698c25a7111902fa133019" exitCode=0 Mar 20 10:56:13 crc kubenswrapper[4748]: I0320 10:56:13.655588 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-qpm57" event={"ID":"543b35d5-7087-4a09-bbfb-6937b3765855","Type":"ContainerDied","Data":"cc3d10b384b48095b3ea900b6f455aeed1fe457a88698c25a7111902fa133019"} Mar 20 10:56:13 crc kubenswrapper[4748]: I0320 10:56:13.779069 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-8t679"] Mar 20 10:56:13 crc kubenswrapper[4748]: I0320 10:56:13.921420 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-g4qfl"] Mar 20 10:56:13 crc kubenswrapper[4748]: I0320 10:56:13.955743 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-r6t2v"] Mar 20 10:56:13 crc kubenswrapper[4748]: I0320 10:56:13.980956 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-42s2j"] Mar 20 10:56:13 crc kubenswrapper[4748]: W0320 10:56:13.995914 4748 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfcd1bb77_acf9_4f10_855d_4ab63f3ad229.slice/crio-304286bbf5d469aeee81c1d9e8ee78d81797054684e4cd5e340c9b01ab1be4d0 WatchSource:0}: Error finding container 304286bbf5d469aeee81c1d9e8ee78d81797054684e4cd5e340c9b01ab1be4d0: Status 404 returned error can't find the container with id 304286bbf5d469aeee81c1d9e8ee78d81797054684e4cd5e340c9b01ab1be4d0 Mar 20 10:56:14 crc kubenswrapper[4748]: I0320 10:56:14.140807 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-qpm57" Mar 20 10:56:14 crc kubenswrapper[4748]: I0320 10:56:14.189803 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4g8tb\" (UniqueName: \"kubernetes.io/projected/543b35d5-7087-4a09-bbfb-6937b3765855-kube-api-access-4g8tb\") pod \"543b35d5-7087-4a09-bbfb-6937b3765855\" (UID: \"543b35d5-7087-4a09-bbfb-6937b3765855\") " Mar 20 10:56:14 crc kubenswrapper[4748]: I0320 10:56:14.189938 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/543b35d5-7087-4a09-bbfb-6937b3765855-dns-svc\") pod \"543b35d5-7087-4a09-bbfb-6937b3765855\" (UID: \"543b35d5-7087-4a09-bbfb-6937b3765855\") " Mar 20 10:56:14 crc kubenswrapper[4748]: I0320 10:56:14.190178 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/543b35d5-7087-4a09-bbfb-6937b3765855-ovsdbserver-sb\") pod \"543b35d5-7087-4a09-bbfb-6937b3765855\" (UID: \"543b35d5-7087-4a09-bbfb-6937b3765855\") " Mar 20 10:56:14 crc kubenswrapper[4748]: I0320 10:56:14.190211 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/543b35d5-7087-4a09-bbfb-6937b3765855-config\") pod \"543b35d5-7087-4a09-bbfb-6937b3765855\" (UID: \"543b35d5-7087-4a09-bbfb-6937b3765855\") " Mar 20 10:56:14 crc kubenswrapper[4748]: I0320 10:56:14.190242 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/543b35d5-7087-4a09-bbfb-6937b3765855-ovsdbserver-nb\") pod \"543b35d5-7087-4a09-bbfb-6937b3765855\" (UID: \"543b35d5-7087-4a09-bbfb-6937b3765855\") " Mar 20 10:56:14 crc kubenswrapper[4748]: I0320 10:56:14.190371 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/543b35d5-7087-4a09-bbfb-6937b3765855-dns-swift-storage-0\") pod \"543b35d5-7087-4a09-bbfb-6937b3765855\" (UID: \"543b35d5-7087-4a09-bbfb-6937b3765855\") " Mar 20 10:56:14 crc kubenswrapper[4748]: I0320 10:56:14.237152 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/543b35d5-7087-4a09-bbfb-6937b3765855-kube-api-access-4g8tb" (OuterVolumeSpecName: "kube-api-access-4g8tb") pod "543b35d5-7087-4a09-bbfb-6937b3765855" (UID: "543b35d5-7087-4a09-bbfb-6937b3765855"). InnerVolumeSpecName "kube-api-access-4g8tb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:14 crc kubenswrapper[4748]: I0320 10:56:14.294346 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4g8tb\" (UniqueName: \"kubernetes.io/projected/543b35d5-7087-4a09-bbfb-6937b3765855-kube-api-access-4g8tb\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:14 crc kubenswrapper[4748]: I0320 10:56:14.421220 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 10:56:14 crc kubenswrapper[4748]: I0320 10:56:14.468408 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-69d8fb6c75-qkwp6"] Mar 20 10:56:14 crc kubenswrapper[4748]: I0320 10:56:14.527273 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-j7jqs"] Mar 20 10:56:14 crc kubenswrapper[4748]: I0320 10:56:14.556747 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 10:56:14 crc kubenswrapper[4748]: W0320 10:56:14.557784 4748 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda4a4f230_3fe6_44a4_a91b_5b0ea07ae755.slice/crio-c14c86cf03f9fe0ca3c0e2c665e5093475365ee25da6bfb5d95ce3271c77eacb WatchSource:0}: Error finding container c14c86cf03f9fe0ca3c0e2c665e5093475365ee25da6bfb5d95ce3271c77eacb: Status 404 returned error can't find the container with id c14c86cf03f9fe0ca3c0e2c665e5093475365ee25da6bfb5d95ce3271c77eacb Mar 20 10:56:14 crc kubenswrapper[4748]: I0320 10:56:14.590863 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-5cfd8f8d7c-h5mnf"] Mar 20 10:56:14 crc kubenswrapper[4748]: E0320 10:56:14.591415 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="543b35d5-7087-4a09-bbfb-6937b3765855" containerName="dnsmasq-dns" Mar 20 10:56:14 crc kubenswrapper[4748]: I0320 10:56:14.591440 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="543b35d5-7087-4a09-bbfb-6937b3765855" containerName="dnsmasq-dns" Mar 20 10:56:14 crc kubenswrapper[4748]: E0320 10:56:14.591466 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="543b35d5-7087-4a09-bbfb-6937b3765855" containerName="init" Mar 20 10:56:14 crc kubenswrapper[4748]: I0320 10:56:14.591476 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="543b35d5-7087-4a09-bbfb-6937b3765855" containerName="init" Mar 20 10:56:14 crc kubenswrapper[4748]: I0320 10:56:14.591687 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="543b35d5-7087-4a09-bbfb-6937b3765855" containerName="dnsmasq-dns" Mar 20 10:56:14 crc kubenswrapper[4748]: I0320 10:56:14.609265 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5cfd8f8d7c-h5mnf" Mar 20 10:56:14 crc kubenswrapper[4748]: I0320 10:56:14.637301 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-b8d4ccdcc-grvq4"] Mar 20 10:56:14 crc kubenswrapper[4748]: I0320 10:56:14.685132 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-rm5bp"] Mar 20 10:56:14 crc kubenswrapper[4748]: I0320 10:56:14.686372 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/543b35d5-7087-4a09-bbfb-6937b3765855-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "543b35d5-7087-4a09-bbfb-6937b3765855" (UID: "543b35d5-7087-4a09-bbfb-6937b3765855"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:14 crc kubenswrapper[4748]: I0320 10:56:14.687287 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/543b35d5-7087-4a09-bbfb-6937b3765855-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "543b35d5-7087-4a09-bbfb-6937b3765855" (UID: "543b35d5-7087-4a09-bbfb-6937b3765855"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:14 crc kubenswrapper[4748]: I0320 10:56:14.731591 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-5g98d"] Mar 20 10:56:14 crc kubenswrapper[4748]: I0320 10:56:14.735411 4748 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/543b35d5-7087-4a09-bbfb-6937b3765855-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:14 crc kubenswrapper[4748]: I0320 10:56:14.735442 4748 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/543b35d5-7087-4a09-bbfb-6937b3765855-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:14 crc kubenswrapper[4748]: I0320 10:56:14.775487 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5cfd8f8d7c-h5mnf"] Mar 20 10:56:14 crc kubenswrapper[4748]: I0320 10:56:14.820345 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-r6t2v" event={"ID":"e7f8f96f-de61-435d-a542-0b10d8860ccd","Type":"ContainerStarted","Data":"3bc0db86ee4b53156414e6bc24baf2c78fbeca995b3c9e9e4d945b908cfcb834"} Mar 20 10:56:14 crc kubenswrapper[4748]: I0320 10:56:14.820404 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-r6t2v" event={"ID":"e7f8f96f-de61-435d-a542-0b10d8860ccd","Type":"ContainerStarted","Data":"4f8fae9e624f573d3977105bbfe89b6f89ea2cd1946529b914861b11604a7d51"} Mar 20 10:56:14 crc kubenswrapper[4748]: I0320 10:56:14.823604 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/543b35d5-7087-4a09-bbfb-6937b3765855-config" (OuterVolumeSpecName: "config") pod "543b35d5-7087-4a09-bbfb-6937b3765855" (UID: "543b35d5-7087-4a09-bbfb-6937b3765855"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:14 crc kubenswrapper[4748]: I0320 10:56:14.829701 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-j7jqs" event={"ID":"a4a4f230-3fe6-44a4-a91b-5b0ea07ae755","Type":"ContainerStarted","Data":"c14c86cf03f9fe0ca3c0e2c665e5093475365ee25da6bfb5d95ce3271c77eacb"} Mar 20 10:56:14 crc kubenswrapper[4748]: I0320 10:56:14.841674 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/543b35d5-7087-4a09-bbfb-6937b3765855-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "543b35d5-7087-4a09-bbfb-6937b3765855" (UID: "543b35d5-7087-4a09-bbfb-6937b3765855"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:14 crc kubenswrapper[4748]: I0320 10:56:14.847467 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-42s2j" event={"ID":"fcd1bb77-acf9-4f10-855d-4ab63f3ad229","Type":"ContainerStarted","Data":"be3695bbb6fdf796a499eeaa190fe1108f365b77b2d5a6962cc53640abb5b2de"} Mar 20 10:56:14 crc kubenswrapper[4748]: I0320 10:56:14.847527 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-42s2j" event={"ID":"fcd1bb77-acf9-4f10-855d-4ab63f3ad229","Type":"ContainerStarted","Data":"304286bbf5d469aeee81c1d9e8ee78d81797054684e4cd5e340c9b01ab1be4d0"} Mar 20 10:56:14 crc kubenswrapper[4748]: I0320 10:56:14.849762 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-g4qfl" event={"ID":"6340e84c-8ffb-40e5-a470-52a50bff86f1","Type":"ContainerStarted","Data":"1c01d9fe33d0b6c8604dbe356b18748c2c2ba07b2ee401d34a3e154f5a2026cb"} Mar 20 10:56:14 crc kubenswrapper[4748]: I0320 10:56:14.867510 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 10:56:14 crc kubenswrapper[4748]: I0320 10:56:14.877172 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aa097fcb-4ad5-4fda-a410-a64ad13495d6-logs\") pod \"horizon-5cfd8f8d7c-h5mnf\" (UID: \"aa097fcb-4ad5-4fda-a410-a64ad13495d6\") " pod="openstack/horizon-5cfd8f8d7c-h5mnf" Mar 20 10:56:14 crc kubenswrapper[4748]: I0320 10:56:14.877254 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/aa097fcb-4ad5-4fda-a410-a64ad13495d6-horizon-secret-key\") pod \"horizon-5cfd8f8d7c-h5mnf\" (UID: \"aa097fcb-4ad5-4fda-a410-a64ad13495d6\") " pod="openstack/horizon-5cfd8f8d7c-h5mnf" Mar 20 10:56:14 crc kubenswrapper[4748]: I0320 10:56:14.893075 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4z4qt\" (UniqueName: \"kubernetes.io/projected/aa097fcb-4ad5-4fda-a410-a64ad13495d6-kube-api-access-4z4qt\") pod \"horizon-5cfd8f8d7c-h5mnf\" (UID: \"aa097fcb-4ad5-4fda-a410-a64ad13495d6\") " pod="openstack/horizon-5cfd8f8d7c-h5mnf" Mar 20 10:56:14 crc kubenswrapper[4748]: I0320 10:56:14.893246 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/aa097fcb-4ad5-4fda-a410-a64ad13495d6-config-data\") pod \"horizon-5cfd8f8d7c-h5mnf\" (UID: \"aa097fcb-4ad5-4fda-a410-a64ad13495d6\") " pod="openstack/horizon-5cfd8f8d7c-h5mnf" Mar 20 10:56:14 crc kubenswrapper[4748]: I0320 10:56:14.893400 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/aa097fcb-4ad5-4fda-a410-a64ad13495d6-scripts\") pod \"horizon-5cfd8f8d7c-h5mnf\" (UID: \"aa097fcb-4ad5-4fda-a410-a64ad13495d6\") " pod="openstack/horizon-5cfd8f8d7c-h5mnf" Mar 20 10:56:14 crc kubenswrapper[4748]: I0320 10:56:14.893683 4748 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/543b35d5-7087-4a09-bbfb-6937b3765855-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:14 crc kubenswrapper[4748]: I0320 10:56:14.893717 4748 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/543b35d5-7087-4a09-bbfb-6937b3765855-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:14 crc kubenswrapper[4748]: I0320 10:56:14.899879 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-rm5bp" event={"ID":"af16052b-a5ab-4244-b007-69a32d050a35","Type":"ContainerStarted","Data":"60e807de3887c03a9534eba93d8d22e8db9862960b97894d8ad6ed0aa49676a3"} Mar 20 10:56:14 crc kubenswrapper[4748]: I0320 10:56:14.916333 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-69d8fb6c75-qkwp6"] Mar 20 10:56:14 crc kubenswrapper[4748]: I0320 10:56:14.923661 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/543b35d5-7087-4a09-bbfb-6937b3765855-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "543b35d5-7087-4a09-bbfb-6937b3765855" (UID: "543b35d5-7087-4a09-bbfb-6937b3765855"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:14 crc kubenswrapper[4748]: I0320 10:56:14.935226 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 10:56:14 crc kubenswrapper[4748]: I0320 10:56:14.950451 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-qpm57" event={"ID":"543b35d5-7087-4a09-bbfb-6937b3765855","Type":"ContainerDied","Data":"c47f490911807806df5ff33fc4949a3ab383925901b1513709f35bd46ed90c91"} Mar 20 10:56:14 crc kubenswrapper[4748]: I0320 10:56:14.950520 4748 scope.go:117] "RemoveContainer" containerID="cc3d10b384b48095b3ea900b6f455aeed1fe457a88698c25a7111902fa133019" Mar 20 10:56:14 crc kubenswrapper[4748]: I0320 10:56:14.950663 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-qpm57" Mar 20 10:56:14 crc kubenswrapper[4748]: I0320 10:56:14.956968 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847c4cc679-8t679" event={"ID":"24d482f6-45ad-43d4-b2a8-3bbcf4d3f8ae","Type":"ContainerStarted","Data":"37915ac9c1fdca7a86f9251c4c10d4bc4789d14421f5965dd155f180a19e4c85"} Mar 20 10:56:14 crc kubenswrapper[4748]: I0320 10:56:14.961023 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 10:56:14 crc kubenswrapper[4748]: I0320 10:56:14.990690 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a4a84b3e-b01b-424e-969b-e2cbc625f3f3","Type":"ContainerStarted","Data":"20a9ce4214b791e72549d5217073d33447106b29028f5240e3d8862aeb6db1ee"} Mar 20 10:56:14 crc kubenswrapper[4748]: I0320 10:56:14.995573 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/aa097fcb-4ad5-4fda-a410-a64ad13495d6-scripts\") pod \"horizon-5cfd8f8d7c-h5mnf\" (UID: \"aa097fcb-4ad5-4fda-a410-a64ad13495d6\") " pod="openstack/horizon-5cfd8f8d7c-h5mnf" Mar 20 10:56:14 crc kubenswrapper[4748]: I0320 10:56:14.995752 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aa097fcb-4ad5-4fda-a410-a64ad13495d6-logs\") pod \"horizon-5cfd8f8d7c-h5mnf\" (UID: \"aa097fcb-4ad5-4fda-a410-a64ad13495d6\") " pod="openstack/horizon-5cfd8f8d7c-h5mnf" Mar 20 10:56:14 crc kubenswrapper[4748]: I0320 10:56:14.995788 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/aa097fcb-4ad5-4fda-a410-a64ad13495d6-horizon-secret-key\") pod \"horizon-5cfd8f8d7c-h5mnf\" (UID: \"aa097fcb-4ad5-4fda-a410-a64ad13495d6\") " pod="openstack/horizon-5cfd8f8d7c-h5mnf" Mar 20 10:56:14 crc kubenswrapper[4748]: I0320 10:56:14.995876 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4z4qt\" (UniqueName: \"kubernetes.io/projected/aa097fcb-4ad5-4fda-a410-a64ad13495d6-kube-api-access-4z4qt\") pod \"horizon-5cfd8f8d7c-h5mnf\" (UID: \"aa097fcb-4ad5-4fda-a410-a64ad13495d6\") " pod="openstack/horizon-5cfd8f8d7c-h5mnf" Mar 20 10:56:14 crc kubenswrapper[4748]: I0320 10:56:14.995914 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/aa097fcb-4ad5-4fda-a410-a64ad13495d6-config-data\") pod \"horizon-5cfd8f8d7c-h5mnf\" (UID: \"aa097fcb-4ad5-4fda-a410-a64ad13495d6\") " pod="openstack/horizon-5cfd8f8d7c-h5mnf" Mar 20 10:56:14 crc kubenswrapper[4748]: I0320 10:56:14.996075 4748 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/543b35d5-7087-4a09-bbfb-6937b3765855-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:14 crc kubenswrapper[4748]: I0320 10:56:14.996779 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/aa097fcb-4ad5-4fda-a410-a64ad13495d6-scripts\") pod \"horizon-5cfd8f8d7c-h5mnf\" (UID: \"aa097fcb-4ad5-4fda-a410-a64ad13495d6\") " pod="openstack/horizon-5cfd8f8d7c-h5mnf" Mar 20 10:56:14 crc kubenswrapper[4748]: I0320 10:56:14.997448 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/aa097fcb-4ad5-4fda-a410-a64ad13495d6-config-data\") pod \"horizon-5cfd8f8d7c-h5mnf\" (UID: \"aa097fcb-4ad5-4fda-a410-a64ad13495d6\") " pod="openstack/horizon-5cfd8f8d7c-h5mnf" Mar 20 10:56:14 crc kubenswrapper[4748]: I0320 10:56:14.998682 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aa097fcb-4ad5-4fda-a410-a64ad13495d6-logs\") pod \"horizon-5cfd8f8d7c-h5mnf\" (UID: \"aa097fcb-4ad5-4fda-a410-a64ad13495d6\") " pod="openstack/horizon-5cfd8f8d7c-h5mnf" Mar 20 10:56:15 crc kubenswrapper[4748]: I0320 10:56:15.012923 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-r6t2v" podStartSLOduration=3.01289215 podStartE2EDuration="3.01289215s" podCreationTimestamp="2026-03-20 10:56:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:56:14.884305227 +0000 UTC m=+1210.025851051" watchObservedRunningTime="2026-03-20 10:56:15.01289215 +0000 UTC m=+1210.154438004" Mar 20 10:56:15 crc kubenswrapper[4748]: I0320 10:56:15.015655 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/aa097fcb-4ad5-4fda-a410-a64ad13495d6-horizon-secret-key\") pod \"horizon-5cfd8f8d7c-h5mnf\" (UID: \"aa097fcb-4ad5-4fda-a410-a64ad13495d6\") " pod="openstack/horizon-5cfd8f8d7c-h5mnf" Mar 20 10:56:15 crc kubenswrapper[4748]: I0320 10:56:15.021509 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4z4qt\" (UniqueName: \"kubernetes.io/projected/aa097fcb-4ad5-4fda-a410-a64ad13495d6-kube-api-access-4z4qt\") pod \"horizon-5cfd8f8d7c-h5mnf\" (UID: \"aa097fcb-4ad5-4fda-a410-a64ad13495d6\") " pod="openstack/horizon-5cfd8f8d7c-h5mnf" Mar 20 10:56:15 crc kubenswrapper[4748]: I0320 10:56:15.032632 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 10:56:15 crc kubenswrapper[4748]: I0320 10:56:15.044707 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-42s2j" podStartSLOduration=4.044681989 podStartE2EDuration="4.044681989s" podCreationTimestamp="2026-03-20 10:56:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:56:14.925129953 +0000 UTC m=+1210.066675767" watchObservedRunningTime="2026-03-20 10:56:15.044681989 +0000 UTC m=+1210.186227803" Mar 20 10:56:15 crc kubenswrapper[4748]: I0320 10:56:15.069109 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-qpm57"] Mar 20 10:56:15 crc kubenswrapper[4748]: I0320 10:56:15.069780 4748 scope.go:117] "RemoveContainer" containerID="4f534fdbc40d88a75c92508b0831ab6db7bba17dfabbce4a88fd5ea0bf9fea07" Mar 20 10:56:15 crc kubenswrapper[4748]: I0320 10:56:15.081342 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-qpm57"] Mar 20 10:56:15 crc kubenswrapper[4748]: I0320 10:56:15.282417 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5cfd8f8d7c-h5mnf" Mar 20 10:56:15 crc kubenswrapper[4748]: I0320 10:56:15.338707 4748 scope.go:117] "RemoveContainer" containerID="f7863050dbe48cee1b8d6c88fffaf895e820d9a23f3721578114ae080eaaeb81" Mar 20 10:56:15 crc kubenswrapper[4748]: I0320 10:56:15.458710 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-8t679" Mar 20 10:56:15 crc kubenswrapper[4748]: I0320 10:56:15.546986 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/24d482f6-45ad-43d4-b2a8-3bbcf4d3f8ae-ovsdbserver-sb\") pod \"24d482f6-45ad-43d4-b2a8-3bbcf4d3f8ae\" (UID: \"24d482f6-45ad-43d4-b2a8-3bbcf4d3f8ae\") " Mar 20 10:56:15 crc kubenswrapper[4748]: I0320 10:56:15.547097 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/24d482f6-45ad-43d4-b2a8-3bbcf4d3f8ae-dns-svc\") pod \"24d482f6-45ad-43d4-b2a8-3bbcf4d3f8ae\" (UID: \"24d482f6-45ad-43d4-b2a8-3bbcf4d3f8ae\") " Mar 20 10:56:15 crc kubenswrapper[4748]: I0320 10:56:15.547235 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24d482f6-45ad-43d4-b2a8-3bbcf4d3f8ae-config\") pod \"24d482f6-45ad-43d4-b2a8-3bbcf4d3f8ae\" (UID: \"24d482f6-45ad-43d4-b2a8-3bbcf4d3f8ae\") " Mar 20 10:56:15 crc kubenswrapper[4748]: I0320 10:56:15.547282 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/24d482f6-45ad-43d4-b2a8-3bbcf4d3f8ae-ovsdbserver-nb\") pod \"24d482f6-45ad-43d4-b2a8-3bbcf4d3f8ae\" (UID: \"24d482f6-45ad-43d4-b2a8-3bbcf4d3f8ae\") " Mar 20 10:56:15 crc kubenswrapper[4748]: I0320 10:56:15.547319 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7qfm8\" (UniqueName: \"kubernetes.io/projected/24d482f6-45ad-43d4-b2a8-3bbcf4d3f8ae-kube-api-access-7qfm8\") pod \"24d482f6-45ad-43d4-b2a8-3bbcf4d3f8ae\" (UID: \"24d482f6-45ad-43d4-b2a8-3bbcf4d3f8ae\") " Mar 20 10:56:15 crc kubenswrapper[4748]: I0320 10:56:15.547357 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/24d482f6-45ad-43d4-b2a8-3bbcf4d3f8ae-dns-swift-storage-0\") pod \"24d482f6-45ad-43d4-b2a8-3bbcf4d3f8ae\" (UID: \"24d482f6-45ad-43d4-b2a8-3bbcf4d3f8ae\") " Mar 20 10:56:15 crc kubenswrapper[4748]: I0320 10:56:15.594701 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24d482f6-45ad-43d4-b2a8-3bbcf4d3f8ae-kube-api-access-7qfm8" (OuterVolumeSpecName: "kube-api-access-7qfm8") pod "24d482f6-45ad-43d4-b2a8-3bbcf4d3f8ae" (UID: "24d482f6-45ad-43d4-b2a8-3bbcf4d3f8ae"). InnerVolumeSpecName "kube-api-access-7qfm8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:15 crc kubenswrapper[4748]: I0320 10:56:15.598748 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="543b35d5-7087-4a09-bbfb-6937b3765855" path="/var/lib/kubelet/pods/543b35d5-7087-4a09-bbfb-6937b3765855/volumes" Mar 20 10:56:15 crc kubenswrapper[4748]: I0320 10:56:15.600517 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5e2826a-d87e-4002-bece-b3eed4990771" path="/var/lib/kubelet/pods/c5e2826a-d87e-4002-bece-b3eed4990771/volumes" Mar 20 10:56:15 crc kubenswrapper[4748]: I0320 10:56:15.605744 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24d482f6-45ad-43d4-b2a8-3bbcf4d3f8ae-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "24d482f6-45ad-43d4-b2a8-3bbcf4d3f8ae" (UID: "24d482f6-45ad-43d4-b2a8-3bbcf4d3f8ae"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:15 crc kubenswrapper[4748]: I0320 10:56:15.613934 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24d482f6-45ad-43d4-b2a8-3bbcf4d3f8ae-config" (OuterVolumeSpecName: "config") pod "24d482f6-45ad-43d4-b2a8-3bbcf4d3f8ae" (UID: "24d482f6-45ad-43d4-b2a8-3bbcf4d3f8ae"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:15 crc kubenswrapper[4748]: I0320 10:56:15.618890 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24d482f6-45ad-43d4-b2a8-3bbcf4d3f8ae-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "24d482f6-45ad-43d4-b2a8-3bbcf4d3f8ae" (UID: "24d482f6-45ad-43d4-b2a8-3bbcf4d3f8ae"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:15 crc kubenswrapper[4748]: I0320 10:56:15.620364 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24d482f6-45ad-43d4-b2a8-3bbcf4d3f8ae-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "24d482f6-45ad-43d4-b2a8-3bbcf4d3f8ae" (UID: "24d482f6-45ad-43d4-b2a8-3bbcf4d3f8ae"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:15 crc kubenswrapper[4748]: I0320 10:56:15.653061 4748 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/24d482f6-45ad-43d4-b2a8-3bbcf4d3f8ae-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:15 crc kubenswrapper[4748]: I0320 10:56:15.653091 4748 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24d482f6-45ad-43d4-b2a8-3bbcf4d3f8ae-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:15 crc kubenswrapper[4748]: I0320 10:56:15.653100 4748 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/24d482f6-45ad-43d4-b2a8-3bbcf4d3f8ae-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:15 crc kubenswrapper[4748]: I0320 10:56:15.653111 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7qfm8\" (UniqueName: \"kubernetes.io/projected/24d482f6-45ad-43d4-b2a8-3bbcf4d3f8ae-kube-api-access-7qfm8\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:15 crc kubenswrapper[4748]: I0320 10:56:15.653120 4748 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/24d482f6-45ad-43d4-b2a8-3bbcf4d3f8ae-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:15 crc kubenswrapper[4748]: I0320 10:56:15.702151 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24d482f6-45ad-43d4-b2a8-3bbcf4d3f8ae-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "24d482f6-45ad-43d4-b2a8-3bbcf4d3f8ae" (UID: "24d482f6-45ad-43d4-b2a8-3bbcf4d3f8ae"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:15 crc kubenswrapper[4748]: I0320 10:56:15.755146 4748 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/24d482f6-45ad-43d4-b2a8-3bbcf4d3f8ae-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:15 crc kubenswrapper[4748]: I0320 10:56:15.846484 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5cfd8f8d7c-h5mnf"] Mar 20 10:56:16 crc kubenswrapper[4748]: I0320 10:56:16.040889 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-b8d4ccdcc-grvq4" event={"ID":"dddc05be-fe06-4c29-8102-d6118b6f36d5","Type":"ContainerStarted","Data":"98518173afb007a8f4b226333d979551b3c2b7f486663828b15c2283d3e413e3"} Mar 20 10:56:16 crc kubenswrapper[4748]: I0320 10:56:16.047105 4748 generic.go:334] "Generic (PLEG): container finished" podID="24d482f6-45ad-43d4-b2a8-3bbcf4d3f8ae" containerID="fc595114864ebdc9f9fb3d1dc301db1e3dcc037e26048e1f7e1790996ca8fd7f" exitCode=0 Mar 20 10:56:16 crc kubenswrapper[4748]: I0320 10:56:16.047167 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-8t679" Mar 20 10:56:16 crc kubenswrapper[4748]: I0320 10:56:16.047200 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847c4cc679-8t679" event={"ID":"24d482f6-45ad-43d4-b2a8-3bbcf4d3f8ae","Type":"ContainerDied","Data":"37915ac9c1fdca7a86f9251c4c10d4bc4789d14421f5965dd155f180a19e4c85"} Mar 20 10:56:16 crc kubenswrapper[4748]: I0320 10:56:16.047257 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847c4cc679-8t679" event={"ID":"24d482f6-45ad-43d4-b2a8-3bbcf4d3f8ae","Type":"ContainerDied","Data":"fc595114864ebdc9f9fb3d1dc301db1e3dcc037e26048e1f7e1790996ca8fd7f"} Mar 20 10:56:16 crc kubenswrapper[4748]: I0320 10:56:16.047279 4748 scope.go:117] "RemoveContainer" containerID="fc595114864ebdc9f9fb3d1dc301db1e3dcc037e26048e1f7e1790996ca8fd7f" Mar 20 10:56:16 crc kubenswrapper[4748]: I0320 10:56:16.055075 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"16ac2620-c820-4881-b959-d12478e5f4ba","Type":"ContainerStarted","Data":"22430d0042f0ed5cb007e28f6d3ccf1b7b2fb2775365375a469d0c0175dfe3fc"} Mar 20 10:56:16 crc kubenswrapper[4748]: I0320 10:56:16.064405 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-69d8fb6c75-qkwp6" event={"ID":"c2b0c5b1-956d-4aae-97b8-849e47e03677","Type":"ContainerStarted","Data":"1ea4775183436c3f3ec39c31ec78b2d9a1c1d0a3f06d49e0063c67fc2c4bd9a7"} Mar 20 10:56:16 crc kubenswrapper[4748]: I0320 10:56:16.074069 4748 generic.go:334] "Generic (PLEG): container finished" podID="24adc7ea-6cf7-416b-9570-a92736a9b48b" containerID="332ee7a61599ba0028d64900ea9a7a116fecc5be4626068bb8e64b1e9931ddce" exitCode=0 Mar 20 10:56:16 crc kubenswrapper[4748]: I0320 10:56:16.074138 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-5g98d" event={"ID":"24adc7ea-6cf7-416b-9570-a92736a9b48b","Type":"ContainerDied","Data":"332ee7a61599ba0028d64900ea9a7a116fecc5be4626068bb8e64b1e9931ddce"} Mar 20 10:56:16 crc kubenswrapper[4748]: I0320 10:56:16.074165 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-5g98d" event={"ID":"24adc7ea-6cf7-416b-9570-a92736a9b48b","Type":"ContainerStarted","Data":"1d39e39b445ed448b1df60fa8583d35fdcd62be2730f0ed34b82aa517a848ffc"} Mar 20 10:56:16 crc kubenswrapper[4748]: I0320 10:56:16.087082 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5cfd8f8d7c-h5mnf" event={"ID":"aa097fcb-4ad5-4fda-a410-a64ad13495d6","Type":"ContainerStarted","Data":"26907ff68937d83f28062eee7b1abfc36dea79a78efe0533d8be2eaa5e12cdf2"} Mar 20 10:56:16 crc kubenswrapper[4748]: I0320 10:56:16.090886 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"952e94ca-2ca0-4a73-af07-8a02572a080b","Type":"ContainerStarted","Data":"03e83f620f71fe7d7a163a3a9827c784f99b1ddd38b713d0697b78ab72e15268"} Mar 20 10:56:16 crc kubenswrapper[4748]: I0320 10:56:16.107674 4748 scope.go:117] "RemoveContainer" containerID="fc595114864ebdc9f9fb3d1dc301db1e3dcc037e26048e1f7e1790996ca8fd7f" Mar 20 10:56:16 crc kubenswrapper[4748]: E0320 10:56:16.109738 4748 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc595114864ebdc9f9fb3d1dc301db1e3dcc037e26048e1f7e1790996ca8fd7f\": container with ID starting with fc595114864ebdc9f9fb3d1dc301db1e3dcc037e26048e1f7e1790996ca8fd7f not found: ID does not exist" containerID="fc595114864ebdc9f9fb3d1dc301db1e3dcc037e26048e1f7e1790996ca8fd7f" Mar 20 10:56:16 crc kubenswrapper[4748]: I0320 10:56:16.109783 4748 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc595114864ebdc9f9fb3d1dc301db1e3dcc037e26048e1f7e1790996ca8fd7f"} err="failed to get container status \"fc595114864ebdc9f9fb3d1dc301db1e3dcc037e26048e1f7e1790996ca8fd7f\": rpc error: code = NotFound desc = could not find container \"fc595114864ebdc9f9fb3d1dc301db1e3dcc037e26048e1f7e1790996ca8fd7f\": container with ID starting with fc595114864ebdc9f9fb3d1dc301db1e3dcc037e26048e1f7e1790996ca8fd7f not found: ID does not exist" Mar 20 10:56:16 crc kubenswrapper[4748]: I0320 10:56:16.330476 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-8t679"] Mar 20 10:56:16 crc kubenswrapper[4748]: I0320 10:56:16.349414 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-8t679"] Mar 20 10:56:17 crc kubenswrapper[4748]: I0320 10:56:17.159352 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"952e94ca-2ca0-4a73-af07-8a02572a080b","Type":"ContainerStarted","Data":"2f526096f41e83b934123a3a9b75ae5487b966cfc04857fb23ef0bf6f46d38ea"} Mar 20 10:56:17 crc kubenswrapper[4748]: I0320 10:56:17.161918 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"16ac2620-c820-4881-b959-d12478e5f4ba","Type":"ContainerStarted","Data":"348df6b5d3755f7725ea911eb2e14e2469d79eaae2e2a58deb61b534849f3b14"} Mar 20 10:56:17 crc kubenswrapper[4748]: I0320 10:56:17.165261 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-5g98d" event={"ID":"24adc7ea-6cf7-416b-9570-a92736a9b48b","Type":"ContainerStarted","Data":"82ea62f0cf66489ad6299a0579431489e73d924d8588b0d43d6ec33c76da1136"} Mar 20 10:56:17 crc kubenswrapper[4748]: I0320 10:56:17.165350 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-785d8bcb8c-5g98d" Mar 20 10:56:17 crc kubenswrapper[4748]: I0320 10:56:17.535724 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24d482f6-45ad-43d4-b2a8-3bbcf4d3f8ae" path="/var/lib/kubelet/pods/24d482f6-45ad-43d4-b2a8-3bbcf4d3f8ae/volumes" Mar 20 10:56:18 crc kubenswrapper[4748]: I0320 10:56:18.218496 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"16ac2620-c820-4881-b959-d12478e5f4ba","Type":"ContainerStarted","Data":"b179e670570fb05fdb090dd6f9b5d81690c8690e94d33b7fa7f91628a328b55c"} Mar 20 10:56:18 crc kubenswrapper[4748]: I0320 10:56:18.219251 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="16ac2620-c820-4881-b959-d12478e5f4ba" containerName="glance-log" containerID="cri-o://348df6b5d3755f7725ea911eb2e14e2469d79eaae2e2a58deb61b534849f3b14" gracePeriod=30 Mar 20 10:56:18 crc kubenswrapper[4748]: I0320 10:56:18.219804 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="16ac2620-c820-4881-b959-d12478e5f4ba" containerName="glance-httpd" containerID="cri-o://b179e670570fb05fdb090dd6f9b5d81690c8690e94d33b7fa7f91628a328b55c" gracePeriod=30 Mar 20 10:56:18 crc kubenswrapper[4748]: I0320 10:56:18.231087 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"952e94ca-2ca0-4a73-af07-8a02572a080b","Type":"ContainerStarted","Data":"701bbd352e749b7cf6f1466360b7be0f1095943e20208c22277d4d7de3eaac02"} Mar 20 10:56:18 crc kubenswrapper[4748]: I0320 10:56:18.231312 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="952e94ca-2ca0-4a73-af07-8a02572a080b" containerName="glance-log" containerID="cri-o://2f526096f41e83b934123a3a9b75ae5487b966cfc04857fb23ef0bf6f46d38ea" gracePeriod=30 Mar 20 10:56:18 crc kubenswrapper[4748]: I0320 10:56:18.231321 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="952e94ca-2ca0-4a73-af07-8a02572a080b" containerName="glance-httpd" containerID="cri-o://701bbd352e749b7cf6f1466360b7be0f1095943e20208c22277d4d7de3eaac02" gracePeriod=30 Mar 20 10:56:18 crc kubenswrapper[4748]: I0320 10:56:18.265604 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-785d8bcb8c-5g98d" podStartSLOduration=6.265581326 podStartE2EDuration="6.265581326s" podCreationTimestamp="2026-03-20 10:56:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:56:17.189561294 +0000 UTC m=+1212.331107138" watchObservedRunningTime="2026-03-20 10:56:18.265581326 +0000 UTC m=+1213.407127140" Mar 20 10:56:18 crc kubenswrapper[4748]: I0320 10:56:18.269128 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=6.269108415 podStartE2EDuration="6.269108415s" podCreationTimestamp="2026-03-20 10:56:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:56:18.261991316 +0000 UTC m=+1213.403537130" watchObservedRunningTime="2026-03-20 10:56:18.269108415 +0000 UTC m=+1213.410654219" Mar 20 10:56:19 crc kubenswrapper[4748]: I0320 10:56:19.252033 4748 generic.go:334] "Generic (PLEG): container finished" podID="16ac2620-c820-4881-b959-d12478e5f4ba" containerID="b179e670570fb05fdb090dd6f9b5d81690c8690e94d33b7fa7f91628a328b55c" exitCode=0 Mar 20 10:56:19 crc kubenswrapper[4748]: I0320 10:56:19.252433 4748 generic.go:334] "Generic (PLEG): container finished" podID="16ac2620-c820-4881-b959-d12478e5f4ba" containerID="348df6b5d3755f7725ea911eb2e14e2469d79eaae2e2a58deb61b534849f3b14" exitCode=143 Mar 20 10:56:19 crc kubenswrapper[4748]: I0320 10:56:19.252476 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"16ac2620-c820-4881-b959-d12478e5f4ba","Type":"ContainerDied","Data":"b179e670570fb05fdb090dd6f9b5d81690c8690e94d33b7fa7f91628a328b55c"} Mar 20 10:56:19 crc kubenswrapper[4748]: I0320 10:56:19.252530 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"16ac2620-c820-4881-b959-d12478e5f4ba","Type":"ContainerDied","Data":"348df6b5d3755f7725ea911eb2e14e2469d79eaae2e2a58deb61b534849f3b14"} Mar 20 10:56:19 crc kubenswrapper[4748]: I0320 10:56:19.258701 4748 generic.go:334] "Generic (PLEG): container finished" podID="952e94ca-2ca0-4a73-af07-8a02572a080b" containerID="701bbd352e749b7cf6f1466360b7be0f1095943e20208c22277d4d7de3eaac02" exitCode=0 Mar 20 10:56:19 crc kubenswrapper[4748]: I0320 10:56:19.258724 4748 generic.go:334] "Generic (PLEG): container finished" podID="952e94ca-2ca0-4a73-af07-8a02572a080b" containerID="2f526096f41e83b934123a3a9b75ae5487b966cfc04857fb23ef0bf6f46d38ea" exitCode=143 Mar 20 10:56:19 crc kubenswrapper[4748]: I0320 10:56:19.258762 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"952e94ca-2ca0-4a73-af07-8a02572a080b","Type":"ContainerDied","Data":"701bbd352e749b7cf6f1466360b7be0f1095943e20208c22277d4d7de3eaac02"} Mar 20 10:56:19 crc kubenswrapper[4748]: I0320 10:56:19.258785 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"952e94ca-2ca0-4a73-af07-8a02572a080b","Type":"ContainerDied","Data":"2f526096f41e83b934123a3a9b75ae5487b966cfc04857fb23ef0bf6f46d38ea"} Mar 20 10:56:20 crc kubenswrapper[4748]: I0320 10:56:20.274024 4748 generic.go:334] "Generic (PLEG): container finished" podID="fcd1bb77-acf9-4f10-855d-4ab63f3ad229" containerID="be3695bbb6fdf796a499eeaa190fe1108f365b77b2d5a6962cc53640abb5b2de" exitCode=0 Mar 20 10:56:20 crc kubenswrapper[4748]: I0320 10:56:20.274076 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-42s2j" event={"ID":"fcd1bb77-acf9-4f10-855d-4ab63f3ad229","Type":"ContainerDied","Data":"be3695bbb6fdf796a499eeaa190fe1108f365b77b2d5a6962cc53640abb5b2de"} Mar 20 10:56:20 crc kubenswrapper[4748]: I0320 10:56:20.309235 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=8.309185145 podStartE2EDuration="8.309185145s" podCreationTimestamp="2026-03-20 10:56:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:56:18.298213087 +0000 UTC m=+1213.439758901" watchObservedRunningTime="2026-03-20 10:56:20.309185145 +0000 UTC m=+1215.450730959" Mar 20 10:56:21 crc kubenswrapper[4748]: I0320 10:56:21.110247 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-b8d4ccdcc-grvq4"] Mar 20 10:56:21 crc kubenswrapper[4748]: I0320 10:56:21.142733 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-b85b9d5c6-7bgxl"] Mar 20 10:56:21 crc kubenswrapper[4748]: E0320 10:56:21.143095 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24d482f6-45ad-43d4-b2a8-3bbcf4d3f8ae" containerName="init" Mar 20 10:56:21 crc kubenswrapper[4748]: I0320 10:56:21.143115 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="24d482f6-45ad-43d4-b2a8-3bbcf4d3f8ae" containerName="init" Mar 20 10:56:21 crc kubenswrapper[4748]: I0320 10:56:21.143391 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="24d482f6-45ad-43d4-b2a8-3bbcf4d3f8ae" containerName="init" Mar 20 10:56:21 crc kubenswrapper[4748]: I0320 10:56:21.144411 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-b85b9d5c6-7bgxl" Mar 20 10:56:21 crc kubenswrapper[4748]: I0320 10:56:21.149122 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Mar 20 10:56:21 crc kubenswrapper[4748]: I0320 10:56:21.157930 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-b85b9d5c6-7bgxl"] Mar 20 10:56:21 crc kubenswrapper[4748]: I0320 10:56:21.229348 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5cfd8f8d7c-h5mnf"] Mar 20 10:56:21 crc kubenswrapper[4748]: I0320 10:56:21.261491 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7d79b6bb86-nhfts"] Mar 20 10:56:21 crc kubenswrapper[4748]: I0320 10:56:21.262986 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7d79b6bb86-nhfts" Mar 20 10:56:21 crc kubenswrapper[4748]: I0320 10:56:21.270427 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7d79b6bb86-nhfts"] Mar 20 10:56:21 crc kubenswrapper[4748]: I0320 10:56:21.288709 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/18e8975c-a2d8-4319-b175-2a66ce3d97c9-horizon-secret-key\") pod \"horizon-b85b9d5c6-7bgxl\" (UID: \"18e8975c-a2d8-4319-b175-2a66ce3d97c9\") " pod="openstack/horizon-b85b9d5c6-7bgxl" Mar 20 10:56:21 crc kubenswrapper[4748]: I0320 10:56:21.288760 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/18e8975c-a2d8-4319-b175-2a66ce3d97c9-logs\") pod \"horizon-b85b9d5c6-7bgxl\" (UID: \"18e8975c-a2d8-4319-b175-2a66ce3d97c9\") " pod="openstack/horizon-b85b9d5c6-7bgxl" Mar 20 10:56:21 crc kubenswrapper[4748]: I0320 10:56:21.288877 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18e8975c-a2d8-4319-b175-2a66ce3d97c9-combined-ca-bundle\") pod \"horizon-b85b9d5c6-7bgxl\" (UID: \"18e8975c-a2d8-4319-b175-2a66ce3d97c9\") " pod="openstack/horizon-b85b9d5c6-7bgxl" Mar 20 10:56:21 crc kubenswrapper[4748]: I0320 10:56:21.288921 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/18e8975c-a2d8-4319-b175-2a66ce3d97c9-config-data\") pod \"horizon-b85b9d5c6-7bgxl\" (UID: \"18e8975c-a2d8-4319-b175-2a66ce3d97c9\") " pod="openstack/horizon-b85b9d5c6-7bgxl" Mar 20 10:56:21 crc kubenswrapper[4748]: I0320 10:56:21.288958 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/18e8975c-a2d8-4319-b175-2a66ce3d97c9-scripts\") pod \"horizon-b85b9d5c6-7bgxl\" (UID: \"18e8975c-a2d8-4319-b175-2a66ce3d97c9\") " pod="openstack/horizon-b85b9d5c6-7bgxl" Mar 20 10:56:21 crc kubenswrapper[4748]: I0320 10:56:21.288984 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ds9t\" (UniqueName: \"kubernetes.io/projected/18e8975c-a2d8-4319-b175-2a66ce3d97c9-kube-api-access-2ds9t\") pod \"horizon-b85b9d5c6-7bgxl\" (UID: \"18e8975c-a2d8-4319-b175-2a66ce3d97c9\") " pod="openstack/horizon-b85b9d5c6-7bgxl" Mar 20 10:56:21 crc kubenswrapper[4748]: I0320 10:56:21.289056 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/18e8975c-a2d8-4319-b175-2a66ce3d97c9-horizon-tls-certs\") pod \"horizon-b85b9d5c6-7bgxl\" (UID: \"18e8975c-a2d8-4319-b175-2a66ce3d97c9\") " pod="openstack/horizon-b85b9d5c6-7bgxl" Mar 20 10:56:21 crc kubenswrapper[4748]: I0320 10:56:21.393493 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/18e8975c-a2d8-4319-b175-2a66ce3d97c9-horizon-secret-key\") pod \"horizon-b85b9d5c6-7bgxl\" (UID: \"18e8975c-a2d8-4319-b175-2a66ce3d97c9\") " pod="openstack/horizon-b85b9d5c6-7bgxl" Mar 20 10:56:21 crc kubenswrapper[4748]: I0320 10:56:21.393580 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/18e8975c-a2d8-4319-b175-2a66ce3d97c9-logs\") pod \"horizon-b85b9d5c6-7bgxl\" (UID: \"18e8975c-a2d8-4319-b175-2a66ce3d97c9\") " pod="openstack/horizon-b85b9d5c6-7bgxl" Mar 20 10:56:21 crc kubenswrapper[4748]: I0320 10:56:21.393623 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3de236a-e527-4582-8eb5-03ca8aa883e0-horizon-tls-certs\") pod \"horizon-7d79b6bb86-nhfts\" (UID: \"f3de236a-e527-4582-8eb5-03ca8aa883e0\") " pod="openstack/horizon-7d79b6bb86-nhfts" Mar 20 10:56:21 crc kubenswrapper[4748]: I0320 10:56:21.393691 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f3de236a-e527-4582-8eb5-03ca8aa883e0-logs\") pod \"horizon-7d79b6bb86-nhfts\" (UID: \"f3de236a-e527-4582-8eb5-03ca8aa883e0\") " pod="openstack/horizon-7d79b6bb86-nhfts" Mar 20 10:56:21 crc kubenswrapper[4748]: I0320 10:56:21.393744 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f3de236a-e527-4582-8eb5-03ca8aa883e0-horizon-secret-key\") pod \"horizon-7d79b6bb86-nhfts\" (UID: \"f3de236a-e527-4582-8eb5-03ca8aa883e0\") " pod="openstack/horizon-7d79b6bb86-nhfts" Mar 20 10:56:21 crc kubenswrapper[4748]: I0320 10:56:21.393777 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18e8975c-a2d8-4319-b175-2a66ce3d97c9-combined-ca-bundle\") pod \"horizon-b85b9d5c6-7bgxl\" (UID: \"18e8975c-a2d8-4319-b175-2a66ce3d97c9\") " pod="openstack/horizon-b85b9d5c6-7bgxl" Mar 20 10:56:21 crc kubenswrapper[4748]: I0320 10:56:21.393853 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/18e8975c-a2d8-4319-b175-2a66ce3d97c9-config-data\") pod \"horizon-b85b9d5c6-7bgxl\" (UID: \"18e8975c-a2d8-4319-b175-2a66ce3d97c9\") " pod="openstack/horizon-b85b9d5c6-7bgxl" Mar 20 10:56:21 crc kubenswrapper[4748]: I0320 10:56:21.393894 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/18e8975c-a2d8-4319-b175-2a66ce3d97c9-scripts\") pod \"horizon-b85b9d5c6-7bgxl\" (UID: \"18e8975c-a2d8-4319-b175-2a66ce3d97c9\") " pod="openstack/horizon-b85b9d5c6-7bgxl" Mar 20 10:56:21 crc kubenswrapper[4748]: I0320 10:56:21.393921 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2ds9t\" (UniqueName: \"kubernetes.io/projected/18e8975c-a2d8-4319-b175-2a66ce3d97c9-kube-api-access-2ds9t\") pod \"horizon-b85b9d5c6-7bgxl\" (UID: \"18e8975c-a2d8-4319-b175-2a66ce3d97c9\") " pod="openstack/horizon-b85b9d5c6-7bgxl" Mar 20 10:56:21 crc kubenswrapper[4748]: I0320 10:56:21.394010 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54fjv\" (UniqueName: \"kubernetes.io/projected/f3de236a-e527-4582-8eb5-03ca8aa883e0-kube-api-access-54fjv\") pod \"horizon-7d79b6bb86-nhfts\" (UID: \"f3de236a-e527-4582-8eb5-03ca8aa883e0\") " pod="openstack/horizon-7d79b6bb86-nhfts" Mar 20 10:56:21 crc kubenswrapper[4748]: I0320 10:56:21.394079 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f3de236a-e527-4582-8eb5-03ca8aa883e0-scripts\") pod \"horizon-7d79b6bb86-nhfts\" (UID: \"f3de236a-e527-4582-8eb5-03ca8aa883e0\") " pod="openstack/horizon-7d79b6bb86-nhfts" Mar 20 10:56:21 crc kubenswrapper[4748]: I0320 10:56:21.394105 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/18e8975c-a2d8-4319-b175-2a66ce3d97c9-horizon-tls-certs\") pod \"horizon-b85b9d5c6-7bgxl\" (UID: \"18e8975c-a2d8-4319-b175-2a66ce3d97c9\") " pod="openstack/horizon-b85b9d5c6-7bgxl" Mar 20 10:56:21 crc kubenswrapper[4748]: I0320 10:56:21.394152 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f3de236a-e527-4582-8eb5-03ca8aa883e0-config-data\") pod \"horizon-7d79b6bb86-nhfts\" (UID: \"f3de236a-e527-4582-8eb5-03ca8aa883e0\") " pod="openstack/horizon-7d79b6bb86-nhfts" Mar 20 10:56:21 crc kubenswrapper[4748]: I0320 10:56:21.394183 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3de236a-e527-4582-8eb5-03ca8aa883e0-combined-ca-bundle\") pod \"horizon-7d79b6bb86-nhfts\" (UID: \"f3de236a-e527-4582-8eb5-03ca8aa883e0\") " pod="openstack/horizon-7d79b6bb86-nhfts" Mar 20 10:56:21 crc kubenswrapper[4748]: I0320 10:56:21.394222 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/18e8975c-a2d8-4319-b175-2a66ce3d97c9-logs\") pod \"horizon-b85b9d5c6-7bgxl\" (UID: \"18e8975c-a2d8-4319-b175-2a66ce3d97c9\") " pod="openstack/horizon-b85b9d5c6-7bgxl" Mar 20 10:56:21 crc kubenswrapper[4748]: I0320 10:56:21.394814 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/18e8975c-a2d8-4319-b175-2a66ce3d97c9-scripts\") pod \"horizon-b85b9d5c6-7bgxl\" (UID: \"18e8975c-a2d8-4319-b175-2a66ce3d97c9\") " pod="openstack/horizon-b85b9d5c6-7bgxl" Mar 20 10:56:21 crc kubenswrapper[4748]: I0320 10:56:21.398422 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/18e8975c-a2d8-4319-b175-2a66ce3d97c9-config-data\") pod \"horizon-b85b9d5c6-7bgxl\" (UID: \"18e8975c-a2d8-4319-b175-2a66ce3d97c9\") " pod="openstack/horizon-b85b9d5c6-7bgxl" Mar 20 10:56:21 crc kubenswrapper[4748]: I0320 10:56:21.401437 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18e8975c-a2d8-4319-b175-2a66ce3d97c9-combined-ca-bundle\") pod \"horizon-b85b9d5c6-7bgxl\" (UID: \"18e8975c-a2d8-4319-b175-2a66ce3d97c9\") " pod="openstack/horizon-b85b9d5c6-7bgxl" Mar 20 10:56:21 crc kubenswrapper[4748]: I0320 10:56:21.427271 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/18e8975c-a2d8-4319-b175-2a66ce3d97c9-horizon-tls-certs\") pod \"horizon-b85b9d5c6-7bgxl\" (UID: \"18e8975c-a2d8-4319-b175-2a66ce3d97c9\") " pod="openstack/horizon-b85b9d5c6-7bgxl" Mar 20 10:56:21 crc kubenswrapper[4748]: I0320 10:56:21.436285 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/18e8975c-a2d8-4319-b175-2a66ce3d97c9-horizon-secret-key\") pod \"horizon-b85b9d5c6-7bgxl\" (UID: \"18e8975c-a2d8-4319-b175-2a66ce3d97c9\") " pod="openstack/horizon-b85b9d5c6-7bgxl" Mar 20 10:56:21 crc kubenswrapper[4748]: I0320 10:56:21.457006 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ds9t\" (UniqueName: \"kubernetes.io/projected/18e8975c-a2d8-4319-b175-2a66ce3d97c9-kube-api-access-2ds9t\") pod \"horizon-b85b9d5c6-7bgxl\" (UID: \"18e8975c-a2d8-4319-b175-2a66ce3d97c9\") " pod="openstack/horizon-b85b9d5c6-7bgxl" Mar 20 10:56:21 crc kubenswrapper[4748]: I0320 10:56:21.476378 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-b85b9d5c6-7bgxl" Mar 20 10:56:21 crc kubenswrapper[4748]: I0320 10:56:21.495392 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54fjv\" (UniqueName: \"kubernetes.io/projected/f3de236a-e527-4582-8eb5-03ca8aa883e0-kube-api-access-54fjv\") pod \"horizon-7d79b6bb86-nhfts\" (UID: \"f3de236a-e527-4582-8eb5-03ca8aa883e0\") " pod="openstack/horizon-7d79b6bb86-nhfts" Mar 20 10:56:21 crc kubenswrapper[4748]: I0320 10:56:21.495501 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f3de236a-e527-4582-8eb5-03ca8aa883e0-scripts\") pod \"horizon-7d79b6bb86-nhfts\" (UID: \"f3de236a-e527-4582-8eb5-03ca8aa883e0\") " pod="openstack/horizon-7d79b6bb86-nhfts" Mar 20 10:56:21 crc kubenswrapper[4748]: I0320 10:56:21.495539 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f3de236a-e527-4582-8eb5-03ca8aa883e0-config-data\") pod \"horizon-7d79b6bb86-nhfts\" (UID: \"f3de236a-e527-4582-8eb5-03ca8aa883e0\") " pod="openstack/horizon-7d79b6bb86-nhfts" Mar 20 10:56:21 crc kubenswrapper[4748]: I0320 10:56:21.495560 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3de236a-e527-4582-8eb5-03ca8aa883e0-combined-ca-bundle\") pod \"horizon-7d79b6bb86-nhfts\" (UID: \"f3de236a-e527-4582-8eb5-03ca8aa883e0\") " pod="openstack/horizon-7d79b6bb86-nhfts" Mar 20 10:56:21 crc kubenswrapper[4748]: I0320 10:56:21.495608 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3de236a-e527-4582-8eb5-03ca8aa883e0-horizon-tls-certs\") pod \"horizon-7d79b6bb86-nhfts\" (UID: \"f3de236a-e527-4582-8eb5-03ca8aa883e0\") " pod="openstack/horizon-7d79b6bb86-nhfts" Mar 20 10:56:21 crc kubenswrapper[4748]: I0320 10:56:21.495656 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f3de236a-e527-4582-8eb5-03ca8aa883e0-logs\") pod \"horizon-7d79b6bb86-nhfts\" (UID: \"f3de236a-e527-4582-8eb5-03ca8aa883e0\") " pod="openstack/horizon-7d79b6bb86-nhfts" Mar 20 10:56:21 crc kubenswrapper[4748]: I0320 10:56:21.495686 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f3de236a-e527-4582-8eb5-03ca8aa883e0-horizon-secret-key\") pod \"horizon-7d79b6bb86-nhfts\" (UID: \"f3de236a-e527-4582-8eb5-03ca8aa883e0\") " pod="openstack/horizon-7d79b6bb86-nhfts" Mar 20 10:56:21 crc kubenswrapper[4748]: I0320 10:56:21.498283 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f3de236a-e527-4582-8eb5-03ca8aa883e0-scripts\") pod \"horizon-7d79b6bb86-nhfts\" (UID: \"f3de236a-e527-4582-8eb5-03ca8aa883e0\") " pod="openstack/horizon-7d79b6bb86-nhfts" Mar 20 10:56:21 crc kubenswrapper[4748]: I0320 10:56:21.499353 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f3de236a-e527-4582-8eb5-03ca8aa883e0-horizon-secret-key\") pod \"horizon-7d79b6bb86-nhfts\" (UID: \"f3de236a-e527-4582-8eb5-03ca8aa883e0\") " pod="openstack/horizon-7d79b6bb86-nhfts" Mar 20 10:56:21 crc kubenswrapper[4748]: I0320 10:56:21.499603 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f3de236a-e527-4582-8eb5-03ca8aa883e0-config-data\") pod \"horizon-7d79b6bb86-nhfts\" (UID: \"f3de236a-e527-4582-8eb5-03ca8aa883e0\") " pod="openstack/horizon-7d79b6bb86-nhfts" Mar 20 10:56:21 crc kubenswrapper[4748]: I0320 10:56:21.499820 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f3de236a-e527-4582-8eb5-03ca8aa883e0-logs\") pod \"horizon-7d79b6bb86-nhfts\" (UID: \"f3de236a-e527-4582-8eb5-03ca8aa883e0\") " pod="openstack/horizon-7d79b6bb86-nhfts" Mar 20 10:56:21 crc kubenswrapper[4748]: I0320 10:56:21.502449 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3de236a-e527-4582-8eb5-03ca8aa883e0-horizon-tls-certs\") pod \"horizon-7d79b6bb86-nhfts\" (UID: \"f3de236a-e527-4582-8eb5-03ca8aa883e0\") " pod="openstack/horizon-7d79b6bb86-nhfts" Mar 20 10:56:21 crc kubenswrapper[4748]: I0320 10:56:21.502785 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3de236a-e527-4582-8eb5-03ca8aa883e0-combined-ca-bundle\") pod \"horizon-7d79b6bb86-nhfts\" (UID: \"f3de236a-e527-4582-8eb5-03ca8aa883e0\") " pod="openstack/horizon-7d79b6bb86-nhfts" Mar 20 10:56:21 crc kubenswrapper[4748]: I0320 10:56:21.525384 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54fjv\" (UniqueName: \"kubernetes.io/projected/f3de236a-e527-4582-8eb5-03ca8aa883e0-kube-api-access-54fjv\") pod \"horizon-7d79b6bb86-nhfts\" (UID: \"f3de236a-e527-4582-8eb5-03ca8aa883e0\") " pod="openstack/horizon-7d79b6bb86-nhfts" Mar 20 10:56:21 crc kubenswrapper[4748]: I0320 10:56:21.587339 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7d79b6bb86-nhfts" Mar 20 10:56:23 crc kubenswrapper[4748]: I0320 10:56:23.318728 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-785d8bcb8c-5g98d" Mar 20 10:56:23 crc kubenswrapper[4748]: I0320 10:56:23.389381 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-jb9kr"] Mar 20 10:56:23 crc kubenswrapper[4748]: I0320 10:56:23.390019 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-698758b865-jb9kr" podUID="fe43a4aa-ef4d-4509-8760-b98a4de5b2e5" containerName="dnsmasq-dns" containerID="cri-o://8790c054f45fcf832b100fb8aef752a10849709a0806741a7a4087d053414ea1" gracePeriod=10 Mar 20 10:56:23 crc kubenswrapper[4748]: E0320 10:56:23.647729 4748 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfe43a4aa_ef4d_4509_8760_b98a4de5b2e5.slice/crio-conmon-8790c054f45fcf832b100fb8aef752a10849709a0806741a7a4087d053414ea1.scope\": RecentStats: unable to find data in memory cache]" Mar 20 10:56:24 crc kubenswrapper[4748]: I0320 10:56:24.337333 4748 generic.go:334] "Generic (PLEG): container finished" podID="fe43a4aa-ef4d-4509-8760-b98a4de5b2e5" containerID="8790c054f45fcf832b100fb8aef752a10849709a0806741a7a4087d053414ea1" exitCode=0 Mar 20 10:56:24 crc kubenswrapper[4748]: I0320 10:56:24.337386 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-jb9kr" event={"ID":"fe43a4aa-ef4d-4509-8760-b98a4de5b2e5","Type":"ContainerDied","Data":"8790c054f45fcf832b100fb8aef752a10849709a0806741a7a4087d053414ea1"} Mar 20 10:56:25 crc kubenswrapper[4748]: I0320 10:56:25.255797 4748 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-jb9kr" podUID="fe43a4aa-ef4d-4509-8760-b98a4de5b2e5" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.115:5353: connect: connection refused" Mar 20 10:56:27 crc kubenswrapper[4748]: I0320 10:56:27.967585 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-42s2j" Mar 20 10:56:28 crc kubenswrapper[4748]: I0320 10:56:28.031637 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fcd1bb77-acf9-4f10-855d-4ab63f3ad229-fernet-keys\") pod \"fcd1bb77-acf9-4f10-855d-4ab63f3ad229\" (UID: \"fcd1bb77-acf9-4f10-855d-4ab63f3ad229\") " Mar 20 10:56:28 crc kubenswrapper[4748]: I0320 10:56:28.031699 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v2n28\" (UniqueName: \"kubernetes.io/projected/fcd1bb77-acf9-4f10-855d-4ab63f3ad229-kube-api-access-v2n28\") pod \"fcd1bb77-acf9-4f10-855d-4ab63f3ad229\" (UID: \"fcd1bb77-acf9-4f10-855d-4ab63f3ad229\") " Mar 20 10:56:28 crc kubenswrapper[4748]: I0320 10:56:28.031782 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcd1bb77-acf9-4f10-855d-4ab63f3ad229-combined-ca-bundle\") pod \"fcd1bb77-acf9-4f10-855d-4ab63f3ad229\" (UID: \"fcd1bb77-acf9-4f10-855d-4ab63f3ad229\") " Mar 20 10:56:28 crc kubenswrapper[4748]: I0320 10:56:28.031823 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fcd1bb77-acf9-4f10-855d-4ab63f3ad229-config-data\") pod \"fcd1bb77-acf9-4f10-855d-4ab63f3ad229\" (UID: \"fcd1bb77-acf9-4f10-855d-4ab63f3ad229\") " Mar 20 10:56:28 crc kubenswrapper[4748]: I0320 10:56:28.031886 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fcd1bb77-acf9-4f10-855d-4ab63f3ad229-credential-keys\") pod \"fcd1bb77-acf9-4f10-855d-4ab63f3ad229\" (UID: \"fcd1bb77-acf9-4f10-855d-4ab63f3ad229\") " Mar 20 10:56:28 crc kubenswrapper[4748]: I0320 10:56:28.031918 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fcd1bb77-acf9-4f10-855d-4ab63f3ad229-scripts\") pod \"fcd1bb77-acf9-4f10-855d-4ab63f3ad229\" (UID: \"fcd1bb77-acf9-4f10-855d-4ab63f3ad229\") " Mar 20 10:56:28 crc kubenswrapper[4748]: I0320 10:56:28.039863 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fcd1bb77-acf9-4f10-855d-4ab63f3ad229-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "fcd1bb77-acf9-4f10-855d-4ab63f3ad229" (UID: "fcd1bb77-acf9-4f10-855d-4ab63f3ad229"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:28 crc kubenswrapper[4748]: I0320 10:56:28.041625 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fcd1bb77-acf9-4f10-855d-4ab63f3ad229-kube-api-access-v2n28" (OuterVolumeSpecName: "kube-api-access-v2n28") pod "fcd1bb77-acf9-4f10-855d-4ab63f3ad229" (UID: "fcd1bb77-acf9-4f10-855d-4ab63f3ad229"). InnerVolumeSpecName "kube-api-access-v2n28". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:28 crc kubenswrapper[4748]: I0320 10:56:28.049602 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fcd1bb77-acf9-4f10-855d-4ab63f3ad229-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "fcd1bb77-acf9-4f10-855d-4ab63f3ad229" (UID: "fcd1bb77-acf9-4f10-855d-4ab63f3ad229"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:28 crc kubenswrapper[4748]: I0320 10:56:28.056997 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fcd1bb77-acf9-4f10-855d-4ab63f3ad229-scripts" (OuterVolumeSpecName: "scripts") pod "fcd1bb77-acf9-4f10-855d-4ab63f3ad229" (UID: "fcd1bb77-acf9-4f10-855d-4ab63f3ad229"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:28 crc kubenswrapper[4748]: I0320 10:56:28.077438 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fcd1bb77-acf9-4f10-855d-4ab63f3ad229-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fcd1bb77-acf9-4f10-855d-4ab63f3ad229" (UID: "fcd1bb77-acf9-4f10-855d-4ab63f3ad229"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:28 crc kubenswrapper[4748]: I0320 10:56:28.080312 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fcd1bb77-acf9-4f10-855d-4ab63f3ad229-config-data" (OuterVolumeSpecName: "config-data") pod "fcd1bb77-acf9-4f10-855d-4ab63f3ad229" (UID: "fcd1bb77-acf9-4f10-855d-4ab63f3ad229"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:28 crc kubenswrapper[4748]: I0320 10:56:28.133815 4748 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fcd1bb77-acf9-4f10-855d-4ab63f3ad229-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:28 crc kubenswrapper[4748]: I0320 10:56:28.133882 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v2n28\" (UniqueName: \"kubernetes.io/projected/fcd1bb77-acf9-4f10-855d-4ab63f3ad229-kube-api-access-v2n28\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:28 crc kubenswrapper[4748]: I0320 10:56:28.133901 4748 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcd1bb77-acf9-4f10-855d-4ab63f3ad229-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:28 crc kubenswrapper[4748]: I0320 10:56:28.133915 4748 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fcd1bb77-acf9-4f10-855d-4ab63f3ad229-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:28 crc kubenswrapper[4748]: I0320 10:56:28.133927 4748 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fcd1bb77-acf9-4f10-855d-4ab63f3ad229-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:28 crc kubenswrapper[4748]: I0320 10:56:28.133939 4748 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fcd1bb77-acf9-4f10-855d-4ab63f3ad229-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:28 crc kubenswrapper[4748]: I0320 10:56:28.372383 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-42s2j" event={"ID":"fcd1bb77-acf9-4f10-855d-4ab63f3ad229","Type":"ContainerDied","Data":"304286bbf5d469aeee81c1d9e8ee78d81797054684e4cd5e340c9b01ab1be4d0"} Mar 20 10:56:28 crc kubenswrapper[4748]: I0320 10:56:28.372435 4748 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="304286bbf5d469aeee81c1d9e8ee78d81797054684e4cd5e340c9b01ab1be4d0" Mar 20 10:56:28 crc kubenswrapper[4748]: I0320 10:56:28.372557 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-42s2j" Mar 20 10:56:29 crc kubenswrapper[4748]: I0320 10:56:29.058644 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-42s2j"] Mar 20 10:56:29 crc kubenswrapper[4748]: I0320 10:56:29.066514 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-42s2j"] Mar 20 10:56:29 crc kubenswrapper[4748]: I0320 10:56:29.151586 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-744pk"] Mar 20 10:56:29 crc kubenswrapper[4748]: E0320 10:56:29.151974 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcd1bb77-acf9-4f10-855d-4ab63f3ad229" containerName="keystone-bootstrap" Mar 20 10:56:29 crc kubenswrapper[4748]: I0320 10:56:29.151991 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcd1bb77-acf9-4f10-855d-4ab63f3ad229" containerName="keystone-bootstrap" Mar 20 10:56:29 crc kubenswrapper[4748]: I0320 10:56:29.152188 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="fcd1bb77-acf9-4f10-855d-4ab63f3ad229" containerName="keystone-bootstrap" Mar 20 10:56:29 crc kubenswrapper[4748]: I0320 10:56:29.152718 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-744pk" Mar 20 10:56:29 crc kubenswrapper[4748]: I0320 10:56:29.156147 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 20 10:56:29 crc kubenswrapper[4748]: I0320 10:56:29.156200 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 20 10:56:29 crc kubenswrapper[4748]: I0320 10:56:29.156661 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 20 10:56:29 crc kubenswrapper[4748]: I0320 10:56:29.161230 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 20 10:56:29 crc kubenswrapper[4748]: I0320 10:56:29.166589 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-n27fv" Mar 20 10:56:29 crc kubenswrapper[4748]: I0320 10:56:29.185153 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-744pk"] Mar 20 10:56:29 crc kubenswrapper[4748]: I0320 10:56:29.265922 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/66aa7b6f-a021-4161-b3da-ddb593f2b169-credential-keys\") pod \"keystone-bootstrap-744pk\" (UID: \"66aa7b6f-a021-4161-b3da-ddb593f2b169\") " pod="openstack/keystone-bootstrap-744pk" Mar 20 10:56:29 crc kubenswrapper[4748]: I0320 10:56:29.265994 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66aa7b6f-a021-4161-b3da-ddb593f2b169-config-data\") pod \"keystone-bootstrap-744pk\" (UID: \"66aa7b6f-a021-4161-b3da-ddb593f2b169\") " pod="openstack/keystone-bootstrap-744pk" Mar 20 10:56:29 crc kubenswrapper[4748]: I0320 10:56:29.266034 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/66aa7b6f-a021-4161-b3da-ddb593f2b169-scripts\") pod \"keystone-bootstrap-744pk\" (UID: \"66aa7b6f-a021-4161-b3da-ddb593f2b169\") " pod="openstack/keystone-bootstrap-744pk" Mar 20 10:56:29 crc kubenswrapper[4748]: I0320 10:56:29.266312 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66aa7b6f-a021-4161-b3da-ddb593f2b169-combined-ca-bundle\") pod \"keystone-bootstrap-744pk\" (UID: \"66aa7b6f-a021-4161-b3da-ddb593f2b169\") " pod="openstack/keystone-bootstrap-744pk" Mar 20 10:56:29 crc kubenswrapper[4748]: I0320 10:56:29.266404 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/66aa7b6f-a021-4161-b3da-ddb593f2b169-fernet-keys\") pod \"keystone-bootstrap-744pk\" (UID: \"66aa7b6f-a021-4161-b3da-ddb593f2b169\") " pod="openstack/keystone-bootstrap-744pk" Mar 20 10:56:29 crc kubenswrapper[4748]: I0320 10:56:29.266547 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k26wc\" (UniqueName: \"kubernetes.io/projected/66aa7b6f-a021-4161-b3da-ddb593f2b169-kube-api-access-k26wc\") pod \"keystone-bootstrap-744pk\" (UID: \"66aa7b6f-a021-4161-b3da-ddb593f2b169\") " pod="openstack/keystone-bootstrap-744pk" Mar 20 10:56:29 crc kubenswrapper[4748]: I0320 10:56:29.367864 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66aa7b6f-a021-4161-b3da-ddb593f2b169-combined-ca-bundle\") pod \"keystone-bootstrap-744pk\" (UID: \"66aa7b6f-a021-4161-b3da-ddb593f2b169\") " pod="openstack/keystone-bootstrap-744pk" Mar 20 10:56:29 crc kubenswrapper[4748]: I0320 10:56:29.367935 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/66aa7b6f-a021-4161-b3da-ddb593f2b169-fernet-keys\") pod \"keystone-bootstrap-744pk\" (UID: \"66aa7b6f-a021-4161-b3da-ddb593f2b169\") " pod="openstack/keystone-bootstrap-744pk" Mar 20 10:56:29 crc kubenswrapper[4748]: I0320 10:56:29.368002 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k26wc\" (UniqueName: \"kubernetes.io/projected/66aa7b6f-a021-4161-b3da-ddb593f2b169-kube-api-access-k26wc\") pod \"keystone-bootstrap-744pk\" (UID: \"66aa7b6f-a021-4161-b3da-ddb593f2b169\") " pod="openstack/keystone-bootstrap-744pk" Mar 20 10:56:29 crc kubenswrapper[4748]: I0320 10:56:29.368075 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/66aa7b6f-a021-4161-b3da-ddb593f2b169-credential-keys\") pod \"keystone-bootstrap-744pk\" (UID: \"66aa7b6f-a021-4161-b3da-ddb593f2b169\") " pod="openstack/keystone-bootstrap-744pk" Mar 20 10:56:29 crc kubenswrapper[4748]: I0320 10:56:29.368120 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66aa7b6f-a021-4161-b3da-ddb593f2b169-config-data\") pod \"keystone-bootstrap-744pk\" (UID: \"66aa7b6f-a021-4161-b3da-ddb593f2b169\") " pod="openstack/keystone-bootstrap-744pk" Mar 20 10:56:29 crc kubenswrapper[4748]: I0320 10:56:29.368159 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/66aa7b6f-a021-4161-b3da-ddb593f2b169-scripts\") pod \"keystone-bootstrap-744pk\" (UID: \"66aa7b6f-a021-4161-b3da-ddb593f2b169\") " pod="openstack/keystone-bootstrap-744pk" Mar 20 10:56:29 crc kubenswrapper[4748]: I0320 10:56:29.373826 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/66aa7b6f-a021-4161-b3da-ddb593f2b169-credential-keys\") pod \"keystone-bootstrap-744pk\" (UID: \"66aa7b6f-a021-4161-b3da-ddb593f2b169\") " pod="openstack/keystone-bootstrap-744pk" Mar 20 10:56:29 crc kubenswrapper[4748]: I0320 10:56:29.377524 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/66aa7b6f-a021-4161-b3da-ddb593f2b169-scripts\") pod \"keystone-bootstrap-744pk\" (UID: \"66aa7b6f-a021-4161-b3da-ddb593f2b169\") " pod="openstack/keystone-bootstrap-744pk" Mar 20 10:56:29 crc kubenswrapper[4748]: I0320 10:56:29.377770 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/66aa7b6f-a021-4161-b3da-ddb593f2b169-fernet-keys\") pod \"keystone-bootstrap-744pk\" (UID: \"66aa7b6f-a021-4161-b3da-ddb593f2b169\") " pod="openstack/keystone-bootstrap-744pk" Mar 20 10:56:29 crc kubenswrapper[4748]: I0320 10:56:29.378014 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66aa7b6f-a021-4161-b3da-ddb593f2b169-config-data\") pod \"keystone-bootstrap-744pk\" (UID: \"66aa7b6f-a021-4161-b3da-ddb593f2b169\") " pod="openstack/keystone-bootstrap-744pk" Mar 20 10:56:29 crc kubenswrapper[4748]: I0320 10:56:29.378604 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66aa7b6f-a021-4161-b3da-ddb593f2b169-combined-ca-bundle\") pod \"keystone-bootstrap-744pk\" (UID: \"66aa7b6f-a021-4161-b3da-ddb593f2b169\") " pod="openstack/keystone-bootstrap-744pk" Mar 20 10:56:29 crc kubenswrapper[4748]: I0320 10:56:29.386739 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k26wc\" (UniqueName: \"kubernetes.io/projected/66aa7b6f-a021-4161-b3da-ddb593f2b169-kube-api-access-k26wc\") pod \"keystone-bootstrap-744pk\" (UID: \"66aa7b6f-a021-4161-b3da-ddb593f2b169\") " pod="openstack/keystone-bootstrap-744pk" Mar 20 10:56:29 crc kubenswrapper[4748]: E0320 10:56:29.451989 4748 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-placement-api:current-podified" Mar 20 10:56:29 crc kubenswrapper[4748]: E0320 10:56:29.452371 4748 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:placement-db-sync,Image:quay.io/podified-antelope-centos9/openstack-placement-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/placement,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:placement-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2jbt7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42482,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-db-sync-g4qfl_openstack(6340e84c-8ffb-40e5-a470-52a50bff86f1): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 10:56:29 crc kubenswrapper[4748]: E0320 10:56:29.453649 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/placement-db-sync-g4qfl" podUID="6340e84c-8ffb-40e5-a470-52a50bff86f1" Mar 20 10:56:29 crc kubenswrapper[4748]: I0320 10:56:29.488884 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-744pk" Mar 20 10:56:29 crc kubenswrapper[4748]: I0320 10:56:29.528165 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fcd1bb77-acf9-4f10-855d-4ab63f3ad229" path="/var/lib/kubelet/pods/fcd1bb77-acf9-4f10-855d-4ab63f3ad229/volumes" Mar 20 10:56:30 crc kubenswrapper[4748]: I0320 10:56:30.258497 4748 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-jb9kr" podUID="fe43a4aa-ef4d-4509-8760-b98a4de5b2e5" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.115:5353: connect: connection refused" Mar 20 10:56:30 crc kubenswrapper[4748]: E0320 10:56:30.397908 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-placement-api:current-podified\\\"\"" pod="openstack/placement-db-sync-g4qfl" podUID="6340e84c-8ffb-40e5-a470-52a50bff86f1" Mar 20 10:56:34 crc kubenswrapper[4748]: E0320 10:56:34.152208 4748 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Mar 20 10:56:34 crc kubenswrapper[4748]: E0320 10:56:34.153404 4748 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68bh585h55h56fh66ch58ch69h56fhbhd8hc9h66hdbh5c6h64h66fh659h74hc4h59hc7h58ch546h587hb8h5bh569h66fh694hb8h697h5b5q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jxb4f,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-b8d4ccdcc-grvq4_openstack(dddc05be-fe06-4c29-8102-d6118b6f36d5): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 10:56:34 crc kubenswrapper[4748]: E0320 10:56:34.156094 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-b8d4ccdcc-grvq4" podUID="dddc05be-fe06-4c29-8102-d6118b6f36d5" Mar 20 10:56:38 crc kubenswrapper[4748]: E0320 10:56:38.474448 4748 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Mar 20 10:56:38 crc kubenswrapper[4748]: E0320 10:56:38.475125 4748 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n5cch6dh5c6hb6h8dh95h64dh64ch647h56fh587h596h695hbh57h5dfh55bh669h5bhf9h567h569h5dh545h55fh65h4h579hddh666h658hd8q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-m44cp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-69d8fb6c75-qkwp6_openstack(c2b0c5b1-956d-4aae-97b8-849e47e03677): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 10:56:38 crc kubenswrapper[4748]: E0320 10:56:38.477430 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-69d8fb6c75-qkwp6" podUID="c2b0c5b1-956d-4aae-97b8-849e47e03677" Mar 20 10:56:40 crc kubenswrapper[4748]: I0320 10:56:40.256662 4748 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-jb9kr" podUID="fe43a4aa-ef4d-4509-8760-b98a4de5b2e5" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.115:5353: i/o timeout" Mar 20 10:56:40 crc kubenswrapper[4748]: I0320 10:56:40.257498 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-jb9kr" Mar 20 10:56:42 crc kubenswrapper[4748]: I0320 10:56:42.397855 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 10:56:42 crc kubenswrapper[4748]: I0320 10:56:42.411249 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 10:56:42 crc kubenswrapper[4748]: I0320 10:56:42.433788 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"16ac2620-c820-4881-b959-d12478e5f4ba\" (UID: \"16ac2620-c820-4881-b959-d12478e5f4ba\") " Mar 20 10:56:42 crc kubenswrapper[4748]: I0320 10:56:42.433914 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/952e94ca-2ca0-4a73-af07-8a02572a080b-internal-tls-certs\") pod \"952e94ca-2ca0-4a73-af07-8a02572a080b\" (UID: \"952e94ca-2ca0-4a73-af07-8a02572a080b\") " Mar 20 10:56:42 crc kubenswrapper[4748]: I0320 10:56:42.433971 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/952e94ca-2ca0-4a73-af07-8a02572a080b-config-data\") pod \"952e94ca-2ca0-4a73-af07-8a02572a080b\" (UID: \"952e94ca-2ca0-4a73-af07-8a02572a080b\") " Mar 20 10:56:42 crc kubenswrapper[4748]: I0320 10:56:42.434060 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/952e94ca-2ca0-4a73-af07-8a02572a080b-httpd-run\") pod \"952e94ca-2ca0-4a73-af07-8a02572a080b\" (UID: \"952e94ca-2ca0-4a73-af07-8a02572a080b\") " Mar 20 10:56:42 crc kubenswrapper[4748]: I0320 10:56:42.434121 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16ac2620-c820-4881-b959-d12478e5f4ba-config-data\") pod \"16ac2620-c820-4881-b959-d12478e5f4ba\" (UID: \"16ac2620-c820-4881-b959-d12478e5f4ba\") " Mar 20 10:56:42 crc kubenswrapper[4748]: I0320 10:56:42.434139 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/952e94ca-2ca0-4a73-af07-8a02572a080b-combined-ca-bundle\") pod \"952e94ca-2ca0-4a73-af07-8a02572a080b\" (UID: \"952e94ca-2ca0-4a73-af07-8a02572a080b\") " Mar 20 10:56:42 crc kubenswrapper[4748]: I0320 10:56:42.434206 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/952e94ca-2ca0-4a73-af07-8a02572a080b-logs\") pod \"952e94ca-2ca0-4a73-af07-8a02572a080b\" (UID: \"952e94ca-2ca0-4a73-af07-8a02572a080b\") " Mar 20 10:56:42 crc kubenswrapper[4748]: I0320 10:56:42.434254 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16ac2620-c820-4881-b959-d12478e5f4ba-combined-ca-bundle\") pod \"16ac2620-c820-4881-b959-d12478e5f4ba\" (UID: \"16ac2620-c820-4881-b959-d12478e5f4ba\") " Mar 20 10:56:42 crc kubenswrapper[4748]: I0320 10:56:42.434405 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"952e94ca-2ca0-4a73-af07-8a02572a080b\" (UID: \"952e94ca-2ca0-4a73-af07-8a02572a080b\") " Mar 20 10:56:42 crc kubenswrapper[4748]: I0320 10:56:42.434488 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wrmm6\" (UniqueName: \"kubernetes.io/projected/16ac2620-c820-4881-b959-d12478e5f4ba-kube-api-access-wrmm6\") pod \"16ac2620-c820-4881-b959-d12478e5f4ba\" (UID: \"16ac2620-c820-4881-b959-d12478e5f4ba\") " Mar 20 10:56:42 crc kubenswrapper[4748]: I0320 10:56:42.434544 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/16ac2620-c820-4881-b959-d12478e5f4ba-public-tls-certs\") pod \"16ac2620-c820-4881-b959-d12478e5f4ba\" (UID: \"16ac2620-c820-4881-b959-d12478e5f4ba\") " Mar 20 10:56:42 crc kubenswrapper[4748]: I0320 10:56:42.434569 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2nhcj\" (UniqueName: \"kubernetes.io/projected/952e94ca-2ca0-4a73-af07-8a02572a080b-kube-api-access-2nhcj\") pod \"952e94ca-2ca0-4a73-af07-8a02572a080b\" (UID: \"952e94ca-2ca0-4a73-af07-8a02572a080b\") " Mar 20 10:56:42 crc kubenswrapper[4748]: I0320 10:56:42.434597 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/16ac2620-c820-4881-b959-d12478e5f4ba-httpd-run\") pod \"16ac2620-c820-4881-b959-d12478e5f4ba\" (UID: \"16ac2620-c820-4881-b959-d12478e5f4ba\") " Mar 20 10:56:42 crc kubenswrapper[4748]: I0320 10:56:42.434625 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/16ac2620-c820-4881-b959-d12478e5f4ba-logs\") pod \"16ac2620-c820-4881-b959-d12478e5f4ba\" (UID: \"16ac2620-c820-4881-b959-d12478e5f4ba\") " Mar 20 10:56:42 crc kubenswrapper[4748]: I0320 10:56:42.434658 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16ac2620-c820-4881-b959-d12478e5f4ba-scripts\") pod \"16ac2620-c820-4881-b959-d12478e5f4ba\" (UID: \"16ac2620-c820-4881-b959-d12478e5f4ba\") " Mar 20 10:56:42 crc kubenswrapper[4748]: I0320 10:56:42.434685 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/952e94ca-2ca0-4a73-af07-8a02572a080b-scripts\") pod \"952e94ca-2ca0-4a73-af07-8a02572a080b\" (UID: \"952e94ca-2ca0-4a73-af07-8a02572a080b\") " Mar 20 10:56:42 crc kubenswrapper[4748]: I0320 10:56:42.434924 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/952e94ca-2ca0-4a73-af07-8a02572a080b-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "952e94ca-2ca0-4a73-af07-8a02572a080b" (UID: "952e94ca-2ca0-4a73-af07-8a02572a080b"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:56:42 crc kubenswrapper[4748]: I0320 10:56:42.435251 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16ac2620-c820-4881-b959-d12478e5f4ba-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "16ac2620-c820-4881-b959-d12478e5f4ba" (UID: "16ac2620-c820-4881-b959-d12478e5f4ba"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:56:42 crc kubenswrapper[4748]: I0320 10:56:42.435333 4748 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/952e94ca-2ca0-4a73-af07-8a02572a080b-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:42 crc kubenswrapper[4748]: I0320 10:56:42.435723 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/952e94ca-2ca0-4a73-af07-8a02572a080b-logs" (OuterVolumeSpecName: "logs") pod "952e94ca-2ca0-4a73-af07-8a02572a080b" (UID: "952e94ca-2ca0-4a73-af07-8a02572a080b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:56:42 crc kubenswrapper[4748]: I0320 10:56:42.438337 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16ac2620-c820-4881-b959-d12478e5f4ba-logs" (OuterVolumeSpecName: "logs") pod "16ac2620-c820-4881-b959-d12478e5f4ba" (UID: "16ac2620-c820-4881-b959-d12478e5f4ba"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:56:42 crc kubenswrapper[4748]: I0320 10:56:42.441095 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "glance") pod "16ac2620-c820-4881-b959-d12478e5f4ba" (UID: "16ac2620-c820-4881-b959-d12478e5f4ba"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 20 10:56:42 crc kubenswrapper[4748]: I0320 10:56:42.441497 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/952e94ca-2ca0-4a73-af07-8a02572a080b-kube-api-access-2nhcj" (OuterVolumeSpecName: "kube-api-access-2nhcj") pod "952e94ca-2ca0-4a73-af07-8a02572a080b" (UID: "952e94ca-2ca0-4a73-af07-8a02572a080b"). InnerVolumeSpecName "kube-api-access-2nhcj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:42 crc kubenswrapper[4748]: I0320 10:56:42.443087 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "952e94ca-2ca0-4a73-af07-8a02572a080b" (UID: "952e94ca-2ca0-4a73-af07-8a02572a080b"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 20 10:56:42 crc kubenswrapper[4748]: I0320 10:56:42.444600 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16ac2620-c820-4881-b959-d12478e5f4ba-scripts" (OuterVolumeSpecName: "scripts") pod "16ac2620-c820-4881-b959-d12478e5f4ba" (UID: "16ac2620-c820-4881-b959-d12478e5f4ba"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:42 crc kubenswrapper[4748]: I0320 10:56:42.444746 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/952e94ca-2ca0-4a73-af07-8a02572a080b-scripts" (OuterVolumeSpecName: "scripts") pod "952e94ca-2ca0-4a73-af07-8a02572a080b" (UID: "952e94ca-2ca0-4a73-af07-8a02572a080b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:42 crc kubenswrapper[4748]: I0320 10:56:42.449992 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16ac2620-c820-4881-b959-d12478e5f4ba-kube-api-access-wrmm6" (OuterVolumeSpecName: "kube-api-access-wrmm6") pod "16ac2620-c820-4881-b959-d12478e5f4ba" (UID: "16ac2620-c820-4881-b959-d12478e5f4ba"). InnerVolumeSpecName "kube-api-access-wrmm6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:42 crc kubenswrapper[4748]: I0320 10:56:42.492711 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16ac2620-c820-4881-b959-d12478e5f4ba-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "16ac2620-c820-4881-b959-d12478e5f4ba" (UID: "16ac2620-c820-4881-b959-d12478e5f4ba"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:42 crc kubenswrapper[4748]: I0320 10:56:42.517072 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16ac2620-c820-4881-b959-d12478e5f4ba-config-data" (OuterVolumeSpecName: "config-data") pod "16ac2620-c820-4881-b959-d12478e5f4ba" (UID: "16ac2620-c820-4881-b959-d12478e5f4ba"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:42 crc kubenswrapper[4748]: I0320 10:56:42.519590 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/952e94ca-2ca0-4a73-af07-8a02572a080b-config-data" (OuterVolumeSpecName: "config-data") pod "952e94ca-2ca0-4a73-af07-8a02572a080b" (UID: "952e94ca-2ca0-4a73-af07-8a02572a080b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:42 crc kubenswrapper[4748]: I0320 10:56:42.541266 4748 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Mar 20 10:56:42 crc kubenswrapper[4748]: I0320 10:56:42.541327 4748 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/952e94ca-2ca0-4a73-af07-8a02572a080b-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:42 crc kubenswrapper[4748]: I0320 10:56:42.541342 4748 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16ac2620-c820-4881-b959-d12478e5f4ba-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:42 crc kubenswrapper[4748]: I0320 10:56:42.541357 4748 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/952e94ca-2ca0-4a73-af07-8a02572a080b-logs\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:42 crc kubenswrapper[4748]: I0320 10:56:42.541371 4748 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16ac2620-c820-4881-b959-d12478e5f4ba-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:42 crc kubenswrapper[4748]: I0320 10:56:42.541393 4748 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Mar 20 10:56:42 crc kubenswrapper[4748]: I0320 10:56:42.541403 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wrmm6\" (UniqueName: \"kubernetes.io/projected/16ac2620-c820-4881-b959-d12478e5f4ba-kube-api-access-wrmm6\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:42 crc kubenswrapper[4748]: I0320 10:56:42.541413 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2nhcj\" (UniqueName: \"kubernetes.io/projected/952e94ca-2ca0-4a73-af07-8a02572a080b-kube-api-access-2nhcj\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:42 crc kubenswrapper[4748]: I0320 10:56:42.541422 4748 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/16ac2620-c820-4881-b959-d12478e5f4ba-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:42 crc kubenswrapper[4748]: I0320 10:56:42.541432 4748 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/16ac2620-c820-4881-b959-d12478e5f4ba-logs\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:42 crc kubenswrapper[4748]: I0320 10:56:42.541464 4748 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16ac2620-c820-4881-b959-d12478e5f4ba-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:42 crc kubenswrapper[4748]: I0320 10:56:42.541517 4748 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/952e94ca-2ca0-4a73-af07-8a02572a080b-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:42 crc kubenswrapper[4748]: I0320 10:56:42.545511 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"952e94ca-2ca0-4a73-af07-8a02572a080b","Type":"ContainerDied","Data":"03e83f620f71fe7d7a163a3a9827c784f99b1ddd38b713d0697b78ab72e15268"} Mar 20 10:56:42 crc kubenswrapper[4748]: I0320 10:56:42.545583 4748 scope.go:117] "RemoveContainer" containerID="701bbd352e749b7cf6f1466360b7be0f1095943e20208c22277d4d7de3eaac02" Mar 20 10:56:42 crc kubenswrapper[4748]: I0320 10:56:42.545735 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 10:56:42 crc kubenswrapper[4748]: I0320 10:56:42.550682 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/952e94ca-2ca0-4a73-af07-8a02572a080b-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "952e94ca-2ca0-4a73-af07-8a02572a080b" (UID: "952e94ca-2ca0-4a73-af07-8a02572a080b"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:42 crc kubenswrapper[4748]: I0320 10:56:42.552603 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"16ac2620-c820-4881-b959-d12478e5f4ba","Type":"ContainerDied","Data":"22430d0042f0ed5cb007e28f6d3ccf1b7b2fb2775365375a469d0c0175dfe3fc"} Mar 20 10:56:42 crc kubenswrapper[4748]: I0320 10:56:42.552711 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 10:56:42 crc kubenswrapper[4748]: I0320 10:56:42.561414 4748 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Mar 20 10:56:42 crc kubenswrapper[4748]: I0320 10:56:42.570295 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16ac2620-c820-4881-b959-d12478e5f4ba-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "16ac2620-c820-4881-b959-d12478e5f4ba" (UID: "16ac2620-c820-4881-b959-d12478e5f4ba"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:42 crc kubenswrapper[4748]: I0320 10:56:42.571308 4748 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Mar 20 10:56:42 crc kubenswrapper[4748]: I0320 10:56:42.599117 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/952e94ca-2ca0-4a73-af07-8a02572a080b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "952e94ca-2ca0-4a73-af07-8a02572a080b" (UID: "952e94ca-2ca0-4a73-af07-8a02572a080b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:42 crc kubenswrapper[4748]: I0320 10:56:42.643633 4748 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:42 crc kubenswrapper[4748]: I0320 10:56:42.643679 4748 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/16ac2620-c820-4881-b959-d12478e5f4ba-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:42 crc kubenswrapper[4748]: I0320 10:56:42.643698 4748 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:42 crc kubenswrapper[4748]: I0320 10:56:42.643714 4748 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/952e94ca-2ca0-4a73-af07-8a02572a080b-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:42 crc kubenswrapper[4748]: I0320 10:56:42.643726 4748 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/952e94ca-2ca0-4a73-af07-8a02572a080b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:42 crc kubenswrapper[4748]: I0320 10:56:42.901602 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 10:56:42 crc kubenswrapper[4748]: I0320 10:56:42.923326 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 10:56:42 crc kubenswrapper[4748]: I0320 10:56:42.941754 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 10:56:42 crc kubenswrapper[4748]: I0320 10:56:42.978992 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 10:56:42 crc kubenswrapper[4748]: I0320 10:56:42.979399 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 10:56:42 crc kubenswrapper[4748]: E0320 10:56:42.980105 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="952e94ca-2ca0-4a73-af07-8a02572a080b" containerName="glance-httpd" Mar 20 10:56:42 crc kubenswrapper[4748]: I0320 10:56:42.980200 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="952e94ca-2ca0-4a73-af07-8a02572a080b" containerName="glance-httpd" Mar 20 10:56:42 crc kubenswrapper[4748]: E0320 10:56:42.980270 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="952e94ca-2ca0-4a73-af07-8a02572a080b" containerName="glance-log" Mar 20 10:56:42 crc kubenswrapper[4748]: I0320 10:56:42.980323 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="952e94ca-2ca0-4a73-af07-8a02572a080b" containerName="glance-log" Mar 20 10:56:42 crc kubenswrapper[4748]: E0320 10:56:42.980465 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16ac2620-c820-4881-b959-d12478e5f4ba" containerName="glance-log" Mar 20 10:56:42 crc kubenswrapper[4748]: I0320 10:56:42.980529 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="16ac2620-c820-4881-b959-d12478e5f4ba" containerName="glance-log" Mar 20 10:56:42 crc kubenswrapper[4748]: E0320 10:56:42.980588 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16ac2620-c820-4881-b959-d12478e5f4ba" containerName="glance-httpd" Mar 20 10:56:42 crc kubenswrapper[4748]: I0320 10:56:42.980663 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="16ac2620-c820-4881-b959-d12478e5f4ba" containerName="glance-httpd" Mar 20 10:56:42 crc kubenswrapper[4748]: I0320 10:56:42.980960 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="952e94ca-2ca0-4a73-af07-8a02572a080b" containerName="glance-httpd" Mar 20 10:56:42 crc kubenswrapper[4748]: I0320 10:56:42.981070 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="16ac2620-c820-4881-b959-d12478e5f4ba" containerName="glance-httpd" Mar 20 10:56:42 crc kubenswrapper[4748]: I0320 10:56:42.981222 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="16ac2620-c820-4881-b959-d12478e5f4ba" containerName="glance-log" Mar 20 10:56:42 crc kubenswrapper[4748]: I0320 10:56:42.981304 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="952e94ca-2ca0-4a73-af07-8a02572a080b" containerName="glance-log" Mar 20 10:56:42 crc kubenswrapper[4748]: I0320 10:56:42.984784 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 10:56:42 crc kubenswrapper[4748]: I0320 10:56:42.988814 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 20 10:56:42 crc kubenswrapper[4748]: I0320 10:56:42.989118 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 20 10:56:42 crc kubenswrapper[4748]: I0320 10:56:42.989585 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 20 10:56:42 crc kubenswrapper[4748]: I0320 10:56:42.989725 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-vdzj8" Mar 20 10:56:42 crc kubenswrapper[4748]: I0320 10:56:42.998378 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 10:56:43 crc kubenswrapper[4748]: I0320 10:56:43.004192 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 10:56:43 crc kubenswrapper[4748]: I0320 10:56:43.012754 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 20 10:56:43 crc kubenswrapper[4748]: I0320 10:56:43.013078 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 20 10:56:43 crc kubenswrapper[4748]: I0320 10:56:43.033518 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 10:56:43 crc kubenswrapper[4748]: I0320 10:56:43.046058 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 10:56:43 crc kubenswrapper[4748]: I0320 10:56:43.051651 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f4ca5c7-6a70-485e-b39d-e786ad0004a5-config-data\") pod \"glance-default-internal-api-0\" (UID: \"0f4ca5c7-6a70-485e-b39d-e786ad0004a5\") " pod="openstack/glance-default-internal-api-0" Mar 20 10:56:43 crc kubenswrapper[4748]: I0320 10:56:43.051721 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f4ca5c7-6a70-485e-b39d-e786ad0004a5-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"0f4ca5c7-6a70-485e-b39d-e786ad0004a5\") " pod="openstack/glance-default-internal-api-0" Mar 20 10:56:43 crc kubenswrapper[4748]: I0320 10:56:43.051769 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwfx9\" (UniqueName: \"kubernetes.io/projected/0f4ca5c7-6a70-485e-b39d-e786ad0004a5-kube-api-access-pwfx9\") pod \"glance-default-internal-api-0\" (UID: \"0f4ca5c7-6a70-485e-b39d-e786ad0004a5\") " pod="openstack/glance-default-internal-api-0" Mar 20 10:56:43 crc kubenswrapper[4748]: I0320 10:56:43.051800 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f4ca5c7-6a70-485e-b39d-e786ad0004a5-scripts\") pod \"glance-default-internal-api-0\" (UID: \"0f4ca5c7-6a70-485e-b39d-e786ad0004a5\") " pod="openstack/glance-default-internal-api-0" Mar 20 10:56:43 crc kubenswrapper[4748]: I0320 10:56:43.052069 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0f4ca5c7-6a70-485e-b39d-e786ad0004a5-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"0f4ca5c7-6a70-485e-b39d-e786ad0004a5\") " pod="openstack/glance-default-internal-api-0" Mar 20 10:56:43 crc kubenswrapper[4748]: I0320 10:56:43.052495 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0f4ca5c7-6a70-485e-b39d-e786ad0004a5-logs\") pod \"glance-default-internal-api-0\" (UID: \"0f4ca5c7-6a70-485e-b39d-e786ad0004a5\") " pod="openstack/glance-default-internal-api-0" Mar 20 10:56:43 crc kubenswrapper[4748]: I0320 10:56:43.052649 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0f4ca5c7-6a70-485e-b39d-e786ad0004a5-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"0f4ca5c7-6a70-485e-b39d-e786ad0004a5\") " pod="openstack/glance-default-internal-api-0" Mar 20 10:56:43 crc kubenswrapper[4748]: I0320 10:56:43.052921 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"0f4ca5c7-6a70-485e-b39d-e786ad0004a5\") " pod="openstack/glance-default-internal-api-0" Mar 20 10:56:43 crc kubenswrapper[4748]: E0320 10:56:43.141301 4748 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Mar 20 10:56:43 crc kubenswrapper[4748]: E0320 10:56:43.141858 4748 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qhtff,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-j7jqs_openstack(a4a4f230-3fe6-44a4-a91b-5b0ea07ae755): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 10:56:43 crc kubenswrapper[4748]: E0320 10:56:43.143096 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-j7jqs" podUID="a4a4f230-3fe6-44a4-a91b-5b0ea07ae755" Mar 20 10:56:43 crc kubenswrapper[4748]: I0320 10:56:43.154876 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3de89ba4-6b05-410a-a7a0-e6ec1e1ba095-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"3de89ba4-6b05-410a-a7a0-e6ec1e1ba095\") " pod="openstack/glance-default-external-api-0" Mar 20 10:56:43 crc kubenswrapper[4748]: I0320 10:56:43.154964 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pwfx9\" (UniqueName: \"kubernetes.io/projected/0f4ca5c7-6a70-485e-b39d-e786ad0004a5-kube-api-access-pwfx9\") pod \"glance-default-internal-api-0\" (UID: \"0f4ca5c7-6a70-485e-b39d-e786ad0004a5\") " pod="openstack/glance-default-internal-api-0" Mar 20 10:56:43 crc kubenswrapper[4748]: I0320 10:56:43.154995 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3de89ba4-6b05-410a-a7a0-e6ec1e1ba095-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"3de89ba4-6b05-410a-a7a0-e6ec1e1ba095\") " pod="openstack/glance-default-external-api-0" Mar 20 10:56:43 crc kubenswrapper[4748]: I0320 10:56:43.155025 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f4ca5c7-6a70-485e-b39d-e786ad0004a5-scripts\") pod \"glance-default-internal-api-0\" (UID: \"0f4ca5c7-6a70-485e-b39d-e786ad0004a5\") " pod="openstack/glance-default-internal-api-0" Mar 20 10:56:43 crc kubenswrapper[4748]: I0320 10:56:43.155056 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3de89ba4-6b05-410a-a7a0-e6ec1e1ba095-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"3de89ba4-6b05-410a-a7a0-e6ec1e1ba095\") " pod="openstack/glance-default-external-api-0" Mar 20 10:56:43 crc kubenswrapper[4748]: I0320 10:56:43.155077 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3de89ba4-6b05-410a-a7a0-e6ec1e1ba095-scripts\") pod \"glance-default-external-api-0\" (UID: \"3de89ba4-6b05-410a-a7a0-e6ec1e1ba095\") " pod="openstack/glance-default-external-api-0" Mar 20 10:56:43 crc kubenswrapper[4748]: I0320 10:56:43.155114 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3de89ba4-6b05-410a-a7a0-e6ec1e1ba095-logs\") pod \"glance-default-external-api-0\" (UID: \"3de89ba4-6b05-410a-a7a0-e6ec1e1ba095\") " pod="openstack/glance-default-external-api-0" Mar 20 10:56:43 crc kubenswrapper[4748]: I0320 10:56:43.155140 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0f4ca5c7-6a70-485e-b39d-e786ad0004a5-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"0f4ca5c7-6a70-485e-b39d-e786ad0004a5\") " pod="openstack/glance-default-internal-api-0" Mar 20 10:56:43 crc kubenswrapper[4748]: I0320 10:56:43.155188 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"3de89ba4-6b05-410a-a7a0-e6ec1e1ba095\") " pod="openstack/glance-default-external-api-0" Mar 20 10:56:43 crc kubenswrapper[4748]: I0320 10:56:43.155215 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0f4ca5c7-6a70-485e-b39d-e786ad0004a5-logs\") pod \"glance-default-internal-api-0\" (UID: \"0f4ca5c7-6a70-485e-b39d-e786ad0004a5\") " pod="openstack/glance-default-internal-api-0" Mar 20 10:56:43 crc kubenswrapper[4748]: I0320 10:56:43.155239 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0f4ca5c7-6a70-485e-b39d-e786ad0004a5-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"0f4ca5c7-6a70-485e-b39d-e786ad0004a5\") " pod="openstack/glance-default-internal-api-0" Mar 20 10:56:43 crc kubenswrapper[4748]: I0320 10:56:43.155296 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"0f4ca5c7-6a70-485e-b39d-e786ad0004a5\") " pod="openstack/glance-default-internal-api-0" Mar 20 10:56:43 crc kubenswrapper[4748]: I0320 10:56:43.155745 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f4ca5c7-6a70-485e-b39d-e786ad0004a5-config-data\") pod \"glance-default-internal-api-0\" (UID: \"0f4ca5c7-6a70-485e-b39d-e786ad0004a5\") " pod="openstack/glance-default-internal-api-0" Mar 20 10:56:43 crc kubenswrapper[4748]: I0320 10:56:43.155803 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3de89ba4-6b05-410a-a7a0-e6ec1e1ba095-config-data\") pod \"glance-default-external-api-0\" (UID: \"3de89ba4-6b05-410a-a7a0-e6ec1e1ba095\") " pod="openstack/glance-default-external-api-0" Mar 20 10:56:43 crc kubenswrapper[4748]: I0320 10:56:43.155805 4748 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"0f4ca5c7-6a70-485e-b39d-e786ad0004a5\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-internal-api-0" Mar 20 10:56:43 crc kubenswrapper[4748]: I0320 10:56:43.156022 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f4ca5c7-6a70-485e-b39d-e786ad0004a5-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"0f4ca5c7-6a70-485e-b39d-e786ad0004a5\") " pod="openstack/glance-default-internal-api-0" Mar 20 10:56:43 crc kubenswrapper[4748]: I0320 10:56:43.156059 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0f4ca5c7-6a70-485e-b39d-e786ad0004a5-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"0f4ca5c7-6a70-485e-b39d-e786ad0004a5\") " pod="openstack/glance-default-internal-api-0" Mar 20 10:56:43 crc kubenswrapper[4748]: I0320 10:56:43.156061 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nncx5\" (UniqueName: \"kubernetes.io/projected/3de89ba4-6b05-410a-a7a0-e6ec1e1ba095-kube-api-access-nncx5\") pod \"glance-default-external-api-0\" (UID: \"3de89ba4-6b05-410a-a7a0-e6ec1e1ba095\") " pod="openstack/glance-default-external-api-0" Mar 20 10:56:43 crc kubenswrapper[4748]: I0320 10:56:43.156198 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0f4ca5c7-6a70-485e-b39d-e786ad0004a5-logs\") pod \"glance-default-internal-api-0\" (UID: \"0f4ca5c7-6a70-485e-b39d-e786ad0004a5\") " pod="openstack/glance-default-internal-api-0" Mar 20 10:56:43 crc kubenswrapper[4748]: I0320 10:56:43.159008 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-b8d4ccdcc-grvq4" Mar 20 10:56:43 crc kubenswrapper[4748]: I0320 10:56:43.159581 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f4ca5c7-6a70-485e-b39d-e786ad0004a5-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"0f4ca5c7-6a70-485e-b39d-e786ad0004a5\") " pod="openstack/glance-default-internal-api-0" Mar 20 10:56:43 crc kubenswrapper[4748]: I0320 10:56:43.160026 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0f4ca5c7-6a70-485e-b39d-e786ad0004a5-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"0f4ca5c7-6a70-485e-b39d-e786ad0004a5\") " pod="openstack/glance-default-internal-api-0" Mar 20 10:56:43 crc kubenswrapper[4748]: I0320 10:56:43.161247 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f4ca5c7-6a70-485e-b39d-e786ad0004a5-config-data\") pod \"glance-default-internal-api-0\" (UID: \"0f4ca5c7-6a70-485e-b39d-e786ad0004a5\") " pod="openstack/glance-default-internal-api-0" Mar 20 10:56:43 crc kubenswrapper[4748]: I0320 10:56:43.162848 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f4ca5c7-6a70-485e-b39d-e786ad0004a5-scripts\") pod \"glance-default-internal-api-0\" (UID: \"0f4ca5c7-6a70-485e-b39d-e786ad0004a5\") " pod="openstack/glance-default-internal-api-0" Mar 20 10:56:43 crc kubenswrapper[4748]: I0320 10:56:43.168214 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-69d8fb6c75-qkwp6" Mar 20 10:56:43 crc kubenswrapper[4748]: I0320 10:56:43.174274 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwfx9\" (UniqueName: \"kubernetes.io/projected/0f4ca5c7-6a70-485e-b39d-e786ad0004a5-kube-api-access-pwfx9\") pod \"glance-default-internal-api-0\" (UID: \"0f4ca5c7-6a70-485e-b39d-e786ad0004a5\") " pod="openstack/glance-default-internal-api-0" Mar 20 10:56:43 crc kubenswrapper[4748]: I0320 10:56:43.182536 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-jb9kr" Mar 20 10:56:43 crc kubenswrapper[4748]: I0320 10:56:43.197381 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"0f4ca5c7-6a70-485e-b39d-e786ad0004a5\") " pod="openstack/glance-default-internal-api-0" Mar 20 10:56:43 crc kubenswrapper[4748]: I0320 10:56:43.257323 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c2b0c5b1-956d-4aae-97b8-849e47e03677-config-data\") pod \"c2b0c5b1-956d-4aae-97b8-849e47e03677\" (UID: \"c2b0c5b1-956d-4aae-97b8-849e47e03677\") " Mar 20 10:56:43 crc kubenswrapper[4748]: I0320 10:56:43.257403 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/dddc05be-fe06-4c29-8102-d6118b6f36d5-horizon-secret-key\") pod \"dddc05be-fe06-4c29-8102-d6118b6f36d5\" (UID: \"dddc05be-fe06-4c29-8102-d6118b6f36d5\") " Mar 20 10:56:43 crc kubenswrapper[4748]: I0320 10:56:43.257459 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m44cp\" (UniqueName: \"kubernetes.io/projected/c2b0c5b1-956d-4aae-97b8-849e47e03677-kube-api-access-m44cp\") pod \"c2b0c5b1-956d-4aae-97b8-849e47e03677\" (UID: \"c2b0c5b1-956d-4aae-97b8-849e47e03677\") " Mar 20 10:56:43 crc kubenswrapper[4748]: I0320 10:56:43.257500 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fe43a4aa-ef4d-4509-8760-b98a4de5b2e5-dns-svc\") pod \"fe43a4aa-ef4d-4509-8760-b98a4de5b2e5\" (UID: \"fe43a4aa-ef4d-4509-8760-b98a4de5b2e5\") " Mar 20 10:56:43 crc kubenswrapper[4748]: I0320 10:56:43.257577 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9qz5v\" (UniqueName: \"kubernetes.io/projected/fe43a4aa-ef4d-4509-8760-b98a4de5b2e5-kube-api-access-9qz5v\") pod \"fe43a4aa-ef4d-4509-8760-b98a4de5b2e5\" (UID: \"fe43a4aa-ef4d-4509-8760-b98a4de5b2e5\") " Mar 20 10:56:43 crc kubenswrapper[4748]: I0320 10:56:43.257607 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jxb4f\" (UniqueName: \"kubernetes.io/projected/dddc05be-fe06-4c29-8102-d6118b6f36d5-kube-api-access-jxb4f\") pod \"dddc05be-fe06-4c29-8102-d6118b6f36d5\" (UID: \"dddc05be-fe06-4c29-8102-d6118b6f36d5\") " Mar 20 10:56:43 crc kubenswrapper[4748]: I0320 10:56:43.257662 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dddc05be-fe06-4c29-8102-d6118b6f36d5-scripts\") pod \"dddc05be-fe06-4c29-8102-d6118b6f36d5\" (UID: \"dddc05be-fe06-4c29-8102-d6118b6f36d5\") " Mar 20 10:56:43 crc kubenswrapper[4748]: I0320 10:56:43.257709 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dddc05be-fe06-4c29-8102-d6118b6f36d5-logs\") pod \"dddc05be-fe06-4c29-8102-d6118b6f36d5\" (UID: \"dddc05be-fe06-4c29-8102-d6118b6f36d5\") " Mar 20 10:56:43 crc kubenswrapper[4748]: I0320 10:56:43.257748 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fe43a4aa-ef4d-4509-8760-b98a4de5b2e5-ovsdbserver-sb\") pod \"fe43a4aa-ef4d-4509-8760-b98a4de5b2e5\" (UID: \"fe43a4aa-ef4d-4509-8760-b98a4de5b2e5\") " Mar 20 10:56:43 crc kubenswrapper[4748]: I0320 10:56:43.257779 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dddc05be-fe06-4c29-8102-d6118b6f36d5-config-data\") pod \"dddc05be-fe06-4c29-8102-d6118b6f36d5\" (UID: \"dddc05be-fe06-4c29-8102-d6118b6f36d5\") " Mar 20 10:56:43 crc kubenswrapper[4748]: I0320 10:56:43.257813 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fe43a4aa-ef4d-4509-8760-b98a4de5b2e5-ovsdbserver-nb\") pod \"fe43a4aa-ef4d-4509-8760-b98a4de5b2e5\" (UID: \"fe43a4aa-ef4d-4509-8760-b98a4de5b2e5\") " Mar 20 10:56:43 crc kubenswrapper[4748]: I0320 10:56:43.257866 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c2b0c5b1-956d-4aae-97b8-849e47e03677-scripts\") pod \"c2b0c5b1-956d-4aae-97b8-849e47e03677\" (UID: \"c2b0c5b1-956d-4aae-97b8-849e47e03677\") " Mar 20 10:56:43 crc kubenswrapper[4748]: I0320 10:56:43.257894 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe43a4aa-ef4d-4509-8760-b98a4de5b2e5-config\") pod \"fe43a4aa-ef4d-4509-8760-b98a4de5b2e5\" (UID: \"fe43a4aa-ef4d-4509-8760-b98a4de5b2e5\") " Mar 20 10:56:43 crc kubenswrapper[4748]: I0320 10:56:43.257922 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c2b0c5b1-956d-4aae-97b8-849e47e03677-horizon-secret-key\") pod \"c2b0c5b1-956d-4aae-97b8-849e47e03677\" (UID: \"c2b0c5b1-956d-4aae-97b8-849e47e03677\") " Mar 20 10:56:43 crc kubenswrapper[4748]: I0320 10:56:43.257950 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c2b0c5b1-956d-4aae-97b8-849e47e03677-logs\") pod \"c2b0c5b1-956d-4aae-97b8-849e47e03677\" (UID: \"c2b0c5b1-956d-4aae-97b8-849e47e03677\") " Mar 20 10:56:43 crc kubenswrapper[4748]: I0320 10:56:43.258215 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3de89ba4-6b05-410a-a7a0-e6ec1e1ba095-config-data\") pod \"glance-default-external-api-0\" (UID: \"3de89ba4-6b05-410a-a7a0-e6ec1e1ba095\") " pod="openstack/glance-default-external-api-0" Mar 20 10:56:43 crc kubenswrapper[4748]: I0320 10:56:43.258270 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nncx5\" (UniqueName: \"kubernetes.io/projected/3de89ba4-6b05-410a-a7a0-e6ec1e1ba095-kube-api-access-nncx5\") pod \"glance-default-external-api-0\" (UID: \"3de89ba4-6b05-410a-a7a0-e6ec1e1ba095\") " pod="openstack/glance-default-external-api-0" Mar 20 10:56:43 crc kubenswrapper[4748]: I0320 10:56:43.258307 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3de89ba4-6b05-410a-a7a0-e6ec1e1ba095-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"3de89ba4-6b05-410a-a7a0-e6ec1e1ba095\") " pod="openstack/glance-default-external-api-0" Mar 20 10:56:43 crc kubenswrapper[4748]: I0320 10:56:43.258341 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3de89ba4-6b05-410a-a7a0-e6ec1e1ba095-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"3de89ba4-6b05-410a-a7a0-e6ec1e1ba095\") " pod="openstack/glance-default-external-api-0" Mar 20 10:56:43 crc kubenswrapper[4748]: I0320 10:56:43.258379 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3de89ba4-6b05-410a-a7a0-e6ec1e1ba095-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"3de89ba4-6b05-410a-a7a0-e6ec1e1ba095\") " pod="openstack/glance-default-external-api-0" Mar 20 10:56:43 crc kubenswrapper[4748]: I0320 10:56:43.258405 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3de89ba4-6b05-410a-a7a0-e6ec1e1ba095-scripts\") pod \"glance-default-external-api-0\" (UID: \"3de89ba4-6b05-410a-a7a0-e6ec1e1ba095\") " pod="openstack/glance-default-external-api-0" Mar 20 10:56:43 crc kubenswrapper[4748]: I0320 10:56:43.258439 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3de89ba4-6b05-410a-a7a0-e6ec1e1ba095-logs\") pod \"glance-default-external-api-0\" (UID: \"3de89ba4-6b05-410a-a7a0-e6ec1e1ba095\") " pod="openstack/glance-default-external-api-0" Mar 20 10:56:43 crc kubenswrapper[4748]: I0320 10:56:43.258496 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"3de89ba4-6b05-410a-a7a0-e6ec1e1ba095\") " pod="openstack/glance-default-external-api-0" Mar 20 10:56:43 crc kubenswrapper[4748]: I0320 10:56:43.259014 4748 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"3de89ba4-6b05-410a-a7a0-e6ec1e1ba095\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/glance-default-external-api-0" Mar 20 10:56:43 crc kubenswrapper[4748]: I0320 10:56:43.259707 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dddc05be-fe06-4c29-8102-d6118b6f36d5-scripts" (OuterVolumeSpecName: "scripts") pod "dddc05be-fe06-4c29-8102-d6118b6f36d5" (UID: "dddc05be-fe06-4c29-8102-d6118b6f36d5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:43 crc kubenswrapper[4748]: I0320 10:56:43.261297 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dddc05be-fe06-4c29-8102-d6118b6f36d5-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "dddc05be-fe06-4c29-8102-d6118b6f36d5" (UID: "dddc05be-fe06-4c29-8102-d6118b6f36d5"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:43 crc kubenswrapper[4748]: I0320 10:56:43.263020 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2b0c5b1-956d-4aae-97b8-849e47e03677-config-data" (OuterVolumeSpecName: "config-data") pod "c2b0c5b1-956d-4aae-97b8-849e47e03677" (UID: "c2b0c5b1-956d-4aae-97b8-849e47e03677"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:43 crc kubenswrapper[4748]: I0320 10:56:43.263526 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3de89ba4-6b05-410a-a7a0-e6ec1e1ba095-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"3de89ba4-6b05-410a-a7a0-e6ec1e1ba095\") " pod="openstack/glance-default-external-api-0" Mar 20 10:56:43 crc kubenswrapper[4748]: I0320 10:56:43.274472 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dddc05be-fe06-4c29-8102-d6118b6f36d5-logs" (OuterVolumeSpecName: "logs") pod "dddc05be-fe06-4c29-8102-d6118b6f36d5" (UID: "dddc05be-fe06-4c29-8102-d6118b6f36d5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:56:43 crc kubenswrapper[4748]: I0320 10:56:43.275961 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2b0c5b1-956d-4aae-97b8-849e47e03677-kube-api-access-m44cp" (OuterVolumeSpecName: "kube-api-access-m44cp") pod "c2b0c5b1-956d-4aae-97b8-849e47e03677" (UID: "c2b0c5b1-956d-4aae-97b8-849e47e03677"). InnerVolumeSpecName "kube-api-access-m44cp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:43 crc kubenswrapper[4748]: I0320 10:56:43.276511 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2b0c5b1-956d-4aae-97b8-849e47e03677-scripts" (OuterVolumeSpecName: "scripts") pod "c2b0c5b1-956d-4aae-97b8-849e47e03677" (UID: "c2b0c5b1-956d-4aae-97b8-849e47e03677"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:43 crc kubenswrapper[4748]: I0320 10:56:43.277819 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2b0c5b1-956d-4aae-97b8-849e47e03677-logs" (OuterVolumeSpecName: "logs") pod "c2b0c5b1-956d-4aae-97b8-849e47e03677" (UID: "c2b0c5b1-956d-4aae-97b8-849e47e03677"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:56:43 crc kubenswrapper[4748]: I0320 10:56:43.278293 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3de89ba4-6b05-410a-a7a0-e6ec1e1ba095-logs\") pod \"glance-default-external-api-0\" (UID: \"3de89ba4-6b05-410a-a7a0-e6ec1e1ba095\") " pod="openstack/glance-default-external-api-0" Mar 20 10:56:43 crc kubenswrapper[4748]: I0320 10:56:43.284322 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3de89ba4-6b05-410a-a7a0-e6ec1e1ba095-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"3de89ba4-6b05-410a-a7a0-e6ec1e1ba095\") " pod="openstack/glance-default-external-api-0" Mar 20 10:56:43 crc kubenswrapper[4748]: I0320 10:56:43.297727 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dddc05be-fe06-4c29-8102-d6118b6f36d5-config-data" (OuterVolumeSpecName: "config-data") pod "dddc05be-fe06-4c29-8102-d6118b6f36d5" (UID: "dddc05be-fe06-4c29-8102-d6118b6f36d5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:43 crc kubenswrapper[4748]: I0320 10:56:43.309986 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nncx5\" (UniqueName: \"kubernetes.io/projected/3de89ba4-6b05-410a-a7a0-e6ec1e1ba095-kube-api-access-nncx5\") pod \"glance-default-external-api-0\" (UID: \"3de89ba4-6b05-410a-a7a0-e6ec1e1ba095\") " pod="openstack/glance-default-external-api-0" Mar 20 10:56:43 crc kubenswrapper[4748]: I0320 10:56:43.312887 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 10:56:43 crc kubenswrapper[4748]: I0320 10:56:43.319244 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dddc05be-fe06-4c29-8102-d6118b6f36d5-kube-api-access-jxb4f" (OuterVolumeSpecName: "kube-api-access-jxb4f") pod "dddc05be-fe06-4c29-8102-d6118b6f36d5" (UID: "dddc05be-fe06-4c29-8102-d6118b6f36d5"). InnerVolumeSpecName "kube-api-access-jxb4f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:43 crc kubenswrapper[4748]: I0320 10:56:43.319643 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3de89ba4-6b05-410a-a7a0-e6ec1e1ba095-scripts\") pod \"glance-default-external-api-0\" (UID: \"3de89ba4-6b05-410a-a7a0-e6ec1e1ba095\") " pod="openstack/glance-default-external-api-0" Mar 20 10:56:43 crc kubenswrapper[4748]: I0320 10:56:43.320717 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe43a4aa-ef4d-4509-8760-b98a4de5b2e5-kube-api-access-9qz5v" (OuterVolumeSpecName: "kube-api-access-9qz5v") pod "fe43a4aa-ef4d-4509-8760-b98a4de5b2e5" (UID: "fe43a4aa-ef4d-4509-8760-b98a4de5b2e5"). InnerVolumeSpecName "kube-api-access-9qz5v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:43 crc kubenswrapper[4748]: I0320 10:56:43.322653 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2b0c5b1-956d-4aae-97b8-849e47e03677-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "c2b0c5b1-956d-4aae-97b8-849e47e03677" (UID: "c2b0c5b1-956d-4aae-97b8-849e47e03677"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:43 crc kubenswrapper[4748]: I0320 10:56:43.323564 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3de89ba4-6b05-410a-a7a0-e6ec1e1ba095-config-data\") pod \"glance-default-external-api-0\" (UID: \"3de89ba4-6b05-410a-a7a0-e6ec1e1ba095\") " pod="openstack/glance-default-external-api-0" Mar 20 10:56:43 crc kubenswrapper[4748]: I0320 10:56:43.332383 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3de89ba4-6b05-410a-a7a0-e6ec1e1ba095-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"3de89ba4-6b05-410a-a7a0-e6ec1e1ba095\") " pod="openstack/glance-default-external-api-0" Mar 20 10:56:43 crc kubenswrapper[4748]: I0320 10:56:43.347113 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"3de89ba4-6b05-410a-a7a0-e6ec1e1ba095\") " pod="openstack/glance-default-external-api-0" Mar 20 10:56:43 crc kubenswrapper[4748]: I0320 10:56:43.359947 4748 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c2b0c5b1-956d-4aae-97b8-849e47e03677-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:43 crc kubenswrapper[4748]: I0320 10:56:43.359986 4748 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/dddc05be-fe06-4c29-8102-d6118b6f36d5-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:43 crc kubenswrapper[4748]: I0320 10:56:43.360002 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m44cp\" (UniqueName: \"kubernetes.io/projected/c2b0c5b1-956d-4aae-97b8-849e47e03677-kube-api-access-m44cp\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:43 crc kubenswrapper[4748]: I0320 10:56:43.360015 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9qz5v\" (UniqueName: \"kubernetes.io/projected/fe43a4aa-ef4d-4509-8760-b98a4de5b2e5-kube-api-access-9qz5v\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:43 crc kubenswrapper[4748]: I0320 10:56:43.360027 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jxb4f\" (UniqueName: \"kubernetes.io/projected/dddc05be-fe06-4c29-8102-d6118b6f36d5-kube-api-access-jxb4f\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:43 crc kubenswrapper[4748]: I0320 10:56:43.360038 4748 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dddc05be-fe06-4c29-8102-d6118b6f36d5-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:43 crc kubenswrapper[4748]: I0320 10:56:43.360049 4748 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dddc05be-fe06-4c29-8102-d6118b6f36d5-logs\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:43 crc kubenswrapper[4748]: I0320 10:56:43.360060 4748 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dddc05be-fe06-4c29-8102-d6118b6f36d5-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:43 crc kubenswrapper[4748]: I0320 10:56:43.360073 4748 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c2b0c5b1-956d-4aae-97b8-849e47e03677-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:43 crc kubenswrapper[4748]: I0320 10:56:43.360089 4748 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c2b0c5b1-956d-4aae-97b8-849e47e03677-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:43 crc kubenswrapper[4748]: I0320 10:56:43.360099 4748 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c2b0c5b1-956d-4aae-97b8-849e47e03677-logs\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:43 crc kubenswrapper[4748]: I0320 10:56:43.365198 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe43a4aa-ef4d-4509-8760-b98a4de5b2e5-config" (OuterVolumeSpecName: "config") pod "fe43a4aa-ef4d-4509-8760-b98a4de5b2e5" (UID: "fe43a4aa-ef4d-4509-8760-b98a4de5b2e5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:43 crc kubenswrapper[4748]: I0320 10:56:43.367453 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe43a4aa-ef4d-4509-8760-b98a4de5b2e5-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "fe43a4aa-ef4d-4509-8760-b98a4de5b2e5" (UID: "fe43a4aa-ef4d-4509-8760-b98a4de5b2e5"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:43 crc kubenswrapper[4748]: I0320 10:56:43.374248 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe43a4aa-ef4d-4509-8760-b98a4de5b2e5-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "fe43a4aa-ef4d-4509-8760-b98a4de5b2e5" (UID: "fe43a4aa-ef4d-4509-8760-b98a4de5b2e5"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:43 crc kubenswrapper[4748]: I0320 10:56:43.401388 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe43a4aa-ef4d-4509-8760-b98a4de5b2e5-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "fe43a4aa-ef4d-4509-8760-b98a4de5b2e5" (UID: "fe43a4aa-ef4d-4509-8760-b98a4de5b2e5"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:43 crc kubenswrapper[4748]: I0320 10:56:43.460932 4748 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fe43a4aa-ef4d-4509-8760-b98a4de5b2e5-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:43 crc kubenswrapper[4748]: I0320 10:56:43.461285 4748 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fe43a4aa-ef4d-4509-8760-b98a4de5b2e5-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:43 crc kubenswrapper[4748]: I0320 10:56:43.461356 4748 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fe43a4aa-ef4d-4509-8760-b98a4de5b2e5-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:43 crc kubenswrapper[4748]: I0320 10:56:43.461418 4748 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe43a4aa-ef4d-4509-8760-b98a4de5b2e5-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:43 crc kubenswrapper[4748]: I0320 10:56:43.528395 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16ac2620-c820-4881-b959-d12478e5f4ba" path="/var/lib/kubelet/pods/16ac2620-c820-4881-b959-d12478e5f4ba/volumes" Mar 20 10:56:43 crc kubenswrapper[4748]: I0320 10:56:43.529235 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="952e94ca-2ca0-4a73-af07-8a02572a080b" path="/var/lib/kubelet/pods/952e94ca-2ca0-4a73-af07-8a02572a080b/volumes" Mar 20 10:56:43 crc kubenswrapper[4748]: I0320 10:56:43.562810 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-jb9kr" event={"ID":"fe43a4aa-ef4d-4509-8760-b98a4de5b2e5","Type":"ContainerDied","Data":"99179eae6e1188d0ddd1698060bd0537b36751de5986cf3d13bc2f2b07aba527"} Mar 20 10:56:43 crc kubenswrapper[4748]: I0320 10:56:43.563101 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-jb9kr" Mar 20 10:56:43 crc kubenswrapper[4748]: I0320 10:56:43.563962 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-69d8fb6c75-qkwp6" event={"ID":"c2b0c5b1-956d-4aae-97b8-849e47e03677","Type":"ContainerDied","Data":"1ea4775183436c3f3ec39c31ec78b2d9a1c1d0a3f06d49e0063c67fc2c4bd9a7"} Mar 20 10:56:43 crc kubenswrapper[4748]: I0320 10:56:43.564078 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-69d8fb6c75-qkwp6" Mar 20 10:56:43 crc kubenswrapper[4748]: I0320 10:56:43.566007 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-b8d4ccdcc-grvq4" event={"ID":"dddc05be-fe06-4c29-8102-d6118b6f36d5","Type":"ContainerDied","Data":"98518173afb007a8f4b226333d979551b3c2b7f486663828b15c2283d3e413e3"} Mar 20 10:56:43 crc kubenswrapper[4748]: I0320 10:56:43.566108 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-b8d4ccdcc-grvq4" Mar 20 10:56:43 crc kubenswrapper[4748]: E0320 10:56:43.567667 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-j7jqs" podUID="a4a4f230-3fe6-44a4-a91b-5b0ea07ae755" Mar 20 10:56:43 crc kubenswrapper[4748]: I0320 10:56:43.627907 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-jb9kr"] Mar 20 10:56:43 crc kubenswrapper[4748]: I0320 10:56:43.628375 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 10:56:43 crc kubenswrapper[4748]: I0320 10:56:43.639221 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-698758b865-jb9kr"] Mar 20 10:56:43 crc kubenswrapper[4748]: I0320 10:56:43.667611 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-69d8fb6c75-qkwp6"] Mar 20 10:56:43 crc kubenswrapper[4748]: I0320 10:56:43.685324 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-69d8fb6c75-qkwp6"] Mar 20 10:56:43 crc kubenswrapper[4748]: I0320 10:56:43.702087 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-b8d4ccdcc-grvq4"] Mar 20 10:56:43 crc kubenswrapper[4748]: I0320 10:56:43.709609 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-b8d4ccdcc-grvq4"] Mar 20 10:56:44 crc kubenswrapper[4748]: E0320 10:56:44.905022 4748 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Mar 20 10:56:44 crc kubenswrapper[4748]: E0320 10:56:44.906366 4748 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ng9kt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-rm5bp_openstack(af16052b-a5ab-4244-b007-69a32d050a35): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 10:56:44 crc kubenswrapper[4748]: E0320 10:56:44.907626 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-rm5bp" podUID="af16052b-a5ab-4244-b007-69a32d050a35" Mar 20 10:56:44 crc kubenswrapper[4748]: I0320 10:56:44.931458 4748 scope.go:117] "RemoveContainer" containerID="2f526096f41e83b934123a3a9b75ae5487b966cfc04857fb23ef0bf6f46d38ea" Mar 20 10:56:45 crc kubenswrapper[4748]: I0320 10:56:45.028726 4748 scope.go:117] "RemoveContainer" containerID="b179e670570fb05fdb090dd6f9b5d81690c8690e94d33b7fa7f91628a328b55c" Mar 20 10:56:45 crc kubenswrapper[4748]: I0320 10:56:45.177935 4748 scope.go:117] "RemoveContainer" containerID="348df6b5d3755f7725ea911eb2e14e2469d79eaae2e2a58deb61b534849f3b14" Mar 20 10:56:45 crc kubenswrapper[4748]: I0320 10:56:45.253929 4748 scope.go:117] "RemoveContainer" containerID="8790c054f45fcf832b100fb8aef752a10849709a0806741a7a4087d053414ea1" Mar 20 10:56:45 crc kubenswrapper[4748]: I0320 10:56:45.257436 4748 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-jb9kr" podUID="fe43a4aa-ef4d-4509-8760-b98a4de5b2e5" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.115:5353: i/o timeout" Mar 20 10:56:45 crc kubenswrapper[4748]: I0320 10:56:45.295585 4748 scope.go:117] "RemoveContainer" containerID="7c06a4bcfc3a018e2e063d74f0306892b82ab0ac2ca20afe5ec5de3fd15e55cf" Mar 20 10:56:45 crc kubenswrapper[4748]: I0320 10:56:45.447628 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7d79b6bb86-nhfts"] Mar 20 10:56:45 crc kubenswrapper[4748]: I0320 10:56:45.471466 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-b85b9d5c6-7bgxl"] Mar 20 10:56:45 crc kubenswrapper[4748]: I0320 10:56:45.545501 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2b0c5b1-956d-4aae-97b8-849e47e03677" path="/var/lib/kubelet/pods/c2b0c5b1-956d-4aae-97b8-849e47e03677/volumes" Mar 20 10:56:45 crc kubenswrapper[4748]: I0320 10:56:45.547399 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dddc05be-fe06-4c29-8102-d6118b6f36d5" path="/var/lib/kubelet/pods/dddc05be-fe06-4c29-8102-d6118b6f36d5/volumes" Mar 20 10:56:45 crc kubenswrapper[4748]: I0320 10:56:45.548698 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe43a4aa-ef4d-4509-8760-b98a4de5b2e5" path="/var/lib/kubelet/pods/fe43a4aa-ef4d-4509-8760-b98a4de5b2e5/volumes" Mar 20 10:56:45 crc kubenswrapper[4748]: I0320 10:56:45.613291 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a4a84b3e-b01b-424e-969b-e2cbc625f3f3","Type":"ContainerStarted","Data":"d16c8ce847109b4cbb4c5c2f1fe4580f1a7331cc45263f62aa6dedd567c05801"} Mar 20 10:56:45 crc kubenswrapper[4748]: I0320 10:56:45.620818 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-744pk"] Mar 20 10:56:45 crc kubenswrapper[4748]: I0320 10:56:45.626656 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-g4qfl" event={"ID":"6340e84c-8ffb-40e5-a470-52a50bff86f1","Type":"ContainerStarted","Data":"082ff8fe131c6a070446d24be4bd88a999c7470b5b6ea25ae826803de9f4f0c7"} Mar 20 10:56:45 crc kubenswrapper[4748]: I0320 10:56:45.632707 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7d79b6bb86-nhfts" event={"ID":"f3de236a-e527-4582-8eb5-03ca8aa883e0","Type":"ContainerStarted","Data":"cc82fa1192b45cb51eb7f6afb9a3f150f0f8454aff84b3a8608d2321d4aadea4"} Mar 20 10:56:45 crc kubenswrapper[4748]: I0320 10:56:45.633674 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-b85b9d5c6-7bgxl" event={"ID":"18e8975c-a2d8-4319-b175-2a66ce3d97c9","Type":"ContainerStarted","Data":"69c482b45ad1acfdb209c46107ce31e29092379c1bd726b0b2a9235cbf1d7c97"} Mar 20 10:56:45 crc kubenswrapper[4748]: I0320 10:56:45.637280 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5cfd8f8d7c-h5mnf" event={"ID":"aa097fcb-4ad5-4fda-a410-a64ad13495d6","Type":"ContainerStarted","Data":"4e733315e69c8dbe83f67d93931f3e3f905d8ab4605e8d87bf5a3078d1e2d846"} Mar 20 10:56:45 crc kubenswrapper[4748]: I0320 10:56:45.637356 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5cfd8f8d7c-h5mnf" event={"ID":"aa097fcb-4ad5-4fda-a410-a64ad13495d6","Type":"ContainerStarted","Data":"68208b69a0c9e6eadf462e65d3e43cfd1e8f6f00bf91a2972db8339946ee5370"} Mar 20 10:56:45 crc kubenswrapper[4748]: I0320 10:56:45.637494 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-5cfd8f8d7c-h5mnf" podUID="aa097fcb-4ad5-4fda-a410-a64ad13495d6" containerName="horizon-log" containerID="cri-o://68208b69a0c9e6eadf462e65d3e43cfd1e8f6f00bf91a2972db8339946ee5370" gracePeriod=30 Mar 20 10:56:45 crc kubenswrapper[4748]: I0320 10:56:45.637522 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-5cfd8f8d7c-h5mnf" podUID="aa097fcb-4ad5-4fda-a410-a64ad13495d6" containerName="horizon" containerID="cri-o://4e733315e69c8dbe83f67d93931f3e3f905d8ab4605e8d87bf5a3078d1e2d846" gracePeriod=30 Mar 20 10:56:45 crc kubenswrapper[4748]: E0320 10:56:45.643733 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-rm5bp" podUID="af16052b-a5ab-4244-b007-69a32d050a35" Mar 20 10:56:45 crc kubenswrapper[4748]: I0320 10:56:45.664230 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-g4qfl" podStartSLOduration=2.356414128 podStartE2EDuration="33.66420899s" podCreationTimestamp="2026-03-20 10:56:12 +0000 UTC" firstStartedPulling="2026-03-20 10:56:13.95882187 +0000 UTC m=+1209.100367684" lastFinishedPulling="2026-03-20 10:56:45.266616732 +0000 UTC m=+1240.408162546" observedRunningTime="2026-03-20 10:56:45.65746531 +0000 UTC m=+1240.799011144" watchObservedRunningTime="2026-03-20 10:56:45.66420899 +0000 UTC m=+1240.805754804" Mar 20 10:56:45 crc kubenswrapper[4748]: I0320 10:56:45.735502 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-5cfd8f8d7c-h5mnf" podStartSLOduration=2.875001874 podStartE2EDuration="31.735472688s" podCreationTimestamp="2026-03-20 10:56:14 +0000 UTC" firstStartedPulling="2026-03-20 10:56:15.971151002 +0000 UTC m=+1211.112696816" lastFinishedPulling="2026-03-20 10:56:44.831621816 +0000 UTC m=+1239.973167630" observedRunningTime="2026-03-20 10:56:45.706724907 +0000 UTC m=+1240.848270721" watchObservedRunningTime="2026-03-20 10:56:45.735472688 +0000 UTC m=+1240.877018512" Mar 20 10:56:45 crc kubenswrapper[4748]: I0320 10:56:45.757197 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 10:56:45 crc kubenswrapper[4748]: W0320 10:56:45.758400 4748 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0f4ca5c7_6a70_485e_b39d_e786ad0004a5.slice/crio-f3bf85e569ec584f9e22583b8e0f41f19087cd0ff785ae06e648d27836965069 WatchSource:0}: Error finding container f3bf85e569ec584f9e22583b8e0f41f19087cd0ff785ae06e648d27836965069: Status 404 returned error can't find the container with id f3bf85e569ec584f9e22583b8e0f41f19087cd0ff785ae06e648d27836965069 Mar 20 10:56:46 crc kubenswrapper[4748]: I0320 10:56:46.656905 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-744pk" event={"ID":"66aa7b6f-a021-4161-b3da-ddb593f2b169","Type":"ContainerStarted","Data":"c0e9914d7fc10214a4367f42562b7721df70ea6cc98d9004fbc125748550f8ec"} Mar 20 10:56:46 crc kubenswrapper[4748]: I0320 10:56:46.657519 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-744pk" event={"ID":"66aa7b6f-a021-4161-b3da-ddb593f2b169","Type":"ContainerStarted","Data":"e1562dcd28dcda8c9eeebf4f7f811c5b656785c42201f373be92009963e6ccd5"} Mar 20 10:56:46 crc kubenswrapper[4748]: I0320 10:56:46.663123 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7d79b6bb86-nhfts" event={"ID":"f3de236a-e527-4582-8eb5-03ca8aa883e0","Type":"ContainerStarted","Data":"805b74110f3c95eb6e4b59fc46a92c9b13405aaf86ff14411174713a0996d547"} Mar 20 10:56:46 crc kubenswrapper[4748]: I0320 10:56:46.663177 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7d79b6bb86-nhfts" event={"ID":"f3de236a-e527-4582-8eb5-03ca8aa883e0","Type":"ContainerStarted","Data":"babcb72aadeed16bfca8b67f243d7aab758016e834627f7ee1bf82d96d9f1e1e"} Mar 20 10:56:46 crc kubenswrapper[4748]: I0320 10:56:46.665492 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-b85b9d5c6-7bgxl" event={"ID":"18e8975c-a2d8-4319-b175-2a66ce3d97c9","Type":"ContainerStarted","Data":"1ef9c8f1e297ada419f5c18bfd6c48a8d01adf6e580eac1a741fdfad1d08a74f"} Mar 20 10:56:46 crc kubenswrapper[4748]: I0320 10:56:46.665529 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-b85b9d5c6-7bgxl" event={"ID":"18e8975c-a2d8-4319-b175-2a66ce3d97c9","Type":"ContainerStarted","Data":"87958b0500956376a8c4f78533010b993f91853465c77d78af833be6203bfca7"} Mar 20 10:56:46 crc kubenswrapper[4748]: I0320 10:56:46.674619 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"0f4ca5c7-6a70-485e-b39d-e786ad0004a5","Type":"ContainerStarted","Data":"12a28e7df4ad27261901712b424f1439a97b18d7aecab5a7b19754232c5975d5"} Mar 20 10:56:46 crc kubenswrapper[4748]: I0320 10:56:46.674677 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"0f4ca5c7-6a70-485e-b39d-e786ad0004a5","Type":"ContainerStarted","Data":"f3bf85e569ec584f9e22583b8e0f41f19087cd0ff785ae06e648d27836965069"} Mar 20 10:56:46 crc kubenswrapper[4748]: I0320 10:56:46.696800 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-744pk" podStartSLOduration=17.696771422 podStartE2EDuration="17.696771422s" podCreationTimestamp="2026-03-20 10:56:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:56:46.681818307 +0000 UTC m=+1241.823364121" watchObservedRunningTime="2026-03-20 10:56:46.696771422 +0000 UTC m=+1241.838317236" Mar 20 10:56:46 crc kubenswrapper[4748]: I0320 10:56:46.724122 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-b85b9d5c6-7bgxl" podStartSLOduration=25.724091598 podStartE2EDuration="25.724091598s" podCreationTimestamp="2026-03-20 10:56:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:56:46.707400009 +0000 UTC m=+1241.848945823" watchObservedRunningTime="2026-03-20 10:56:46.724091598 +0000 UTC m=+1241.865637402" Mar 20 10:56:46 crc kubenswrapper[4748]: I0320 10:56:46.752931 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 10:56:46 crc kubenswrapper[4748]: I0320 10:56:46.758089 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-7d79b6bb86-nhfts" podStartSLOduration=25.75806362 podStartE2EDuration="25.75806362s" podCreationTimestamp="2026-03-20 10:56:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:56:46.743616228 +0000 UTC m=+1241.885162052" watchObservedRunningTime="2026-03-20 10:56:46.75806362 +0000 UTC m=+1241.899609434" Mar 20 10:56:47 crc kubenswrapper[4748]: W0320 10:56:47.385678 4748 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3de89ba4_6b05_410a_a7a0_e6ec1e1ba095.slice/crio-04d6864f5cfac2f58a829489b9bdc1b6c4e7fec51052db50a77a31f8dcb39c8b WatchSource:0}: Error finding container 04d6864f5cfac2f58a829489b9bdc1b6c4e7fec51052db50a77a31f8dcb39c8b: Status 404 returned error can't find the container with id 04d6864f5cfac2f58a829489b9bdc1b6c4e7fec51052db50a77a31f8dcb39c8b Mar 20 10:56:47 crc kubenswrapper[4748]: I0320 10:56:47.741488 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3de89ba4-6b05-410a-a7a0-e6ec1e1ba095","Type":"ContainerStarted","Data":"04d6864f5cfac2f58a829489b9bdc1b6c4e7fec51052db50a77a31f8dcb39c8b"} Mar 20 10:56:48 crc kubenswrapper[4748]: I0320 10:56:48.767627 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"0f4ca5c7-6a70-485e-b39d-e786ad0004a5","Type":"ContainerStarted","Data":"df4ecb4636ea9e29b83162f356f15c0a5c472e8b049de36d07077b5bb611b6dc"} Mar 20 10:56:48 crc kubenswrapper[4748]: I0320 10:56:48.788356 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3de89ba4-6b05-410a-a7a0-e6ec1e1ba095","Type":"ContainerStarted","Data":"065f22a766d13285d2727ecf62800ca8102915dd8b53b27ce6bd138ca82c75d0"} Mar 20 10:56:48 crc kubenswrapper[4748]: I0320 10:56:48.800553 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=6.800530736 podStartE2EDuration="6.800530736s" podCreationTimestamp="2026-03-20 10:56:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:56:48.799395357 +0000 UTC m=+1243.940941171" watchObservedRunningTime="2026-03-20 10:56:48.800530736 +0000 UTC m=+1243.942076560" Mar 20 10:56:49 crc kubenswrapper[4748]: I0320 10:56:49.799275 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a4a84b3e-b01b-424e-969b-e2cbc625f3f3","Type":"ContainerStarted","Data":"7693a571a1bb3556c78b7802243ef6a1734518cfa95c9943f8df86099554cc32"} Mar 20 10:56:49 crc kubenswrapper[4748]: I0320 10:56:49.804623 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3de89ba4-6b05-410a-a7a0-e6ec1e1ba095","Type":"ContainerStarted","Data":"ec96f6c09a0d2fe8560c444f5939eb9b0f5bb271f3d6e415b8cd9d420c3f3d97"} Mar 20 10:56:49 crc kubenswrapper[4748]: I0320 10:56:49.836986 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=7.836957325 podStartE2EDuration="7.836957325s" podCreationTimestamp="2026-03-20 10:56:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:56:49.826488813 +0000 UTC m=+1244.968034637" watchObservedRunningTime="2026-03-20 10:56:49.836957325 +0000 UTC m=+1244.978503129" Mar 20 10:56:51 crc kubenswrapper[4748]: I0320 10:56:51.478175 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-b85b9d5c6-7bgxl" Mar 20 10:56:51 crc kubenswrapper[4748]: I0320 10:56:51.478590 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-b85b9d5c6-7bgxl" Mar 20 10:56:51 crc kubenswrapper[4748]: I0320 10:56:51.588659 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-7d79b6bb86-nhfts" Mar 20 10:56:51 crc kubenswrapper[4748]: I0320 10:56:51.588754 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7d79b6bb86-nhfts" Mar 20 10:56:52 crc kubenswrapper[4748]: I0320 10:56:52.853480 4748 generic.go:334] "Generic (PLEG): container finished" podID="66aa7b6f-a021-4161-b3da-ddb593f2b169" containerID="c0e9914d7fc10214a4367f42562b7721df70ea6cc98d9004fbc125748550f8ec" exitCode=0 Mar 20 10:56:52 crc kubenswrapper[4748]: I0320 10:56:52.853553 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-744pk" event={"ID":"66aa7b6f-a021-4161-b3da-ddb593f2b169","Type":"ContainerDied","Data":"c0e9914d7fc10214a4367f42562b7721df70ea6cc98d9004fbc125748550f8ec"} Mar 20 10:56:53 crc kubenswrapper[4748]: I0320 10:56:53.315295 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 20 10:56:53 crc kubenswrapper[4748]: I0320 10:56:53.315396 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 20 10:56:53 crc kubenswrapper[4748]: I0320 10:56:53.361162 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 20 10:56:53 crc kubenswrapper[4748]: I0320 10:56:53.369871 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 20 10:56:53 crc kubenswrapper[4748]: I0320 10:56:53.628916 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 20 10:56:53 crc kubenswrapper[4748]: I0320 10:56:53.629312 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 20 10:56:53 crc kubenswrapper[4748]: I0320 10:56:53.673241 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 20 10:56:53 crc kubenswrapper[4748]: I0320 10:56:53.691026 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 20 10:56:53 crc kubenswrapper[4748]: I0320 10:56:53.867558 4748 generic.go:334] "Generic (PLEG): container finished" podID="6340e84c-8ffb-40e5-a470-52a50bff86f1" containerID="082ff8fe131c6a070446d24be4bd88a999c7470b5b6ea25ae826803de9f4f0c7" exitCode=0 Mar 20 10:56:53 crc kubenswrapper[4748]: I0320 10:56:53.867702 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-g4qfl" event={"ID":"6340e84c-8ffb-40e5-a470-52a50bff86f1","Type":"ContainerDied","Data":"082ff8fe131c6a070446d24be4bd88a999c7470b5b6ea25ae826803de9f4f0c7"} Mar 20 10:56:53 crc kubenswrapper[4748]: I0320 10:56:53.868283 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 20 10:56:53 crc kubenswrapper[4748]: I0320 10:56:53.868318 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 20 10:56:53 crc kubenswrapper[4748]: I0320 10:56:53.868334 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 20 10:56:53 crc kubenswrapper[4748]: I0320 10:56:53.868346 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 20 10:56:55 crc kubenswrapper[4748]: I0320 10:56:55.283705 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-5cfd8f8d7c-h5mnf" Mar 20 10:56:56 crc kubenswrapper[4748]: I0320 10:56:56.752155 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 20 10:56:56 crc kubenswrapper[4748]: I0320 10:56:56.752638 4748 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 10:56:57 crc kubenswrapper[4748]: I0320 10:56:57.150781 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 20 10:56:57 crc kubenswrapper[4748]: I0320 10:56:57.443786 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 20 10:56:57 crc kubenswrapper[4748]: I0320 10:56:57.592424 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-744pk" Mar 20 10:56:57 crc kubenswrapper[4748]: I0320 10:56:57.624514 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-g4qfl" Mar 20 10:56:57 crc kubenswrapper[4748]: I0320 10:56:57.666502 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k26wc\" (UniqueName: \"kubernetes.io/projected/66aa7b6f-a021-4161-b3da-ddb593f2b169-kube-api-access-k26wc\") pod \"66aa7b6f-a021-4161-b3da-ddb593f2b169\" (UID: \"66aa7b6f-a021-4161-b3da-ddb593f2b169\") " Mar 20 10:56:57 crc kubenswrapper[4748]: I0320 10:56:57.666643 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66aa7b6f-a021-4161-b3da-ddb593f2b169-combined-ca-bundle\") pod \"66aa7b6f-a021-4161-b3da-ddb593f2b169\" (UID: \"66aa7b6f-a021-4161-b3da-ddb593f2b169\") " Mar 20 10:56:57 crc kubenswrapper[4748]: I0320 10:56:57.666679 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/66aa7b6f-a021-4161-b3da-ddb593f2b169-fernet-keys\") pod \"66aa7b6f-a021-4161-b3da-ddb593f2b169\" (UID: \"66aa7b6f-a021-4161-b3da-ddb593f2b169\") " Mar 20 10:56:57 crc kubenswrapper[4748]: I0320 10:56:57.666723 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/66aa7b6f-a021-4161-b3da-ddb593f2b169-scripts\") pod \"66aa7b6f-a021-4161-b3da-ddb593f2b169\" (UID: \"66aa7b6f-a021-4161-b3da-ddb593f2b169\") " Mar 20 10:56:57 crc kubenswrapper[4748]: I0320 10:56:57.666743 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66aa7b6f-a021-4161-b3da-ddb593f2b169-config-data\") pod \"66aa7b6f-a021-4161-b3da-ddb593f2b169\" (UID: \"66aa7b6f-a021-4161-b3da-ddb593f2b169\") " Mar 20 10:56:57 crc kubenswrapper[4748]: I0320 10:56:57.666769 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/66aa7b6f-a021-4161-b3da-ddb593f2b169-credential-keys\") pod \"66aa7b6f-a021-4161-b3da-ddb593f2b169\" (UID: \"66aa7b6f-a021-4161-b3da-ddb593f2b169\") " Mar 20 10:56:57 crc kubenswrapper[4748]: I0320 10:56:57.674961 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66aa7b6f-a021-4161-b3da-ddb593f2b169-scripts" (OuterVolumeSpecName: "scripts") pod "66aa7b6f-a021-4161-b3da-ddb593f2b169" (UID: "66aa7b6f-a021-4161-b3da-ddb593f2b169"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:57 crc kubenswrapper[4748]: I0320 10:56:57.675059 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66aa7b6f-a021-4161-b3da-ddb593f2b169-kube-api-access-k26wc" (OuterVolumeSpecName: "kube-api-access-k26wc") pod "66aa7b6f-a021-4161-b3da-ddb593f2b169" (UID: "66aa7b6f-a021-4161-b3da-ddb593f2b169"). InnerVolumeSpecName "kube-api-access-k26wc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:57 crc kubenswrapper[4748]: I0320 10:56:57.675082 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66aa7b6f-a021-4161-b3da-ddb593f2b169-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "66aa7b6f-a021-4161-b3da-ddb593f2b169" (UID: "66aa7b6f-a021-4161-b3da-ddb593f2b169"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:57 crc kubenswrapper[4748]: I0320 10:56:57.675085 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66aa7b6f-a021-4161-b3da-ddb593f2b169-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "66aa7b6f-a021-4161-b3da-ddb593f2b169" (UID: "66aa7b6f-a021-4161-b3da-ddb593f2b169"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:57 crc kubenswrapper[4748]: I0320 10:56:57.700056 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66aa7b6f-a021-4161-b3da-ddb593f2b169-config-data" (OuterVolumeSpecName: "config-data") pod "66aa7b6f-a021-4161-b3da-ddb593f2b169" (UID: "66aa7b6f-a021-4161-b3da-ddb593f2b169"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:57 crc kubenswrapper[4748]: I0320 10:56:57.733959 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66aa7b6f-a021-4161-b3da-ddb593f2b169-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "66aa7b6f-a021-4161-b3da-ddb593f2b169" (UID: "66aa7b6f-a021-4161-b3da-ddb593f2b169"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:57 crc kubenswrapper[4748]: I0320 10:56:57.770554 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2jbt7\" (UniqueName: \"kubernetes.io/projected/6340e84c-8ffb-40e5-a470-52a50bff86f1-kube-api-access-2jbt7\") pod \"6340e84c-8ffb-40e5-a470-52a50bff86f1\" (UID: \"6340e84c-8ffb-40e5-a470-52a50bff86f1\") " Mar 20 10:56:57 crc kubenswrapper[4748]: I0320 10:56:57.770729 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6340e84c-8ffb-40e5-a470-52a50bff86f1-scripts\") pod \"6340e84c-8ffb-40e5-a470-52a50bff86f1\" (UID: \"6340e84c-8ffb-40e5-a470-52a50bff86f1\") " Mar 20 10:56:57 crc kubenswrapper[4748]: I0320 10:56:57.770802 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6340e84c-8ffb-40e5-a470-52a50bff86f1-combined-ca-bundle\") pod \"6340e84c-8ffb-40e5-a470-52a50bff86f1\" (UID: \"6340e84c-8ffb-40e5-a470-52a50bff86f1\") " Mar 20 10:56:57 crc kubenswrapper[4748]: I0320 10:56:57.770949 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6340e84c-8ffb-40e5-a470-52a50bff86f1-logs\") pod \"6340e84c-8ffb-40e5-a470-52a50bff86f1\" (UID: \"6340e84c-8ffb-40e5-a470-52a50bff86f1\") " Mar 20 10:56:57 crc kubenswrapper[4748]: I0320 10:56:57.770983 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6340e84c-8ffb-40e5-a470-52a50bff86f1-config-data\") pod \"6340e84c-8ffb-40e5-a470-52a50bff86f1\" (UID: \"6340e84c-8ffb-40e5-a470-52a50bff86f1\") " Mar 20 10:56:57 crc kubenswrapper[4748]: I0320 10:56:57.771486 4748 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/66aa7b6f-a021-4161-b3da-ddb593f2b169-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:57 crc kubenswrapper[4748]: I0320 10:56:57.771498 4748 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/66aa7b6f-a021-4161-b3da-ddb593f2b169-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:57 crc kubenswrapper[4748]: I0320 10:56:57.771509 4748 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66aa7b6f-a021-4161-b3da-ddb593f2b169-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:57 crc kubenswrapper[4748]: I0320 10:56:57.771519 4748 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/66aa7b6f-a021-4161-b3da-ddb593f2b169-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:57 crc kubenswrapper[4748]: I0320 10:56:57.771530 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k26wc\" (UniqueName: \"kubernetes.io/projected/66aa7b6f-a021-4161-b3da-ddb593f2b169-kube-api-access-k26wc\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:57 crc kubenswrapper[4748]: I0320 10:56:57.771540 4748 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66aa7b6f-a021-4161-b3da-ddb593f2b169-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:57 crc kubenswrapper[4748]: I0320 10:56:57.777995 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6340e84c-8ffb-40e5-a470-52a50bff86f1-scripts" (OuterVolumeSpecName: "scripts") pod "6340e84c-8ffb-40e5-a470-52a50bff86f1" (UID: "6340e84c-8ffb-40e5-a470-52a50bff86f1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:57 crc kubenswrapper[4748]: I0320 10:56:57.778281 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6340e84c-8ffb-40e5-a470-52a50bff86f1-logs" (OuterVolumeSpecName: "logs") pod "6340e84c-8ffb-40e5-a470-52a50bff86f1" (UID: "6340e84c-8ffb-40e5-a470-52a50bff86f1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:56:57 crc kubenswrapper[4748]: I0320 10:56:57.780150 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6340e84c-8ffb-40e5-a470-52a50bff86f1-kube-api-access-2jbt7" (OuterVolumeSpecName: "kube-api-access-2jbt7") pod "6340e84c-8ffb-40e5-a470-52a50bff86f1" (UID: "6340e84c-8ffb-40e5-a470-52a50bff86f1"). InnerVolumeSpecName "kube-api-access-2jbt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:57 crc kubenswrapper[4748]: I0320 10:56:57.801375 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6340e84c-8ffb-40e5-a470-52a50bff86f1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6340e84c-8ffb-40e5-a470-52a50bff86f1" (UID: "6340e84c-8ffb-40e5-a470-52a50bff86f1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:57 crc kubenswrapper[4748]: I0320 10:56:57.802009 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6340e84c-8ffb-40e5-a470-52a50bff86f1-config-data" (OuterVolumeSpecName: "config-data") pod "6340e84c-8ffb-40e5-a470-52a50bff86f1" (UID: "6340e84c-8ffb-40e5-a470-52a50bff86f1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:57 crc kubenswrapper[4748]: I0320 10:56:57.874391 4748 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6340e84c-8ffb-40e5-a470-52a50bff86f1-logs\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:57 crc kubenswrapper[4748]: I0320 10:56:57.874444 4748 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6340e84c-8ffb-40e5-a470-52a50bff86f1-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:57 crc kubenswrapper[4748]: I0320 10:56:57.874462 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2jbt7\" (UniqueName: \"kubernetes.io/projected/6340e84c-8ffb-40e5-a470-52a50bff86f1-kube-api-access-2jbt7\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:57 crc kubenswrapper[4748]: I0320 10:56:57.874473 4748 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6340e84c-8ffb-40e5-a470-52a50bff86f1-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:57 crc kubenswrapper[4748]: I0320 10:56:57.874481 4748 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6340e84c-8ffb-40e5-a470-52a50bff86f1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:57 crc kubenswrapper[4748]: I0320 10:56:57.910429 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 20 10:56:57 crc kubenswrapper[4748]: I0320 10:56:57.919322 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-g4qfl" event={"ID":"6340e84c-8ffb-40e5-a470-52a50bff86f1","Type":"ContainerDied","Data":"1c01d9fe33d0b6c8604dbe356b18748c2c2ba07b2ee401d34a3e154f5a2026cb"} Mar 20 10:56:57 crc kubenswrapper[4748]: I0320 10:56:57.919357 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-g4qfl" Mar 20 10:56:57 crc kubenswrapper[4748]: I0320 10:56:57.919378 4748 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1c01d9fe33d0b6c8604dbe356b18748c2c2ba07b2ee401d34a3e154f5a2026cb" Mar 20 10:56:57 crc kubenswrapper[4748]: I0320 10:56:57.935010 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-744pk" event={"ID":"66aa7b6f-a021-4161-b3da-ddb593f2b169","Type":"ContainerDied","Data":"e1562dcd28dcda8c9eeebf4f7f811c5b656785c42201f373be92009963e6ccd5"} Mar 20 10:56:57 crc kubenswrapper[4748]: I0320 10:56:57.935070 4748 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e1562dcd28dcda8c9eeebf4f7f811c5b656785c42201f373be92009963e6ccd5" Mar 20 10:56:57 crc kubenswrapper[4748]: I0320 10:56:57.935113 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-744pk" Mar 20 10:56:58 crc kubenswrapper[4748]: I0320 10:56:58.733355 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-688f5b7cfd-ffmqn"] Mar 20 10:56:58 crc kubenswrapper[4748]: E0320 10:56:58.734508 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66aa7b6f-a021-4161-b3da-ddb593f2b169" containerName="keystone-bootstrap" Mar 20 10:56:58 crc kubenswrapper[4748]: I0320 10:56:58.734533 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="66aa7b6f-a021-4161-b3da-ddb593f2b169" containerName="keystone-bootstrap" Mar 20 10:56:58 crc kubenswrapper[4748]: E0320 10:56:58.734545 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe43a4aa-ef4d-4509-8760-b98a4de5b2e5" containerName="init" Mar 20 10:56:58 crc kubenswrapper[4748]: I0320 10:56:58.734554 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe43a4aa-ef4d-4509-8760-b98a4de5b2e5" containerName="init" Mar 20 10:56:58 crc kubenswrapper[4748]: E0320 10:56:58.734570 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe43a4aa-ef4d-4509-8760-b98a4de5b2e5" containerName="dnsmasq-dns" Mar 20 10:56:58 crc kubenswrapper[4748]: I0320 10:56:58.734579 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe43a4aa-ef4d-4509-8760-b98a4de5b2e5" containerName="dnsmasq-dns" Mar 20 10:56:58 crc kubenswrapper[4748]: E0320 10:56:58.734603 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6340e84c-8ffb-40e5-a470-52a50bff86f1" containerName="placement-db-sync" Mar 20 10:56:58 crc kubenswrapper[4748]: I0320 10:56:58.734616 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="6340e84c-8ffb-40e5-a470-52a50bff86f1" containerName="placement-db-sync" Mar 20 10:56:58 crc kubenswrapper[4748]: I0320 10:56:58.734865 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe43a4aa-ef4d-4509-8760-b98a4de5b2e5" containerName="dnsmasq-dns" Mar 20 10:56:58 crc kubenswrapper[4748]: I0320 10:56:58.734943 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="66aa7b6f-a021-4161-b3da-ddb593f2b169" containerName="keystone-bootstrap" Mar 20 10:56:58 crc kubenswrapper[4748]: I0320 10:56:58.734971 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="6340e84c-8ffb-40e5-a470-52a50bff86f1" containerName="placement-db-sync" Mar 20 10:56:58 crc kubenswrapper[4748]: I0320 10:56:58.749868 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-688f5b7cfd-ffmqn"] Mar 20 10:56:58 crc kubenswrapper[4748]: I0320 10:56:58.750015 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-688f5b7cfd-ffmqn" Mar 20 10:56:58 crc kubenswrapper[4748]: I0320 10:56:58.754273 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 20 10:56:58 crc kubenswrapper[4748]: I0320 10:56:58.757267 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 20 10:56:58 crc kubenswrapper[4748]: I0320 10:56:58.757304 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Mar 20 10:56:58 crc kubenswrapper[4748]: I0320 10:56:58.757719 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-n27fv" Mar 20 10:56:58 crc kubenswrapper[4748]: I0320 10:56:58.757696 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Mar 20 10:56:58 crc kubenswrapper[4748]: I0320 10:56:58.764080 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 20 10:56:58 crc kubenswrapper[4748]: I0320 10:56:58.892308 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-6495464d6d-wmp49"] Mar 20 10:56:58 crc kubenswrapper[4748]: I0320 10:56:58.894620 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6495464d6d-wmp49" Mar 20 10:56:58 crc kubenswrapper[4748]: I0320 10:56:58.896696 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc43e627-4d33-422e-bfc0-63cb746991ca-config-data\") pod \"keystone-688f5b7cfd-ffmqn\" (UID: \"cc43e627-4d33-422e-bfc0-63cb746991ca\") " pod="openstack/keystone-688f5b7cfd-ffmqn" Mar 20 10:56:58 crc kubenswrapper[4748]: I0320 10:56:58.896936 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/cc43e627-4d33-422e-bfc0-63cb746991ca-credential-keys\") pod \"keystone-688f5b7cfd-ffmqn\" (UID: \"cc43e627-4d33-422e-bfc0-63cb746991ca\") " pod="openstack/keystone-688f5b7cfd-ffmqn" Mar 20 10:56:58 crc kubenswrapper[4748]: I0320 10:56:58.897067 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cc43e627-4d33-422e-bfc0-63cb746991ca-scripts\") pod \"keystone-688f5b7cfd-ffmqn\" (UID: \"cc43e627-4d33-422e-bfc0-63cb746991ca\") " pod="openstack/keystone-688f5b7cfd-ffmqn" Mar 20 10:56:58 crc kubenswrapper[4748]: I0320 10:56:58.897218 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djtrv\" (UniqueName: \"kubernetes.io/projected/cc43e627-4d33-422e-bfc0-63cb746991ca-kube-api-access-djtrv\") pod \"keystone-688f5b7cfd-ffmqn\" (UID: \"cc43e627-4d33-422e-bfc0-63cb746991ca\") " pod="openstack/keystone-688f5b7cfd-ffmqn" Mar 20 10:56:58 crc kubenswrapper[4748]: I0320 10:56:58.897346 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cc43e627-4d33-422e-bfc0-63cb746991ca-internal-tls-certs\") pod \"keystone-688f5b7cfd-ffmqn\" (UID: \"cc43e627-4d33-422e-bfc0-63cb746991ca\") " pod="openstack/keystone-688f5b7cfd-ffmqn" Mar 20 10:56:58 crc kubenswrapper[4748]: I0320 10:56:58.897467 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cc43e627-4d33-422e-bfc0-63cb746991ca-public-tls-certs\") pod \"keystone-688f5b7cfd-ffmqn\" (UID: \"cc43e627-4d33-422e-bfc0-63cb746991ca\") " pod="openstack/keystone-688f5b7cfd-ffmqn" Mar 20 10:56:58 crc kubenswrapper[4748]: I0320 10:56:58.897575 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cc43e627-4d33-422e-bfc0-63cb746991ca-fernet-keys\") pod \"keystone-688f5b7cfd-ffmqn\" (UID: \"cc43e627-4d33-422e-bfc0-63cb746991ca\") " pod="openstack/keystone-688f5b7cfd-ffmqn" Mar 20 10:56:58 crc kubenswrapper[4748]: I0320 10:56:58.897683 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc43e627-4d33-422e-bfc0-63cb746991ca-combined-ca-bundle\") pod \"keystone-688f5b7cfd-ffmqn\" (UID: \"cc43e627-4d33-422e-bfc0-63cb746991ca\") " pod="openstack/keystone-688f5b7cfd-ffmqn" Mar 20 10:56:58 crc kubenswrapper[4748]: I0320 10:56:58.899416 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Mar 20 10:56:58 crc kubenswrapper[4748]: I0320 10:56:58.899643 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Mar 20 10:56:58 crc kubenswrapper[4748]: I0320 10:56:58.899797 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-2l5mf" Mar 20 10:56:58 crc kubenswrapper[4748]: I0320 10:56:58.899810 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 20 10:56:58 crc kubenswrapper[4748]: I0320 10:56:58.899889 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 20 10:56:58 crc kubenswrapper[4748]: I0320 10:56:58.903468 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6495464d6d-wmp49"] Mar 20 10:56:58 crc kubenswrapper[4748]: I0320 10:56:58.949923 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a4a84b3e-b01b-424e-969b-e2cbc625f3f3","Type":"ContainerStarted","Data":"e17c8c2860c89b5f959f53f917aaa0d7342a3145b363b17aaba400791ed0c29f"} Mar 20 10:56:59 crc kubenswrapper[4748]: I0320 10:56:59.001799 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djtrv\" (UniqueName: \"kubernetes.io/projected/cc43e627-4d33-422e-bfc0-63cb746991ca-kube-api-access-djtrv\") pod \"keystone-688f5b7cfd-ffmqn\" (UID: \"cc43e627-4d33-422e-bfc0-63cb746991ca\") " pod="openstack/keystone-688f5b7cfd-ffmqn" Mar 20 10:56:59 crc kubenswrapper[4748]: I0320 10:56:59.001904 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9f0a0d1-a120-45fd-95f6-6a5650096207-combined-ca-bundle\") pod \"placement-6495464d6d-wmp49\" (UID: \"a9f0a0d1-a120-45fd-95f6-6a5650096207\") " pod="openstack/placement-6495464d6d-wmp49" Mar 20 10:56:59 crc kubenswrapper[4748]: I0320 10:56:59.001932 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a9f0a0d1-a120-45fd-95f6-6a5650096207-public-tls-certs\") pod \"placement-6495464d6d-wmp49\" (UID: \"a9f0a0d1-a120-45fd-95f6-6a5650096207\") " pod="openstack/placement-6495464d6d-wmp49" Mar 20 10:56:59 crc kubenswrapper[4748]: I0320 10:56:59.001958 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cc43e627-4d33-422e-bfc0-63cb746991ca-internal-tls-certs\") pod \"keystone-688f5b7cfd-ffmqn\" (UID: \"cc43e627-4d33-422e-bfc0-63cb746991ca\") " pod="openstack/keystone-688f5b7cfd-ffmqn" Mar 20 10:56:59 crc kubenswrapper[4748]: I0320 10:56:59.001990 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cc43e627-4d33-422e-bfc0-63cb746991ca-public-tls-certs\") pod \"keystone-688f5b7cfd-ffmqn\" (UID: \"cc43e627-4d33-422e-bfc0-63cb746991ca\") " pod="openstack/keystone-688f5b7cfd-ffmqn" Mar 20 10:56:59 crc kubenswrapper[4748]: I0320 10:56:59.002036 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cc43e627-4d33-422e-bfc0-63cb746991ca-fernet-keys\") pod \"keystone-688f5b7cfd-ffmqn\" (UID: \"cc43e627-4d33-422e-bfc0-63cb746991ca\") " pod="openstack/keystone-688f5b7cfd-ffmqn" Mar 20 10:56:59 crc kubenswrapper[4748]: I0320 10:56:59.002052 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a9f0a0d1-a120-45fd-95f6-6a5650096207-internal-tls-certs\") pod \"placement-6495464d6d-wmp49\" (UID: \"a9f0a0d1-a120-45fd-95f6-6a5650096207\") " pod="openstack/placement-6495464d6d-wmp49" Mar 20 10:56:59 crc kubenswrapper[4748]: I0320 10:56:59.002074 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc43e627-4d33-422e-bfc0-63cb746991ca-combined-ca-bundle\") pod \"keystone-688f5b7cfd-ffmqn\" (UID: \"cc43e627-4d33-422e-bfc0-63cb746991ca\") " pod="openstack/keystone-688f5b7cfd-ffmqn" Mar 20 10:56:59 crc kubenswrapper[4748]: I0320 10:56:59.002102 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cq5cc\" (UniqueName: \"kubernetes.io/projected/a9f0a0d1-a120-45fd-95f6-6a5650096207-kube-api-access-cq5cc\") pod \"placement-6495464d6d-wmp49\" (UID: \"a9f0a0d1-a120-45fd-95f6-6a5650096207\") " pod="openstack/placement-6495464d6d-wmp49" Mar 20 10:56:59 crc kubenswrapper[4748]: I0320 10:56:59.002166 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a9f0a0d1-a120-45fd-95f6-6a5650096207-logs\") pod \"placement-6495464d6d-wmp49\" (UID: \"a9f0a0d1-a120-45fd-95f6-6a5650096207\") " pod="openstack/placement-6495464d6d-wmp49" Mar 20 10:56:59 crc kubenswrapper[4748]: I0320 10:56:59.002207 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc43e627-4d33-422e-bfc0-63cb746991ca-config-data\") pod \"keystone-688f5b7cfd-ffmqn\" (UID: \"cc43e627-4d33-422e-bfc0-63cb746991ca\") " pod="openstack/keystone-688f5b7cfd-ffmqn" Mar 20 10:56:59 crc kubenswrapper[4748]: I0320 10:56:59.002253 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a9f0a0d1-a120-45fd-95f6-6a5650096207-scripts\") pod \"placement-6495464d6d-wmp49\" (UID: \"a9f0a0d1-a120-45fd-95f6-6a5650096207\") " pod="openstack/placement-6495464d6d-wmp49" Mar 20 10:56:59 crc kubenswrapper[4748]: I0320 10:56:59.002289 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/cc43e627-4d33-422e-bfc0-63cb746991ca-credential-keys\") pod \"keystone-688f5b7cfd-ffmqn\" (UID: \"cc43e627-4d33-422e-bfc0-63cb746991ca\") " pod="openstack/keystone-688f5b7cfd-ffmqn" Mar 20 10:56:59 crc kubenswrapper[4748]: I0320 10:56:59.002308 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9f0a0d1-a120-45fd-95f6-6a5650096207-config-data\") pod \"placement-6495464d6d-wmp49\" (UID: \"a9f0a0d1-a120-45fd-95f6-6a5650096207\") " pod="openstack/placement-6495464d6d-wmp49" Mar 20 10:56:59 crc kubenswrapper[4748]: I0320 10:56:59.002330 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cc43e627-4d33-422e-bfc0-63cb746991ca-scripts\") pod \"keystone-688f5b7cfd-ffmqn\" (UID: \"cc43e627-4d33-422e-bfc0-63cb746991ca\") " pod="openstack/keystone-688f5b7cfd-ffmqn" Mar 20 10:56:59 crc kubenswrapper[4748]: I0320 10:56:59.007945 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cc43e627-4d33-422e-bfc0-63cb746991ca-public-tls-certs\") pod \"keystone-688f5b7cfd-ffmqn\" (UID: \"cc43e627-4d33-422e-bfc0-63cb746991ca\") " pod="openstack/keystone-688f5b7cfd-ffmqn" Mar 20 10:56:59 crc kubenswrapper[4748]: I0320 10:56:59.009584 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc43e627-4d33-422e-bfc0-63cb746991ca-config-data\") pod \"keystone-688f5b7cfd-ffmqn\" (UID: \"cc43e627-4d33-422e-bfc0-63cb746991ca\") " pod="openstack/keystone-688f5b7cfd-ffmqn" Mar 20 10:56:59 crc kubenswrapper[4748]: I0320 10:56:59.009967 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cc43e627-4d33-422e-bfc0-63cb746991ca-scripts\") pod \"keystone-688f5b7cfd-ffmqn\" (UID: \"cc43e627-4d33-422e-bfc0-63cb746991ca\") " pod="openstack/keystone-688f5b7cfd-ffmqn" Mar 20 10:56:59 crc kubenswrapper[4748]: I0320 10:56:59.010421 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cc43e627-4d33-422e-bfc0-63cb746991ca-internal-tls-certs\") pod \"keystone-688f5b7cfd-ffmqn\" (UID: \"cc43e627-4d33-422e-bfc0-63cb746991ca\") " pod="openstack/keystone-688f5b7cfd-ffmqn" Mar 20 10:56:59 crc kubenswrapper[4748]: I0320 10:56:59.010990 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc43e627-4d33-422e-bfc0-63cb746991ca-combined-ca-bundle\") pod \"keystone-688f5b7cfd-ffmqn\" (UID: \"cc43e627-4d33-422e-bfc0-63cb746991ca\") " pod="openstack/keystone-688f5b7cfd-ffmqn" Mar 20 10:56:59 crc kubenswrapper[4748]: I0320 10:56:59.012299 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/cc43e627-4d33-422e-bfc0-63cb746991ca-credential-keys\") pod \"keystone-688f5b7cfd-ffmqn\" (UID: \"cc43e627-4d33-422e-bfc0-63cb746991ca\") " pod="openstack/keystone-688f5b7cfd-ffmqn" Mar 20 10:56:59 crc kubenswrapper[4748]: I0320 10:56:59.013356 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cc43e627-4d33-422e-bfc0-63cb746991ca-fernet-keys\") pod \"keystone-688f5b7cfd-ffmqn\" (UID: \"cc43e627-4d33-422e-bfc0-63cb746991ca\") " pod="openstack/keystone-688f5b7cfd-ffmqn" Mar 20 10:56:59 crc kubenswrapper[4748]: I0320 10:56:59.031663 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djtrv\" (UniqueName: \"kubernetes.io/projected/cc43e627-4d33-422e-bfc0-63cb746991ca-kube-api-access-djtrv\") pod \"keystone-688f5b7cfd-ffmqn\" (UID: \"cc43e627-4d33-422e-bfc0-63cb746991ca\") " pod="openstack/keystone-688f5b7cfd-ffmqn" Mar 20 10:56:59 crc kubenswrapper[4748]: I0320 10:56:59.080973 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-688f5b7cfd-ffmqn" Mar 20 10:56:59 crc kubenswrapper[4748]: I0320 10:56:59.104686 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a9f0a0d1-a120-45fd-95f6-6a5650096207-scripts\") pod \"placement-6495464d6d-wmp49\" (UID: \"a9f0a0d1-a120-45fd-95f6-6a5650096207\") " pod="openstack/placement-6495464d6d-wmp49" Mar 20 10:56:59 crc kubenswrapper[4748]: I0320 10:56:59.104769 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9f0a0d1-a120-45fd-95f6-6a5650096207-config-data\") pod \"placement-6495464d6d-wmp49\" (UID: \"a9f0a0d1-a120-45fd-95f6-6a5650096207\") " pod="openstack/placement-6495464d6d-wmp49" Mar 20 10:56:59 crc kubenswrapper[4748]: I0320 10:56:59.104875 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9f0a0d1-a120-45fd-95f6-6a5650096207-combined-ca-bundle\") pod \"placement-6495464d6d-wmp49\" (UID: \"a9f0a0d1-a120-45fd-95f6-6a5650096207\") " pod="openstack/placement-6495464d6d-wmp49" Mar 20 10:56:59 crc kubenswrapper[4748]: I0320 10:56:59.104910 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a9f0a0d1-a120-45fd-95f6-6a5650096207-public-tls-certs\") pod \"placement-6495464d6d-wmp49\" (UID: \"a9f0a0d1-a120-45fd-95f6-6a5650096207\") " pod="openstack/placement-6495464d6d-wmp49" Mar 20 10:56:59 crc kubenswrapper[4748]: I0320 10:56:59.104973 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a9f0a0d1-a120-45fd-95f6-6a5650096207-internal-tls-certs\") pod \"placement-6495464d6d-wmp49\" (UID: \"a9f0a0d1-a120-45fd-95f6-6a5650096207\") " pod="openstack/placement-6495464d6d-wmp49" Mar 20 10:56:59 crc kubenswrapper[4748]: I0320 10:56:59.105016 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cq5cc\" (UniqueName: \"kubernetes.io/projected/a9f0a0d1-a120-45fd-95f6-6a5650096207-kube-api-access-cq5cc\") pod \"placement-6495464d6d-wmp49\" (UID: \"a9f0a0d1-a120-45fd-95f6-6a5650096207\") " pod="openstack/placement-6495464d6d-wmp49" Mar 20 10:56:59 crc kubenswrapper[4748]: I0320 10:56:59.105047 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a9f0a0d1-a120-45fd-95f6-6a5650096207-logs\") pod \"placement-6495464d6d-wmp49\" (UID: \"a9f0a0d1-a120-45fd-95f6-6a5650096207\") " pod="openstack/placement-6495464d6d-wmp49" Mar 20 10:56:59 crc kubenswrapper[4748]: I0320 10:56:59.105501 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a9f0a0d1-a120-45fd-95f6-6a5650096207-logs\") pod \"placement-6495464d6d-wmp49\" (UID: \"a9f0a0d1-a120-45fd-95f6-6a5650096207\") " pod="openstack/placement-6495464d6d-wmp49" Mar 20 10:56:59 crc kubenswrapper[4748]: I0320 10:56:59.110685 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9f0a0d1-a120-45fd-95f6-6a5650096207-config-data\") pod \"placement-6495464d6d-wmp49\" (UID: \"a9f0a0d1-a120-45fd-95f6-6a5650096207\") " pod="openstack/placement-6495464d6d-wmp49" Mar 20 10:56:59 crc kubenswrapper[4748]: I0320 10:56:59.111697 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a9f0a0d1-a120-45fd-95f6-6a5650096207-public-tls-certs\") pod \"placement-6495464d6d-wmp49\" (UID: \"a9f0a0d1-a120-45fd-95f6-6a5650096207\") " pod="openstack/placement-6495464d6d-wmp49" Mar 20 10:56:59 crc kubenswrapper[4748]: I0320 10:56:59.111701 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9f0a0d1-a120-45fd-95f6-6a5650096207-combined-ca-bundle\") pod \"placement-6495464d6d-wmp49\" (UID: \"a9f0a0d1-a120-45fd-95f6-6a5650096207\") " pod="openstack/placement-6495464d6d-wmp49" Mar 20 10:56:59 crc kubenswrapper[4748]: I0320 10:56:59.112140 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a9f0a0d1-a120-45fd-95f6-6a5650096207-scripts\") pod \"placement-6495464d6d-wmp49\" (UID: \"a9f0a0d1-a120-45fd-95f6-6a5650096207\") " pod="openstack/placement-6495464d6d-wmp49" Mar 20 10:56:59 crc kubenswrapper[4748]: I0320 10:56:59.112592 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a9f0a0d1-a120-45fd-95f6-6a5650096207-internal-tls-certs\") pod \"placement-6495464d6d-wmp49\" (UID: \"a9f0a0d1-a120-45fd-95f6-6a5650096207\") " pod="openstack/placement-6495464d6d-wmp49" Mar 20 10:56:59 crc kubenswrapper[4748]: I0320 10:56:59.128399 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cq5cc\" (UniqueName: \"kubernetes.io/projected/a9f0a0d1-a120-45fd-95f6-6a5650096207-kube-api-access-cq5cc\") pod \"placement-6495464d6d-wmp49\" (UID: \"a9f0a0d1-a120-45fd-95f6-6a5650096207\") " pod="openstack/placement-6495464d6d-wmp49" Mar 20 10:56:59 crc kubenswrapper[4748]: I0320 10:56:59.219468 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-56cc45587d-dchtq"] Mar 20 10:56:59 crc kubenswrapper[4748]: I0320 10:56:59.220821 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-56cc45587d-dchtq" Mar 20 10:56:59 crc kubenswrapper[4748]: I0320 10:56:59.233777 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6495464d6d-wmp49" Mar 20 10:56:59 crc kubenswrapper[4748]: I0320 10:56:59.248002 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-56cc45587d-dchtq"] Mar 20 10:56:59 crc kubenswrapper[4748]: I0320 10:56:59.312851 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9bd6666d-34bf-42aa-bac6-e119898e279d-public-tls-certs\") pod \"placement-56cc45587d-dchtq\" (UID: \"9bd6666d-34bf-42aa-bac6-e119898e279d\") " pod="openstack/placement-56cc45587d-dchtq" Mar 20 10:56:59 crc kubenswrapper[4748]: I0320 10:56:59.312925 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9bd6666d-34bf-42aa-bac6-e119898e279d-internal-tls-certs\") pod \"placement-56cc45587d-dchtq\" (UID: \"9bd6666d-34bf-42aa-bac6-e119898e279d\") " pod="openstack/placement-56cc45587d-dchtq" Mar 20 10:56:59 crc kubenswrapper[4748]: I0320 10:56:59.312974 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9bd6666d-34bf-42aa-bac6-e119898e279d-config-data\") pod \"placement-56cc45587d-dchtq\" (UID: \"9bd6666d-34bf-42aa-bac6-e119898e279d\") " pod="openstack/placement-56cc45587d-dchtq" Mar 20 10:56:59 crc kubenswrapper[4748]: I0320 10:56:59.313071 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9bd6666d-34bf-42aa-bac6-e119898e279d-scripts\") pod \"placement-56cc45587d-dchtq\" (UID: \"9bd6666d-34bf-42aa-bac6-e119898e279d\") " pod="openstack/placement-56cc45587d-dchtq" Mar 20 10:56:59 crc kubenswrapper[4748]: I0320 10:56:59.313112 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9bd6666d-34bf-42aa-bac6-e119898e279d-logs\") pod \"placement-56cc45587d-dchtq\" (UID: \"9bd6666d-34bf-42aa-bac6-e119898e279d\") " pod="openstack/placement-56cc45587d-dchtq" Mar 20 10:56:59 crc kubenswrapper[4748]: I0320 10:56:59.313173 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mt7dj\" (UniqueName: \"kubernetes.io/projected/9bd6666d-34bf-42aa-bac6-e119898e279d-kube-api-access-mt7dj\") pod \"placement-56cc45587d-dchtq\" (UID: \"9bd6666d-34bf-42aa-bac6-e119898e279d\") " pod="openstack/placement-56cc45587d-dchtq" Mar 20 10:56:59 crc kubenswrapper[4748]: I0320 10:56:59.313236 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bd6666d-34bf-42aa-bac6-e119898e279d-combined-ca-bundle\") pod \"placement-56cc45587d-dchtq\" (UID: \"9bd6666d-34bf-42aa-bac6-e119898e279d\") " pod="openstack/placement-56cc45587d-dchtq" Mar 20 10:56:59 crc kubenswrapper[4748]: I0320 10:56:59.415496 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bd6666d-34bf-42aa-bac6-e119898e279d-combined-ca-bundle\") pod \"placement-56cc45587d-dchtq\" (UID: \"9bd6666d-34bf-42aa-bac6-e119898e279d\") " pod="openstack/placement-56cc45587d-dchtq" Mar 20 10:56:59 crc kubenswrapper[4748]: I0320 10:56:59.417349 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9bd6666d-34bf-42aa-bac6-e119898e279d-public-tls-certs\") pod \"placement-56cc45587d-dchtq\" (UID: \"9bd6666d-34bf-42aa-bac6-e119898e279d\") " pod="openstack/placement-56cc45587d-dchtq" Mar 20 10:56:59 crc kubenswrapper[4748]: I0320 10:56:59.417408 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9bd6666d-34bf-42aa-bac6-e119898e279d-internal-tls-certs\") pod \"placement-56cc45587d-dchtq\" (UID: \"9bd6666d-34bf-42aa-bac6-e119898e279d\") " pod="openstack/placement-56cc45587d-dchtq" Mar 20 10:56:59 crc kubenswrapper[4748]: I0320 10:56:59.417449 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9bd6666d-34bf-42aa-bac6-e119898e279d-config-data\") pod \"placement-56cc45587d-dchtq\" (UID: \"9bd6666d-34bf-42aa-bac6-e119898e279d\") " pod="openstack/placement-56cc45587d-dchtq" Mar 20 10:56:59 crc kubenswrapper[4748]: I0320 10:56:59.417518 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9bd6666d-34bf-42aa-bac6-e119898e279d-scripts\") pod \"placement-56cc45587d-dchtq\" (UID: \"9bd6666d-34bf-42aa-bac6-e119898e279d\") " pod="openstack/placement-56cc45587d-dchtq" Mar 20 10:56:59 crc kubenswrapper[4748]: I0320 10:56:59.417540 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9bd6666d-34bf-42aa-bac6-e119898e279d-logs\") pod \"placement-56cc45587d-dchtq\" (UID: \"9bd6666d-34bf-42aa-bac6-e119898e279d\") " pod="openstack/placement-56cc45587d-dchtq" Mar 20 10:56:59 crc kubenswrapper[4748]: I0320 10:56:59.417574 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mt7dj\" (UniqueName: \"kubernetes.io/projected/9bd6666d-34bf-42aa-bac6-e119898e279d-kube-api-access-mt7dj\") pod \"placement-56cc45587d-dchtq\" (UID: \"9bd6666d-34bf-42aa-bac6-e119898e279d\") " pod="openstack/placement-56cc45587d-dchtq" Mar 20 10:56:59 crc kubenswrapper[4748]: I0320 10:56:59.421749 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9bd6666d-34bf-42aa-bac6-e119898e279d-logs\") pod \"placement-56cc45587d-dchtq\" (UID: \"9bd6666d-34bf-42aa-bac6-e119898e279d\") " pod="openstack/placement-56cc45587d-dchtq" Mar 20 10:56:59 crc kubenswrapper[4748]: I0320 10:56:59.425608 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9bd6666d-34bf-42aa-bac6-e119898e279d-config-data\") pod \"placement-56cc45587d-dchtq\" (UID: \"9bd6666d-34bf-42aa-bac6-e119898e279d\") " pod="openstack/placement-56cc45587d-dchtq" Mar 20 10:56:59 crc kubenswrapper[4748]: I0320 10:56:59.426138 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9bd6666d-34bf-42aa-bac6-e119898e279d-internal-tls-certs\") pod \"placement-56cc45587d-dchtq\" (UID: \"9bd6666d-34bf-42aa-bac6-e119898e279d\") " pod="openstack/placement-56cc45587d-dchtq" Mar 20 10:56:59 crc kubenswrapper[4748]: I0320 10:56:59.427663 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9bd6666d-34bf-42aa-bac6-e119898e279d-public-tls-certs\") pod \"placement-56cc45587d-dchtq\" (UID: \"9bd6666d-34bf-42aa-bac6-e119898e279d\") " pod="openstack/placement-56cc45587d-dchtq" Mar 20 10:56:59 crc kubenswrapper[4748]: I0320 10:56:59.429365 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bd6666d-34bf-42aa-bac6-e119898e279d-combined-ca-bundle\") pod \"placement-56cc45587d-dchtq\" (UID: \"9bd6666d-34bf-42aa-bac6-e119898e279d\") " pod="openstack/placement-56cc45587d-dchtq" Mar 20 10:56:59 crc kubenswrapper[4748]: I0320 10:56:59.433090 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9bd6666d-34bf-42aa-bac6-e119898e279d-scripts\") pod \"placement-56cc45587d-dchtq\" (UID: \"9bd6666d-34bf-42aa-bac6-e119898e279d\") " pod="openstack/placement-56cc45587d-dchtq" Mar 20 10:56:59 crc kubenswrapper[4748]: I0320 10:56:59.437707 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mt7dj\" (UniqueName: \"kubernetes.io/projected/9bd6666d-34bf-42aa-bac6-e119898e279d-kube-api-access-mt7dj\") pod \"placement-56cc45587d-dchtq\" (UID: \"9bd6666d-34bf-42aa-bac6-e119898e279d\") " pod="openstack/placement-56cc45587d-dchtq" Mar 20 10:56:59 crc kubenswrapper[4748]: I0320 10:56:59.592496 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-56cc45587d-dchtq" Mar 20 10:56:59 crc kubenswrapper[4748]: I0320 10:56:59.765602 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-688f5b7cfd-ffmqn"] Mar 20 10:56:59 crc kubenswrapper[4748]: I0320 10:56:59.930500 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6495464d6d-wmp49"] Mar 20 10:56:59 crc kubenswrapper[4748]: I0320 10:56:59.995685 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-688f5b7cfd-ffmqn" event={"ID":"cc43e627-4d33-422e-bfc0-63cb746991ca","Type":"ContainerStarted","Data":"cc5331442fb71fe8354676ff35e8a46014ac4cfb28b1615f304d75a44e86bc3f"} Mar 20 10:56:59 crc kubenswrapper[4748]: I0320 10:56:59.998734 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6495464d6d-wmp49" event={"ID":"a9f0a0d1-a120-45fd-95f6-6a5650096207","Type":"ContainerStarted","Data":"a6e903d6deefbe2a66160c8dd4522cd6885148e114f0b955ac6911fe7b11bf32"} Mar 20 10:57:00 crc kubenswrapper[4748]: I0320 10:57:00.177817 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-56cc45587d-dchtq"] Mar 20 10:57:01 crc kubenswrapper[4748]: I0320 10:57:01.020417 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6495464d6d-wmp49" event={"ID":"a9f0a0d1-a120-45fd-95f6-6a5650096207","Type":"ContainerStarted","Data":"905528e50dd4243b3974576ea6e5a824bfd8f949a041d1798afac52a1be8e960"} Mar 20 10:57:01 crc kubenswrapper[4748]: I0320 10:57:01.020826 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6495464d6d-wmp49" event={"ID":"a9f0a0d1-a120-45fd-95f6-6a5650096207","Type":"ContainerStarted","Data":"aa3e5bcdeb53be9bcb7cd3059f52452ab74bfc640f4cce1815b0bb44915f913a"} Mar 20 10:57:01 crc kubenswrapper[4748]: I0320 10:57:01.022119 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6495464d6d-wmp49" Mar 20 10:57:01 crc kubenswrapper[4748]: I0320 10:57:01.022145 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6495464d6d-wmp49" Mar 20 10:57:01 crc kubenswrapper[4748]: I0320 10:57:01.025748 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-688f5b7cfd-ffmqn" event={"ID":"cc43e627-4d33-422e-bfc0-63cb746991ca","Type":"ContainerStarted","Data":"13c1261de0cf5996a82a38a34d2c641fbd2236eaf17b7feb67c882fa8016fd72"} Mar 20 10:57:01 crc kubenswrapper[4748]: I0320 10:57:01.026420 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-688f5b7cfd-ffmqn" Mar 20 10:57:01 crc kubenswrapper[4748]: I0320 10:57:01.042264 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-j7jqs" event={"ID":"a4a4f230-3fe6-44a4-a91b-5b0ea07ae755","Type":"ContainerStarted","Data":"219f24c3854d372dd4332eb0a35515344f2b41d6e05185335df317097eb29204"} Mar 20 10:57:01 crc kubenswrapper[4748]: I0320 10:57:01.058462 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-6495464d6d-wmp49" podStartSLOduration=3.058437139 podStartE2EDuration="3.058437139s" podCreationTimestamp="2026-03-20 10:56:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:57:01.044336805 +0000 UTC m=+1256.185882629" watchObservedRunningTime="2026-03-20 10:57:01.058437139 +0000 UTC m=+1256.199982953" Mar 20 10:57:01 crc kubenswrapper[4748]: I0320 10:57:01.065825 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-56cc45587d-dchtq" event={"ID":"9bd6666d-34bf-42aa-bac6-e119898e279d","Type":"ContainerStarted","Data":"8f6aa61f85503ea7efc32227d7281c6e982bc5449c5ebe93c2ebd30c47a98beb"} Mar 20 10:57:01 crc kubenswrapper[4748]: I0320 10:57:01.065925 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-56cc45587d-dchtq" event={"ID":"9bd6666d-34bf-42aa-bac6-e119898e279d","Type":"ContainerStarted","Data":"7001a6aa0121d229e15f0cceb156490d621ca6c78a52ca1b26a6b79cc4cd28d7"} Mar 20 10:57:01 crc kubenswrapper[4748]: I0320 10:57:01.104167 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-j7jqs" podStartSLOduration=3.430799598 podStartE2EDuration="49.104137956s" podCreationTimestamp="2026-03-20 10:56:12 +0000 UTC" firstStartedPulling="2026-03-20 10:56:14.564396055 +0000 UTC m=+1209.705941869" lastFinishedPulling="2026-03-20 10:57:00.237734413 +0000 UTC m=+1255.379280227" observedRunningTime="2026-03-20 10:57:01.096790052 +0000 UTC m=+1256.238335866" watchObservedRunningTime="2026-03-20 10:57:01.104137956 +0000 UTC m=+1256.245683780" Mar 20 10:57:01 crc kubenswrapper[4748]: I0320 10:57:01.106402 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-688f5b7cfd-ffmqn" podStartSLOduration=3.106393523 podStartE2EDuration="3.106393523s" podCreationTimestamp="2026-03-20 10:56:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:57:01.073176799 +0000 UTC m=+1256.214722753" watchObservedRunningTime="2026-03-20 10:57:01.106393523 +0000 UTC m=+1256.247939337" Mar 20 10:57:01 crc kubenswrapper[4748]: I0320 10:57:01.479814 4748 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-b85b9d5c6-7bgxl" podUID="18e8975c-a2d8-4319-b175-2a66ce3d97c9" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.150:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.150:8443: connect: connection refused" Mar 20 10:57:01 crc kubenswrapper[4748]: I0320 10:57:01.593120 4748 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-7d79b6bb86-nhfts" podUID="f3de236a-e527-4582-8eb5-03ca8aa883e0" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.151:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.151:8443: connect: connection refused" Mar 20 10:57:02 crc kubenswrapper[4748]: I0320 10:57:02.081069 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-56cc45587d-dchtq" event={"ID":"9bd6666d-34bf-42aa-bac6-e119898e279d","Type":"ContainerStarted","Data":"f3f343f55a72e017c2ae95254212fd8f871cdf00ebb0a24a008ab3a85c5da660"} Mar 20 10:57:02 crc kubenswrapper[4748]: I0320 10:57:02.082882 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-56cc45587d-dchtq" Mar 20 10:57:02 crc kubenswrapper[4748]: I0320 10:57:02.082915 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-56cc45587d-dchtq" Mar 20 10:57:02 crc kubenswrapper[4748]: I0320 10:57:02.130448 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-56cc45587d-dchtq" podStartSLOduration=3.13041045 podStartE2EDuration="3.13041045s" podCreationTimestamp="2026-03-20 10:56:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:57:02.117163577 +0000 UTC m=+1257.258709391" watchObservedRunningTime="2026-03-20 10:57:02.13041045 +0000 UTC m=+1257.271956264" Mar 20 10:57:04 crc kubenswrapper[4748]: I0320 10:57:04.110423 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-rm5bp" event={"ID":"af16052b-a5ab-4244-b007-69a32d050a35","Type":"ContainerStarted","Data":"b38554dd1be02a45128fe9b6df10200ec1e855d0b4273f34b69a7887f02090c1"} Mar 20 10:57:04 crc kubenswrapper[4748]: I0320 10:57:04.147308 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-rm5bp" podStartSLOduration=4.359599739 podStartE2EDuration="52.147279074s" podCreationTimestamp="2026-03-20 10:56:12 +0000 UTC" firstStartedPulling="2026-03-20 10:56:14.642522938 +0000 UTC m=+1209.784068752" lastFinishedPulling="2026-03-20 10:57:02.430202273 +0000 UTC m=+1257.571748087" observedRunningTime="2026-03-20 10:57:04.130552244 +0000 UTC m=+1259.272098068" watchObservedRunningTime="2026-03-20 10:57:04.147279074 +0000 UTC m=+1259.288824898" Mar 20 10:57:10 crc kubenswrapper[4748]: I0320 10:57:10.183459 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a4a84b3e-b01b-424e-969b-e2cbc625f3f3","Type":"ContainerStarted","Data":"3ade9b52d1222e59d2a2a50e0f2ffc390b4931ec098ad50961c13fd3ee0ad8a5"} Mar 20 10:57:10 crc kubenswrapper[4748]: I0320 10:57:10.184078 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 20 10:57:10 crc kubenswrapper[4748]: I0320 10:57:10.183894 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a4a84b3e-b01b-424e-969b-e2cbc625f3f3" containerName="proxy-httpd" containerID="cri-o://3ade9b52d1222e59d2a2a50e0f2ffc390b4931ec098ad50961c13fd3ee0ad8a5" gracePeriod=30 Mar 20 10:57:10 crc kubenswrapper[4748]: I0320 10:57:10.183907 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a4a84b3e-b01b-424e-969b-e2cbc625f3f3" containerName="sg-core" containerID="cri-o://e17c8c2860c89b5f959f53f917aaa0d7342a3145b363b17aaba400791ed0c29f" gracePeriod=30 Mar 20 10:57:10 crc kubenswrapper[4748]: I0320 10:57:10.183918 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a4a84b3e-b01b-424e-969b-e2cbc625f3f3" containerName="ceilometer-notification-agent" containerID="cri-o://7693a571a1bb3556c78b7802243ef6a1734518cfa95c9943f8df86099554cc32" gracePeriod=30 Mar 20 10:57:10 crc kubenswrapper[4748]: I0320 10:57:10.183606 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a4a84b3e-b01b-424e-969b-e2cbc625f3f3" containerName="ceilometer-central-agent" containerID="cri-o://d16c8ce847109b4cbb4c5c2f1fe4580f1a7331cc45263f62aa6dedd567c05801" gracePeriod=30 Mar 20 10:57:10 crc kubenswrapper[4748]: I0320 10:57:10.216705 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.959306916 podStartE2EDuration="58.216680836s" podCreationTimestamp="2026-03-20 10:56:12 +0000 UTC" firstStartedPulling="2026-03-20 10:56:14.567107943 +0000 UTC m=+1209.708653757" lastFinishedPulling="2026-03-20 10:57:09.824481853 +0000 UTC m=+1264.966027677" observedRunningTime="2026-03-20 10:57:10.212346637 +0000 UTC m=+1265.353892471" watchObservedRunningTime="2026-03-20 10:57:10.216680836 +0000 UTC m=+1265.358226650" Mar 20 10:57:11 crc kubenswrapper[4748]: I0320 10:57:11.197885 4748 generic.go:334] "Generic (PLEG): container finished" podID="a4a84b3e-b01b-424e-969b-e2cbc625f3f3" containerID="3ade9b52d1222e59d2a2a50e0f2ffc390b4931ec098ad50961c13fd3ee0ad8a5" exitCode=0 Mar 20 10:57:11 crc kubenswrapper[4748]: I0320 10:57:11.198532 4748 generic.go:334] "Generic (PLEG): container finished" podID="a4a84b3e-b01b-424e-969b-e2cbc625f3f3" containerID="e17c8c2860c89b5f959f53f917aaa0d7342a3145b363b17aaba400791ed0c29f" exitCode=2 Mar 20 10:57:11 crc kubenswrapper[4748]: I0320 10:57:11.198553 4748 generic.go:334] "Generic (PLEG): container finished" podID="a4a84b3e-b01b-424e-969b-e2cbc625f3f3" containerID="d16c8ce847109b4cbb4c5c2f1fe4580f1a7331cc45263f62aa6dedd567c05801" exitCode=0 Mar 20 10:57:11 crc kubenswrapper[4748]: I0320 10:57:11.197956 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a4a84b3e-b01b-424e-969b-e2cbc625f3f3","Type":"ContainerDied","Data":"3ade9b52d1222e59d2a2a50e0f2ffc390b4931ec098ad50961c13fd3ee0ad8a5"} Mar 20 10:57:11 crc kubenswrapper[4748]: I0320 10:57:11.198600 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a4a84b3e-b01b-424e-969b-e2cbc625f3f3","Type":"ContainerDied","Data":"e17c8c2860c89b5f959f53f917aaa0d7342a3145b363b17aaba400791ed0c29f"} Mar 20 10:57:11 crc kubenswrapper[4748]: I0320 10:57:11.198613 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a4a84b3e-b01b-424e-969b-e2cbc625f3f3","Type":"ContainerDied","Data":"d16c8ce847109b4cbb4c5c2f1fe4580f1a7331cc45263f62aa6dedd567c05801"} Mar 20 10:57:11 crc kubenswrapper[4748]: I0320 10:57:11.588986 4748 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-7d79b6bb86-nhfts" podUID="f3de236a-e527-4582-8eb5-03ca8aa883e0" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.151:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.151:8443: connect: connection refused" Mar 20 10:57:12 crc kubenswrapper[4748]: I0320 10:57:12.211452 4748 generic.go:334] "Generic (PLEG): container finished" podID="a4a84b3e-b01b-424e-969b-e2cbc625f3f3" containerID="7693a571a1bb3556c78b7802243ef6a1734518cfa95c9943f8df86099554cc32" exitCode=0 Mar 20 10:57:12 crc kubenswrapper[4748]: I0320 10:57:12.211517 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a4a84b3e-b01b-424e-969b-e2cbc625f3f3","Type":"ContainerDied","Data":"7693a571a1bb3556c78b7802243ef6a1734518cfa95c9943f8df86099554cc32"} Mar 20 10:57:12 crc kubenswrapper[4748]: I0320 10:57:12.946246 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 10:57:13 crc kubenswrapper[4748]: I0320 10:57:13.138302 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xzsrp\" (UniqueName: \"kubernetes.io/projected/a4a84b3e-b01b-424e-969b-e2cbc625f3f3-kube-api-access-xzsrp\") pod \"a4a84b3e-b01b-424e-969b-e2cbc625f3f3\" (UID: \"a4a84b3e-b01b-424e-969b-e2cbc625f3f3\") " Mar 20 10:57:13 crc kubenswrapper[4748]: I0320 10:57:13.138421 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a4a84b3e-b01b-424e-969b-e2cbc625f3f3-sg-core-conf-yaml\") pod \"a4a84b3e-b01b-424e-969b-e2cbc625f3f3\" (UID: \"a4a84b3e-b01b-424e-969b-e2cbc625f3f3\") " Mar 20 10:57:13 crc kubenswrapper[4748]: I0320 10:57:13.138445 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a4a84b3e-b01b-424e-969b-e2cbc625f3f3-scripts\") pod \"a4a84b3e-b01b-424e-969b-e2cbc625f3f3\" (UID: \"a4a84b3e-b01b-424e-969b-e2cbc625f3f3\") " Mar 20 10:57:13 crc kubenswrapper[4748]: I0320 10:57:13.138469 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4a84b3e-b01b-424e-969b-e2cbc625f3f3-combined-ca-bundle\") pod \"a4a84b3e-b01b-424e-969b-e2cbc625f3f3\" (UID: \"a4a84b3e-b01b-424e-969b-e2cbc625f3f3\") " Mar 20 10:57:13 crc kubenswrapper[4748]: I0320 10:57:13.138492 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4a84b3e-b01b-424e-969b-e2cbc625f3f3-config-data\") pod \"a4a84b3e-b01b-424e-969b-e2cbc625f3f3\" (UID: \"a4a84b3e-b01b-424e-969b-e2cbc625f3f3\") " Mar 20 10:57:13 crc kubenswrapper[4748]: I0320 10:57:13.138568 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a4a84b3e-b01b-424e-969b-e2cbc625f3f3-log-httpd\") pod \"a4a84b3e-b01b-424e-969b-e2cbc625f3f3\" (UID: \"a4a84b3e-b01b-424e-969b-e2cbc625f3f3\") " Mar 20 10:57:13 crc kubenswrapper[4748]: I0320 10:57:13.138694 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a4a84b3e-b01b-424e-969b-e2cbc625f3f3-run-httpd\") pod \"a4a84b3e-b01b-424e-969b-e2cbc625f3f3\" (UID: \"a4a84b3e-b01b-424e-969b-e2cbc625f3f3\") " Mar 20 10:57:13 crc kubenswrapper[4748]: I0320 10:57:13.139246 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4a84b3e-b01b-424e-969b-e2cbc625f3f3-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "a4a84b3e-b01b-424e-969b-e2cbc625f3f3" (UID: "a4a84b3e-b01b-424e-969b-e2cbc625f3f3"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:57:13 crc kubenswrapper[4748]: I0320 10:57:13.139294 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4a84b3e-b01b-424e-969b-e2cbc625f3f3-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "a4a84b3e-b01b-424e-969b-e2cbc625f3f3" (UID: "a4a84b3e-b01b-424e-969b-e2cbc625f3f3"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:57:13 crc kubenswrapper[4748]: I0320 10:57:13.164080 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4a84b3e-b01b-424e-969b-e2cbc625f3f3-kube-api-access-xzsrp" (OuterVolumeSpecName: "kube-api-access-xzsrp") pod "a4a84b3e-b01b-424e-969b-e2cbc625f3f3" (UID: "a4a84b3e-b01b-424e-969b-e2cbc625f3f3"). InnerVolumeSpecName "kube-api-access-xzsrp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:57:13 crc kubenswrapper[4748]: I0320 10:57:13.164204 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4a84b3e-b01b-424e-969b-e2cbc625f3f3-scripts" (OuterVolumeSpecName: "scripts") pod "a4a84b3e-b01b-424e-969b-e2cbc625f3f3" (UID: "a4a84b3e-b01b-424e-969b-e2cbc625f3f3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:57:13 crc kubenswrapper[4748]: I0320 10:57:13.191177 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4a84b3e-b01b-424e-969b-e2cbc625f3f3-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "a4a84b3e-b01b-424e-969b-e2cbc625f3f3" (UID: "a4a84b3e-b01b-424e-969b-e2cbc625f3f3"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:57:13 crc kubenswrapper[4748]: I0320 10:57:13.225395 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a4a84b3e-b01b-424e-969b-e2cbc625f3f3","Type":"ContainerDied","Data":"20a9ce4214b791e72549d5217073d33447106b29028f5240e3d8862aeb6db1ee"} Mar 20 10:57:13 crc kubenswrapper[4748]: I0320 10:57:13.225499 4748 scope.go:117] "RemoveContainer" containerID="3ade9b52d1222e59d2a2a50e0f2ffc390b4931ec098ad50961c13fd3ee0ad8a5" Mar 20 10:57:13 crc kubenswrapper[4748]: I0320 10:57:13.225411 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 10:57:13 crc kubenswrapper[4748]: I0320 10:57:13.227035 4748 generic.go:334] "Generic (PLEG): container finished" podID="a4a4f230-3fe6-44a4-a91b-5b0ea07ae755" containerID="219f24c3854d372dd4332eb0a35515344f2b41d6e05185335df317097eb29204" exitCode=0 Mar 20 10:57:13 crc kubenswrapper[4748]: I0320 10:57:13.227088 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-j7jqs" event={"ID":"a4a4f230-3fe6-44a4-a91b-5b0ea07ae755","Type":"ContainerDied","Data":"219f24c3854d372dd4332eb0a35515344f2b41d6e05185335df317097eb29204"} Mar 20 10:57:13 crc kubenswrapper[4748]: I0320 10:57:13.244023 4748 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a4a84b3e-b01b-424e-969b-e2cbc625f3f3-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 10:57:13 crc kubenswrapper[4748]: I0320 10:57:13.244068 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xzsrp\" (UniqueName: \"kubernetes.io/projected/a4a84b3e-b01b-424e-969b-e2cbc625f3f3-kube-api-access-xzsrp\") on node \"crc\" DevicePath \"\"" Mar 20 10:57:13 crc kubenswrapper[4748]: I0320 10:57:13.244083 4748 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a4a84b3e-b01b-424e-969b-e2cbc625f3f3-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 20 10:57:13 crc kubenswrapper[4748]: I0320 10:57:13.244091 4748 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a4a84b3e-b01b-424e-969b-e2cbc625f3f3-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 10:57:13 crc kubenswrapper[4748]: I0320 10:57:13.244100 4748 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a4a84b3e-b01b-424e-969b-e2cbc625f3f3-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 10:57:13 crc kubenswrapper[4748]: I0320 10:57:13.248369 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4a84b3e-b01b-424e-969b-e2cbc625f3f3-config-data" (OuterVolumeSpecName: "config-data") pod "a4a84b3e-b01b-424e-969b-e2cbc625f3f3" (UID: "a4a84b3e-b01b-424e-969b-e2cbc625f3f3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:57:13 crc kubenswrapper[4748]: I0320 10:57:13.252605 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4a84b3e-b01b-424e-969b-e2cbc625f3f3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a4a84b3e-b01b-424e-969b-e2cbc625f3f3" (UID: "a4a84b3e-b01b-424e-969b-e2cbc625f3f3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:57:13 crc kubenswrapper[4748]: I0320 10:57:13.345407 4748 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4a84b3e-b01b-424e-969b-e2cbc625f3f3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 10:57:13 crc kubenswrapper[4748]: I0320 10:57:13.345447 4748 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4a84b3e-b01b-424e-969b-e2cbc625f3f3-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 10:57:13 crc kubenswrapper[4748]: I0320 10:57:13.355617 4748 scope.go:117] "RemoveContainer" containerID="e17c8c2860c89b5f959f53f917aaa0d7342a3145b363b17aaba400791ed0c29f" Mar 20 10:57:13 crc kubenswrapper[4748]: I0320 10:57:13.378149 4748 scope.go:117] "RemoveContainer" containerID="7693a571a1bb3556c78b7802243ef6a1734518cfa95c9943f8df86099554cc32" Mar 20 10:57:13 crc kubenswrapper[4748]: I0320 10:57:13.412572 4748 scope.go:117] "RemoveContainer" containerID="d16c8ce847109b4cbb4c5c2f1fe4580f1a7331cc45263f62aa6dedd567c05801" Mar 20 10:57:13 crc kubenswrapper[4748]: I0320 10:57:13.571971 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 10:57:13 crc kubenswrapper[4748]: I0320 10:57:13.589710 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 20 10:57:13 crc kubenswrapper[4748]: I0320 10:57:13.602405 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 10:57:13 crc kubenswrapper[4748]: E0320 10:57:13.602821 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4a84b3e-b01b-424e-969b-e2cbc625f3f3" containerName="ceilometer-central-agent" Mar 20 10:57:13 crc kubenswrapper[4748]: I0320 10:57:13.602861 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4a84b3e-b01b-424e-969b-e2cbc625f3f3" containerName="ceilometer-central-agent" Mar 20 10:57:13 crc kubenswrapper[4748]: E0320 10:57:13.602875 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4a84b3e-b01b-424e-969b-e2cbc625f3f3" containerName="ceilometer-notification-agent" Mar 20 10:57:13 crc kubenswrapper[4748]: I0320 10:57:13.602885 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4a84b3e-b01b-424e-969b-e2cbc625f3f3" containerName="ceilometer-notification-agent" Mar 20 10:57:13 crc kubenswrapper[4748]: E0320 10:57:13.602906 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4a84b3e-b01b-424e-969b-e2cbc625f3f3" containerName="proxy-httpd" Mar 20 10:57:13 crc kubenswrapper[4748]: I0320 10:57:13.602916 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4a84b3e-b01b-424e-969b-e2cbc625f3f3" containerName="proxy-httpd" Mar 20 10:57:13 crc kubenswrapper[4748]: E0320 10:57:13.602935 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4a84b3e-b01b-424e-969b-e2cbc625f3f3" containerName="sg-core" Mar 20 10:57:13 crc kubenswrapper[4748]: I0320 10:57:13.602943 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4a84b3e-b01b-424e-969b-e2cbc625f3f3" containerName="sg-core" Mar 20 10:57:13 crc kubenswrapper[4748]: I0320 10:57:13.603155 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4a84b3e-b01b-424e-969b-e2cbc625f3f3" containerName="ceilometer-central-agent" Mar 20 10:57:13 crc kubenswrapper[4748]: I0320 10:57:13.603173 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4a84b3e-b01b-424e-969b-e2cbc625f3f3" containerName="sg-core" Mar 20 10:57:13 crc kubenswrapper[4748]: I0320 10:57:13.603196 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4a84b3e-b01b-424e-969b-e2cbc625f3f3" containerName="ceilometer-notification-agent" Mar 20 10:57:13 crc kubenswrapper[4748]: I0320 10:57:13.603216 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4a84b3e-b01b-424e-969b-e2cbc625f3f3" containerName="proxy-httpd" Mar 20 10:57:13 crc kubenswrapper[4748]: I0320 10:57:13.610451 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 10:57:13 crc kubenswrapper[4748]: I0320 10:57:13.615004 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 10:57:13 crc kubenswrapper[4748]: I0320 10:57:13.615250 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 10:57:13 crc kubenswrapper[4748]: I0320 10:57:13.647174 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-b85b9d5c6-7bgxl" Mar 20 10:57:13 crc kubenswrapper[4748]: I0320 10:57:13.648450 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc703d08-b2b6-4531-8da1-050c7fa93964-config-data\") pod \"ceilometer-0\" (UID: \"bc703d08-b2b6-4531-8da1-050c7fa93964\") " pod="openstack/ceilometer-0" Mar 20 10:57:13 crc kubenswrapper[4748]: I0320 10:57:13.648518 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bc703d08-b2b6-4531-8da1-050c7fa93964-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bc703d08-b2b6-4531-8da1-050c7fa93964\") " pod="openstack/ceilometer-0" Mar 20 10:57:13 crc kubenswrapper[4748]: I0320 10:57:13.648548 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bc703d08-b2b6-4531-8da1-050c7fa93964-scripts\") pod \"ceilometer-0\" (UID: \"bc703d08-b2b6-4531-8da1-050c7fa93964\") " pod="openstack/ceilometer-0" Mar 20 10:57:13 crc kubenswrapper[4748]: I0320 10:57:13.648582 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bc703d08-b2b6-4531-8da1-050c7fa93964-log-httpd\") pod \"ceilometer-0\" (UID: \"bc703d08-b2b6-4531-8da1-050c7fa93964\") " pod="openstack/ceilometer-0" Mar 20 10:57:13 crc kubenswrapper[4748]: I0320 10:57:13.648630 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc703d08-b2b6-4531-8da1-050c7fa93964-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bc703d08-b2b6-4531-8da1-050c7fa93964\") " pod="openstack/ceilometer-0" Mar 20 10:57:13 crc kubenswrapper[4748]: I0320 10:57:13.648827 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bc703d08-b2b6-4531-8da1-050c7fa93964-run-httpd\") pod \"ceilometer-0\" (UID: \"bc703d08-b2b6-4531-8da1-050c7fa93964\") " pod="openstack/ceilometer-0" Mar 20 10:57:13 crc kubenswrapper[4748]: I0320 10:57:13.648965 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lkv7p\" (UniqueName: \"kubernetes.io/projected/bc703d08-b2b6-4531-8da1-050c7fa93964-kube-api-access-lkv7p\") pod \"ceilometer-0\" (UID: \"bc703d08-b2b6-4531-8da1-050c7fa93964\") " pod="openstack/ceilometer-0" Mar 20 10:57:13 crc kubenswrapper[4748]: I0320 10:57:13.697392 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 10:57:13 crc kubenswrapper[4748]: I0320 10:57:13.750331 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bc703d08-b2b6-4531-8da1-050c7fa93964-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bc703d08-b2b6-4531-8da1-050c7fa93964\") " pod="openstack/ceilometer-0" Mar 20 10:57:13 crc kubenswrapper[4748]: I0320 10:57:13.750395 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bc703d08-b2b6-4531-8da1-050c7fa93964-scripts\") pod \"ceilometer-0\" (UID: \"bc703d08-b2b6-4531-8da1-050c7fa93964\") " pod="openstack/ceilometer-0" Mar 20 10:57:13 crc kubenswrapper[4748]: I0320 10:57:13.750447 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bc703d08-b2b6-4531-8da1-050c7fa93964-log-httpd\") pod \"ceilometer-0\" (UID: \"bc703d08-b2b6-4531-8da1-050c7fa93964\") " pod="openstack/ceilometer-0" Mar 20 10:57:13 crc kubenswrapper[4748]: I0320 10:57:13.750535 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc703d08-b2b6-4531-8da1-050c7fa93964-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bc703d08-b2b6-4531-8da1-050c7fa93964\") " pod="openstack/ceilometer-0" Mar 20 10:57:13 crc kubenswrapper[4748]: I0320 10:57:13.750593 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bc703d08-b2b6-4531-8da1-050c7fa93964-run-httpd\") pod \"ceilometer-0\" (UID: \"bc703d08-b2b6-4531-8da1-050c7fa93964\") " pod="openstack/ceilometer-0" Mar 20 10:57:13 crc kubenswrapper[4748]: I0320 10:57:13.750661 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lkv7p\" (UniqueName: \"kubernetes.io/projected/bc703d08-b2b6-4531-8da1-050c7fa93964-kube-api-access-lkv7p\") pod \"ceilometer-0\" (UID: \"bc703d08-b2b6-4531-8da1-050c7fa93964\") " pod="openstack/ceilometer-0" Mar 20 10:57:13 crc kubenswrapper[4748]: I0320 10:57:13.750772 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc703d08-b2b6-4531-8da1-050c7fa93964-config-data\") pod \"ceilometer-0\" (UID: \"bc703d08-b2b6-4531-8da1-050c7fa93964\") " pod="openstack/ceilometer-0" Mar 20 10:57:13 crc kubenswrapper[4748]: I0320 10:57:13.752676 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bc703d08-b2b6-4531-8da1-050c7fa93964-run-httpd\") pod \"ceilometer-0\" (UID: \"bc703d08-b2b6-4531-8da1-050c7fa93964\") " pod="openstack/ceilometer-0" Mar 20 10:57:13 crc kubenswrapper[4748]: I0320 10:57:13.753011 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bc703d08-b2b6-4531-8da1-050c7fa93964-log-httpd\") pod \"ceilometer-0\" (UID: \"bc703d08-b2b6-4531-8da1-050c7fa93964\") " pod="openstack/ceilometer-0" Mar 20 10:57:13 crc kubenswrapper[4748]: I0320 10:57:13.757750 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc703d08-b2b6-4531-8da1-050c7fa93964-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bc703d08-b2b6-4531-8da1-050c7fa93964\") " pod="openstack/ceilometer-0" Mar 20 10:57:13 crc kubenswrapper[4748]: I0320 10:57:13.768312 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bc703d08-b2b6-4531-8da1-050c7fa93964-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bc703d08-b2b6-4531-8da1-050c7fa93964\") " pod="openstack/ceilometer-0" Mar 20 10:57:13 crc kubenswrapper[4748]: I0320 10:57:13.768789 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc703d08-b2b6-4531-8da1-050c7fa93964-config-data\") pod \"ceilometer-0\" (UID: \"bc703d08-b2b6-4531-8da1-050c7fa93964\") " pod="openstack/ceilometer-0" Mar 20 10:57:13 crc kubenswrapper[4748]: I0320 10:57:13.771252 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bc703d08-b2b6-4531-8da1-050c7fa93964-scripts\") pod \"ceilometer-0\" (UID: \"bc703d08-b2b6-4531-8da1-050c7fa93964\") " pod="openstack/ceilometer-0" Mar 20 10:57:13 crc kubenswrapper[4748]: I0320 10:57:13.773755 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 10:57:13 crc kubenswrapper[4748]: E0320 10:57:13.774456 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-lkv7p], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/ceilometer-0" podUID="bc703d08-b2b6-4531-8da1-050c7fa93964" Mar 20 10:57:13 crc kubenswrapper[4748]: I0320 10:57:13.791904 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lkv7p\" (UniqueName: \"kubernetes.io/projected/bc703d08-b2b6-4531-8da1-050c7fa93964-kube-api-access-lkv7p\") pod \"ceilometer-0\" (UID: \"bc703d08-b2b6-4531-8da1-050c7fa93964\") " pod="openstack/ceilometer-0" Mar 20 10:57:14 crc kubenswrapper[4748]: I0320 10:57:14.237669 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 10:57:14 crc kubenswrapper[4748]: I0320 10:57:14.251854 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 10:57:14 crc kubenswrapper[4748]: I0320 10:57:14.361660 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bc703d08-b2b6-4531-8da1-050c7fa93964-scripts\") pod \"bc703d08-b2b6-4531-8da1-050c7fa93964\" (UID: \"bc703d08-b2b6-4531-8da1-050c7fa93964\") " Mar 20 10:57:14 crc kubenswrapper[4748]: I0320 10:57:14.361720 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lkv7p\" (UniqueName: \"kubernetes.io/projected/bc703d08-b2b6-4531-8da1-050c7fa93964-kube-api-access-lkv7p\") pod \"bc703d08-b2b6-4531-8da1-050c7fa93964\" (UID: \"bc703d08-b2b6-4531-8da1-050c7fa93964\") " Mar 20 10:57:14 crc kubenswrapper[4748]: I0320 10:57:14.361766 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc703d08-b2b6-4531-8da1-050c7fa93964-config-data\") pod \"bc703d08-b2b6-4531-8da1-050c7fa93964\" (UID: \"bc703d08-b2b6-4531-8da1-050c7fa93964\") " Mar 20 10:57:14 crc kubenswrapper[4748]: I0320 10:57:14.361851 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc703d08-b2b6-4531-8da1-050c7fa93964-combined-ca-bundle\") pod \"bc703d08-b2b6-4531-8da1-050c7fa93964\" (UID: \"bc703d08-b2b6-4531-8da1-050c7fa93964\") " Mar 20 10:57:14 crc kubenswrapper[4748]: I0320 10:57:14.361871 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bc703d08-b2b6-4531-8da1-050c7fa93964-run-httpd\") pod \"bc703d08-b2b6-4531-8da1-050c7fa93964\" (UID: \"bc703d08-b2b6-4531-8da1-050c7fa93964\") " Mar 20 10:57:14 crc kubenswrapper[4748]: I0320 10:57:14.361909 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bc703d08-b2b6-4531-8da1-050c7fa93964-sg-core-conf-yaml\") pod \"bc703d08-b2b6-4531-8da1-050c7fa93964\" (UID: \"bc703d08-b2b6-4531-8da1-050c7fa93964\") " Mar 20 10:57:14 crc kubenswrapper[4748]: I0320 10:57:14.361938 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bc703d08-b2b6-4531-8da1-050c7fa93964-log-httpd\") pod \"bc703d08-b2b6-4531-8da1-050c7fa93964\" (UID: \"bc703d08-b2b6-4531-8da1-050c7fa93964\") " Mar 20 10:57:14 crc kubenswrapper[4748]: I0320 10:57:14.362755 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc703d08-b2b6-4531-8da1-050c7fa93964-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "bc703d08-b2b6-4531-8da1-050c7fa93964" (UID: "bc703d08-b2b6-4531-8da1-050c7fa93964"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:57:14 crc kubenswrapper[4748]: I0320 10:57:14.362955 4748 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bc703d08-b2b6-4531-8da1-050c7fa93964-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 10:57:14 crc kubenswrapper[4748]: I0320 10:57:14.363221 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc703d08-b2b6-4531-8da1-050c7fa93964-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "bc703d08-b2b6-4531-8da1-050c7fa93964" (UID: "bc703d08-b2b6-4531-8da1-050c7fa93964"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:57:14 crc kubenswrapper[4748]: I0320 10:57:14.368191 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc703d08-b2b6-4531-8da1-050c7fa93964-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bc703d08-b2b6-4531-8da1-050c7fa93964" (UID: "bc703d08-b2b6-4531-8da1-050c7fa93964"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:57:14 crc kubenswrapper[4748]: I0320 10:57:14.368236 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc703d08-b2b6-4531-8da1-050c7fa93964-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "bc703d08-b2b6-4531-8da1-050c7fa93964" (UID: "bc703d08-b2b6-4531-8da1-050c7fa93964"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:57:14 crc kubenswrapper[4748]: I0320 10:57:14.377052 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc703d08-b2b6-4531-8da1-050c7fa93964-scripts" (OuterVolumeSpecName: "scripts") pod "bc703d08-b2b6-4531-8da1-050c7fa93964" (UID: "bc703d08-b2b6-4531-8da1-050c7fa93964"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:57:14 crc kubenswrapper[4748]: I0320 10:57:14.377137 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc703d08-b2b6-4531-8da1-050c7fa93964-kube-api-access-lkv7p" (OuterVolumeSpecName: "kube-api-access-lkv7p") pod "bc703d08-b2b6-4531-8da1-050c7fa93964" (UID: "bc703d08-b2b6-4531-8da1-050c7fa93964"). InnerVolumeSpecName "kube-api-access-lkv7p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:57:14 crc kubenswrapper[4748]: I0320 10:57:14.377157 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc703d08-b2b6-4531-8da1-050c7fa93964-config-data" (OuterVolumeSpecName: "config-data") pod "bc703d08-b2b6-4531-8da1-050c7fa93964" (UID: "bc703d08-b2b6-4531-8da1-050c7fa93964"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:57:14 crc kubenswrapper[4748]: I0320 10:57:14.466132 4748 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bc703d08-b2b6-4531-8da1-050c7fa93964-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 10:57:14 crc kubenswrapper[4748]: I0320 10:57:14.466191 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lkv7p\" (UniqueName: \"kubernetes.io/projected/bc703d08-b2b6-4531-8da1-050c7fa93964-kube-api-access-lkv7p\") on node \"crc\" DevicePath \"\"" Mar 20 10:57:14 crc kubenswrapper[4748]: I0320 10:57:14.466207 4748 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc703d08-b2b6-4531-8da1-050c7fa93964-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 10:57:14 crc kubenswrapper[4748]: I0320 10:57:14.466218 4748 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc703d08-b2b6-4531-8da1-050c7fa93964-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 10:57:14 crc kubenswrapper[4748]: I0320 10:57:14.466231 4748 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bc703d08-b2b6-4531-8da1-050c7fa93964-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 20 10:57:14 crc kubenswrapper[4748]: I0320 10:57:14.466242 4748 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bc703d08-b2b6-4531-8da1-050c7fa93964-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 10:57:14 crc kubenswrapper[4748]: I0320 10:57:14.562472 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-j7jqs" Mar 20 10:57:14 crc kubenswrapper[4748]: I0320 10:57:14.581579 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qhtff\" (UniqueName: \"kubernetes.io/projected/a4a4f230-3fe6-44a4-a91b-5b0ea07ae755-kube-api-access-qhtff\") pod \"a4a4f230-3fe6-44a4-a91b-5b0ea07ae755\" (UID: \"a4a4f230-3fe6-44a4-a91b-5b0ea07ae755\") " Mar 20 10:57:14 crc kubenswrapper[4748]: I0320 10:57:14.581767 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4a4f230-3fe6-44a4-a91b-5b0ea07ae755-combined-ca-bundle\") pod \"a4a4f230-3fe6-44a4-a91b-5b0ea07ae755\" (UID: \"a4a4f230-3fe6-44a4-a91b-5b0ea07ae755\") " Mar 20 10:57:14 crc kubenswrapper[4748]: I0320 10:57:14.582059 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a4a4f230-3fe6-44a4-a91b-5b0ea07ae755-db-sync-config-data\") pod \"a4a4f230-3fe6-44a4-a91b-5b0ea07ae755\" (UID: \"a4a4f230-3fe6-44a4-a91b-5b0ea07ae755\") " Mar 20 10:57:14 crc kubenswrapper[4748]: I0320 10:57:14.611026 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4a4f230-3fe6-44a4-a91b-5b0ea07ae755-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "a4a4f230-3fe6-44a4-a91b-5b0ea07ae755" (UID: "a4a4f230-3fe6-44a4-a91b-5b0ea07ae755"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:57:14 crc kubenswrapper[4748]: I0320 10:57:14.615102 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4a4f230-3fe6-44a4-a91b-5b0ea07ae755-kube-api-access-qhtff" (OuterVolumeSpecName: "kube-api-access-qhtff") pod "a4a4f230-3fe6-44a4-a91b-5b0ea07ae755" (UID: "a4a4f230-3fe6-44a4-a91b-5b0ea07ae755"). InnerVolumeSpecName "kube-api-access-qhtff". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:57:14 crc kubenswrapper[4748]: I0320 10:57:14.633989 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4a4f230-3fe6-44a4-a91b-5b0ea07ae755-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a4a4f230-3fe6-44a4-a91b-5b0ea07ae755" (UID: "a4a4f230-3fe6-44a4-a91b-5b0ea07ae755"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:57:14 crc kubenswrapper[4748]: I0320 10:57:14.686001 4748 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a4a4f230-3fe6-44a4-a91b-5b0ea07ae755-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 10:57:14 crc kubenswrapper[4748]: I0320 10:57:14.686362 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qhtff\" (UniqueName: \"kubernetes.io/projected/a4a4f230-3fe6-44a4-a91b-5b0ea07ae755-kube-api-access-qhtff\") on node \"crc\" DevicePath \"\"" Mar 20 10:57:14 crc kubenswrapper[4748]: I0320 10:57:14.686375 4748 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4a4f230-3fe6-44a4-a91b-5b0ea07ae755-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 10:57:15 crc kubenswrapper[4748]: I0320 10:57:15.249564 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 10:57:15 crc kubenswrapper[4748]: I0320 10:57:15.249589 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-j7jqs" Mar 20 10:57:15 crc kubenswrapper[4748]: I0320 10:57:15.249552 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-j7jqs" event={"ID":"a4a4f230-3fe6-44a4-a91b-5b0ea07ae755","Type":"ContainerDied","Data":"c14c86cf03f9fe0ca3c0e2c665e5093475365ee25da6bfb5d95ce3271c77eacb"} Mar 20 10:57:15 crc kubenswrapper[4748]: I0320 10:57:15.249647 4748 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c14c86cf03f9fe0ca3c0e2c665e5093475365ee25da6bfb5d95ce3271c77eacb" Mar 20 10:57:15 crc kubenswrapper[4748]: I0320 10:57:15.328373 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 10:57:15 crc kubenswrapper[4748]: I0320 10:57:15.336381 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 20 10:57:15 crc kubenswrapper[4748]: I0320 10:57:15.376896 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 10:57:15 crc kubenswrapper[4748]: E0320 10:57:15.377709 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4a4f230-3fe6-44a4-a91b-5b0ea07ae755" containerName="barbican-db-sync" Mar 20 10:57:15 crc kubenswrapper[4748]: I0320 10:57:15.377739 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4a4f230-3fe6-44a4-a91b-5b0ea07ae755" containerName="barbican-db-sync" Mar 20 10:57:15 crc kubenswrapper[4748]: I0320 10:57:15.378139 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4a4f230-3fe6-44a4-a91b-5b0ea07ae755" containerName="barbican-db-sync" Mar 20 10:57:15 crc kubenswrapper[4748]: I0320 10:57:15.380297 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 10:57:15 crc kubenswrapper[4748]: I0320 10:57:15.385221 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 10:57:15 crc kubenswrapper[4748]: I0320 10:57:15.385700 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 10:57:15 crc kubenswrapper[4748]: I0320 10:57:15.388318 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 10:57:15 crc kubenswrapper[4748]: I0320 10:57:15.500629 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9ddc460-94b3-44d2-9eed-2b4e344f8232-config-data\") pod \"ceilometer-0\" (UID: \"f9ddc460-94b3-44d2-9eed-2b4e344f8232\") " pod="openstack/ceilometer-0" Mar 20 10:57:15 crc kubenswrapper[4748]: I0320 10:57:15.501013 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f9ddc460-94b3-44d2-9eed-2b4e344f8232-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f9ddc460-94b3-44d2-9eed-2b4e344f8232\") " pod="openstack/ceilometer-0" Mar 20 10:57:15 crc kubenswrapper[4748]: I0320 10:57:15.501221 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f9ddc460-94b3-44d2-9eed-2b4e344f8232-log-httpd\") pod \"ceilometer-0\" (UID: \"f9ddc460-94b3-44d2-9eed-2b4e344f8232\") " pod="openstack/ceilometer-0" Mar 20 10:57:15 crc kubenswrapper[4748]: I0320 10:57:15.501280 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9ddc460-94b3-44d2-9eed-2b4e344f8232-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f9ddc460-94b3-44d2-9eed-2b4e344f8232\") " pod="openstack/ceilometer-0" Mar 20 10:57:15 crc kubenswrapper[4748]: I0320 10:57:15.501495 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8m7rx\" (UniqueName: \"kubernetes.io/projected/f9ddc460-94b3-44d2-9eed-2b4e344f8232-kube-api-access-8m7rx\") pod \"ceilometer-0\" (UID: \"f9ddc460-94b3-44d2-9eed-2b4e344f8232\") " pod="openstack/ceilometer-0" Mar 20 10:57:15 crc kubenswrapper[4748]: I0320 10:57:15.501647 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f9ddc460-94b3-44d2-9eed-2b4e344f8232-scripts\") pod \"ceilometer-0\" (UID: \"f9ddc460-94b3-44d2-9eed-2b4e344f8232\") " pod="openstack/ceilometer-0" Mar 20 10:57:15 crc kubenswrapper[4748]: I0320 10:57:15.501763 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f9ddc460-94b3-44d2-9eed-2b4e344f8232-run-httpd\") pod \"ceilometer-0\" (UID: \"f9ddc460-94b3-44d2-9eed-2b4e344f8232\") " pod="openstack/ceilometer-0" Mar 20 10:57:15 crc kubenswrapper[4748]: I0320 10:57:15.526189 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4a84b3e-b01b-424e-969b-e2cbc625f3f3" path="/var/lib/kubelet/pods/a4a84b3e-b01b-424e-969b-e2cbc625f3f3/volumes" Mar 20 10:57:15 crc kubenswrapper[4748]: I0320 10:57:15.527196 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc703d08-b2b6-4531-8da1-050c7fa93964" path="/var/lib/kubelet/pods/bc703d08-b2b6-4531-8da1-050c7fa93964/volumes" Mar 20 10:57:15 crc kubenswrapper[4748]: I0320 10:57:15.603011 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f9ddc460-94b3-44d2-9eed-2b4e344f8232-log-httpd\") pod \"ceilometer-0\" (UID: \"f9ddc460-94b3-44d2-9eed-2b4e344f8232\") " pod="openstack/ceilometer-0" Mar 20 10:57:15 crc kubenswrapper[4748]: I0320 10:57:15.603078 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9ddc460-94b3-44d2-9eed-2b4e344f8232-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f9ddc460-94b3-44d2-9eed-2b4e344f8232\") " pod="openstack/ceilometer-0" Mar 20 10:57:15 crc kubenswrapper[4748]: I0320 10:57:15.603149 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8m7rx\" (UniqueName: \"kubernetes.io/projected/f9ddc460-94b3-44d2-9eed-2b4e344f8232-kube-api-access-8m7rx\") pod \"ceilometer-0\" (UID: \"f9ddc460-94b3-44d2-9eed-2b4e344f8232\") " pod="openstack/ceilometer-0" Mar 20 10:57:15 crc kubenswrapper[4748]: I0320 10:57:15.603192 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f9ddc460-94b3-44d2-9eed-2b4e344f8232-scripts\") pod \"ceilometer-0\" (UID: \"f9ddc460-94b3-44d2-9eed-2b4e344f8232\") " pod="openstack/ceilometer-0" Mar 20 10:57:15 crc kubenswrapper[4748]: I0320 10:57:15.603236 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f9ddc460-94b3-44d2-9eed-2b4e344f8232-run-httpd\") pod \"ceilometer-0\" (UID: \"f9ddc460-94b3-44d2-9eed-2b4e344f8232\") " pod="openstack/ceilometer-0" Mar 20 10:57:15 crc kubenswrapper[4748]: I0320 10:57:15.603278 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9ddc460-94b3-44d2-9eed-2b4e344f8232-config-data\") pod \"ceilometer-0\" (UID: \"f9ddc460-94b3-44d2-9eed-2b4e344f8232\") " pod="openstack/ceilometer-0" Mar 20 10:57:15 crc kubenswrapper[4748]: I0320 10:57:15.603309 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f9ddc460-94b3-44d2-9eed-2b4e344f8232-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f9ddc460-94b3-44d2-9eed-2b4e344f8232\") " pod="openstack/ceilometer-0" Mar 20 10:57:15 crc kubenswrapper[4748]: I0320 10:57:15.603680 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f9ddc460-94b3-44d2-9eed-2b4e344f8232-log-httpd\") pod \"ceilometer-0\" (UID: \"f9ddc460-94b3-44d2-9eed-2b4e344f8232\") " pod="openstack/ceilometer-0" Mar 20 10:57:15 crc kubenswrapper[4748]: I0320 10:57:15.604375 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f9ddc460-94b3-44d2-9eed-2b4e344f8232-run-httpd\") pod \"ceilometer-0\" (UID: \"f9ddc460-94b3-44d2-9eed-2b4e344f8232\") " pod="openstack/ceilometer-0" Mar 20 10:57:15 crc kubenswrapper[4748]: I0320 10:57:15.611036 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9ddc460-94b3-44d2-9eed-2b4e344f8232-config-data\") pod \"ceilometer-0\" (UID: \"f9ddc460-94b3-44d2-9eed-2b4e344f8232\") " pod="openstack/ceilometer-0" Mar 20 10:57:15 crc kubenswrapper[4748]: I0320 10:57:15.635396 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f9ddc460-94b3-44d2-9eed-2b4e344f8232-scripts\") pod \"ceilometer-0\" (UID: \"f9ddc460-94b3-44d2-9eed-2b4e344f8232\") " pod="openstack/ceilometer-0" Mar 20 10:57:15 crc kubenswrapper[4748]: I0320 10:57:15.635759 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9ddc460-94b3-44d2-9eed-2b4e344f8232-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f9ddc460-94b3-44d2-9eed-2b4e344f8232\") " pod="openstack/ceilometer-0" Mar 20 10:57:15 crc kubenswrapper[4748]: I0320 10:57:15.638955 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f9ddc460-94b3-44d2-9eed-2b4e344f8232-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f9ddc460-94b3-44d2-9eed-2b4e344f8232\") " pod="openstack/ceilometer-0" Mar 20 10:57:15 crc kubenswrapper[4748]: I0320 10:57:15.678583 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8m7rx\" (UniqueName: \"kubernetes.io/projected/f9ddc460-94b3-44d2-9eed-2b4e344f8232-kube-api-access-8m7rx\") pod \"ceilometer-0\" (UID: \"f9ddc460-94b3-44d2-9eed-2b4e344f8232\") " pod="openstack/ceilometer-0" Mar 20 10:57:15 crc kubenswrapper[4748]: I0320 10:57:15.680692 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-556bf684bc-f9q9w"] Mar 20 10:57:15 crc kubenswrapper[4748]: I0320 10:57:15.691744 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-556bf684bc-f9q9w" Mar 20 10:57:15 crc kubenswrapper[4748]: I0320 10:57:15.697668 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Mar 20 10:57:15 crc kubenswrapper[4748]: I0320 10:57:15.698005 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-hfmzk" Mar 20 10:57:15 crc kubenswrapper[4748]: I0320 10:57:15.698194 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 20 10:57:15 crc kubenswrapper[4748]: I0320 10:57:15.705416 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6c942b79-bc14-4a48-8fbd-32667bc1afc6-config-data-custom\") pod \"barbican-worker-556bf684bc-f9q9w\" (UID: \"6c942b79-bc14-4a48-8fbd-32667bc1afc6\") " pod="openstack/barbican-worker-556bf684bc-f9q9w" Mar 20 10:57:15 crc kubenswrapper[4748]: I0320 10:57:15.705489 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6c942b79-bc14-4a48-8fbd-32667bc1afc6-logs\") pod \"barbican-worker-556bf684bc-f9q9w\" (UID: \"6c942b79-bc14-4a48-8fbd-32667bc1afc6\") " pod="openstack/barbican-worker-556bf684bc-f9q9w" Mar 20 10:57:15 crc kubenswrapper[4748]: I0320 10:57:15.705527 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6s2p\" (UniqueName: \"kubernetes.io/projected/6c942b79-bc14-4a48-8fbd-32667bc1afc6-kube-api-access-r6s2p\") pod \"barbican-worker-556bf684bc-f9q9w\" (UID: \"6c942b79-bc14-4a48-8fbd-32667bc1afc6\") " pod="openstack/barbican-worker-556bf684bc-f9q9w" Mar 20 10:57:15 crc kubenswrapper[4748]: I0320 10:57:15.705619 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c942b79-bc14-4a48-8fbd-32667bc1afc6-combined-ca-bundle\") pod \"barbican-worker-556bf684bc-f9q9w\" (UID: \"6c942b79-bc14-4a48-8fbd-32667bc1afc6\") " pod="openstack/barbican-worker-556bf684bc-f9q9w" Mar 20 10:57:15 crc kubenswrapper[4748]: I0320 10:57:15.705680 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c942b79-bc14-4a48-8fbd-32667bc1afc6-config-data\") pod \"barbican-worker-556bf684bc-f9q9w\" (UID: \"6c942b79-bc14-4a48-8fbd-32667bc1afc6\") " pod="openstack/barbican-worker-556bf684bc-f9q9w" Mar 20 10:57:15 crc kubenswrapper[4748]: I0320 10:57:15.733486 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-5b69f7d5cb-p5jsk"] Mar 20 10:57:15 crc kubenswrapper[4748]: I0320 10:57:15.738581 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-5b69f7d5cb-p5jsk" Mar 20 10:57:15 crc kubenswrapper[4748]: I0320 10:57:15.745061 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 10:57:15 crc kubenswrapper[4748]: I0320 10:57:15.750943 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Mar 20 10:57:15 crc kubenswrapper[4748]: I0320 10:57:15.762796 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-556bf684bc-f9q9w"] Mar 20 10:57:15 crc kubenswrapper[4748]: I0320 10:57:15.811720 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c942b79-bc14-4a48-8fbd-32667bc1afc6-combined-ca-bundle\") pod \"barbican-worker-556bf684bc-f9q9w\" (UID: \"6c942b79-bc14-4a48-8fbd-32667bc1afc6\") " pod="openstack/barbican-worker-556bf684bc-f9q9w" Mar 20 10:57:15 crc kubenswrapper[4748]: I0320 10:57:15.811882 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c942b79-bc14-4a48-8fbd-32667bc1afc6-config-data\") pod \"barbican-worker-556bf684bc-f9q9w\" (UID: \"6c942b79-bc14-4a48-8fbd-32667bc1afc6\") " pod="openstack/barbican-worker-556bf684bc-f9q9w" Mar 20 10:57:15 crc kubenswrapper[4748]: I0320 10:57:15.811947 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6c942b79-bc14-4a48-8fbd-32667bc1afc6-config-data-custom\") pod \"barbican-worker-556bf684bc-f9q9w\" (UID: \"6c942b79-bc14-4a48-8fbd-32667bc1afc6\") " pod="openstack/barbican-worker-556bf684bc-f9q9w" Mar 20 10:57:15 crc kubenswrapper[4748]: I0320 10:57:15.811946 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-5b69f7d5cb-p5jsk"] Mar 20 10:57:15 crc kubenswrapper[4748]: I0320 10:57:15.812005 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6c942b79-bc14-4a48-8fbd-32667bc1afc6-logs\") pod \"barbican-worker-556bf684bc-f9q9w\" (UID: \"6c942b79-bc14-4a48-8fbd-32667bc1afc6\") " pod="openstack/barbican-worker-556bf684bc-f9q9w" Mar 20 10:57:15 crc kubenswrapper[4748]: I0320 10:57:15.812063 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r6s2p\" (UniqueName: \"kubernetes.io/projected/6c942b79-bc14-4a48-8fbd-32667bc1afc6-kube-api-access-r6s2p\") pod \"barbican-worker-556bf684bc-f9q9w\" (UID: \"6c942b79-bc14-4a48-8fbd-32667bc1afc6\") " pod="openstack/barbican-worker-556bf684bc-f9q9w" Mar 20 10:57:15 crc kubenswrapper[4748]: I0320 10:57:15.816667 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6c942b79-bc14-4a48-8fbd-32667bc1afc6-logs\") pod \"barbican-worker-556bf684bc-f9q9w\" (UID: \"6c942b79-bc14-4a48-8fbd-32667bc1afc6\") " pod="openstack/barbican-worker-556bf684bc-f9q9w" Mar 20 10:57:15 crc kubenswrapper[4748]: I0320 10:57:15.826438 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c942b79-bc14-4a48-8fbd-32667bc1afc6-config-data\") pod \"barbican-worker-556bf684bc-f9q9w\" (UID: \"6c942b79-bc14-4a48-8fbd-32667bc1afc6\") " pod="openstack/barbican-worker-556bf684bc-f9q9w" Mar 20 10:57:15 crc kubenswrapper[4748]: I0320 10:57:15.830690 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c942b79-bc14-4a48-8fbd-32667bc1afc6-combined-ca-bundle\") pod \"barbican-worker-556bf684bc-f9q9w\" (UID: \"6c942b79-bc14-4a48-8fbd-32667bc1afc6\") " pod="openstack/barbican-worker-556bf684bc-f9q9w" Mar 20 10:57:15 crc kubenswrapper[4748]: I0320 10:57:15.838646 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6c942b79-bc14-4a48-8fbd-32667bc1afc6-config-data-custom\") pod \"barbican-worker-556bf684bc-f9q9w\" (UID: \"6c942b79-bc14-4a48-8fbd-32667bc1afc6\") " pod="openstack/barbican-worker-556bf684bc-f9q9w" Mar 20 10:57:15 crc kubenswrapper[4748]: I0320 10:57:15.863567 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6s2p\" (UniqueName: \"kubernetes.io/projected/6c942b79-bc14-4a48-8fbd-32667bc1afc6-kube-api-access-r6s2p\") pod \"barbican-worker-556bf684bc-f9q9w\" (UID: \"6c942b79-bc14-4a48-8fbd-32667bc1afc6\") " pod="openstack/barbican-worker-556bf684bc-f9q9w" Mar 20 10:57:15 crc kubenswrapper[4748]: I0320 10:57:15.910047 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-586bdc5f9-nt6hc"] Mar 20 10:57:15 crc kubenswrapper[4748]: I0320 10:57:15.911886 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-586bdc5f9-nt6hc" Mar 20 10:57:15 crc kubenswrapper[4748]: I0320 10:57:15.923635 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8trcj\" (UniqueName: \"kubernetes.io/projected/ed96228e-6626-468c-bf60-a1073dfc123e-kube-api-access-8trcj\") pod \"barbican-keystone-listener-5b69f7d5cb-p5jsk\" (UID: \"ed96228e-6626-468c-bf60-a1073dfc123e\") " pod="openstack/barbican-keystone-listener-5b69f7d5cb-p5jsk" Mar 20 10:57:15 crc kubenswrapper[4748]: I0320 10:57:15.923709 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed96228e-6626-468c-bf60-a1073dfc123e-combined-ca-bundle\") pod \"barbican-keystone-listener-5b69f7d5cb-p5jsk\" (UID: \"ed96228e-6626-468c-bf60-a1073dfc123e\") " pod="openstack/barbican-keystone-listener-5b69f7d5cb-p5jsk" Mar 20 10:57:15 crc kubenswrapper[4748]: I0320 10:57:15.923808 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ed96228e-6626-468c-bf60-a1073dfc123e-logs\") pod \"barbican-keystone-listener-5b69f7d5cb-p5jsk\" (UID: \"ed96228e-6626-468c-bf60-a1073dfc123e\") " pod="openstack/barbican-keystone-listener-5b69f7d5cb-p5jsk" Mar 20 10:57:15 crc kubenswrapper[4748]: I0320 10:57:15.923846 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed96228e-6626-468c-bf60-a1073dfc123e-config-data\") pod \"barbican-keystone-listener-5b69f7d5cb-p5jsk\" (UID: \"ed96228e-6626-468c-bf60-a1073dfc123e\") " pod="openstack/barbican-keystone-listener-5b69f7d5cb-p5jsk" Mar 20 10:57:15 crc kubenswrapper[4748]: I0320 10:57:15.923866 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ed96228e-6626-468c-bf60-a1073dfc123e-config-data-custom\") pod \"barbican-keystone-listener-5b69f7d5cb-p5jsk\" (UID: \"ed96228e-6626-468c-bf60-a1073dfc123e\") " pod="openstack/barbican-keystone-listener-5b69f7d5cb-p5jsk" Mar 20 10:57:15 crc kubenswrapper[4748]: I0320 10:57:15.926559 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-586bdc5f9-nt6hc"] Mar 20 10:57:15 crc kubenswrapper[4748]: I0320 10:57:15.958113 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-556bf684bc-f9q9w" Mar 20 10:57:16 crc kubenswrapper[4748]: I0320 10:57:16.055167 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-b85b9d5c6-7bgxl" Mar 20 10:57:16 crc kubenswrapper[4748]: I0320 10:57:16.066498 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cd0027c7-fda5-4ece-a96d-a45762ad902a-dns-svc\") pod \"dnsmasq-dns-586bdc5f9-nt6hc\" (UID: \"cd0027c7-fda5-4ece-a96d-a45762ad902a\") " pod="openstack/dnsmasq-dns-586bdc5f9-nt6hc" Mar 20 10:57:16 crc kubenswrapper[4748]: I0320 10:57:16.066781 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd0027c7-fda5-4ece-a96d-a45762ad902a-config\") pod \"dnsmasq-dns-586bdc5f9-nt6hc\" (UID: \"cd0027c7-fda5-4ece-a96d-a45762ad902a\") " pod="openstack/dnsmasq-dns-586bdc5f9-nt6hc" Mar 20 10:57:16 crc kubenswrapper[4748]: I0320 10:57:16.070406 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cd0027c7-fda5-4ece-a96d-a45762ad902a-ovsdbserver-sb\") pod \"dnsmasq-dns-586bdc5f9-nt6hc\" (UID: \"cd0027c7-fda5-4ece-a96d-a45762ad902a\") " pod="openstack/dnsmasq-dns-586bdc5f9-nt6hc" Mar 20 10:57:16 crc kubenswrapper[4748]: I0320 10:57:16.071271 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8trcj\" (UniqueName: \"kubernetes.io/projected/ed96228e-6626-468c-bf60-a1073dfc123e-kube-api-access-8trcj\") pod \"barbican-keystone-listener-5b69f7d5cb-p5jsk\" (UID: \"ed96228e-6626-468c-bf60-a1073dfc123e\") " pod="openstack/barbican-keystone-listener-5b69f7d5cb-p5jsk" Mar 20 10:57:16 crc kubenswrapper[4748]: I0320 10:57:16.071340 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cd0027c7-fda5-4ece-a96d-a45762ad902a-dns-swift-storage-0\") pod \"dnsmasq-dns-586bdc5f9-nt6hc\" (UID: \"cd0027c7-fda5-4ece-a96d-a45762ad902a\") " pod="openstack/dnsmasq-dns-586bdc5f9-nt6hc" Mar 20 10:57:16 crc kubenswrapper[4748]: I0320 10:57:16.076229 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed96228e-6626-468c-bf60-a1073dfc123e-combined-ca-bundle\") pod \"barbican-keystone-listener-5b69f7d5cb-p5jsk\" (UID: \"ed96228e-6626-468c-bf60-a1073dfc123e\") " pod="openstack/barbican-keystone-listener-5b69f7d5cb-p5jsk" Mar 20 10:57:16 crc kubenswrapper[4748]: I0320 10:57:16.076748 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cd0027c7-fda5-4ece-a96d-a45762ad902a-ovsdbserver-nb\") pod \"dnsmasq-dns-586bdc5f9-nt6hc\" (UID: \"cd0027c7-fda5-4ece-a96d-a45762ad902a\") " pod="openstack/dnsmasq-dns-586bdc5f9-nt6hc" Mar 20 10:57:16 crc kubenswrapper[4748]: I0320 10:57:16.078371 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ed96228e-6626-468c-bf60-a1073dfc123e-logs\") pod \"barbican-keystone-listener-5b69f7d5cb-p5jsk\" (UID: \"ed96228e-6626-468c-bf60-a1073dfc123e\") " pod="openstack/barbican-keystone-listener-5b69f7d5cb-p5jsk" Mar 20 10:57:16 crc kubenswrapper[4748]: I0320 10:57:16.078875 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed96228e-6626-468c-bf60-a1073dfc123e-config-data\") pod \"barbican-keystone-listener-5b69f7d5cb-p5jsk\" (UID: \"ed96228e-6626-468c-bf60-a1073dfc123e\") " pod="openstack/barbican-keystone-listener-5b69f7d5cb-p5jsk" Mar 20 10:57:16 crc kubenswrapper[4748]: I0320 10:57:16.079062 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ed96228e-6626-468c-bf60-a1073dfc123e-config-data-custom\") pod \"barbican-keystone-listener-5b69f7d5cb-p5jsk\" (UID: \"ed96228e-6626-468c-bf60-a1073dfc123e\") " pod="openstack/barbican-keystone-listener-5b69f7d5cb-p5jsk" Mar 20 10:57:16 crc kubenswrapper[4748]: I0320 10:57:16.080601 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bzgg\" (UniqueName: \"kubernetes.io/projected/cd0027c7-fda5-4ece-a96d-a45762ad902a-kube-api-access-5bzgg\") pod \"dnsmasq-dns-586bdc5f9-nt6hc\" (UID: \"cd0027c7-fda5-4ece-a96d-a45762ad902a\") " pod="openstack/dnsmasq-dns-586bdc5f9-nt6hc" Mar 20 10:57:16 crc kubenswrapper[4748]: I0320 10:57:16.084785 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ed96228e-6626-468c-bf60-a1073dfc123e-logs\") pod \"barbican-keystone-listener-5b69f7d5cb-p5jsk\" (UID: \"ed96228e-6626-468c-bf60-a1073dfc123e\") " pod="openstack/barbican-keystone-listener-5b69f7d5cb-p5jsk" Mar 20 10:57:16 crc kubenswrapper[4748]: I0320 10:57:16.133003 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed96228e-6626-468c-bf60-a1073dfc123e-combined-ca-bundle\") pod \"barbican-keystone-listener-5b69f7d5cb-p5jsk\" (UID: \"ed96228e-6626-468c-bf60-a1073dfc123e\") " pod="openstack/barbican-keystone-listener-5b69f7d5cb-p5jsk" Mar 20 10:57:16 crc kubenswrapper[4748]: I0320 10:57:16.133750 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-d4c76548d-zpdfr"] Mar 20 10:57:16 crc kubenswrapper[4748]: I0320 10:57:16.136242 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-d4c76548d-zpdfr" Mar 20 10:57:16 crc kubenswrapper[4748]: I0320 10:57:16.143219 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Mar 20 10:57:16 crc kubenswrapper[4748]: I0320 10:57:16.145786 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed96228e-6626-468c-bf60-a1073dfc123e-config-data\") pod \"barbican-keystone-listener-5b69f7d5cb-p5jsk\" (UID: \"ed96228e-6626-468c-bf60-a1073dfc123e\") " pod="openstack/barbican-keystone-listener-5b69f7d5cb-p5jsk" Mar 20 10:57:16 crc kubenswrapper[4748]: I0320 10:57:16.152589 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ed96228e-6626-468c-bf60-a1073dfc123e-config-data-custom\") pod \"barbican-keystone-listener-5b69f7d5cb-p5jsk\" (UID: \"ed96228e-6626-468c-bf60-a1073dfc123e\") " pod="openstack/barbican-keystone-listener-5b69f7d5cb-p5jsk" Mar 20 10:57:16 crc kubenswrapper[4748]: I0320 10:57:16.182202 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8trcj\" (UniqueName: \"kubernetes.io/projected/ed96228e-6626-468c-bf60-a1073dfc123e-kube-api-access-8trcj\") pod \"barbican-keystone-listener-5b69f7d5cb-p5jsk\" (UID: \"ed96228e-6626-468c-bf60-a1073dfc123e\") " pod="openstack/barbican-keystone-listener-5b69f7d5cb-p5jsk" Mar 20 10:57:16 crc kubenswrapper[4748]: I0320 10:57:16.187570 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5bzgg\" (UniqueName: \"kubernetes.io/projected/cd0027c7-fda5-4ece-a96d-a45762ad902a-kube-api-access-5bzgg\") pod \"dnsmasq-dns-586bdc5f9-nt6hc\" (UID: \"cd0027c7-fda5-4ece-a96d-a45762ad902a\") " pod="openstack/dnsmasq-dns-586bdc5f9-nt6hc" Mar 20 10:57:16 crc kubenswrapper[4748]: I0320 10:57:16.187689 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bb0cf1bb-fa5f-4f46-951a-d3bda77fd294-logs\") pod \"barbican-api-d4c76548d-zpdfr\" (UID: \"bb0cf1bb-fa5f-4f46-951a-d3bda77fd294\") " pod="openstack/barbican-api-d4c76548d-zpdfr" Mar 20 10:57:16 crc kubenswrapper[4748]: I0320 10:57:16.187790 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cd0027c7-fda5-4ece-a96d-a45762ad902a-dns-svc\") pod \"dnsmasq-dns-586bdc5f9-nt6hc\" (UID: \"cd0027c7-fda5-4ece-a96d-a45762ad902a\") " pod="openstack/dnsmasq-dns-586bdc5f9-nt6hc" Mar 20 10:57:16 crc kubenswrapper[4748]: I0320 10:57:16.187922 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd0027c7-fda5-4ece-a96d-a45762ad902a-config\") pod \"dnsmasq-dns-586bdc5f9-nt6hc\" (UID: \"cd0027c7-fda5-4ece-a96d-a45762ad902a\") " pod="openstack/dnsmasq-dns-586bdc5f9-nt6hc" Mar 20 10:57:16 crc kubenswrapper[4748]: I0320 10:57:16.188044 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cd0027c7-fda5-4ece-a96d-a45762ad902a-ovsdbserver-sb\") pod \"dnsmasq-dns-586bdc5f9-nt6hc\" (UID: \"cd0027c7-fda5-4ece-a96d-a45762ad902a\") " pod="openstack/dnsmasq-dns-586bdc5f9-nt6hc" Mar 20 10:57:16 crc kubenswrapper[4748]: I0320 10:57:16.188111 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cd0027c7-fda5-4ece-a96d-a45762ad902a-dns-swift-storage-0\") pod \"dnsmasq-dns-586bdc5f9-nt6hc\" (UID: \"cd0027c7-fda5-4ece-a96d-a45762ad902a\") " pod="openstack/dnsmasq-dns-586bdc5f9-nt6hc" Mar 20 10:57:16 crc kubenswrapper[4748]: I0320 10:57:16.188199 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bb0cf1bb-fa5f-4f46-951a-d3bda77fd294-config-data-custom\") pod \"barbican-api-d4c76548d-zpdfr\" (UID: \"bb0cf1bb-fa5f-4f46-951a-d3bda77fd294\") " pod="openstack/barbican-api-d4c76548d-zpdfr" Mar 20 10:57:16 crc kubenswrapper[4748]: I0320 10:57:16.188269 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npgxc\" (UniqueName: \"kubernetes.io/projected/bb0cf1bb-fa5f-4f46-951a-d3bda77fd294-kube-api-access-npgxc\") pod \"barbican-api-d4c76548d-zpdfr\" (UID: \"bb0cf1bb-fa5f-4f46-951a-d3bda77fd294\") " pod="openstack/barbican-api-d4c76548d-zpdfr" Mar 20 10:57:16 crc kubenswrapper[4748]: I0320 10:57:16.188355 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cd0027c7-fda5-4ece-a96d-a45762ad902a-ovsdbserver-nb\") pod \"dnsmasq-dns-586bdc5f9-nt6hc\" (UID: \"cd0027c7-fda5-4ece-a96d-a45762ad902a\") " pod="openstack/dnsmasq-dns-586bdc5f9-nt6hc" Mar 20 10:57:16 crc kubenswrapper[4748]: I0320 10:57:16.188428 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb0cf1bb-fa5f-4f46-951a-d3bda77fd294-config-data\") pod \"barbican-api-d4c76548d-zpdfr\" (UID: \"bb0cf1bb-fa5f-4f46-951a-d3bda77fd294\") " pod="openstack/barbican-api-d4c76548d-zpdfr" Mar 20 10:57:16 crc kubenswrapper[4748]: I0320 10:57:16.188542 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb0cf1bb-fa5f-4f46-951a-d3bda77fd294-combined-ca-bundle\") pod \"barbican-api-d4c76548d-zpdfr\" (UID: \"bb0cf1bb-fa5f-4f46-951a-d3bda77fd294\") " pod="openstack/barbican-api-d4c76548d-zpdfr" Mar 20 10:57:16 crc kubenswrapper[4748]: I0320 10:57:16.189973 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cd0027c7-fda5-4ece-a96d-a45762ad902a-dns-svc\") pod \"dnsmasq-dns-586bdc5f9-nt6hc\" (UID: \"cd0027c7-fda5-4ece-a96d-a45762ad902a\") " pod="openstack/dnsmasq-dns-586bdc5f9-nt6hc" Mar 20 10:57:16 crc kubenswrapper[4748]: I0320 10:57:16.190621 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd0027c7-fda5-4ece-a96d-a45762ad902a-config\") pod \"dnsmasq-dns-586bdc5f9-nt6hc\" (UID: \"cd0027c7-fda5-4ece-a96d-a45762ad902a\") " pod="openstack/dnsmasq-dns-586bdc5f9-nt6hc" Mar 20 10:57:16 crc kubenswrapper[4748]: I0320 10:57:16.192072 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cd0027c7-fda5-4ece-a96d-a45762ad902a-ovsdbserver-sb\") pod \"dnsmasq-dns-586bdc5f9-nt6hc\" (UID: \"cd0027c7-fda5-4ece-a96d-a45762ad902a\") " pod="openstack/dnsmasq-dns-586bdc5f9-nt6hc" Mar 20 10:57:16 crc kubenswrapper[4748]: I0320 10:57:16.192683 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cd0027c7-fda5-4ece-a96d-a45762ad902a-dns-swift-storage-0\") pod \"dnsmasq-dns-586bdc5f9-nt6hc\" (UID: \"cd0027c7-fda5-4ece-a96d-a45762ad902a\") " pod="openstack/dnsmasq-dns-586bdc5f9-nt6hc" Mar 20 10:57:16 crc kubenswrapper[4748]: I0320 10:57:16.193321 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cd0027c7-fda5-4ece-a96d-a45762ad902a-ovsdbserver-nb\") pod \"dnsmasq-dns-586bdc5f9-nt6hc\" (UID: \"cd0027c7-fda5-4ece-a96d-a45762ad902a\") " pod="openstack/dnsmasq-dns-586bdc5f9-nt6hc" Mar 20 10:57:16 crc kubenswrapper[4748]: I0320 10:57:16.193639 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-d4c76548d-zpdfr"] Mar 20 10:57:16 crc kubenswrapper[4748]: I0320 10:57:16.212824 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bzgg\" (UniqueName: \"kubernetes.io/projected/cd0027c7-fda5-4ece-a96d-a45762ad902a-kube-api-access-5bzgg\") pod \"dnsmasq-dns-586bdc5f9-nt6hc\" (UID: \"cd0027c7-fda5-4ece-a96d-a45762ad902a\") " pod="openstack/dnsmasq-dns-586bdc5f9-nt6hc" Mar 20 10:57:16 crc kubenswrapper[4748]: I0320 10:57:16.290823 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb0cf1bb-fa5f-4f46-951a-d3bda77fd294-config-data\") pod \"barbican-api-d4c76548d-zpdfr\" (UID: \"bb0cf1bb-fa5f-4f46-951a-d3bda77fd294\") " pod="openstack/barbican-api-d4c76548d-zpdfr" Mar 20 10:57:16 crc kubenswrapper[4748]: I0320 10:57:16.290921 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb0cf1bb-fa5f-4f46-951a-d3bda77fd294-combined-ca-bundle\") pod \"barbican-api-d4c76548d-zpdfr\" (UID: \"bb0cf1bb-fa5f-4f46-951a-d3bda77fd294\") " pod="openstack/barbican-api-d4c76548d-zpdfr" Mar 20 10:57:16 crc kubenswrapper[4748]: I0320 10:57:16.290968 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bb0cf1bb-fa5f-4f46-951a-d3bda77fd294-logs\") pod \"barbican-api-d4c76548d-zpdfr\" (UID: \"bb0cf1bb-fa5f-4f46-951a-d3bda77fd294\") " pod="openstack/barbican-api-d4c76548d-zpdfr" Mar 20 10:57:16 crc kubenswrapper[4748]: I0320 10:57:16.291107 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bb0cf1bb-fa5f-4f46-951a-d3bda77fd294-config-data-custom\") pod \"barbican-api-d4c76548d-zpdfr\" (UID: \"bb0cf1bb-fa5f-4f46-951a-d3bda77fd294\") " pod="openstack/barbican-api-d4c76548d-zpdfr" Mar 20 10:57:16 crc kubenswrapper[4748]: I0320 10:57:16.291127 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-npgxc\" (UniqueName: \"kubernetes.io/projected/bb0cf1bb-fa5f-4f46-951a-d3bda77fd294-kube-api-access-npgxc\") pod \"barbican-api-d4c76548d-zpdfr\" (UID: \"bb0cf1bb-fa5f-4f46-951a-d3bda77fd294\") " pod="openstack/barbican-api-d4c76548d-zpdfr" Mar 20 10:57:16 crc kubenswrapper[4748]: I0320 10:57:16.292489 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bb0cf1bb-fa5f-4f46-951a-d3bda77fd294-logs\") pod \"barbican-api-d4c76548d-zpdfr\" (UID: \"bb0cf1bb-fa5f-4f46-951a-d3bda77fd294\") " pod="openstack/barbican-api-d4c76548d-zpdfr" Mar 20 10:57:16 crc kubenswrapper[4748]: I0320 10:57:16.301386 4748 generic.go:334] "Generic (PLEG): container finished" podID="e7f8f96f-de61-435d-a542-0b10d8860ccd" containerID="3bc0db86ee4b53156414e6bc24baf2c78fbeca995b3c9e9e4d945b908cfcb834" exitCode=0 Mar 20 10:57:16 crc kubenswrapper[4748]: I0320 10:57:16.301484 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-r6t2v" event={"ID":"e7f8f96f-de61-435d-a542-0b10d8860ccd","Type":"ContainerDied","Data":"3bc0db86ee4b53156414e6bc24baf2c78fbeca995b3c9e9e4d945b908cfcb834"} Mar 20 10:57:16 crc kubenswrapper[4748]: I0320 10:57:16.302814 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb0cf1bb-fa5f-4f46-951a-d3bda77fd294-config-data\") pod \"barbican-api-d4c76548d-zpdfr\" (UID: \"bb0cf1bb-fa5f-4f46-951a-d3bda77fd294\") " pod="openstack/barbican-api-d4c76548d-zpdfr" Mar 20 10:57:16 crc kubenswrapper[4748]: I0320 10:57:16.305449 4748 generic.go:334] "Generic (PLEG): container finished" podID="aa097fcb-4ad5-4fda-a410-a64ad13495d6" containerID="4e733315e69c8dbe83f67d93931f3e3f905d8ab4605e8d87bf5a3078d1e2d846" exitCode=137 Mar 20 10:57:16 crc kubenswrapper[4748]: I0320 10:57:16.305471 4748 generic.go:334] "Generic (PLEG): container finished" podID="aa097fcb-4ad5-4fda-a410-a64ad13495d6" containerID="68208b69a0c9e6eadf462e65d3e43cfd1e8f6f00bf91a2972db8339946ee5370" exitCode=137 Mar 20 10:57:16 crc kubenswrapper[4748]: I0320 10:57:16.305488 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5cfd8f8d7c-h5mnf" event={"ID":"aa097fcb-4ad5-4fda-a410-a64ad13495d6","Type":"ContainerDied","Data":"4e733315e69c8dbe83f67d93931f3e3f905d8ab4605e8d87bf5a3078d1e2d846"} Mar 20 10:57:16 crc kubenswrapper[4748]: I0320 10:57:16.305508 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5cfd8f8d7c-h5mnf" event={"ID":"aa097fcb-4ad5-4fda-a410-a64ad13495d6","Type":"ContainerDied","Data":"68208b69a0c9e6eadf462e65d3e43cfd1e8f6f00bf91a2972db8339946ee5370"} Mar 20 10:57:16 crc kubenswrapper[4748]: I0320 10:57:16.316670 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-npgxc\" (UniqueName: \"kubernetes.io/projected/bb0cf1bb-fa5f-4f46-951a-d3bda77fd294-kube-api-access-npgxc\") pod \"barbican-api-d4c76548d-zpdfr\" (UID: \"bb0cf1bb-fa5f-4f46-951a-d3bda77fd294\") " pod="openstack/barbican-api-d4c76548d-zpdfr" Mar 20 10:57:16 crc kubenswrapper[4748]: I0320 10:57:16.317640 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bb0cf1bb-fa5f-4f46-951a-d3bda77fd294-config-data-custom\") pod \"barbican-api-d4c76548d-zpdfr\" (UID: \"bb0cf1bb-fa5f-4f46-951a-d3bda77fd294\") " pod="openstack/barbican-api-d4c76548d-zpdfr" Mar 20 10:57:16 crc kubenswrapper[4748]: I0320 10:57:16.328724 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-5b69f7d5cb-p5jsk" Mar 20 10:57:16 crc kubenswrapper[4748]: I0320 10:57:16.336902 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb0cf1bb-fa5f-4f46-951a-d3bda77fd294-combined-ca-bundle\") pod \"barbican-api-d4c76548d-zpdfr\" (UID: \"bb0cf1bb-fa5f-4f46-951a-d3bda77fd294\") " pod="openstack/barbican-api-d4c76548d-zpdfr" Mar 20 10:57:16 crc kubenswrapper[4748]: I0320 10:57:16.445381 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-586bdc5f9-nt6hc" Mar 20 10:57:16 crc kubenswrapper[4748]: I0320 10:57:16.512075 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 10:57:16 crc kubenswrapper[4748]: I0320 10:57:16.540969 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-d4c76548d-zpdfr" Mar 20 10:57:16 crc kubenswrapper[4748]: I0320 10:57:16.946533 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-556bf684bc-f9q9w"] Mar 20 10:57:16 crc kubenswrapper[4748]: I0320 10:57:16.956891 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5cfd8f8d7c-h5mnf" Mar 20 10:57:17 crc kubenswrapper[4748]: I0320 10:57:17.010046 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aa097fcb-4ad5-4fda-a410-a64ad13495d6-logs\") pod \"aa097fcb-4ad5-4fda-a410-a64ad13495d6\" (UID: \"aa097fcb-4ad5-4fda-a410-a64ad13495d6\") " Mar 20 10:57:17 crc kubenswrapper[4748]: I0320 10:57:17.010272 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4z4qt\" (UniqueName: \"kubernetes.io/projected/aa097fcb-4ad5-4fda-a410-a64ad13495d6-kube-api-access-4z4qt\") pod \"aa097fcb-4ad5-4fda-a410-a64ad13495d6\" (UID: \"aa097fcb-4ad5-4fda-a410-a64ad13495d6\") " Mar 20 10:57:17 crc kubenswrapper[4748]: I0320 10:57:17.010324 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/aa097fcb-4ad5-4fda-a410-a64ad13495d6-scripts\") pod \"aa097fcb-4ad5-4fda-a410-a64ad13495d6\" (UID: \"aa097fcb-4ad5-4fda-a410-a64ad13495d6\") " Mar 20 10:57:17 crc kubenswrapper[4748]: I0320 10:57:17.010400 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/aa097fcb-4ad5-4fda-a410-a64ad13495d6-horizon-secret-key\") pod \"aa097fcb-4ad5-4fda-a410-a64ad13495d6\" (UID: \"aa097fcb-4ad5-4fda-a410-a64ad13495d6\") " Mar 20 10:57:17 crc kubenswrapper[4748]: I0320 10:57:17.010428 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/aa097fcb-4ad5-4fda-a410-a64ad13495d6-config-data\") pod \"aa097fcb-4ad5-4fda-a410-a64ad13495d6\" (UID: \"aa097fcb-4ad5-4fda-a410-a64ad13495d6\") " Mar 20 10:57:17 crc kubenswrapper[4748]: I0320 10:57:17.011920 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa097fcb-4ad5-4fda-a410-a64ad13495d6-logs" (OuterVolumeSpecName: "logs") pod "aa097fcb-4ad5-4fda-a410-a64ad13495d6" (UID: "aa097fcb-4ad5-4fda-a410-a64ad13495d6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:57:17 crc kubenswrapper[4748]: I0320 10:57:17.015739 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa097fcb-4ad5-4fda-a410-a64ad13495d6-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "aa097fcb-4ad5-4fda-a410-a64ad13495d6" (UID: "aa097fcb-4ad5-4fda-a410-a64ad13495d6"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:57:17 crc kubenswrapper[4748]: I0320 10:57:17.018525 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa097fcb-4ad5-4fda-a410-a64ad13495d6-kube-api-access-4z4qt" (OuterVolumeSpecName: "kube-api-access-4z4qt") pod "aa097fcb-4ad5-4fda-a410-a64ad13495d6" (UID: "aa097fcb-4ad5-4fda-a410-a64ad13495d6"). InnerVolumeSpecName "kube-api-access-4z4qt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:57:17 crc kubenswrapper[4748]: I0320 10:57:17.037283 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa097fcb-4ad5-4fda-a410-a64ad13495d6-scripts" (OuterVolumeSpecName: "scripts") pod "aa097fcb-4ad5-4fda-a410-a64ad13495d6" (UID: "aa097fcb-4ad5-4fda-a410-a64ad13495d6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:57:17 crc kubenswrapper[4748]: I0320 10:57:17.047937 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa097fcb-4ad5-4fda-a410-a64ad13495d6-config-data" (OuterVolumeSpecName: "config-data") pod "aa097fcb-4ad5-4fda-a410-a64ad13495d6" (UID: "aa097fcb-4ad5-4fda-a410-a64ad13495d6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:57:17 crc kubenswrapper[4748]: I0320 10:57:17.112657 4748 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/aa097fcb-4ad5-4fda-a410-a64ad13495d6-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 10:57:17 crc kubenswrapper[4748]: I0320 10:57:17.112695 4748 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aa097fcb-4ad5-4fda-a410-a64ad13495d6-logs\") on node \"crc\" DevicePath \"\"" Mar 20 10:57:17 crc kubenswrapper[4748]: I0320 10:57:17.112704 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4z4qt\" (UniqueName: \"kubernetes.io/projected/aa097fcb-4ad5-4fda-a410-a64ad13495d6-kube-api-access-4z4qt\") on node \"crc\" DevicePath \"\"" Mar 20 10:57:17 crc kubenswrapper[4748]: I0320 10:57:17.112714 4748 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/aa097fcb-4ad5-4fda-a410-a64ad13495d6-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 10:57:17 crc kubenswrapper[4748]: I0320 10:57:17.112723 4748 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/aa097fcb-4ad5-4fda-a410-a64ad13495d6-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 20 10:57:17 crc kubenswrapper[4748]: I0320 10:57:17.162371 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-5b69f7d5cb-p5jsk"] Mar 20 10:57:17 crc kubenswrapper[4748]: W0320 10:57:17.167756 4748 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poded96228e_6626_468c_bf60_a1073dfc123e.slice/crio-3afd7d7b5794c090641c58de7ab45cdf56c70cc1f231dc769b76d67d5fbd78ef WatchSource:0}: Error finding container 3afd7d7b5794c090641c58de7ab45cdf56c70cc1f231dc769b76d67d5fbd78ef: Status 404 returned error can't find the container with id 3afd7d7b5794c090641c58de7ab45cdf56c70cc1f231dc769b76d67d5fbd78ef Mar 20 10:57:17 crc kubenswrapper[4748]: W0320 10:57:17.186129 4748 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcd0027c7_fda5_4ece_a96d_a45762ad902a.slice/crio-269f3d60dfc0e943c705085abfad6be864bccbf4000eab2d5b5dd74f42d8c6af WatchSource:0}: Error finding container 269f3d60dfc0e943c705085abfad6be864bccbf4000eab2d5b5dd74f42d8c6af: Status 404 returned error can't find the container with id 269f3d60dfc0e943c705085abfad6be864bccbf4000eab2d5b5dd74f42d8c6af Mar 20 10:57:17 crc kubenswrapper[4748]: I0320 10:57:17.186389 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-586bdc5f9-nt6hc"] Mar 20 10:57:17 crc kubenswrapper[4748]: I0320 10:57:17.317868 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5b69f7d5cb-p5jsk" event={"ID":"ed96228e-6626-468c-bf60-a1073dfc123e","Type":"ContainerStarted","Data":"3afd7d7b5794c090641c58de7ab45cdf56c70cc1f231dc769b76d67d5fbd78ef"} Mar 20 10:57:17 crc kubenswrapper[4748]: I0320 10:57:17.319566 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-586bdc5f9-nt6hc" event={"ID":"cd0027c7-fda5-4ece-a96d-a45762ad902a","Type":"ContainerStarted","Data":"269f3d60dfc0e943c705085abfad6be864bccbf4000eab2d5b5dd74f42d8c6af"} Mar 20 10:57:17 crc kubenswrapper[4748]: I0320 10:57:17.322505 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-556bf684bc-f9q9w" event={"ID":"6c942b79-bc14-4a48-8fbd-32667bc1afc6","Type":"ContainerStarted","Data":"5394a76d21c628e2d7683fe6e329976005afbb987e2d71c8e6f7a428f8054923"} Mar 20 10:57:17 crc kubenswrapper[4748]: I0320 10:57:17.333787 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5cfd8f8d7c-h5mnf" event={"ID":"aa097fcb-4ad5-4fda-a410-a64ad13495d6","Type":"ContainerDied","Data":"26907ff68937d83f28062eee7b1abfc36dea79a78efe0533d8be2eaa5e12cdf2"} Mar 20 10:57:17 crc kubenswrapper[4748]: I0320 10:57:17.333873 4748 scope.go:117] "RemoveContainer" containerID="4e733315e69c8dbe83f67d93931f3e3f905d8ab4605e8d87bf5a3078d1e2d846" Mar 20 10:57:17 crc kubenswrapper[4748]: I0320 10:57:17.334052 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5cfd8f8d7c-h5mnf" Mar 20 10:57:17 crc kubenswrapper[4748]: I0320 10:57:17.337917 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-d4c76548d-zpdfr"] Mar 20 10:57:17 crc kubenswrapper[4748]: I0320 10:57:17.341880 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f9ddc460-94b3-44d2-9eed-2b4e344f8232","Type":"ContainerStarted","Data":"1dbd4fa19e69235d080432aa0b3ac54cce1f41f90a1059f97b72648d84c0fd39"} Mar 20 10:57:17 crc kubenswrapper[4748]: W0320 10:57:17.354191 4748 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbb0cf1bb_fa5f_4f46_951a_d3bda77fd294.slice/crio-ec1f065a1d6f350dbdf5ad5a083bebbe57b426c1100162ced22925c50149c171 WatchSource:0}: Error finding container ec1f065a1d6f350dbdf5ad5a083bebbe57b426c1100162ced22925c50149c171: Status 404 returned error can't find the container with id ec1f065a1d6f350dbdf5ad5a083bebbe57b426c1100162ced22925c50149c171 Mar 20 10:57:17 crc kubenswrapper[4748]: I0320 10:57:17.386552 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5cfd8f8d7c-h5mnf"] Mar 20 10:57:17 crc kubenswrapper[4748]: I0320 10:57:17.408398 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-5cfd8f8d7c-h5mnf"] Mar 20 10:57:17 crc kubenswrapper[4748]: I0320 10:57:17.545479 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa097fcb-4ad5-4fda-a410-a64ad13495d6" path="/var/lib/kubelet/pods/aa097fcb-4ad5-4fda-a410-a64ad13495d6/volumes" Mar 20 10:57:17 crc kubenswrapper[4748]: I0320 10:57:17.695962 4748 scope.go:117] "RemoveContainer" containerID="68208b69a0c9e6eadf462e65d3e43cfd1e8f6f00bf91a2972db8339946ee5370" Mar 20 10:57:17 crc kubenswrapper[4748]: I0320 10:57:17.786198 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-r6t2v" Mar 20 10:57:17 crc kubenswrapper[4748]: I0320 10:57:17.828299 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7mj62\" (UniqueName: \"kubernetes.io/projected/e7f8f96f-de61-435d-a542-0b10d8860ccd-kube-api-access-7mj62\") pod \"e7f8f96f-de61-435d-a542-0b10d8860ccd\" (UID: \"e7f8f96f-de61-435d-a542-0b10d8860ccd\") " Mar 20 10:57:17 crc kubenswrapper[4748]: I0320 10:57:17.828798 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e7f8f96f-de61-435d-a542-0b10d8860ccd-config\") pod \"e7f8f96f-de61-435d-a542-0b10d8860ccd\" (UID: \"e7f8f96f-de61-435d-a542-0b10d8860ccd\") " Mar 20 10:57:17 crc kubenswrapper[4748]: I0320 10:57:17.829035 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7f8f96f-de61-435d-a542-0b10d8860ccd-combined-ca-bundle\") pod \"e7f8f96f-de61-435d-a542-0b10d8860ccd\" (UID: \"e7f8f96f-de61-435d-a542-0b10d8860ccd\") " Mar 20 10:57:17 crc kubenswrapper[4748]: I0320 10:57:17.846094 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7f8f96f-de61-435d-a542-0b10d8860ccd-kube-api-access-7mj62" (OuterVolumeSpecName: "kube-api-access-7mj62") pod "e7f8f96f-de61-435d-a542-0b10d8860ccd" (UID: "e7f8f96f-de61-435d-a542-0b10d8860ccd"). InnerVolumeSpecName "kube-api-access-7mj62". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:57:17 crc kubenswrapper[4748]: I0320 10:57:17.865691 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7f8f96f-de61-435d-a542-0b10d8860ccd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e7f8f96f-de61-435d-a542-0b10d8860ccd" (UID: "e7f8f96f-de61-435d-a542-0b10d8860ccd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:57:17 crc kubenswrapper[4748]: I0320 10:57:17.872810 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7f8f96f-de61-435d-a542-0b10d8860ccd-config" (OuterVolumeSpecName: "config") pod "e7f8f96f-de61-435d-a542-0b10d8860ccd" (UID: "e7f8f96f-de61-435d-a542-0b10d8860ccd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:57:17 crc kubenswrapper[4748]: I0320 10:57:17.933695 4748 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7f8f96f-de61-435d-a542-0b10d8860ccd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 10:57:17 crc kubenswrapper[4748]: I0320 10:57:17.934335 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7mj62\" (UniqueName: \"kubernetes.io/projected/e7f8f96f-de61-435d-a542-0b10d8860ccd-kube-api-access-7mj62\") on node \"crc\" DevicePath \"\"" Mar 20 10:57:17 crc kubenswrapper[4748]: I0320 10:57:17.934361 4748 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/e7f8f96f-de61-435d-a542-0b10d8860ccd-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:57:18 crc kubenswrapper[4748]: I0320 10:57:18.371045 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-d4c76548d-zpdfr" event={"ID":"bb0cf1bb-fa5f-4f46-951a-d3bda77fd294","Type":"ContainerStarted","Data":"0e08ca5d444382c0f1d8e48666ec37368c779dbed4152da6910ef5ef5a4ba982"} Mar 20 10:57:18 crc kubenswrapper[4748]: I0320 10:57:18.371424 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-d4c76548d-zpdfr" event={"ID":"bb0cf1bb-fa5f-4f46-951a-d3bda77fd294","Type":"ContainerStarted","Data":"b945c162bbdb32d0b7305ed2bedd8589404cd774506d73bd6b1c2ed5c60b9fef"} Mar 20 10:57:18 crc kubenswrapper[4748]: I0320 10:57:18.371442 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-d4c76548d-zpdfr" event={"ID":"bb0cf1bb-fa5f-4f46-951a-d3bda77fd294","Type":"ContainerStarted","Data":"ec1f065a1d6f350dbdf5ad5a083bebbe57b426c1100162ced22925c50149c171"} Mar 20 10:57:18 crc kubenswrapper[4748]: I0320 10:57:18.371997 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-d4c76548d-zpdfr" Mar 20 10:57:18 crc kubenswrapper[4748]: I0320 10:57:18.372057 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-d4c76548d-zpdfr" Mar 20 10:57:18 crc kubenswrapper[4748]: I0320 10:57:18.376405 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f9ddc460-94b3-44d2-9eed-2b4e344f8232","Type":"ContainerStarted","Data":"7e75f88044945d065adcd67d239ad490c07771d34ce5bef16fa3e8c0b57c522a"} Mar 20 10:57:18 crc kubenswrapper[4748]: I0320 10:57:18.378376 4748 generic.go:334] "Generic (PLEG): container finished" podID="cd0027c7-fda5-4ece-a96d-a45762ad902a" containerID="66fa7651be64027ace9f7143628b017a1dd50c839db23f62930476af8cd84dd3" exitCode=0 Mar 20 10:57:18 crc kubenswrapper[4748]: I0320 10:57:18.378423 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-586bdc5f9-nt6hc" event={"ID":"cd0027c7-fda5-4ece-a96d-a45762ad902a","Type":"ContainerDied","Data":"66fa7651be64027ace9f7143628b017a1dd50c839db23f62930476af8cd84dd3"} Mar 20 10:57:18 crc kubenswrapper[4748]: I0320 10:57:18.383469 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-r6t2v" event={"ID":"e7f8f96f-de61-435d-a542-0b10d8860ccd","Type":"ContainerDied","Data":"4f8fae9e624f573d3977105bbfe89b6f89ea2cd1946529b914861b11604a7d51"} Mar 20 10:57:18 crc kubenswrapper[4748]: I0320 10:57:18.383519 4748 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4f8fae9e624f573d3977105bbfe89b6f89ea2cd1946529b914861b11604a7d51" Mar 20 10:57:18 crc kubenswrapper[4748]: I0320 10:57:18.383584 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-r6t2v" Mar 20 10:57:18 crc kubenswrapper[4748]: I0320 10:57:18.409646 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-d4c76548d-zpdfr" podStartSLOduration=3.409612538 podStartE2EDuration="3.409612538s" podCreationTimestamp="2026-03-20 10:57:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:57:18.397091534 +0000 UTC m=+1273.538637368" watchObservedRunningTime="2026-03-20 10:57:18.409612538 +0000 UTC m=+1273.551158352" Mar 20 10:57:18 crc kubenswrapper[4748]: I0320 10:57:18.606172 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-586bdc5f9-nt6hc"] Mar 20 10:57:18 crc kubenswrapper[4748]: I0320 10:57:18.663902 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-sxxjl"] Mar 20 10:57:18 crc kubenswrapper[4748]: E0320 10:57:18.664369 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7f8f96f-de61-435d-a542-0b10d8860ccd" containerName="neutron-db-sync" Mar 20 10:57:18 crc kubenswrapper[4748]: I0320 10:57:18.664397 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7f8f96f-de61-435d-a542-0b10d8860ccd" containerName="neutron-db-sync" Mar 20 10:57:18 crc kubenswrapper[4748]: E0320 10:57:18.664419 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa097fcb-4ad5-4fda-a410-a64ad13495d6" containerName="horizon" Mar 20 10:57:18 crc kubenswrapper[4748]: I0320 10:57:18.664428 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa097fcb-4ad5-4fda-a410-a64ad13495d6" containerName="horizon" Mar 20 10:57:18 crc kubenswrapper[4748]: E0320 10:57:18.664444 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa097fcb-4ad5-4fda-a410-a64ad13495d6" containerName="horizon-log" Mar 20 10:57:18 crc kubenswrapper[4748]: I0320 10:57:18.664452 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa097fcb-4ad5-4fda-a410-a64ad13495d6" containerName="horizon-log" Mar 20 10:57:18 crc kubenswrapper[4748]: I0320 10:57:18.664682 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa097fcb-4ad5-4fda-a410-a64ad13495d6" containerName="horizon" Mar 20 10:57:18 crc kubenswrapper[4748]: I0320 10:57:18.664709 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa097fcb-4ad5-4fda-a410-a64ad13495d6" containerName="horizon-log" Mar 20 10:57:18 crc kubenswrapper[4748]: I0320 10:57:18.664728 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7f8f96f-de61-435d-a542-0b10d8860ccd" containerName="neutron-db-sync" Mar 20 10:57:18 crc kubenswrapper[4748]: I0320 10:57:18.675828 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-sxxjl" Mar 20 10:57:18 crc kubenswrapper[4748]: I0320 10:57:18.701286 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-sxxjl"] Mar 20 10:57:18 crc kubenswrapper[4748]: I0320 10:57:18.734675 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-589cf645cb-wkg45"] Mar 20 10:57:18 crc kubenswrapper[4748]: I0320 10:57:18.736134 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-589cf645cb-wkg45" Mar 20 10:57:18 crc kubenswrapper[4748]: I0320 10:57:18.743537 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 20 10:57:18 crc kubenswrapper[4748]: I0320 10:57:18.743826 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Mar 20 10:57:18 crc kubenswrapper[4748]: I0320 10:57:18.744123 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-6n2f8" Mar 20 10:57:18 crc kubenswrapper[4748]: I0320 10:57:18.744300 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 20 10:57:18 crc kubenswrapper[4748]: I0320 10:57:18.793374 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-589cf645cb-wkg45"] Mar 20 10:57:18 crc kubenswrapper[4748]: I0320 10:57:18.869525 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/86562079-7a54-4b0f-88bd-83634a175a8a-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-sxxjl\" (UID: \"86562079-7a54-4b0f-88bd-83634a175a8a\") " pod="openstack/dnsmasq-dns-85ff748b95-sxxjl" Mar 20 10:57:18 crc kubenswrapper[4748]: I0320 10:57:18.869603 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2689h\" (UniqueName: \"kubernetes.io/projected/86562079-7a54-4b0f-88bd-83634a175a8a-kube-api-access-2689h\") pod \"dnsmasq-dns-85ff748b95-sxxjl\" (UID: \"86562079-7a54-4b0f-88bd-83634a175a8a\") " pod="openstack/dnsmasq-dns-85ff748b95-sxxjl" Mar 20 10:57:18 crc kubenswrapper[4748]: I0320 10:57:18.869690 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8728f34-b5b9-4ced-9424-b83ff940580f-combined-ca-bundle\") pod \"neutron-589cf645cb-wkg45\" (UID: \"c8728f34-b5b9-4ced-9424-b83ff940580f\") " pod="openstack/neutron-589cf645cb-wkg45" Mar 20 10:57:18 crc kubenswrapper[4748]: I0320 10:57:18.869738 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8728f34-b5b9-4ced-9424-b83ff940580f-ovndb-tls-certs\") pod \"neutron-589cf645cb-wkg45\" (UID: \"c8728f34-b5b9-4ced-9424-b83ff940580f\") " pod="openstack/neutron-589cf645cb-wkg45" Mar 20 10:57:18 crc kubenswrapper[4748]: I0320 10:57:18.869798 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/86562079-7a54-4b0f-88bd-83634a175a8a-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-sxxjl\" (UID: \"86562079-7a54-4b0f-88bd-83634a175a8a\") " pod="openstack/dnsmasq-dns-85ff748b95-sxxjl" Mar 20 10:57:18 crc kubenswrapper[4748]: I0320 10:57:18.869919 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/86562079-7a54-4b0f-88bd-83634a175a8a-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-sxxjl\" (UID: \"86562079-7a54-4b0f-88bd-83634a175a8a\") " pod="openstack/dnsmasq-dns-85ff748b95-sxxjl" Mar 20 10:57:18 crc kubenswrapper[4748]: I0320 10:57:18.870830 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c8728f34-b5b9-4ced-9424-b83ff940580f-config\") pod \"neutron-589cf645cb-wkg45\" (UID: \"c8728f34-b5b9-4ced-9424-b83ff940580f\") " pod="openstack/neutron-589cf645cb-wkg45" Mar 20 10:57:18 crc kubenswrapper[4748]: I0320 10:57:18.870884 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-htpsf\" (UniqueName: \"kubernetes.io/projected/c8728f34-b5b9-4ced-9424-b83ff940580f-kube-api-access-htpsf\") pod \"neutron-589cf645cb-wkg45\" (UID: \"c8728f34-b5b9-4ced-9424-b83ff940580f\") " pod="openstack/neutron-589cf645cb-wkg45" Mar 20 10:57:18 crc kubenswrapper[4748]: I0320 10:57:18.870984 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/86562079-7a54-4b0f-88bd-83634a175a8a-dns-svc\") pod \"dnsmasq-dns-85ff748b95-sxxjl\" (UID: \"86562079-7a54-4b0f-88bd-83634a175a8a\") " pod="openstack/dnsmasq-dns-85ff748b95-sxxjl" Mar 20 10:57:18 crc kubenswrapper[4748]: I0320 10:57:18.871035 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/86562079-7a54-4b0f-88bd-83634a175a8a-config\") pod \"dnsmasq-dns-85ff748b95-sxxjl\" (UID: \"86562079-7a54-4b0f-88bd-83634a175a8a\") " pod="openstack/dnsmasq-dns-85ff748b95-sxxjl" Mar 20 10:57:18 crc kubenswrapper[4748]: I0320 10:57:18.871057 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c8728f34-b5b9-4ced-9424-b83ff940580f-httpd-config\") pod \"neutron-589cf645cb-wkg45\" (UID: \"c8728f34-b5b9-4ced-9424-b83ff940580f\") " pod="openstack/neutron-589cf645cb-wkg45" Mar 20 10:57:18 crc kubenswrapper[4748]: I0320 10:57:18.982773 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-htpsf\" (UniqueName: \"kubernetes.io/projected/c8728f34-b5b9-4ced-9424-b83ff940580f-kube-api-access-htpsf\") pod \"neutron-589cf645cb-wkg45\" (UID: \"c8728f34-b5b9-4ced-9424-b83ff940580f\") " pod="openstack/neutron-589cf645cb-wkg45" Mar 20 10:57:18 crc kubenswrapper[4748]: I0320 10:57:18.982873 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/86562079-7a54-4b0f-88bd-83634a175a8a-dns-svc\") pod \"dnsmasq-dns-85ff748b95-sxxjl\" (UID: \"86562079-7a54-4b0f-88bd-83634a175a8a\") " pod="openstack/dnsmasq-dns-85ff748b95-sxxjl" Mar 20 10:57:18 crc kubenswrapper[4748]: I0320 10:57:18.982906 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/86562079-7a54-4b0f-88bd-83634a175a8a-config\") pod \"dnsmasq-dns-85ff748b95-sxxjl\" (UID: \"86562079-7a54-4b0f-88bd-83634a175a8a\") " pod="openstack/dnsmasq-dns-85ff748b95-sxxjl" Mar 20 10:57:18 crc kubenswrapper[4748]: I0320 10:57:18.982932 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c8728f34-b5b9-4ced-9424-b83ff940580f-httpd-config\") pod \"neutron-589cf645cb-wkg45\" (UID: \"c8728f34-b5b9-4ced-9424-b83ff940580f\") " pod="openstack/neutron-589cf645cb-wkg45" Mar 20 10:57:18 crc kubenswrapper[4748]: I0320 10:57:18.982977 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/86562079-7a54-4b0f-88bd-83634a175a8a-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-sxxjl\" (UID: \"86562079-7a54-4b0f-88bd-83634a175a8a\") " pod="openstack/dnsmasq-dns-85ff748b95-sxxjl" Mar 20 10:57:18 crc kubenswrapper[4748]: I0320 10:57:18.983006 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2689h\" (UniqueName: \"kubernetes.io/projected/86562079-7a54-4b0f-88bd-83634a175a8a-kube-api-access-2689h\") pod \"dnsmasq-dns-85ff748b95-sxxjl\" (UID: \"86562079-7a54-4b0f-88bd-83634a175a8a\") " pod="openstack/dnsmasq-dns-85ff748b95-sxxjl" Mar 20 10:57:18 crc kubenswrapper[4748]: I0320 10:57:18.983046 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8728f34-b5b9-4ced-9424-b83ff940580f-combined-ca-bundle\") pod \"neutron-589cf645cb-wkg45\" (UID: \"c8728f34-b5b9-4ced-9424-b83ff940580f\") " pod="openstack/neutron-589cf645cb-wkg45" Mar 20 10:57:18 crc kubenswrapper[4748]: I0320 10:57:18.983080 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8728f34-b5b9-4ced-9424-b83ff940580f-ovndb-tls-certs\") pod \"neutron-589cf645cb-wkg45\" (UID: \"c8728f34-b5b9-4ced-9424-b83ff940580f\") " pod="openstack/neutron-589cf645cb-wkg45" Mar 20 10:57:18 crc kubenswrapper[4748]: I0320 10:57:18.983116 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/86562079-7a54-4b0f-88bd-83634a175a8a-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-sxxjl\" (UID: \"86562079-7a54-4b0f-88bd-83634a175a8a\") " pod="openstack/dnsmasq-dns-85ff748b95-sxxjl" Mar 20 10:57:18 crc kubenswrapper[4748]: I0320 10:57:18.983178 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/86562079-7a54-4b0f-88bd-83634a175a8a-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-sxxjl\" (UID: \"86562079-7a54-4b0f-88bd-83634a175a8a\") " pod="openstack/dnsmasq-dns-85ff748b95-sxxjl" Mar 20 10:57:18 crc kubenswrapper[4748]: I0320 10:57:18.983203 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c8728f34-b5b9-4ced-9424-b83ff940580f-config\") pod \"neutron-589cf645cb-wkg45\" (UID: \"c8728f34-b5b9-4ced-9424-b83ff940580f\") " pod="openstack/neutron-589cf645cb-wkg45" Mar 20 10:57:18 crc kubenswrapper[4748]: I0320 10:57:18.985721 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/86562079-7a54-4b0f-88bd-83634a175a8a-config\") pod \"dnsmasq-dns-85ff748b95-sxxjl\" (UID: \"86562079-7a54-4b0f-88bd-83634a175a8a\") " pod="openstack/dnsmasq-dns-85ff748b95-sxxjl" Mar 20 10:57:18 crc kubenswrapper[4748]: I0320 10:57:18.988028 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/86562079-7a54-4b0f-88bd-83634a175a8a-dns-svc\") pod \"dnsmasq-dns-85ff748b95-sxxjl\" (UID: \"86562079-7a54-4b0f-88bd-83634a175a8a\") " pod="openstack/dnsmasq-dns-85ff748b95-sxxjl" Mar 20 10:57:18 crc kubenswrapper[4748]: I0320 10:57:18.988299 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/86562079-7a54-4b0f-88bd-83634a175a8a-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-sxxjl\" (UID: \"86562079-7a54-4b0f-88bd-83634a175a8a\") " pod="openstack/dnsmasq-dns-85ff748b95-sxxjl" Mar 20 10:57:18 crc kubenswrapper[4748]: I0320 10:57:18.988527 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/c8728f34-b5b9-4ced-9424-b83ff940580f-config\") pod \"neutron-589cf645cb-wkg45\" (UID: \"c8728f34-b5b9-4ced-9424-b83ff940580f\") " pod="openstack/neutron-589cf645cb-wkg45" Mar 20 10:57:18 crc kubenswrapper[4748]: I0320 10:57:18.988735 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/86562079-7a54-4b0f-88bd-83634a175a8a-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-sxxjl\" (UID: \"86562079-7a54-4b0f-88bd-83634a175a8a\") " pod="openstack/dnsmasq-dns-85ff748b95-sxxjl" Mar 20 10:57:18 crc kubenswrapper[4748]: I0320 10:57:18.988926 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/86562079-7a54-4b0f-88bd-83634a175a8a-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-sxxjl\" (UID: \"86562079-7a54-4b0f-88bd-83634a175a8a\") " pod="openstack/dnsmasq-dns-85ff748b95-sxxjl" Mar 20 10:57:18 crc kubenswrapper[4748]: I0320 10:57:18.991888 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8728f34-b5b9-4ced-9424-b83ff940580f-combined-ca-bundle\") pod \"neutron-589cf645cb-wkg45\" (UID: \"c8728f34-b5b9-4ced-9424-b83ff940580f\") " pod="openstack/neutron-589cf645cb-wkg45" Mar 20 10:57:18 crc kubenswrapper[4748]: I0320 10:57:18.992005 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8728f34-b5b9-4ced-9424-b83ff940580f-ovndb-tls-certs\") pod \"neutron-589cf645cb-wkg45\" (UID: \"c8728f34-b5b9-4ced-9424-b83ff940580f\") " pod="openstack/neutron-589cf645cb-wkg45" Mar 20 10:57:18 crc kubenswrapper[4748]: I0320 10:57:18.992060 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c8728f34-b5b9-4ced-9424-b83ff940580f-httpd-config\") pod \"neutron-589cf645cb-wkg45\" (UID: \"c8728f34-b5b9-4ced-9424-b83ff940580f\") " pod="openstack/neutron-589cf645cb-wkg45" Mar 20 10:57:19 crc kubenswrapper[4748]: I0320 10:57:19.003565 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2689h\" (UniqueName: \"kubernetes.io/projected/86562079-7a54-4b0f-88bd-83634a175a8a-kube-api-access-2689h\") pod \"dnsmasq-dns-85ff748b95-sxxjl\" (UID: \"86562079-7a54-4b0f-88bd-83634a175a8a\") " pod="openstack/dnsmasq-dns-85ff748b95-sxxjl" Mar 20 10:57:19 crc kubenswrapper[4748]: I0320 10:57:19.004887 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-htpsf\" (UniqueName: \"kubernetes.io/projected/c8728f34-b5b9-4ced-9424-b83ff940580f-kube-api-access-htpsf\") pod \"neutron-589cf645cb-wkg45\" (UID: \"c8728f34-b5b9-4ced-9424-b83ff940580f\") " pod="openstack/neutron-589cf645cb-wkg45" Mar 20 10:57:19 crc kubenswrapper[4748]: I0320 10:57:19.104624 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-sxxjl" Mar 20 10:57:19 crc kubenswrapper[4748]: I0320 10:57:19.115321 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-589cf645cb-wkg45" Mar 20 10:57:19 crc kubenswrapper[4748]: I0320 10:57:19.404331 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f9ddc460-94b3-44d2-9eed-2b4e344f8232","Type":"ContainerStarted","Data":"353e45ad1106596beef54eccff0883480d7ef6880bf17ea3857025f6eeabaef0"} Mar 20 10:57:19 crc kubenswrapper[4748]: I0320 10:57:19.408163 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-586bdc5f9-nt6hc" event={"ID":"cd0027c7-fda5-4ece-a96d-a45762ad902a","Type":"ContainerStarted","Data":"495cd93e6040a80b4c2b6f07f1a8b3dedbcf2a7062c8e29da71c1bc92d3d8751"} Mar 20 10:57:19 crc kubenswrapper[4748]: I0320 10:57:19.408392 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-586bdc5f9-nt6hc" podUID="cd0027c7-fda5-4ece-a96d-a45762ad902a" containerName="dnsmasq-dns" containerID="cri-o://495cd93e6040a80b4c2b6f07f1a8b3dedbcf2a7062c8e29da71c1bc92d3d8751" gracePeriod=10 Mar 20 10:57:19 crc kubenswrapper[4748]: I0320 10:57:19.408687 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-586bdc5f9-nt6hc" Mar 20 10:57:19 crc kubenswrapper[4748]: I0320 10:57:19.420960 4748 generic.go:334] "Generic (PLEG): container finished" podID="af16052b-a5ab-4244-b007-69a32d050a35" containerID="b38554dd1be02a45128fe9b6df10200ec1e855d0b4273f34b69a7887f02090c1" exitCode=0 Mar 20 10:57:19 crc kubenswrapper[4748]: I0320 10:57:19.421821 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-rm5bp" event={"ID":"af16052b-a5ab-4244-b007-69a32d050a35","Type":"ContainerDied","Data":"b38554dd1be02a45128fe9b6df10200ec1e855d0b4273f34b69a7887f02090c1"} Mar 20 10:57:19 crc kubenswrapper[4748]: I0320 10:57:19.427587 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-77b5d7f4f8-8jmkc"] Mar 20 10:57:19 crc kubenswrapper[4748]: I0320 10:57:19.430140 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-77b5d7f4f8-8jmkc" Mar 20 10:57:19 crc kubenswrapper[4748]: I0320 10:57:19.432090 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Mar 20 10:57:19 crc kubenswrapper[4748]: I0320 10:57:19.434632 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Mar 20 10:57:19 crc kubenswrapper[4748]: I0320 10:57:19.441008 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-586bdc5f9-nt6hc" podStartSLOduration=4.44098717 podStartE2EDuration="4.44098717s" podCreationTimestamp="2026-03-20 10:57:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:57:19.435447381 +0000 UTC m=+1274.576993195" watchObservedRunningTime="2026-03-20 10:57:19.44098717 +0000 UTC m=+1274.582532974" Mar 20 10:57:19 crc kubenswrapper[4748]: I0320 10:57:19.460379 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-77b5d7f4f8-8jmkc"] Mar 20 10:57:19 crc kubenswrapper[4748]: I0320 10:57:19.598315 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/201e8a26-7bfa-40c7-aa3d-bf32c1344d61-combined-ca-bundle\") pod \"barbican-api-77b5d7f4f8-8jmkc\" (UID: \"201e8a26-7bfa-40c7-aa3d-bf32c1344d61\") " pod="openstack/barbican-api-77b5d7f4f8-8jmkc" Mar 20 10:57:19 crc kubenswrapper[4748]: I0320 10:57:19.600991 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/201e8a26-7bfa-40c7-aa3d-bf32c1344d61-config-data-custom\") pod \"barbican-api-77b5d7f4f8-8jmkc\" (UID: \"201e8a26-7bfa-40c7-aa3d-bf32c1344d61\") " pod="openstack/barbican-api-77b5d7f4f8-8jmkc" Mar 20 10:57:19 crc kubenswrapper[4748]: I0320 10:57:19.601226 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/201e8a26-7bfa-40c7-aa3d-bf32c1344d61-public-tls-certs\") pod \"barbican-api-77b5d7f4f8-8jmkc\" (UID: \"201e8a26-7bfa-40c7-aa3d-bf32c1344d61\") " pod="openstack/barbican-api-77b5d7f4f8-8jmkc" Mar 20 10:57:19 crc kubenswrapper[4748]: I0320 10:57:19.601263 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/201e8a26-7bfa-40c7-aa3d-bf32c1344d61-logs\") pod \"barbican-api-77b5d7f4f8-8jmkc\" (UID: \"201e8a26-7bfa-40c7-aa3d-bf32c1344d61\") " pod="openstack/barbican-api-77b5d7f4f8-8jmkc" Mar 20 10:57:19 crc kubenswrapper[4748]: I0320 10:57:19.601333 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/201e8a26-7bfa-40c7-aa3d-bf32c1344d61-internal-tls-certs\") pod \"barbican-api-77b5d7f4f8-8jmkc\" (UID: \"201e8a26-7bfa-40c7-aa3d-bf32c1344d61\") " pod="openstack/barbican-api-77b5d7f4f8-8jmkc" Mar 20 10:57:19 crc kubenswrapper[4748]: I0320 10:57:19.601482 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k74k5\" (UniqueName: \"kubernetes.io/projected/201e8a26-7bfa-40c7-aa3d-bf32c1344d61-kube-api-access-k74k5\") pod \"barbican-api-77b5d7f4f8-8jmkc\" (UID: \"201e8a26-7bfa-40c7-aa3d-bf32c1344d61\") " pod="openstack/barbican-api-77b5d7f4f8-8jmkc" Mar 20 10:57:19 crc kubenswrapper[4748]: I0320 10:57:19.601779 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/201e8a26-7bfa-40c7-aa3d-bf32c1344d61-config-data\") pod \"barbican-api-77b5d7f4f8-8jmkc\" (UID: \"201e8a26-7bfa-40c7-aa3d-bf32c1344d61\") " pod="openstack/barbican-api-77b5d7f4f8-8jmkc" Mar 20 10:57:19 crc kubenswrapper[4748]: I0320 10:57:19.704965 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/201e8a26-7bfa-40c7-aa3d-bf32c1344d61-config-data\") pod \"barbican-api-77b5d7f4f8-8jmkc\" (UID: \"201e8a26-7bfa-40c7-aa3d-bf32c1344d61\") " pod="openstack/barbican-api-77b5d7f4f8-8jmkc" Mar 20 10:57:19 crc kubenswrapper[4748]: I0320 10:57:19.706023 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/201e8a26-7bfa-40c7-aa3d-bf32c1344d61-combined-ca-bundle\") pod \"barbican-api-77b5d7f4f8-8jmkc\" (UID: \"201e8a26-7bfa-40c7-aa3d-bf32c1344d61\") " pod="openstack/barbican-api-77b5d7f4f8-8jmkc" Mar 20 10:57:19 crc kubenswrapper[4748]: I0320 10:57:19.706849 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/201e8a26-7bfa-40c7-aa3d-bf32c1344d61-config-data-custom\") pod \"barbican-api-77b5d7f4f8-8jmkc\" (UID: \"201e8a26-7bfa-40c7-aa3d-bf32c1344d61\") " pod="openstack/barbican-api-77b5d7f4f8-8jmkc" Mar 20 10:57:19 crc kubenswrapper[4748]: I0320 10:57:19.707094 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/201e8a26-7bfa-40c7-aa3d-bf32c1344d61-public-tls-certs\") pod \"barbican-api-77b5d7f4f8-8jmkc\" (UID: \"201e8a26-7bfa-40c7-aa3d-bf32c1344d61\") " pod="openstack/barbican-api-77b5d7f4f8-8jmkc" Mar 20 10:57:19 crc kubenswrapper[4748]: I0320 10:57:19.707221 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/201e8a26-7bfa-40c7-aa3d-bf32c1344d61-logs\") pod \"barbican-api-77b5d7f4f8-8jmkc\" (UID: \"201e8a26-7bfa-40c7-aa3d-bf32c1344d61\") " pod="openstack/barbican-api-77b5d7f4f8-8jmkc" Mar 20 10:57:19 crc kubenswrapper[4748]: I0320 10:57:19.707368 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/201e8a26-7bfa-40c7-aa3d-bf32c1344d61-internal-tls-certs\") pod \"barbican-api-77b5d7f4f8-8jmkc\" (UID: \"201e8a26-7bfa-40c7-aa3d-bf32c1344d61\") " pod="openstack/barbican-api-77b5d7f4f8-8jmkc" Mar 20 10:57:19 crc kubenswrapper[4748]: I0320 10:57:19.707528 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k74k5\" (UniqueName: \"kubernetes.io/projected/201e8a26-7bfa-40c7-aa3d-bf32c1344d61-kube-api-access-k74k5\") pod \"barbican-api-77b5d7f4f8-8jmkc\" (UID: \"201e8a26-7bfa-40c7-aa3d-bf32c1344d61\") " pod="openstack/barbican-api-77b5d7f4f8-8jmkc" Mar 20 10:57:19 crc kubenswrapper[4748]: I0320 10:57:19.707585 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/201e8a26-7bfa-40c7-aa3d-bf32c1344d61-logs\") pod \"barbican-api-77b5d7f4f8-8jmkc\" (UID: \"201e8a26-7bfa-40c7-aa3d-bf32c1344d61\") " pod="openstack/barbican-api-77b5d7f4f8-8jmkc" Mar 20 10:57:19 crc kubenswrapper[4748]: I0320 10:57:19.710960 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/201e8a26-7bfa-40c7-aa3d-bf32c1344d61-config-data\") pod \"barbican-api-77b5d7f4f8-8jmkc\" (UID: \"201e8a26-7bfa-40c7-aa3d-bf32c1344d61\") " pod="openstack/barbican-api-77b5d7f4f8-8jmkc" Mar 20 10:57:19 crc kubenswrapper[4748]: I0320 10:57:19.711566 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/201e8a26-7bfa-40c7-aa3d-bf32c1344d61-combined-ca-bundle\") pod \"barbican-api-77b5d7f4f8-8jmkc\" (UID: \"201e8a26-7bfa-40c7-aa3d-bf32c1344d61\") " pod="openstack/barbican-api-77b5d7f4f8-8jmkc" Mar 20 10:57:19 crc kubenswrapper[4748]: I0320 10:57:19.725499 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k74k5\" (UniqueName: \"kubernetes.io/projected/201e8a26-7bfa-40c7-aa3d-bf32c1344d61-kube-api-access-k74k5\") pod \"barbican-api-77b5d7f4f8-8jmkc\" (UID: \"201e8a26-7bfa-40c7-aa3d-bf32c1344d61\") " pod="openstack/barbican-api-77b5d7f4f8-8jmkc" Mar 20 10:57:19 crc kubenswrapper[4748]: I0320 10:57:19.725508 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/201e8a26-7bfa-40c7-aa3d-bf32c1344d61-config-data-custom\") pod \"barbican-api-77b5d7f4f8-8jmkc\" (UID: \"201e8a26-7bfa-40c7-aa3d-bf32c1344d61\") " pod="openstack/barbican-api-77b5d7f4f8-8jmkc" Mar 20 10:57:19 crc kubenswrapper[4748]: I0320 10:57:19.729257 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/201e8a26-7bfa-40c7-aa3d-bf32c1344d61-internal-tls-certs\") pod \"barbican-api-77b5d7f4f8-8jmkc\" (UID: \"201e8a26-7bfa-40c7-aa3d-bf32c1344d61\") " pod="openstack/barbican-api-77b5d7f4f8-8jmkc" Mar 20 10:57:19 crc kubenswrapper[4748]: I0320 10:57:19.742424 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/201e8a26-7bfa-40c7-aa3d-bf32c1344d61-public-tls-certs\") pod \"barbican-api-77b5d7f4f8-8jmkc\" (UID: \"201e8a26-7bfa-40c7-aa3d-bf32c1344d61\") " pod="openstack/barbican-api-77b5d7f4f8-8jmkc" Mar 20 10:57:19 crc kubenswrapper[4748]: I0320 10:57:19.852690 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-77b5d7f4f8-8jmkc" Mar 20 10:57:20 crc kubenswrapper[4748]: I0320 10:57:20.475698 4748 generic.go:334] "Generic (PLEG): container finished" podID="cd0027c7-fda5-4ece-a96d-a45762ad902a" containerID="495cd93e6040a80b4c2b6f07f1a8b3dedbcf2a7062c8e29da71c1bc92d3d8751" exitCode=0 Mar 20 10:57:20 crc kubenswrapper[4748]: I0320 10:57:20.476568 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-586bdc5f9-nt6hc" event={"ID":"cd0027c7-fda5-4ece-a96d-a45762ad902a","Type":"ContainerDied","Data":"495cd93e6040a80b4c2b6f07f1a8b3dedbcf2a7062c8e29da71c1bc92d3d8751"} Mar 20 10:57:20 crc kubenswrapper[4748]: I0320 10:57:20.476607 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-586bdc5f9-nt6hc" event={"ID":"cd0027c7-fda5-4ece-a96d-a45762ad902a","Type":"ContainerDied","Data":"269f3d60dfc0e943c705085abfad6be864bccbf4000eab2d5b5dd74f42d8c6af"} Mar 20 10:57:20 crc kubenswrapper[4748]: I0320 10:57:20.476622 4748 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="269f3d60dfc0e943c705085abfad6be864bccbf4000eab2d5b5dd74f42d8c6af" Mar 20 10:57:20 crc kubenswrapper[4748]: I0320 10:57:20.602862 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-586bdc5f9-nt6hc" Mar 20 10:57:20 crc kubenswrapper[4748]: I0320 10:57:20.730867 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cd0027c7-fda5-4ece-a96d-a45762ad902a-ovsdbserver-sb\") pod \"cd0027c7-fda5-4ece-a96d-a45762ad902a\" (UID: \"cd0027c7-fda5-4ece-a96d-a45762ad902a\") " Mar 20 10:57:20 crc kubenswrapper[4748]: I0320 10:57:20.731027 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd0027c7-fda5-4ece-a96d-a45762ad902a-config\") pod \"cd0027c7-fda5-4ece-a96d-a45762ad902a\" (UID: \"cd0027c7-fda5-4ece-a96d-a45762ad902a\") " Mar 20 10:57:20 crc kubenswrapper[4748]: I0320 10:57:20.731057 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cd0027c7-fda5-4ece-a96d-a45762ad902a-dns-swift-storage-0\") pod \"cd0027c7-fda5-4ece-a96d-a45762ad902a\" (UID: \"cd0027c7-fda5-4ece-a96d-a45762ad902a\") " Mar 20 10:57:20 crc kubenswrapper[4748]: I0320 10:57:20.731510 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cd0027c7-fda5-4ece-a96d-a45762ad902a-dns-svc\") pod \"cd0027c7-fda5-4ece-a96d-a45762ad902a\" (UID: \"cd0027c7-fda5-4ece-a96d-a45762ad902a\") " Mar 20 10:57:20 crc kubenswrapper[4748]: I0320 10:57:20.731868 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5bzgg\" (UniqueName: \"kubernetes.io/projected/cd0027c7-fda5-4ece-a96d-a45762ad902a-kube-api-access-5bzgg\") pod \"cd0027c7-fda5-4ece-a96d-a45762ad902a\" (UID: \"cd0027c7-fda5-4ece-a96d-a45762ad902a\") " Mar 20 10:57:20 crc kubenswrapper[4748]: I0320 10:57:20.731959 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cd0027c7-fda5-4ece-a96d-a45762ad902a-ovsdbserver-nb\") pod \"cd0027c7-fda5-4ece-a96d-a45762ad902a\" (UID: \"cd0027c7-fda5-4ece-a96d-a45762ad902a\") " Mar 20 10:57:20 crc kubenswrapper[4748]: I0320 10:57:20.737044 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd0027c7-fda5-4ece-a96d-a45762ad902a-kube-api-access-5bzgg" (OuterVolumeSpecName: "kube-api-access-5bzgg") pod "cd0027c7-fda5-4ece-a96d-a45762ad902a" (UID: "cd0027c7-fda5-4ece-a96d-a45762ad902a"). InnerVolumeSpecName "kube-api-access-5bzgg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:57:20 crc kubenswrapper[4748]: I0320 10:57:20.840911 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5bzgg\" (UniqueName: \"kubernetes.io/projected/cd0027c7-fda5-4ece-a96d-a45762ad902a-kube-api-access-5bzgg\") on node \"crc\" DevicePath \"\"" Mar 20 10:57:20 crc kubenswrapper[4748]: I0320 10:57:20.932826 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd0027c7-fda5-4ece-a96d-a45762ad902a-config" (OuterVolumeSpecName: "config") pod "cd0027c7-fda5-4ece-a96d-a45762ad902a" (UID: "cd0027c7-fda5-4ece-a96d-a45762ad902a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:57:20 crc kubenswrapper[4748]: I0320 10:57:20.933549 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-sxxjl"] Mar 20 10:57:20 crc kubenswrapper[4748]: I0320 10:57:20.943523 4748 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd0027c7-fda5-4ece-a96d-a45762ad902a-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:57:20 crc kubenswrapper[4748]: I0320 10:57:20.953339 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-rm5bp" Mar 20 10:57:20 crc kubenswrapper[4748]: I0320 10:57:20.955042 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd0027c7-fda5-4ece-a96d-a45762ad902a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "cd0027c7-fda5-4ece-a96d-a45762ad902a" (UID: "cd0027c7-fda5-4ece-a96d-a45762ad902a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:57:20 crc kubenswrapper[4748]: I0320 10:57:20.968040 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd0027c7-fda5-4ece-a96d-a45762ad902a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "cd0027c7-fda5-4ece-a96d-a45762ad902a" (UID: "cd0027c7-fda5-4ece-a96d-a45762ad902a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:57:20 crc kubenswrapper[4748]: I0320 10:57:20.989491 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-77b5d7f4f8-8jmkc"] Mar 20 10:57:21 crc kubenswrapper[4748]: W0320 10:57:21.013574 4748 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod201e8a26_7bfa_40c7_aa3d_bf32c1344d61.slice/crio-b80275da95b29920993dfde339a268a717fedc4da66c8d0ee46000a1230ec691 WatchSource:0}: Error finding container b80275da95b29920993dfde339a268a717fedc4da66c8d0ee46000a1230ec691: Status 404 returned error can't find the container with id b80275da95b29920993dfde339a268a717fedc4da66c8d0ee46000a1230ec691 Mar 20 10:57:21 crc kubenswrapper[4748]: I0320 10:57:21.024756 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd0027c7-fda5-4ece-a96d-a45762ad902a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "cd0027c7-fda5-4ece-a96d-a45762ad902a" (UID: "cd0027c7-fda5-4ece-a96d-a45762ad902a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:57:21 crc kubenswrapper[4748]: I0320 10:57:21.042361 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd0027c7-fda5-4ece-a96d-a45762ad902a-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "cd0027c7-fda5-4ece-a96d-a45762ad902a" (UID: "cd0027c7-fda5-4ece-a96d-a45762ad902a"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:57:21 crc kubenswrapper[4748]: I0320 10:57:21.044465 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af16052b-a5ab-4244-b007-69a32d050a35-scripts\") pod \"af16052b-a5ab-4244-b007-69a32d050a35\" (UID: \"af16052b-a5ab-4244-b007-69a32d050a35\") " Mar 20 10:57:21 crc kubenswrapper[4748]: I0320 10:57:21.048560 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af16052b-a5ab-4244-b007-69a32d050a35-config-data\") pod \"af16052b-a5ab-4244-b007-69a32d050a35\" (UID: \"af16052b-a5ab-4244-b007-69a32d050a35\") " Mar 20 10:57:21 crc kubenswrapper[4748]: I0320 10:57:21.049043 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ng9kt\" (UniqueName: \"kubernetes.io/projected/af16052b-a5ab-4244-b007-69a32d050a35-kube-api-access-ng9kt\") pod \"af16052b-a5ab-4244-b007-69a32d050a35\" (UID: \"af16052b-a5ab-4244-b007-69a32d050a35\") " Mar 20 10:57:21 crc kubenswrapper[4748]: I0320 10:57:21.049615 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af16052b-a5ab-4244-b007-69a32d050a35-combined-ca-bundle\") pod \"af16052b-a5ab-4244-b007-69a32d050a35\" (UID: \"af16052b-a5ab-4244-b007-69a32d050a35\") " Mar 20 10:57:21 crc kubenswrapper[4748]: I0320 10:57:21.049889 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/af16052b-a5ab-4244-b007-69a32d050a35-db-sync-config-data\") pod \"af16052b-a5ab-4244-b007-69a32d050a35\" (UID: \"af16052b-a5ab-4244-b007-69a32d050a35\") " Mar 20 10:57:21 crc kubenswrapper[4748]: I0320 10:57:21.050060 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/af16052b-a5ab-4244-b007-69a32d050a35-etc-machine-id\") pod \"af16052b-a5ab-4244-b007-69a32d050a35\" (UID: \"af16052b-a5ab-4244-b007-69a32d050a35\") " Mar 20 10:57:21 crc kubenswrapper[4748]: I0320 10:57:21.051345 4748 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cd0027c7-fda5-4ece-a96d-a45762ad902a-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 20 10:57:21 crc kubenswrapper[4748]: I0320 10:57:21.051523 4748 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cd0027c7-fda5-4ece-a96d-a45762ad902a-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 10:57:21 crc kubenswrapper[4748]: I0320 10:57:21.051636 4748 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cd0027c7-fda5-4ece-a96d-a45762ad902a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 10:57:21 crc kubenswrapper[4748]: I0320 10:57:21.051735 4748 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cd0027c7-fda5-4ece-a96d-a45762ad902a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 10:57:21 crc kubenswrapper[4748]: I0320 10:57:21.051627 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/af16052b-a5ab-4244-b007-69a32d050a35-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "af16052b-a5ab-4244-b007-69a32d050a35" (UID: "af16052b-a5ab-4244-b007-69a32d050a35"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 10:57:21 crc kubenswrapper[4748]: I0320 10:57:21.051782 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af16052b-a5ab-4244-b007-69a32d050a35-scripts" (OuterVolumeSpecName: "scripts") pod "af16052b-a5ab-4244-b007-69a32d050a35" (UID: "af16052b-a5ab-4244-b007-69a32d050a35"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:57:21 crc kubenswrapper[4748]: I0320 10:57:21.061862 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af16052b-a5ab-4244-b007-69a32d050a35-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "af16052b-a5ab-4244-b007-69a32d050a35" (UID: "af16052b-a5ab-4244-b007-69a32d050a35"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:57:21 crc kubenswrapper[4748]: I0320 10:57:21.065016 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af16052b-a5ab-4244-b007-69a32d050a35-kube-api-access-ng9kt" (OuterVolumeSpecName: "kube-api-access-ng9kt") pod "af16052b-a5ab-4244-b007-69a32d050a35" (UID: "af16052b-a5ab-4244-b007-69a32d050a35"). InnerVolumeSpecName "kube-api-access-ng9kt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:57:21 crc kubenswrapper[4748]: I0320 10:57:21.069519 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-589cf645cb-wkg45"] Mar 20 10:57:21 crc kubenswrapper[4748]: I0320 10:57:21.096262 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af16052b-a5ab-4244-b007-69a32d050a35-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "af16052b-a5ab-4244-b007-69a32d050a35" (UID: "af16052b-a5ab-4244-b007-69a32d050a35"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:57:21 crc kubenswrapper[4748]: W0320 10:57:21.106068 4748 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc8728f34_b5b9_4ced_9424_b83ff940580f.slice/crio-50ff638ef6da43b65c0a92fd98e016e06e128fed98784dc44857c6bfbf24f48c WatchSource:0}: Error finding container 50ff638ef6da43b65c0a92fd98e016e06e128fed98784dc44857c6bfbf24f48c: Status 404 returned error can't find the container with id 50ff638ef6da43b65c0a92fd98e016e06e128fed98784dc44857c6bfbf24f48c Mar 20 10:57:21 crc kubenswrapper[4748]: I0320 10:57:21.118639 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af16052b-a5ab-4244-b007-69a32d050a35-config-data" (OuterVolumeSpecName: "config-data") pod "af16052b-a5ab-4244-b007-69a32d050a35" (UID: "af16052b-a5ab-4244-b007-69a32d050a35"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:57:21 crc kubenswrapper[4748]: I0320 10:57:21.154453 4748 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/af16052b-a5ab-4244-b007-69a32d050a35-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 10:57:21 crc kubenswrapper[4748]: I0320 10:57:21.154698 4748 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/af16052b-a5ab-4244-b007-69a32d050a35-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 20 10:57:21 crc kubenswrapper[4748]: I0320 10:57:21.154806 4748 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af16052b-a5ab-4244-b007-69a32d050a35-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 10:57:21 crc kubenswrapper[4748]: I0320 10:57:21.154896 4748 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af16052b-a5ab-4244-b007-69a32d050a35-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 10:57:21 crc kubenswrapper[4748]: I0320 10:57:21.154980 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ng9kt\" (UniqueName: \"kubernetes.io/projected/af16052b-a5ab-4244-b007-69a32d050a35-kube-api-access-ng9kt\") on node \"crc\" DevicePath \"\"" Mar 20 10:57:21 crc kubenswrapper[4748]: I0320 10:57:21.155055 4748 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af16052b-a5ab-4244-b007-69a32d050a35-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 10:57:21 crc kubenswrapper[4748]: I0320 10:57:21.507998 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f9ddc460-94b3-44d2-9eed-2b4e344f8232","Type":"ContainerStarted","Data":"04847ca44cd324189231aad70f02b74561c73fd6831bcdaaf48927e9710ec62a"} Mar 20 10:57:21 crc kubenswrapper[4748]: I0320 10:57:21.510160 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-589cf645cb-wkg45" event={"ID":"c8728f34-b5b9-4ced-9424-b83ff940580f","Type":"ContainerStarted","Data":"a0b237751dc526e15e76f71e5acad80dcc3d5b8a27566599661be2cfd8af52aa"} Mar 20 10:57:21 crc kubenswrapper[4748]: I0320 10:57:21.510217 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-589cf645cb-wkg45" event={"ID":"c8728f34-b5b9-4ced-9424-b83ff940580f","Type":"ContainerStarted","Data":"50ff638ef6da43b65c0a92fd98e016e06e128fed98784dc44857c6bfbf24f48c"} Mar 20 10:57:21 crc kubenswrapper[4748]: I0320 10:57:21.540583 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-rm5bp" Mar 20 10:57:21 crc kubenswrapper[4748]: I0320 10:57:21.546465 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-556bf684bc-f9q9w" event={"ID":"6c942b79-bc14-4a48-8fbd-32667bc1afc6","Type":"ContainerStarted","Data":"5563daf629262ee0b05ca6e6909f872b18d8a15c842405ed58b1bf65587ae670"} Mar 20 10:57:21 crc kubenswrapper[4748]: I0320 10:57:21.546826 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-556bf684bc-f9q9w" event={"ID":"6c942b79-bc14-4a48-8fbd-32667bc1afc6","Type":"ContainerStarted","Data":"0ade5d3344a5da006d46128ce07ce4eeef84429474a9ef916fad30069ec6d23e"} Mar 20 10:57:21 crc kubenswrapper[4748]: I0320 10:57:21.558913 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-rm5bp" event={"ID":"af16052b-a5ab-4244-b007-69a32d050a35","Type":"ContainerDied","Data":"60e807de3887c03a9534eba93d8d22e8db9862960b97894d8ad6ed0aa49676a3"} Mar 20 10:57:21 crc kubenswrapper[4748]: I0320 10:57:21.559289 4748 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="60e807de3887c03a9534eba93d8d22e8db9862960b97894d8ad6ed0aa49676a3" Mar 20 10:57:21 crc kubenswrapper[4748]: I0320 10:57:21.557062 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-556bf684bc-f9q9w" podStartSLOduration=3.382119578 podStartE2EDuration="6.557030243s" podCreationTimestamp="2026-03-20 10:57:15 +0000 UTC" firstStartedPulling="2026-03-20 10:57:16.953410934 +0000 UTC m=+1272.094956758" lastFinishedPulling="2026-03-20 10:57:20.128321609 +0000 UTC m=+1275.269867423" observedRunningTime="2026-03-20 10:57:21.546231662 +0000 UTC m=+1276.687777486" watchObservedRunningTime="2026-03-20 10:57:21.557030243 +0000 UTC m=+1276.698576047" Mar 20 10:57:21 crc kubenswrapper[4748]: I0320 10:57:21.560565 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-77b5d7f4f8-8jmkc" event={"ID":"201e8a26-7bfa-40c7-aa3d-bf32c1344d61","Type":"ContainerStarted","Data":"9c3cbe26835d3a6462fc1372f6c93fbdc3561d8fb391ea4b973cda6e7b8219af"} Mar 20 10:57:21 crc kubenswrapper[4748]: I0320 10:57:21.560596 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-77b5d7f4f8-8jmkc" event={"ID":"201e8a26-7bfa-40c7-aa3d-bf32c1344d61","Type":"ContainerStarted","Data":"b80275da95b29920993dfde339a268a717fedc4da66c8d0ee46000a1230ec691"} Mar 20 10:57:21 crc kubenswrapper[4748]: I0320 10:57:21.566516 4748 generic.go:334] "Generic (PLEG): container finished" podID="86562079-7a54-4b0f-88bd-83634a175a8a" containerID="4491ded1e30ac6d3c3535024d7ccde4384fd1c44cccd376d41a88849fec93d4d" exitCode=0 Mar 20 10:57:21 crc kubenswrapper[4748]: I0320 10:57:21.566562 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-sxxjl" event={"ID":"86562079-7a54-4b0f-88bd-83634a175a8a","Type":"ContainerDied","Data":"4491ded1e30ac6d3c3535024d7ccde4384fd1c44cccd376d41a88849fec93d4d"} Mar 20 10:57:21 crc kubenswrapper[4748]: I0320 10:57:21.566579 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-sxxjl" event={"ID":"86562079-7a54-4b0f-88bd-83634a175a8a","Type":"ContainerStarted","Data":"1ff95c76a339389aa43a000d6b1a91f0e4bcad3177e33249b830b5ba2804dcbb"} Mar 20 10:57:21 crc kubenswrapper[4748]: I0320 10:57:21.597146 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-586bdc5f9-nt6hc" Mar 20 10:57:21 crc kubenswrapper[4748]: I0320 10:57:21.597274 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5b69f7d5cb-p5jsk" event={"ID":"ed96228e-6626-468c-bf60-a1073dfc123e","Type":"ContainerStarted","Data":"61b7cf17504dec4e5776281865de3725898071cf0ed4fbc4b64b0d4e659dbd5e"} Mar 20 10:57:21 crc kubenswrapper[4748]: I0320 10:57:21.597330 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5b69f7d5cb-p5jsk" event={"ID":"ed96228e-6626-468c-bf60-a1073dfc123e","Type":"ContainerStarted","Data":"bb2d542af4c27cb9dedc88299a4a7848832790aebd994f76b7d5f6461096aae9"} Mar 20 10:57:21 crc kubenswrapper[4748]: I0320 10:57:21.646636 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-5b69f7d5cb-p5jsk" podStartSLOduration=3.706443957 podStartE2EDuration="6.646617401s" podCreationTimestamp="2026-03-20 10:57:15 +0000 UTC" firstStartedPulling="2026-03-20 10:57:17.169690652 +0000 UTC m=+1272.311236466" lastFinishedPulling="2026-03-20 10:57:20.109864096 +0000 UTC m=+1275.251409910" observedRunningTime="2026-03-20 10:57:21.643982775 +0000 UTC m=+1276.785528599" watchObservedRunningTime="2026-03-20 10:57:21.646617401 +0000 UTC m=+1276.788163215" Mar 20 10:57:21 crc kubenswrapper[4748]: I0320 10:57:21.718866 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-586bdc5f9-nt6hc"] Mar 20 10:57:21 crc kubenswrapper[4748]: I0320 10:57:21.787966 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-586bdc5f9-nt6hc"] Mar 20 10:57:21 crc kubenswrapper[4748]: I0320 10:57:21.811063 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 10:57:21 crc kubenswrapper[4748]: E0320 10:57:21.811734 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af16052b-a5ab-4244-b007-69a32d050a35" containerName="cinder-db-sync" Mar 20 10:57:21 crc kubenswrapper[4748]: I0320 10:57:21.811751 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="af16052b-a5ab-4244-b007-69a32d050a35" containerName="cinder-db-sync" Mar 20 10:57:21 crc kubenswrapper[4748]: E0320 10:57:21.811788 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd0027c7-fda5-4ece-a96d-a45762ad902a" containerName="dnsmasq-dns" Mar 20 10:57:21 crc kubenswrapper[4748]: I0320 10:57:21.811796 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd0027c7-fda5-4ece-a96d-a45762ad902a" containerName="dnsmasq-dns" Mar 20 10:57:21 crc kubenswrapper[4748]: E0320 10:57:21.811821 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd0027c7-fda5-4ece-a96d-a45762ad902a" containerName="init" Mar 20 10:57:21 crc kubenswrapper[4748]: I0320 10:57:21.811828 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd0027c7-fda5-4ece-a96d-a45762ad902a" containerName="init" Mar 20 10:57:21 crc kubenswrapper[4748]: I0320 10:57:21.812146 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd0027c7-fda5-4ece-a96d-a45762ad902a" containerName="dnsmasq-dns" Mar 20 10:57:21 crc kubenswrapper[4748]: I0320 10:57:21.812167 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="af16052b-a5ab-4244-b007-69a32d050a35" containerName="cinder-db-sync" Mar 20 10:57:21 crc kubenswrapper[4748]: I0320 10:57:21.813471 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 20 10:57:21 crc kubenswrapper[4748]: I0320 10:57:21.834763 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 20 10:57:21 crc kubenswrapper[4748]: I0320 10:57:21.837672 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-q94rm" Mar 20 10:57:21 crc kubenswrapper[4748]: I0320 10:57:21.837803 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 20 10:57:21 crc kubenswrapper[4748]: I0320 10:57:21.837964 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 20 10:57:21 crc kubenswrapper[4748]: I0320 10:57:21.907121 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/05905a19-bbef-49a3-848c-d08b0862ba89-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"05905a19-bbef-49a3-848c-d08b0862ba89\") " pod="openstack/cinder-scheduler-0" Mar 20 10:57:21 crc kubenswrapper[4748]: I0320 10:57:21.907207 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/05905a19-bbef-49a3-848c-d08b0862ba89-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"05905a19-bbef-49a3-848c-d08b0862ba89\") " pod="openstack/cinder-scheduler-0" Mar 20 10:57:21 crc kubenswrapper[4748]: I0320 10:57:21.907237 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/05905a19-bbef-49a3-848c-d08b0862ba89-scripts\") pod \"cinder-scheduler-0\" (UID: \"05905a19-bbef-49a3-848c-d08b0862ba89\") " pod="openstack/cinder-scheduler-0" Mar 20 10:57:21 crc kubenswrapper[4748]: I0320 10:57:21.907274 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05905a19-bbef-49a3-848c-d08b0862ba89-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"05905a19-bbef-49a3-848c-d08b0862ba89\") " pod="openstack/cinder-scheduler-0" Mar 20 10:57:21 crc kubenswrapper[4748]: I0320 10:57:21.907302 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05905a19-bbef-49a3-848c-d08b0862ba89-config-data\") pod \"cinder-scheduler-0\" (UID: \"05905a19-bbef-49a3-848c-d08b0862ba89\") " pod="openstack/cinder-scheduler-0" Mar 20 10:57:21 crc kubenswrapper[4748]: I0320 10:57:21.907342 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86n7r\" (UniqueName: \"kubernetes.io/projected/05905a19-bbef-49a3-848c-d08b0862ba89-kube-api-access-86n7r\") pod \"cinder-scheduler-0\" (UID: \"05905a19-bbef-49a3-848c-d08b0862ba89\") " pod="openstack/cinder-scheduler-0" Mar 20 10:57:21 crc kubenswrapper[4748]: I0320 10:57:21.998660 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 10:57:22 crc kubenswrapper[4748]: I0320 10:57:22.009962 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/05905a19-bbef-49a3-848c-d08b0862ba89-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"05905a19-bbef-49a3-848c-d08b0862ba89\") " pod="openstack/cinder-scheduler-0" Mar 20 10:57:22 crc kubenswrapper[4748]: I0320 10:57:22.010062 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/05905a19-bbef-49a3-848c-d08b0862ba89-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"05905a19-bbef-49a3-848c-d08b0862ba89\") " pod="openstack/cinder-scheduler-0" Mar 20 10:57:22 crc kubenswrapper[4748]: I0320 10:57:22.010095 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/05905a19-bbef-49a3-848c-d08b0862ba89-scripts\") pod \"cinder-scheduler-0\" (UID: \"05905a19-bbef-49a3-848c-d08b0862ba89\") " pod="openstack/cinder-scheduler-0" Mar 20 10:57:22 crc kubenswrapper[4748]: I0320 10:57:22.010141 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05905a19-bbef-49a3-848c-d08b0862ba89-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"05905a19-bbef-49a3-848c-d08b0862ba89\") " pod="openstack/cinder-scheduler-0" Mar 20 10:57:22 crc kubenswrapper[4748]: I0320 10:57:22.010176 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05905a19-bbef-49a3-848c-d08b0862ba89-config-data\") pod \"cinder-scheduler-0\" (UID: \"05905a19-bbef-49a3-848c-d08b0862ba89\") " pod="openstack/cinder-scheduler-0" Mar 20 10:57:22 crc kubenswrapper[4748]: I0320 10:57:22.010228 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-86n7r\" (UniqueName: \"kubernetes.io/projected/05905a19-bbef-49a3-848c-d08b0862ba89-kube-api-access-86n7r\") pod \"cinder-scheduler-0\" (UID: \"05905a19-bbef-49a3-848c-d08b0862ba89\") " pod="openstack/cinder-scheduler-0" Mar 20 10:57:22 crc kubenswrapper[4748]: I0320 10:57:22.011575 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/05905a19-bbef-49a3-848c-d08b0862ba89-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"05905a19-bbef-49a3-848c-d08b0862ba89\") " pod="openstack/cinder-scheduler-0" Mar 20 10:57:22 crc kubenswrapper[4748]: I0320 10:57:22.033094 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05905a19-bbef-49a3-848c-d08b0862ba89-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"05905a19-bbef-49a3-848c-d08b0862ba89\") " pod="openstack/cinder-scheduler-0" Mar 20 10:57:22 crc kubenswrapper[4748]: I0320 10:57:22.033531 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05905a19-bbef-49a3-848c-d08b0862ba89-config-data\") pod \"cinder-scheduler-0\" (UID: \"05905a19-bbef-49a3-848c-d08b0862ba89\") " pod="openstack/cinder-scheduler-0" Mar 20 10:57:22 crc kubenswrapper[4748]: I0320 10:57:22.034375 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/05905a19-bbef-49a3-848c-d08b0862ba89-scripts\") pod \"cinder-scheduler-0\" (UID: \"05905a19-bbef-49a3-848c-d08b0862ba89\") " pod="openstack/cinder-scheduler-0" Mar 20 10:57:22 crc kubenswrapper[4748]: I0320 10:57:22.058307 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-86n7r\" (UniqueName: \"kubernetes.io/projected/05905a19-bbef-49a3-848c-d08b0862ba89-kube-api-access-86n7r\") pod \"cinder-scheduler-0\" (UID: \"05905a19-bbef-49a3-848c-d08b0862ba89\") " pod="openstack/cinder-scheduler-0" Mar 20 10:57:22 crc kubenswrapper[4748]: I0320 10:57:22.059489 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/05905a19-bbef-49a3-848c-d08b0862ba89-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"05905a19-bbef-49a3-848c-d08b0862ba89\") " pod="openstack/cinder-scheduler-0" Mar 20 10:57:22 crc kubenswrapper[4748]: I0320 10:57:22.175796 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 20 10:57:22 crc kubenswrapper[4748]: I0320 10:57:22.259324 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-sxxjl"] Mar 20 10:57:22 crc kubenswrapper[4748]: I0320 10:57:22.343656 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-dg9px"] Mar 20 10:57:22 crc kubenswrapper[4748]: I0320 10:57:22.355230 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-dg9px" Mar 20 10:57:22 crc kubenswrapper[4748]: I0320 10:57:22.367734 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-dg9px"] Mar 20 10:57:22 crc kubenswrapper[4748]: I0320 10:57:22.407549 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 20 10:57:22 crc kubenswrapper[4748]: I0320 10:57:22.409210 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 20 10:57:22 crc kubenswrapper[4748]: I0320 10:57:22.432144 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 20 10:57:22 crc kubenswrapper[4748]: I0320 10:57:22.433213 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f25ff3ee-c6f6-4b0c-ab9b-b16d5f1d33ba-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-dg9px\" (UID: \"f25ff3ee-c6f6-4b0c-ab9b-b16d5f1d33ba\") " pod="openstack/dnsmasq-dns-5c9776ccc5-dg9px" Mar 20 10:57:22 crc kubenswrapper[4748]: I0320 10:57:22.433236 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f25ff3ee-c6f6-4b0c-ab9b-b16d5f1d33ba-config\") pod \"dnsmasq-dns-5c9776ccc5-dg9px\" (UID: \"f25ff3ee-c6f6-4b0c-ab9b-b16d5f1d33ba\") " pod="openstack/dnsmasq-dns-5c9776ccc5-dg9px" Mar 20 10:57:22 crc kubenswrapper[4748]: I0320 10:57:22.433307 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f25ff3ee-c6f6-4b0c-ab9b-b16d5f1d33ba-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-dg9px\" (UID: \"f25ff3ee-c6f6-4b0c-ab9b-b16d5f1d33ba\") " pod="openstack/dnsmasq-dns-5c9776ccc5-dg9px" Mar 20 10:57:22 crc kubenswrapper[4748]: I0320 10:57:22.433339 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f25ff3ee-c6f6-4b0c-ab9b-b16d5f1d33ba-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-dg9px\" (UID: \"f25ff3ee-c6f6-4b0c-ab9b-b16d5f1d33ba\") " pod="openstack/dnsmasq-dns-5c9776ccc5-dg9px" Mar 20 10:57:22 crc kubenswrapper[4748]: I0320 10:57:22.433360 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kkfn\" (UniqueName: \"kubernetes.io/projected/f25ff3ee-c6f6-4b0c-ab9b-b16d5f1d33ba-kube-api-access-8kkfn\") pod \"dnsmasq-dns-5c9776ccc5-dg9px\" (UID: \"f25ff3ee-c6f6-4b0c-ab9b-b16d5f1d33ba\") " pod="openstack/dnsmasq-dns-5c9776ccc5-dg9px" Mar 20 10:57:22 crc kubenswrapper[4748]: I0320 10:57:22.433434 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f25ff3ee-c6f6-4b0c-ab9b-b16d5f1d33ba-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-dg9px\" (UID: \"f25ff3ee-c6f6-4b0c-ab9b-b16d5f1d33ba\") " pod="openstack/dnsmasq-dns-5c9776ccc5-dg9px" Mar 20 10:57:22 crc kubenswrapper[4748]: I0320 10:57:22.437742 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 20 10:57:22 crc kubenswrapper[4748]: I0320 10:57:22.535253 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/86dd57e9-a5c8-47c3-ab5e-aab96e192486-etc-machine-id\") pod \"cinder-api-0\" (UID: \"86dd57e9-a5c8-47c3-ab5e-aab96e192486\") " pod="openstack/cinder-api-0" Mar 20 10:57:22 crc kubenswrapper[4748]: I0320 10:57:22.536413 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8t5dk\" (UniqueName: \"kubernetes.io/projected/86dd57e9-a5c8-47c3-ab5e-aab96e192486-kube-api-access-8t5dk\") pod \"cinder-api-0\" (UID: \"86dd57e9-a5c8-47c3-ab5e-aab96e192486\") " pod="openstack/cinder-api-0" Mar 20 10:57:22 crc kubenswrapper[4748]: I0320 10:57:22.536511 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86dd57e9-a5c8-47c3-ab5e-aab96e192486-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"86dd57e9-a5c8-47c3-ab5e-aab96e192486\") " pod="openstack/cinder-api-0" Mar 20 10:57:22 crc kubenswrapper[4748]: I0320 10:57:22.536590 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f25ff3ee-c6f6-4b0c-ab9b-b16d5f1d33ba-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-dg9px\" (UID: \"f25ff3ee-c6f6-4b0c-ab9b-b16d5f1d33ba\") " pod="openstack/dnsmasq-dns-5c9776ccc5-dg9px" Mar 20 10:57:22 crc kubenswrapper[4748]: I0320 10:57:22.536668 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/86dd57e9-a5c8-47c3-ab5e-aab96e192486-logs\") pod \"cinder-api-0\" (UID: \"86dd57e9-a5c8-47c3-ab5e-aab96e192486\") " pod="openstack/cinder-api-0" Mar 20 10:57:22 crc kubenswrapper[4748]: I0320 10:57:22.536759 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/86dd57e9-a5c8-47c3-ab5e-aab96e192486-config-data-custom\") pod \"cinder-api-0\" (UID: \"86dd57e9-a5c8-47c3-ab5e-aab96e192486\") " pod="openstack/cinder-api-0" Mar 20 10:57:22 crc kubenswrapper[4748]: I0320 10:57:22.536849 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f25ff3ee-c6f6-4b0c-ab9b-b16d5f1d33ba-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-dg9px\" (UID: \"f25ff3ee-c6f6-4b0c-ab9b-b16d5f1d33ba\") " pod="openstack/dnsmasq-dns-5c9776ccc5-dg9px" Mar 20 10:57:22 crc kubenswrapper[4748]: I0320 10:57:22.536922 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f25ff3ee-c6f6-4b0c-ab9b-b16d5f1d33ba-config\") pod \"dnsmasq-dns-5c9776ccc5-dg9px\" (UID: \"f25ff3ee-c6f6-4b0c-ab9b-b16d5f1d33ba\") " pod="openstack/dnsmasq-dns-5c9776ccc5-dg9px" Mar 20 10:57:22 crc kubenswrapper[4748]: I0320 10:57:22.537035 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f25ff3ee-c6f6-4b0c-ab9b-b16d5f1d33ba-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-dg9px\" (UID: \"f25ff3ee-c6f6-4b0c-ab9b-b16d5f1d33ba\") " pod="openstack/dnsmasq-dns-5c9776ccc5-dg9px" Mar 20 10:57:22 crc kubenswrapper[4748]: I0320 10:57:22.537128 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86dd57e9-a5c8-47c3-ab5e-aab96e192486-config-data\") pod \"cinder-api-0\" (UID: \"86dd57e9-a5c8-47c3-ab5e-aab96e192486\") " pod="openstack/cinder-api-0" Mar 20 10:57:22 crc kubenswrapper[4748]: I0320 10:57:22.537202 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f25ff3ee-c6f6-4b0c-ab9b-b16d5f1d33ba-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-dg9px\" (UID: \"f25ff3ee-c6f6-4b0c-ab9b-b16d5f1d33ba\") " pod="openstack/dnsmasq-dns-5c9776ccc5-dg9px" Mar 20 10:57:22 crc kubenswrapper[4748]: I0320 10:57:22.537302 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8kkfn\" (UniqueName: \"kubernetes.io/projected/f25ff3ee-c6f6-4b0c-ab9b-b16d5f1d33ba-kube-api-access-8kkfn\") pod \"dnsmasq-dns-5c9776ccc5-dg9px\" (UID: \"f25ff3ee-c6f6-4b0c-ab9b-b16d5f1d33ba\") " pod="openstack/dnsmasq-dns-5c9776ccc5-dg9px" Mar 20 10:57:22 crc kubenswrapper[4748]: I0320 10:57:22.537409 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/86dd57e9-a5c8-47c3-ab5e-aab96e192486-scripts\") pod \"cinder-api-0\" (UID: \"86dd57e9-a5c8-47c3-ab5e-aab96e192486\") " pod="openstack/cinder-api-0" Mar 20 10:57:22 crc kubenswrapper[4748]: I0320 10:57:22.538828 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f25ff3ee-c6f6-4b0c-ab9b-b16d5f1d33ba-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-dg9px\" (UID: \"f25ff3ee-c6f6-4b0c-ab9b-b16d5f1d33ba\") " pod="openstack/dnsmasq-dns-5c9776ccc5-dg9px" Mar 20 10:57:22 crc kubenswrapper[4748]: I0320 10:57:22.540895 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f25ff3ee-c6f6-4b0c-ab9b-b16d5f1d33ba-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-dg9px\" (UID: \"f25ff3ee-c6f6-4b0c-ab9b-b16d5f1d33ba\") " pod="openstack/dnsmasq-dns-5c9776ccc5-dg9px" Mar 20 10:57:22 crc kubenswrapper[4748]: I0320 10:57:22.541505 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f25ff3ee-c6f6-4b0c-ab9b-b16d5f1d33ba-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-dg9px\" (UID: \"f25ff3ee-c6f6-4b0c-ab9b-b16d5f1d33ba\") " pod="openstack/dnsmasq-dns-5c9776ccc5-dg9px" Mar 20 10:57:22 crc kubenswrapper[4748]: I0320 10:57:22.542145 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f25ff3ee-c6f6-4b0c-ab9b-b16d5f1d33ba-config\") pod \"dnsmasq-dns-5c9776ccc5-dg9px\" (UID: \"f25ff3ee-c6f6-4b0c-ab9b-b16d5f1d33ba\") " pod="openstack/dnsmasq-dns-5c9776ccc5-dg9px" Mar 20 10:57:22 crc kubenswrapper[4748]: I0320 10:57:22.552923 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f25ff3ee-c6f6-4b0c-ab9b-b16d5f1d33ba-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-dg9px\" (UID: \"f25ff3ee-c6f6-4b0c-ab9b-b16d5f1d33ba\") " pod="openstack/dnsmasq-dns-5c9776ccc5-dg9px" Mar 20 10:57:22 crc kubenswrapper[4748]: I0320 10:57:22.590608 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kkfn\" (UniqueName: \"kubernetes.io/projected/f25ff3ee-c6f6-4b0c-ab9b-b16d5f1d33ba-kube-api-access-8kkfn\") pod \"dnsmasq-dns-5c9776ccc5-dg9px\" (UID: \"f25ff3ee-c6f6-4b0c-ab9b-b16d5f1d33ba\") " pod="openstack/dnsmasq-dns-5c9776ccc5-dg9px" Mar 20 10:57:22 crc kubenswrapper[4748]: I0320 10:57:22.646199 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-77b5d7f4f8-8jmkc" event={"ID":"201e8a26-7bfa-40c7-aa3d-bf32c1344d61","Type":"ContainerStarted","Data":"6c0582f4442226b25df074e491c3076f173c176fc0160366009df639fa27f84c"} Mar 20 10:57:22 crc kubenswrapper[4748]: I0320 10:57:22.647773 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-77b5d7f4f8-8jmkc" Mar 20 10:57:22 crc kubenswrapper[4748]: I0320 10:57:22.647801 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-77b5d7f4f8-8jmkc" Mar 20 10:57:22 crc kubenswrapper[4748]: I0320 10:57:22.659802 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86dd57e9-a5c8-47c3-ab5e-aab96e192486-config-data\") pod \"cinder-api-0\" (UID: \"86dd57e9-a5c8-47c3-ab5e-aab96e192486\") " pod="openstack/cinder-api-0" Mar 20 10:57:22 crc kubenswrapper[4748]: I0320 10:57:22.659901 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/86dd57e9-a5c8-47c3-ab5e-aab96e192486-scripts\") pod \"cinder-api-0\" (UID: \"86dd57e9-a5c8-47c3-ab5e-aab96e192486\") " pod="openstack/cinder-api-0" Mar 20 10:57:22 crc kubenswrapper[4748]: I0320 10:57:22.659964 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/86dd57e9-a5c8-47c3-ab5e-aab96e192486-etc-machine-id\") pod \"cinder-api-0\" (UID: \"86dd57e9-a5c8-47c3-ab5e-aab96e192486\") " pod="openstack/cinder-api-0" Mar 20 10:57:22 crc kubenswrapper[4748]: I0320 10:57:22.660000 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8t5dk\" (UniqueName: \"kubernetes.io/projected/86dd57e9-a5c8-47c3-ab5e-aab96e192486-kube-api-access-8t5dk\") pod \"cinder-api-0\" (UID: \"86dd57e9-a5c8-47c3-ab5e-aab96e192486\") " pod="openstack/cinder-api-0" Mar 20 10:57:22 crc kubenswrapper[4748]: I0320 10:57:22.660029 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86dd57e9-a5c8-47c3-ab5e-aab96e192486-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"86dd57e9-a5c8-47c3-ab5e-aab96e192486\") " pod="openstack/cinder-api-0" Mar 20 10:57:22 crc kubenswrapper[4748]: I0320 10:57:22.660069 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/86dd57e9-a5c8-47c3-ab5e-aab96e192486-logs\") pod \"cinder-api-0\" (UID: \"86dd57e9-a5c8-47c3-ab5e-aab96e192486\") " pod="openstack/cinder-api-0" Mar 20 10:57:22 crc kubenswrapper[4748]: I0320 10:57:22.660111 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/86dd57e9-a5c8-47c3-ab5e-aab96e192486-config-data-custom\") pod \"cinder-api-0\" (UID: \"86dd57e9-a5c8-47c3-ab5e-aab96e192486\") " pod="openstack/cinder-api-0" Mar 20 10:57:22 crc kubenswrapper[4748]: I0320 10:57:22.661404 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/86dd57e9-a5c8-47c3-ab5e-aab96e192486-etc-machine-id\") pod \"cinder-api-0\" (UID: \"86dd57e9-a5c8-47c3-ab5e-aab96e192486\") " pod="openstack/cinder-api-0" Mar 20 10:57:22 crc kubenswrapper[4748]: I0320 10:57:22.662396 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-589cf645cb-wkg45" event={"ID":"c8728f34-b5b9-4ced-9424-b83ff940580f","Type":"ContainerStarted","Data":"eeb50da6d9d48d7097bcc7e766ded49dbca3fb46e5b62b031e89186a098b6d45"} Mar 20 10:57:22 crc kubenswrapper[4748]: I0320 10:57:22.668886 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/86dd57e9-a5c8-47c3-ab5e-aab96e192486-logs\") pod \"cinder-api-0\" (UID: \"86dd57e9-a5c8-47c3-ab5e-aab96e192486\") " pod="openstack/cinder-api-0" Mar 20 10:57:22 crc kubenswrapper[4748]: I0320 10:57:22.670655 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/86dd57e9-a5c8-47c3-ab5e-aab96e192486-config-data-custom\") pod \"cinder-api-0\" (UID: \"86dd57e9-a5c8-47c3-ab5e-aab96e192486\") " pod="openstack/cinder-api-0" Mar 20 10:57:22 crc kubenswrapper[4748]: I0320 10:57:22.678703 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86dd57e9-a5c8-47c3-ab5e-aab96e192486-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"86dd57e9-a5c8-47c3-ab5e-aab96e192486\") " pod="openstack/cinder-api-0" Mar 20 10:57:22 crc kubenswrapper[4748]: I0320 10:57:22.696267 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/86dd57e9-a5c8-47c3-ab5e-aab96e192486-scripts\") pod \"cinder-api-0\" (UID: \"86dd57e9-a5c8-47c3-ab5e-aab96e192486\") " pod="openstack/cinder-api-0" Mar 20 10:57:22 crc kubenswrapper[4748]: I0320 10:57:22.703537 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-77b5d7f4f8-8jmkc" podStartSLOduration=3.703508763 podStartE2EDuration="3.703508763s" podCreationTimestamp="2026-03-20 10:57:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:57:22.696690552 +0000 UTC m=+1277.838236376" watchObservedRunningTime="2026-03-20 10:57:22.703508763 +0000 UTC m=+1277.845054577" Mar 20 10:57:22 crc kubenswrapper[4748]: I0320 10:57:22.722807 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8t5dk\" (UniqueName: \"kubernetes.io/projected/86dd57e9-a5c8-47c3-ab5e-aab96e192486-kube-api-access-8t5dk\") pod \"cinder-api-0\" (UID: \"86dd57e9-a5c8-47c3-ab5e-aab96e192486\") " pod="openstack/cinder-api-0" Mar 20 10:57:22 crc kubenswrapper[4748]: I0320 10:57:22.724679 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86dd57e9-a5c8-47c3-ab5e-aab96e192486-config-data\") pod \"cinder-api-0\" (UID: \"86dd57e9-a5c8-47c3-ab5e-aab96e192486\") " pod="openstack/cinder-api-0" Mar 20 10:57:22 crc kubenswrapper[4748]: I0320 10:57:22.779500 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-dg9px" Mar 20 10:57:22 crc kubenswrapper[4748]: I0320 10:57:22.818352 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 20 10:57:23 crc kubenswrapper[4748]: I0320 10:57:23.055640 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 10:57:23 crc kubenswrapper[4748]: I0320 10:57:23.446975 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-dg9px"] Mar 20 10:57:23 crc kubenswrapper[4748]: I0320 10:57:23.513409 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 20 10:57:23 crc kubenswrapper[4748]: I0320 10:57:23.556230 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd0027c7-fda5-4ece-a96d-a45762ad902a" path="/var/lib/kubelet/pods/cd0027c7-fda5-4ece-a96d-a45762ad902a/volumes" Mar 20 10:57:23 crc kubenswrapper[4748]: I0320 10:57:23.699131 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"86dd57e9-a5c8-47c3-ab5e-aab96e192486","Type":"ContainerStarted","Data":"2309da3130efbd09105098d8b3e661aa053f1456d60c07851abe9460ae9304ce"} Mar 20 10:57:23 crc kubenswrapper[4748]: I0320 10:57:23.707329 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"05905a19-bbef-49a3-848c-d08b0862ba89","Type":"ContainerStarted","Data":"94acf0c195d7e536b3b9215e8f4b086cf8e9719e2a7b8b7a5965373a8d0aaae2"} Mar 20 10:57:23 crc kubenswrapper[4748]: I0320 10:57:23.718878 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-dg9px" event={"ID":"f25ff3ee-c6f6-4b0c-ab9b-b16d5f1d33ba","Type":"ContainerStarted","Data":"093dd8b3922c27300d223e058e975fa705a15dff63764c4d7160dce70705a0f4"} Mar 20 10:57:23 crc kubenswrapper[4748]: I0320 10:57:23.735879 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-sxxjl" event={"ID":"86562079-7a54-4b0f-88bd-83634a175a8a","Type":"ContainerStarted","Data":"c54de2d51aeed57d1bc3da2b2fdca64475dfba5f61fecc7601cd4a27e1b03d90"} Mar 20 10:57:23 crc kubenswrapper[4748]: I0320 10:57:23.735896 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-85ff748b95-sxxjl" podUID="86562079-7a54-4b0f-88bd-83634a175a8a" containerName="dnsmasq-dns" containerID="cri-o://c54de2d51aeed57d1bc3da2b2fdca64475dfba5f61fecc7601cd4a27e1b03d90" gracePeriod=10 Mar 20 10:57:23 crc kubenswrapper[4748]: I0320 10:57:23.736512 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-85ff748b95-sxxjl" Mar 20 10:57:23 crc kubenswrapper[4748]: I0320 10:57:23.821406 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-85ff748b95-sxxjl" podStartSLOduration=5.821341046 podStartE2EDuration="5.821341046s" podCreationTimestamp="2026-03-20 10:57:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:57:23.766564901 +0000 UTC m=+1278.908110715" watchObservedRunningTime="2026-03-20 10:57:23.821341046 +0000 UTC m=+1278.962886860" Mar 20 10:57:23 crc kubenswrapper[4748]: I0320 10:57:23.824139 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-589cf645cb-wkg45" podStartSLOduration=5.824117975 podStartE2EDuration="5.824117975s" podCreationTimestamp="2026-03-20 10:57:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:57:23.800907093 +0000 UTC m=+1278.942452907" watchObservedRunningTime="2026-03-20 10:57:23.824117975 +0000 UTC m=+1278.965663789" Mar 20 10:57:24 crc kubenswrapper[4748]: I0320 10:57:24.443664 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-sxxjl" Mar 20 10:57:24 crc kubenswrapper[4748]: I0320 10:57:24.534637 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/86562079-7a54-4b0f-88bd-83634a175a8a-dns-svc\") pod \"86562079-7a54-4b0f-88bd-83634a175a8a\" (UID: \"86562079-7a54-4b0f-88bd-83634a175a8a\") " Mar 20 10:57:24 crc kubenswrapper[4748]: I0320 10:57:24.534766 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/86562079-7a54-4b0f-88bd-83634a175a8a-config\") pod \"86562079-7a54-4b0f-88bd-83634a175a8a\" (UID: \"86562079-7a54-4b0f-88bd-83634a175a8a\") " Mar 20 10:57:24 crc kubenswrapper[4748]: I0320 10:57:24.534816 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/86562079-7a54-4b0f-88bd-83634a175a8a-ovsdbserver-sb\") pod \"86562079-7a54-4b0f-88bd-83634a175a8a\" (UID: \"86562079-7a54-4b0f-88bd-83634a175a8a\") " Mar 20 10:57:24 crc kubenswrapper[4748]: I0320 10:57:24.534904 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/86562079-7a54-4b0f-88bd-83634a175a8a-ovsdbserver-nb\") pod \"86562079-7a54-4b0f-88bd-83634a175a8a\" (UID: \"86562079-7a54-4b0f-88bd-83634a175a8a\") " Mar 20 10:57:24 crc kubenswrapper[4748]: I0320 10:57:24.534975 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/86562079-7a54-4b0f-88bd-83634a175a8a-dns-swift-storage-0\") pod \"86562079-7a54-4b0f-88bd-83634a175a8a\" (UID: \"86562079-7a54-4b0f-88bd-83634a175a8a\") " Mar 20 10:57:24 crc kubenswrapper[4748]: I0320 10:57:24.535101 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2689h\" (UniqueName: \"kubernetes.io/projected/86562079-7a54-4b0f-88bd-83634a175a8a-kube-api-access-2689h\") pod \"86562079-7a54-4b0f-88bd-83634a175a8a\" (UID: \"86562079-7a54-4b0f-88bd-83634a175a8a\") " Mar 20 10:57:24 crc kubenswrapper[4748]: I0320 10:57:24.576015 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86562079-7a54-4b0f-88bd-83634a175a8a-kube-api-access-2689h" (OuterVolumeSpecName: "kube-api-access-2689h") pod "86562079-7a54-4b0f-88bd-83634a175a8a" (UID: "86562079-7a54-4b0f-88bd-83634a175a8a"). InnerVolumeSpecName "kube-api-access-2689h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:57:24 crc kubenswrapper[4748]: I0320 10:57:24.638450 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2689h\" (UniqueName: \"kubernetes.io/projected/86562079-7a54-4b0f-88bd-83634a175a8a-kube-api-access-2689h\") on node \"crc\" DevicePath \"\"" Mar 20 10:57:24 crc kubenswrapper[4748]: I0320 10:57:24.647652 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/86562079-7a54-4b0f-88bd-83634a175a8a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "86562079-7a54-4b0f-88bd-83634a175a8a" (UID: "86562079-7a54-4b0f-88bd-83634a175a8a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:57:24 crc kubenswrapper[4748]: I0320 10:57:24.652503 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/86562079-7a54-4b0f-88bd-83634a175a8a-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "86562079-7a54-4b0f-88bd-83634a175a8a" (UID: "86562079-7a54-4b0f-88bd-83634a175a8a"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:57:24 crc kubenswrapper[4748]: I0320 10:57:24.656476 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/86562079-7a54-4b0f-88bd-83634a175a8a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "86562079-7a54-4b0f-88bd-83634a175a8a" (UID: "86562079-7a54-4b0f-88bd-83634a175a8a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:57:24 crc kubenswrapper[4748]: I0320 10:57:24.667048 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/86562079-7a54-4b0f-88bd-83634a175a8a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "86562079-7a54-4b0f-88bd-83634a175a8a" (UID: "86562079-7a54-4b0f-88bd-83634a175a8a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:57:24 crc kubenswrapper[4748]: I0320 10:57:24.668416 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/86562079-7a54-4b0f-88bd-83634a175a8a-config" (OuterVolumeSpecName: "config") pod "86562079-7a54-4b0f-88bd-83634a175a8a" (UID: "86562079-7a54-4b0f-88bd-83634a175a8a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:57:24 crc kubenswrapper[4748]: I0320 10:57:24.740170 4748 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/86562079-7a54-4b0f-88bd-83634a175a8a-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:57:24 crc kubenswrapper[4748]: I0320 10:57:24.740203 4748 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/86562079-7a54-4b0f-88bd-83634a175a8a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 10:57:24 crc kubenswrapper[4748]: I0320 10:57:24.740217 4748 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/86562079-7a54-4b0f-88bd-83634a175a8a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 10:57:24 crc kubenswrapper[4748]: I0320 10:57:24.740228 4748 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/86562079-7a54-4b0f-88bd-83634a175a8a-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 20 10:57:24 crc kubenswrapper[4748]: I0320 10:57:24.740239 4748 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/86562079-7a54-4b0f-88bd-83634a175a8a-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 10:57:24 crc kubenswrapper[4748]: I0320 10:57:24.753258 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f9ddc460-94b3-44d2-9eed-2b4e344f8232","Type":"ContainerStarted","Data":"cfb24b150bed54645f70e625690898fb4f02902434052c487e5f5e92fe2cbabf"} Mar 20 10:57:24 crc kubenswrapper[4748]: I0320 10:57:24.754663 4748 generic.go:334] "Generic (PLEG): container finished" podID="f25ff3ee-c6f6-4b0c-ab9b-b16d5f1d33ba" containerID="e47ff1992ccdb4eeeb03582ed6109b9aa838fd7fcc9045eabe6921d6bf8e5b87" exitCode=0 Mar 20 10:57:24 crc kubenswrapper[4748]: I0320 10:57:24.754758 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-dg9px" event={"ID":"f25ff3ee-c6f6-4b0c-ab9b-b16d5f1d33ba","Type":"ContainerDied","Data":"e47ff1992ccdb4eeeb03582ed6109b9aa838fd7fcc9045eabe6921d6bf8e5b87"} Mar 20 10:57:24 crc kubenswrapper[4748]: I0320 10:57:24.756467 4748 generic.go:334] "Generic (PLEG): container finished" podID="86562079-7a54-4b0f-88bd-83634a175a8a" containerID="c54de2d51aeed57d1bc3da2b2fdca64475dfba5f61fecc7601cd4a27e1b03d90" exitCode=0 Mar 20 10:57:24 crc kubenswrapper[4748]: I0320 10:57:24.756513 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-sxxjl" event={"ID":"86562079-7a54-4b0f-88bd-83634a175a8a","Type":"ContainerDied","Data":"c54de2d51aeed57d1bc3da2b2fdca64475dfba5f61fecc7601cd4a27e1b03d90"} Mar 20 10:57:24 crc kubenswrapper[4748]: I0320 10:57:24.756532 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-sxxjl" event={"ID":"86562079-7a54-4b0f-88bd-83634a175a8a","Type":"ContainerDied","Data":"1ff95c76a339389aa43a000d6b1a91f0e4bcad3177e33249b830b5ba2804dcbb"} Mar 20 10:57:24 crc kubenswrapper[4748]: I0320 10:57:24.756550 4748 scope.go:117] "RemoveContainer" containerID="c54de2d51aeed57d1bc3da2b2fdca64475dfba5f61fecc7601cd4a27e1b03d90" Mar 20 10:57:24 crc kubenswrapper[4748]: I0320 10:57:24.756680 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-sxxjl" Mar 20 10:57:24 crc kubenswrapper[4748]: I0320 10:57:24.764015 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"86dd57e9-a5c8-47c3-ab5e-aab96e192486","Type":"ContainerStarted","Data":"19552a7b5b26ba8683b1ea504967d5a99e72d2d8591d187f60e9ff1be670b995"} Mar 20 10:57:24 crc kubenswrapper[4748]: I0320 10:57:24.788825 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.036056253 podStartE2EDuration="9.788797684s" podCreationTimestamp="2026-03-20 10:57:15 +0000 UTC" firstStartedPulling="2026-03-20 10:57:16.632222494 +0000 UTC m=+1271.773768308" lastFinishedPulling="2026-03-20 10:57:23.384963925 +0000 UTC m=+1278.526509739" observedRunningTime="2026-03-20 10:57:24.782990489 +0000 UTC m=+1279.924536313" watchObservedRunningTime="2026-03-20 10:57:24.788797684 +0000 UTC m=+1279.930343498" Mar 20 10:57:24 crc kubenswrapper[4748]: I0320 10:57:24.833216 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-sxxjl"] Mar 20 10:57:24 crc kubenswrapper[4748]: I0320 10:57:24.843356 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-sxxjl"] Mar 20 10:57:24 crc kubenswrapper[4748]: I0320 10:57:24.864580 4748 scope.go:117] "RemoveContainer" containerID="4491ded1e30ac6d3c3535024d7ccde4384fd1c44cccd376d41a88849fec93d4d" Mar 20 10:57:25 crc kubenswrapper[4748]: I0320 10:57:25.002999 4748 scope.go:117] "RemoveContainer" containerID="c54de2d51aeed57d1bc3da2b2fdca64475dfba5f61fecc7601cd4a27e1b03d90" Mar 20 10:57:25 crc kubenswrapper[4748]: E0320 10:57:25.006432 4748 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c54de2d51aeed57d1bc3da2b2fdca64475dfba5f61fecc7601cd4a27e1b03d90\": container with ID starting with c54de2d51aeed57d1bc3da2b2fdca64475dfba5f61fecc7601cd4a27e1b03d90 not found: ID does not exist" containerID="c54de2d51aeed57d1bc3da2b2fdca64475dfba5f61fecc7601cd4a27e1b03d90" Mar 20 10:57:25 crc kubenswrapper[4748]: I0320 10:57:25.006527 4748 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c54de2d51aeed57d1bc3da2b2fdca64475dfba5f61fecc7601cd4a27e1b03d90"} err="failed to get container status \"c54de2d51aeed57d1bc3da2b2fdca64475dfba5f61fecc7601cd4a27e1b03d90\": rpc error: code = NotFound desc = could not find container \"c54de2d51aeed57d1bc3da2b2fdca64475dfba5f61fecc7601cd4a27e1b03d90\": container with ID starting with c54de2d51aeed57d1bc3da2b2fdca64475dfba5f61fecc7601cd4a27e1b03d90 not found: ID does not exist" Mar 20 10:57:25 crc kubenswrapper[4748]: I0320 10:57:25.006563 4748 scope.go:117] "RemoveContainer" containerID="4491ded1e30ac6d3c3535024d7ccde4384fd1c44cccd376d41a88849fec93d4d" Mar 20 10:57:25 crc kubenswrapper[4748]: E0320 10:57:25.007256 4748 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4491ded1e30ac6d3c3535024d7ccde4384fd1c44cccd376d41a88849fec93d4d\": container with ID starting with 4491ded1e30ac6d3c3535024d7ccde4384fd1c44cccd376d41a88849fec93d4d not found: ID does not exist" containerID="4491ded1e30ac6d3c3535024d7ccde4384fd1c44cccd376d41a88849fec93d4d" Mar 20 10:57:25 crc kubenswrapper[4748]: I0320 10:57:25.007338 4748 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4491ded1e30ac6d3c3535024d7ccde4384fd1c44cccd376d41a88849fec93d4d"} err="failed to get container status \"4491ded1e30ac6d3c3535024d7ccde4384fd1c44cccd376d41a88849fec93d4d\": rpc error: code = NotFound desc = could not find container \"4491ded1e30ac6d3c3535024d7ccde4384fd1c44cccd376d41a88849fec93d4d\": container with ID starting with 4491ded1e30ac6d3c3535024d7ccde4384fd1c44cccd376d41a88849fec93d4d not found: ID does not exist" Mar 20 10:57:25 crc kubenswrapper[4748]: I0320 10:57:25.480011 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-7d79b6bb86-nhfts" Mar 20 10:57:25 crc kubenswrapper[4748]: I0320 10:57:25.553947 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86562079-7a54-4b0f-88bd-83634a175a8a" path="/var/lib/kubelet/pods/86562079-7a54-4b0f-88bd-83634a175a8a/volumes" Mar 20 10:57:25 crc kubenswrapper[4748]: I0320 10:57:25.778276 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-dg9px" event={"ID":"f25ff3ee-c6f6-4b0c-ab9b-b16d5f1d33ba","Type":"ContainerStarted","Data":"9865d2f53475bb341f646c1a24fffb584a17e8effa6599ec36b2ad72671239b1"} Mar 20 10:57:25 crc kubenswrapper[4748]: I0320 10:57:25.779598 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c9776ccc5-dg9px" Mar 20 10:57:25 crc kubenswrapper[4748]: I0320 10:57:25.783098 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 20 10:57:26 crc kubenswrapper[4748]: I0320 10:57:26.642190 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c9776ccc5-dg9px" podStartSLOduration=4.642151934 podStartE2EDuration="4.642151934s" podCreationTimestamp="2026-03-20 10:57:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:57:25.810453493 +0000 UTC m=+1280.951999307" watchObservedRunningTime="2026-03-20 10:57:26.642151934 +0000 UTC m=+1281.783697748" Mar 20 10:57:26 crc kubenswrapper[4748]: I0320 10:57:26.643967 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 20 10:57:26 crc kubenswrapper[4748]: I0320 10:57:26.796618 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"86dd57e9-a5c8-47c3-ab5e-aab96e192486","Type":"ContainerStarted","Data":"72464cd488d500d338050fb49e9cd488de7ce436a8d64a0c4b31508e851212e9"} Mar 20 10:57:26 crc kubenswrapper[4748]: I0320 10:57:26.797982 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 20 10:57:26 crc kubenswrapper[4748]: I0320 10:57:26.800241 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"05905a19-bbef-49a3-848c-d08b0862ba89","Type":"ContainerStarted","Data":"ff21caed54a7631e0e405265b7fe2a4b27310b182a081f8dbedb4865c3e87fc9"} Mar 20 10:57:26 crc kubenswrapper[4748]: I0320 10:57:26.808536 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-695f6cc9c-5bkz4"] Mar 20 10:57:26 crc kubenswrapper[4748]: E0320 10:57:26.809258 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86562079-7a54-4b0f-88bd-83634a175a8a" containerName="dnsmasq-dns" Mar 20 10:57:26 crc kubenswrapper[4748]: I0320 10:57:26.809343 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="86562079-7a54-4b0f-88bd-83634a175a8a" containerName="dnsmasq-dns" Mar 20 10:57:26 crc kubenswrapper[4748]: E0320 10:57:26.809429 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86562079-7a54-4b0f-88bd-83634a175a8a" containerName="init" Mar 20 10:57:26 crc kubenswrapper[4748]: I0320 10:57:26.809514 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="86562079-7a54-4b0f-88bd-83634a175a8a" containerName="init" Mar 20 10:57:26 crc kubenswrapper[4748]: I0320 10:57:26.809800 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="86562079-7a54-4b0f-88bd-83634a175a8a" containerName="dnsmasq-dns" Mar 20 10:57:26 crc kubenswrapper[4748]: I0320 10:57:26.811215 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-695f6cc9c-5bkz4" Mar 20 10:57:26 crc kubenswrapper[4748]: I0320 10:57:26.817418 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Mar 20 10:57:26 crc kubenswrapper[4748]: I0320 10:57:26.817737 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Mar 20 10:57:26 crc kubenswrapper[4748]: I0320 10:57:26.826282 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-695f6cc9c-5bkz4"] Mar 20 10:57:26 crc kubenswrapper[4748]: I0320 10:57:26.833251 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.833216929 podStartE2EDuration="4.833216929s" podCreationTimestamp="2026-03-20 10:57:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:57:26.827199078 +0000 UTC m=+1281.968744912" watchObservedRunningTime="2026-03-20 10:57:26.833216929 +0000 UTC m=+1281.974762743" Mar 20 10:57:26 crc kubenswrapper[4748]: I0320 10:57:26.932983 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0900bb20-c211-44be-a5f8-6775641e54ca-config\") pod \"neutron-695f6cc9c-5bkz4\" (UID: \"0900bb20-c211-44be-a5f8-6775641e54ca\") " pod="openstack/neutron-695f6cc9c-5bkz4" Mar 20 10:57:26 crc kubenswrapper[4748]: I0320 10:57:26.933226 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0900bb20-c211-44be-a5f8-6775641e54ca-combined-ca-bundle\") pod \"neutron-695f6cc9c-5bkz4\" (UID: \"0900bb20-c211-44be-a5f8-6775641e54ca\") " pod="openstack/neutron-695f6cc9c-5bkz4" Mar 20 10:57:26 crc kubenswrapper[4748]: I0320 10:57:26.935914 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0900bb20-c211-44be-a5f8-6775641e54ca-httpd-config\") pod \"neutron-695f6cc9c-5bkz4\" (UID: \"0900bb20-c211-44be-a5f8-6775641e54ca\") " pod="openstack/neutron-695f6cc9c-5bkz4" Mar 20 10:57:26 crc kubenswrapper[4748]: I0320 10:57:26.936028 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0900bb20-c211-44be-a5f8-6775641e54ca-ovndb-tls-certs\") pod \"neutron-695f6cc9c-5bkz4\" (UID: \"0900bb20-c211-44be-a5f8-6775641e54ca\") " pod="openstack/neutron-695f6cc9c-5bkz4" Mar 20 10:57:26 crc kubenswrapper[4748]: I0320 10:57:26.936172 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0900bb20-c211-44be-a5f8-6775641e54ca-internal-tls-certs\") pod \"neutron-695f6cc9c-5bkz4\" (UID: \"0900bb20-c211-44be-a5f8-6775641e54ca\") " pod="openstack/neutron-695f6cc9c-5bkz4" Mar 20 10:57:26 crc kubenswrapper[4748]: I0320 10:57:26.936254 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qn7vw\" (UniqueName: \"kubernetes.io/projected/0900bb20-c211-44be-a5f8-6775641e54ca-kube-api-access-qn7vw\") pod \"neutron-695f6cc9c-5bkz4\" (UID: \"0900bb20-c211-44be-a5f8-6775641e54ca\") " pod="openstack/neutron-695f6cc9c-5bkz4" Mar 20 10:57:26 crc kubenswrapper[4748]: I0320 10:57:26.936410 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0900bb20-c211-44be-a5f8-6775641e54ca-public-tls-certs\") pod \"neutron-695f6cc9c-5bkz4\" (UID: \"0900bb20-c211-44be-a5f8-6775641e54ca\") " pod="openstack/neutron-695f6cc9c-5bkz4" Mar 20 10:57:27 crc kubenswrapper[4748]: I0320 10:57:27.037790 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0900bb20-c211-44be-a5f8-6775641e54ca-public-tls-certs\") pod \"neutron-695f6cc9c-5bkz4\" (UID: \"0900bb20-c211-44be-a5f8-6775641e54ca\") " pod="openstack/neutron-695f6cc9c-5bkz4" Mar 20 10:57:27 crc kubenswrapper[4748]: I0320 10:57:27.037938 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0900bb20-c211-44be-a5f8-6775641e54ca-config\") pod \"neutron-695f6cc9c-5bkz4\" (UID: \"0900bb20-c211-44be-a5f8-6775641e54ca\") " pod="openstack/neutron-695f6cc9c-5bkz4" Mar 20 10:57:27 crc kubenswrapper[4748]: I0320 10:57:27.037966 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0900bb20-c211-44be-a5f8-6775641e54ca-combined-ca-bundle\") pod \"neutron-695f6cc9c-5bkz4\" (UID: \"0900bb20-c211-44be-a5f8-6775641e54ca\") " pod="openstack/neutron-695f6cc9c-5bkz4" Mar 20 10:57:27 crc kubenswrapper[4748]: I0320 10:57:27.038074 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0900bb20-c211-44be-a5f8-6775641e54ca-httpd-config\") pod \"neutron-695f6cc9c-5bkz4\" (UID: \"0900bb20-c211-44be-a5f8-6775641e54ca\") " pod="openstack/neutron-695f6cc9c-5bkz4" Mar 20 10:57:27 crc kubenswrapper[4748]: I0320 10:57:27.038114 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0900bb20-c211-44be-a5f8-6775641e54ca-ovndb-tls-certs\") pod \"neutron-695f6cc9c-5bkz4\" (UID: \"0900bb20-c211-44be-a5f8-6775641e54ca\") " pod="openstack/neutron-695f6cc9c-5bkz4" Mar 20 10:57:27 crc kubenswrapper[4748]: I0320 10:57:27.038172 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0900bb20-c211-44be-a5f8-6775641e54ca-internal-tls-certs\") pod \"neutron-695f6cc9c-5bkz4\" (UID: \"0900bb20-c211-44be-a5f8-6775641e54ca\") " pod="openstack/neutron-695f6cc9c-5bkz4" Mar 20 10:57:27 crc kubenswrapper[4748]: I0320 10:57:27.038197 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qn7vw\" (UniqueName: \"kubernetes.io/projected/0900bb20-c211-44be-a5f8-6775641e54ca-kube-api-access-qn7vw\") pod \"neutron-695f6cc9c-5bkz4\" (UID: \"0900bb20-c211-44be-a5f8-6775641e54ca\") " pod="openstack/neutron-695f6cc9c-5bkz4" Mar 20 10:57:27 crc kubenswrapper[4748]: I0320 10:57:27.049453 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0900bb20-c211-44be-a5f8-6775641e54ca-ovndb-tls-certs\") pod \"neutron-695f6cc9c-5bkz4\" (UID: \"0900bb20-c211-44be-a5f8-6775641e54ca\") " pod="openstack/neutron-695f6cc9c-5bkz4" Mar 20 10:57:27 crc kubenswrapper[4748]: I0320 10:57:27.050550 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0900bb20-c211-44be-a5f8-6775641e54ca-httpd-config\") pod \"neutron-695f6cc9c-5bkz4\" (UID: \"0900bb20-c211-44be-a5f8-6775641e54ca\") " pod="openstack/neutron-695f6cc9c-5bkz4" Mar 20 10:57:27 crc kubenswrapper[4748]: I0320 10:57:27.053260 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0900bb20-c211-44be-a5f8-6775641e54ca-public-tls-certs\") pod \"neutron-695f6cc9c-5bkz4\" (UID: \"0900bb20-c211-44be-a5f8-6775641e54ca\") " pod="openstack/neutron-695f6cc9c-5bkz4" Mar 20 10:57:27 crc kubenswrapper[4748]: I0320 10:57:27.053336 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0900bb20-c211-44be-a5f8-6775641e54ca-internal-tls-certs\") pod \"neutron-695f6cc9c-5bkz4\" (UID: \"0900bb20-c211-44be-a5f8-6775641e54ca\") " pod="openstack/neutron-695f6cc9c-5bkz4" Mar 20 10:57:27 crc kubenswrapper[4748]: I0320 10:57:27.067222 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0900bb20-c211-44be-a5f8-6775641e54ca-combined-ca-bundle\") pod \"neutron-695f6cc9c-5bkz4\" (UID: \"0900bb20-c211-44be-a5f8-6775641e54ca\") " pod="openstack/neutron-695f6cc9c-5bkz4" Mar 20 10:57:27 crc kubenswrapper[4748]: I0320 10:57:27.086949 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/0900bb20-c211-44be-a5f8-6775641e54ca-config\") pod \"neutron-695f6cc9c-5bkz4\" (UID: \"0900bb20-c211-44be-a5f8-6775641e54ca\") " pod="openstack/neutron-695f6cc9c-5bkz4" Mar 20 10:57:27 crc kubenswrapper[4748]: I0320 10:57:27.092025 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qn7vw\" (UniqueName: \"kubernetes.io/projected/0900bb20-c211-44be-a5f8-6775641e54ca-kube-api-access-qn7vw\") pod \"neutron-695f6cc9c-5bkz4\" (UID: \"0900bb20-c211-44be-a5f8-6775641e54ca\") " pod="openstack/neutron-695f6cc9c-5bkz4" Mar 20 10:57:27 crc kubenswrapper[4748]: I0320 10:57:27.134340 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-695f6cc9c-5bkz4" Mar 20 10:57:27 crc kubenswrapper[4748]: I0320 10:57:27.811527 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="86dd57e9-a5c8-47c3-ab5e-aab96e192486" containerName="cinder-api-log" containerID="cri-o://19552a7b5b26ba8683b1ea504967d5a99e72d2d8591d187f60e9ff1be670b995" gracePeriod=30 Mar 20 10:57:27 crc kubenswrapper[4748]: I0320 10:57:27.813206 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"05905a19-bbef-49a3-848c-d08b0862ba89","Type":"ContainerStarted","Data":"11c588d8d38829c4b32e022012f5e582ef31303fb102f997fdf28f2bc2c06c75"} Mar 20 10:57:27 crc kubenswrapper[4748]: I0320 10:57:27.813530 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="86dd57e9-a5c8-47c3-ab5e-aab96e192486" containerName="cinder-api" containerID="cri-o://72464cd488d500d338050fb49e9cd488de7ce436a8d64a0c4b31508e851212e9" gracePeriod=30 Mar 20 10:57:27 crc kubenswrapper[4748]: I0320 10:57:27.845636 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=5.077988106 podStartE2EDuration="6.845612065s" podCreationTimestamp="2026-03-20 10:57:21 +0000 UTC" firstStartedPulling="2026-03-20 10:57:23.096958807 +0000 UTC m=+1278.238504621" lastFinishedPulling="2026-03-20 10:57:24.864582766 +0000 UTC m=+1280.006128580" observedRunningTime="2026-03-20 10:57:27.838468396 +0000 UTC m=+1282.980014220" watchObservedRunningTime="2026-03-20 10:57:27.845612065 +0000 UTC m=+1282.987157879" Mar 20 10:57:28 crc kubenswrapper[4748]: I0320 10:57:28.150008 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-695f6cc9c-5bkz4"] Mar 20 10:57:28 crc kubenswrapper[4748]: W0320 10:57:28.158364 4748 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0900bb20_c211_44be_a5f8_6775641e54ca.slice/crio-033a7abb02be9e3efce7a02580f080abe011dde533a59045dc3eb31b0da13a57 WatchSource:0}: Error finding container 033a7abb02be9e3efce7a02580f080abe011dde533a59045dc3eb31b0da13a57: Status 404 returned error can't find the container with id 033a7abb02be9e3efce7a02580f080abe011dde533a59045dc3eb31b0da13a57 Mar 20 10:57:28 crc kubenswrapper[4748]: I0320 10:57:28.568055 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-7d79b6bb86-nhfts" Mar 20 10:57:28 crc kubenswrapper[4748]: I0320 10:57:28.653076 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-b85b9d5c6-7bgxl"] Mar 20 10:57:28 crc kubenswrapper[4748]: I0320 10:57:28.654220 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-b85b9d5c6-7bgxl" podUID="18e8975c-a2d8-4319-b175-2a66ce3d97c9" containerName="horizon-log" containerID="cri-o://87958b0500956376a8c4f78533010b993f91853465c77d78af833be6203bfca7" gracePeriod=30 Mar 20 10:57:28 crc kubenswrapper[4748]: I0320 10:57:28.654685 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-b85b9d5c6-7bgxl" podUID="18e8975c-a2d8-4319-b175-2a66ce3d97c9" containerName="horizon" containerID="cri-o://1ef9c8f1e297ada419f5c18bfd6c48a8d01adf6e580eac1a741fdfad1d08a74f" gracePeriod=30 Mar 20 10:57:28 crc kubenswrapper[4748]: I0320 10:57:28.850163 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-695f6cc9c-5bkz4" event={"ID":"0900bb20-c211-44be-a5f8-6775641e54ca","Type":"ContainerStarted","Data":"68d9ef4c69ed8247f0593ad2612824b7e02ea2eed2236b8006cdfafd577632ff"} Mar 20 10:57:28 crc kubenswrapper[4748]: I0320 10:57:28.850229 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-695f6cc9c-5bkz4" event={"ID":"0900bb20-c211-44be-a5f8-6775641e54ca","Type":"ContainerStarted","Data":"033a7abb02be9e3efce7a02580f080abe011dde533a59045dc3eb31b0da13a57"} Mar 20 10:57:28 crc kubenswrapper[4748]: I0320 10:57:28.863577 4748 generic.go:334] "Generic (PLEG): container finished" podID="86dd57e9-a5c8-47c3-ab5e-aab96e192486" containerID="72464cd488d500d338050fb49e9cd488de7ce436a8d64a0c4b31508e851212e9" exitCode=0 Mar 20 10:57:28 crc kubenswrapper[4748]: I0320 10:57:28.863620 4748 generic.go:334] "Generic (PLEG): container finished" podID="86dd57e9-a5c8-47c3-ab5e-aab96e192486" containerID="19552a7b5b26ba8683b1ea504967d5a99e72d2d8591d187f60e9ff1be670b995" exitCode=143 Mar 20 10:57:28 crc kubenswrapper[4748]: I0320 10:57:28.864981 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"86dd57e9-a5c8-47c3-ab5e-aab96e192486","Type":"ContainerDied","Data":"72464cd488d500d338050fb49e9cd488de7ce436a8d64a0c4b31508e851212e9"} Mar 20 10:57:28 crc kubenswrapper[4748]: I0320 10:57:28.865061 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"86dd57e9-a5c8-47c3-ab5e-aab96e192486","Type":"ContainerDied","Data":"19552a7b5b26ba8683b1ea504967d5a99e72d2d8591d187f60e9ff1be670b995"} Mar 20 10:57:29 crc kubenswrapper[4748]: I0320 10:57:29.612518 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-d4c76548d-zpdfr" Mar 20 10:57:29 crc kubenswrapper[4748]: I0320 10:57:29.883782 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-695f6cc9c-5bkz4" event={"ID":"0900bb20-c211-44be-a5f8-6775641e54ca","Type":"ContainerStarted","Data":"11970fd33da85e6b895fa69b2763b405ec6d860c06239d0122cfbc3913eb428c"} Mar 20 10:57:29 crc kubenswrapper[4748]: I0320 10:57:29.884593 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-695f6cc9c-5bkz4" Mar 20 10:57:29 crc kubenswrapper[4748]: I0320 10:57:29.897597 4748 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-d4c76548d-zpdfr" podUID="bb0cf1bb-fa5f-4f46-951a-d3bda77fd294" containerName="barbican-api-log" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 10:57:29 crc kubenswrapper[4748]: I0320 10:57:29.924556 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-695f6cc9c-5bkz4" podStartSLOduration=3.924527566 podStartE2EDuration="3.924527566s" podCreationTimestamp="2026-03-20 10:57:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:57:29.916451473 +0000 UTC m=+1285.057997307" watchObservedRunningTime="2026-03-20 10:57:29.924527566 +0000 UTC m=+1285.066073380" Mar 20 10:57:30 crc kubenswrapper[4748]: I0320 10:57:30.500672 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 20 10:57:30 crc kubenswrapper[4748]: I0320 10:57:30.626301 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/86dd57e9-a5c8-47c3-ab5e-aab96e192486-config-data-custom\") pod \"86dd57e9-a5c8-47c3-ab5e-aab96e192486\" (UID: \"86dd57e9-a5c8-47c3-ab5e-aab96e192486\") " Mar 20 10:57:30 crc kubenswrapper[4748]: I0320 10:57:30.626453 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/86dd57e9-a5c8-47c3-ab5e-aab96e192486-scripts\") pod \"86dd57e9-a5c8-47c3-ab5e-aab96e192486\" (UID: \"86dd57e9-a5c8-47c3-ab5e-aab96e192486\") " Mar 20 10:57:30 crc kubenswrapper[4748]: I0320 10:57:30.626511 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86dd57e9-a5c8-47c3-ab5e-aab96e192486-combined-ca-bundle\") pod \"86dd57e9-a5c8-47c3-ab5e-aab96e192486\" (UID: \"86dd57e9-a5c8-47c3-ab5e-aab96e192486\") " Mar 20 10:57:30 crc kubenswrapper[4748]: I0320 10:57:30.626571 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86dd57e9-a5c8-47c3-ab5e-aab96e192486-config-data\") pod \"86dd57e9-a5c8-47c3-ab5e-aab96e192486\" (UID: \"86dd57e9-a5c8-47c3-ab5e-aab96e192486\") " Mar 20 10:57:30 crc kubenswrapper[4748]: I0320 10:57:30.626606 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8t5dk\" (UniqueName: \"kubernetes.io/projected/86dd57e9-a5c8-47c3-ab5e-aab96e192486-kube-api-access-8t5dk\") pod \"86dd57e9-a5c8-47c3-ab5e-aab96e192486\" (UID: \"86dd57e9-a5c8-47c3-ab5e-aab96e192486\") " Mar 20 10:57:30 crc kubenswrapper[4748]: I0320 10:57:30.626671 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/86dd57e9-a5c8-47c3-ab5e-aab96e192486-logs\") pod \"86dd57e9-a5c8-47c3-ab5e-aab96e192486\" (UID: \"86dd57e9-a5c8-47c3-ab5e-aab96e192486\") " Mar 20 10:57:30 crc kubenswrapper[4748]: I0320 10:57:30.626790 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/86dd57e9-a5c8-47c3-ab5e-aab96e192486-etc-machine-id\") pod \"86dd57e9-a5c8-47c3-ab5e-aab96e192486\" (UID: \"86dd57e9-a5c8-47c3-ab5e-aab96e192486\") " Mar 20 10:57:30 crc kubenswrapper[4748]: I0320 10:57:30.637201 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86dd57e9-a5c8-47c3-ab5e-aab96e192486-logs" (OuterVolumeSpecName: "logs") pod "86dd57e9-a5c8-47c3-ab5e-aab96e192486" (UID: "86dd57e9-a5c8-47c3-ab5e-aab96e192486"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:57:30 crc kubenswrapper[4748]: I0320 10:57:30.638405 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/86dd57e9-a5c8-47c3-ab5e-aab96e192486-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "86dd57e9-a5c8-47c3-ab5e-aab96e192486" (UID: "86dd57e9-a5c8-47c3-ab5e-aab96e192486"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 10:57:30 crc kubenswrapper[4748]: I0320 10:57:30.647166 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86dd57e9-a5c8-47c3-ab5e-aab96e192486-kube-api-access-8t5dk" (OuterVolumeSpecName: "kube-api-access-8t5dk") pod "86dd57e9-a5c8-47c3-ab5e-aab96e192486" (UID: "86dd57e9-a5c8-47c3-ab5e-aab96e192486"). InnerVolumeSpecName "kube-api-access-8t5dk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:57:30 crc kubenswrapper[4748]: I0320 10:57:30.648720 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86dd57e9-a5c8-47c3-ab5e-aab96e192486-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "86dd57e9-a5c8-47c3-ab5e-aab96e192486" (UID: "86dd57e9-a5c8-47c3-ab5e-aab96e192486"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:57:30 crc kubenswrapper[4748]: I0320 10:57:30.662007 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86dd57e9-a5c8-47c3-ab5e-aab96e192486-scripts" (OuterVolumeSpecName: "scripts") pod "86dd57e9-a5c8-47c3-ab5e-aab96e192486" (UID: "86dd57e9-a5c8-47c3-ab5e-aab96e192486"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:57:30 crc kubenswrapper[4748]: I0320 10:57:30.713929 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86dd57e9-a5c8-47c3-ab5e-aab96e192486-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "86dd57e9-a5c8-47c3-ab5e-aab96e192486" (UID: "86dd57e9-a5c8-47c3-ab5e-aab96e192486"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:57:30 crc kubenswrapper[4748]: I0320 10:57:30.730146 4748 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86dd57e9-a5c8-47c3-ab5e-aab96e192486-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 10:57:30 crc kubenswrapper[4748]: I0320 10:57:30.730198 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8t5dk\" (UniqueName: \"kubernetes.io/projected/86dd57e9-a5c8-47c3-ab5e-aab96e192486-kube-api-access-8t5dk\") on node \"crc\" DevicePath \"\"" Mar 20 10:57:30 crc kubenswrapper[4748]: I0320 10:57:30.730216 4748 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/86dd57e9-a5c8-47c3-ab5e-aab96e192486-logs\") on node \"crc\" DevicePath \"\"" Mar 20 10:57:30 crc kubenswrapper[4748]: I0320 10:57:30.730229 4748 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/86dd57e9-a5c8-47c3-ab5e-aab96e192486-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 20 10:57:30 crc kubenswrapper[4748]: I0320 10:57:30.730241 4748 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/86dd57e9-a5c8-47c3-ab5e-aab96e192486-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 20 10:57:30 crc kubenswrapper[4748]: I0320 10:57:30.730254 4748 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/86dd57e9-a5c8-47c3-ab5e-aab96e192486-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 10:57:30 crc kubenswrapper[4748]: I0320 10:57:30.747863 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86dd57e9-a5c8-47c3-ab5e-aab96e192486-config-data" (OuterVolumeSpecName: "config-data") pod "86dd57e9-a5c8-47c3-ab5e-aab96e192486" (UID: "86dd57e9-a5c8-47c3-ab5e-aab96e192486"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:57:30 crc kubenswrapper[4748]: I0320 10:57:30.832556 4748 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86dd57e9-a5c8-47c3-ab5e-aab96e192486-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 10:57:30 crc kubenswrapper[4748]: I0320 10:57:30.916925 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"86dd57e9-a5c8-47c3-ab5e-aab96e192486","Type":"ContainerDied","Data":"2309da3130efbd09105098d8b3e661aa053f1456d60c07851abe9460ae9304ce"} Mar 20 10:57:30 crc kubenswrapper[4748]: I0320 10:57:30.917000 4748 scope.go:117] "RemoveContainer" containerID="72464cd488d500d338050fb49e9cd488de7ce436a8d64a0c4b31508e851212e9" Mar 20 10:57:30 crc kubenswrapper[4748]: I0320 10:57:30.917054 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 20 10:57:30 crc kubenswrapper[4748]: I0320 10:57:30.981630 4748 scope.go:117] "RemoveContainer" containerID="19552a7b5b26ba8683b1ea504967d5a99e72d2d8591d187f60e9ff1be670b995" Mar 20 10:57:30 crc kubenswrapper[4748]: I0320 10:57:30.998613 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 20 10:57:31 crc kubenswrapper[4748]: I0320 10:57:31.027974 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Mar 20 10:57:31 crc kubenswrapper[4748]: I0320 10:57:31.062027 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 20 10:57:31 crc kubenswrapper[4748]: E0320 10:57:31.062623 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86dd57e9-a5c8-47c3-ab5e-aab96e192486" containerName="cinder-api-log" Mar 20 10:57:31 crc kubenswrapper[4748]: I0320 10:57:31.062645 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="86dd57e9-a5c8-47c3-ab5e-aab96e192486" containerName="cinder-api-log" Mar 20 10:57:31 crc kubenswrapper[4748]: E0320 10:57:31.062683 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86dd57e9-a5c8-47c3-ab5e-aab96e192486" containerName="cinder-api" Mar 20 10:57:31 crc kubenswrapper[4748]: I0320 10:57:31.062694 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="86dd57e9-a5c8-47c3-ab5e-aab96e192486" containerName="cinder-api" Mar 20 10:57:31 crc kubenswrapper[4748]: I0320 10:57:31.062992 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="86dd57e9-a5c8-47c3-ab5e-aab96e192486" containerName="cinder-api-log" Mar 20 10:57:31 crc kubenswrapper[4748]: I0320 10:57:31.063022 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="86dd57e9-a5c8-47c3-ab5e-aab96e192486" containerName="cinder-api" Mar 20 10:57:31 crc kubenswrapper[4748]: I0320 10:57:31.064359 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 20 10:57:31 crc kubenswrapper[4748]: I0320 10:57:31.069806 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 20 10:57:31 crc kubenswrapper[4748]: I0320 10:57:31.069968 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Mar 20 10:57:31 crc kubenswrapper[4748]: I0320 10:57:31.070079 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Mar 20 10:57:31 crc kubenswrapper[4748]: I0320 10:57:31.084546 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 20 10:57:31 crc kubenswrapper[4748]: I0320 10:57:31.139228 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4376a841-9631-4c91-bae6-9c12b2f46a17-config-data\") pod \"cinder-api-0\" (UID: \"4376a841-9631-4c91-bae6-9c12b2f46a17\") " pod="openstack/cinder-api-0" Mar 20 10:57:31 crc kubenswrapper[4748]: I0320 10:57:31.139272 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4376a841-9631-4c91-bae6-9c12b2f46a17-config-data-custom\") pod \"cinder-api-0\" (UID: \"4376a841-9631-4c91-bae6-9c12b2f46a17\") " pod="openstack/cinder-api-0" Mar 20 10:57:31 crc kubenswrapper[4748]: I0320 10:57:31.139299 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4376a841-9631-4c91-bae6-9c12b2f46a17-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"4376a841-9631-4c91-bae6-9c12b2f46a17\") " pod="openstack/cinder-api-0" Mar 20 10:57:31 crc kubenswrapper[4748]: I0320 10:57:31.139312 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4376a841-9631-4c91-bae6-9c12b2f46a17-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"4376a841-9631-4c91-bae6-9c12b2f46a17\") " pod="openstack/cinder-api-0" Mar 20 10:57:31 crc kubenswrapper[4748]: I0320 10:57:31.139361 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4376a841-9631-4c91-bae6-9c12b2f46a17-public-tls-certs\") pod \"cinder-api-0\" (UID: \"4376a841-9631-4c91-bae6-9c12b2f46a17\") " pod="openstack/cinder-api-0" Mar 20 10:57:31 crc kubenswrapper[4748]: I0320 10:57:31.139380 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4376a841-9631-4c91-bae6-9c12b2f46a17-scripts\") pod \"cinder-api-0\" (UID: \"4376a841-9631-4c91-bae6-9c12b2f46a17\") " pod="openstack/cinder-api-0" Mar 20 10:57:31 crc kubenswrapper[4748]: I0320 10:57:31.139397 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4376a841-9631-4c91-bae6-9c12b2f46a17-logs\") pod \"cinder-api-0\" (UID: \"4376a841-9631-4c91-bae6-9c12b2f46a17\") " pod="openstack/cinder-api-0" Mar 20 10:57:31 crc kubenswrapper[4748]: I0320 10:57:31.139437 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4376a841-9631-4c91-bae6-9c12b2f46a17-etc-machine-id\") pod \"cinder-api-0\" (UID: \"4376a841-9631-4c91-bae6-9c12b2f46a17\") " pod="openstack/cinder-api-0" Mar 20 10:57:31 crc kubenswrapper[4748]: I0320 10:57:31.139465 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rm942\" (UniqueName: \"kubernetes.io/projected/4376a841-9631-4c91-bae6-9c12b2f46a17-kube-api-access-rm942\") pod \"cinder-api-0\" (UID: \"4376a841-9631-4c91-bae6-9c12b2f46a17\") " pod="openstack/cinder-api-0" Mar 20 10:57:31 crc kubenswrapper[4748]: I0320 10:57:31.240721 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4376a841-9631-4c91-bae6-9c12b2f46a17-etc-machine-id\") pod \"cinder-api-0\" (UID: \"4376a841-9631-4c91-bae6-9c12b2f46a17\") " pod="openstack/cinder-api-0" Mar 20 10:57:31 crc kubenswrapper[4748]: I0320 10:57:31.240774 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rm942\" (UniqueName: \"kubernetes.io/projected/4376a841-9631-4c91-bae6-9c12b2f46a17-kube-api-access-rm942\") pod \"cinder-api-0\" (UID: \"4376a841-9631-4c91-bae6-9c12b2f46a17\") " pod="openstack/cinder-api-0" Mar 20 10:57:31 crc kubenswrapper[4748]: I0320 10:57:31.240890 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4376a841-9631-4c91-bae6-9c12b2f46a17-config-data\") pod \"cinder-api-0\" (UID: \"4376a841-9631-4c91-bae6-9c12b2f46a17\") " pod="openstack/cinder-api-0" Mar 20 10:57:31 crc kubenswrapper[4748]: I0320 10:57:31.240913 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4376a841-9631-4c91-bae6-9c12b2f46a17-config-data-custom\") pod \"cinder-api-0\" (UID: \"4376a841-9631-4c91-bae6-9c12b2f46a17\") " pod="openstack/cinder-api-0" Mar 20 10:57:31 crc kubenswrapper[4748]: I0320 10:57:31.240903 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4376a841-9631-4c91-bae6-9c12b2f46a17-etc-machine-id\") pod \"cinder-api-0\" (UID: \"4376a841-9631-4c91-bae6-9c12b2f46a17\") " pod="openstack/cinder-api-0" Mar 20 10:57:31 crc kubenswrapper[4748]: I0320 10:57:31.242077 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4376a841-9631-4c91-bae6-9c12b2f46a17-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"4376a841-9631-4c91-bae6-9c12b2f46a17\") " pod="openstack/cinder-api-0" Mar 20 10:57:31 crc kubenswrapper[4748]: I0320 10:57:31.242113 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4376a841-9631-4c91-bae6-9c12b2f46a17-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"4376a841-9631-4c91-bae6-9c12b2f46a17\") " pod="openstack/cinder-api-0" Mar 20 10:57:31 crc kubenswrapper[4748]: I0320 10:57:31.242170 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4376a841-9631-4c91-bae6-9c12b2f46a17-public-tls-certs\") pod \"cinder-api-0\" (UID: \"4376a841-9631-4c91-bae6-9c12b2f46a17\") " pod="openstack/cinder-api-0" Mar 20 10:57:31 crc kubenswrapper[4748]: I0320 10:57:31.242193 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4376a841-9631-4c91-bae6-9c12b2f46a17-scripts\") pod \"cinder-api-0\" (UID: \"4376a841-9631-4c91-bae6-9c12b2f46a17\") " pod="openstack/cinder-api-0" Mar 20 10:57:31 crc kubenswrapper[4748]: I0320 10:57:31.242212 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4376a841-9631-4c91-bae6-9c12b2f46a17-logs\") pod \"cinder-api-0\" (UID: \"4376a841-9631-4c91-bae6-9c12b2f46a17\") " pod="openstack/cinder-api-0" Mar 20 10:57:31 crc kubenswrapper[4748]: I0320 10:57:31.242523 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4376a841-9631-4c91-bae6-9c12b2f46a17-logs\") pod \"cinder-api-0\" (UID: \"4376a841-9631-4c91-bae6-9c12b2f46a17\") " pod="openstack/cinder-api-0" Mar 20 10:57:31 crc kubenswrapper[4748]: I0320 10:57:31.246077 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4376a841-9631-4c91-bae6-9c12b2f46a17-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"4376a841-9631-4c91-bae6-9c12b2f46a17\") " pod="openstack/cinder-api-0" Mar 20 10:57:31 crc kubenswrapper[4748]: I0320 10:57:31.249371 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4376a841-9631-4c91-bae6-9c12b2f46a17-public-tls-certs\") pod \"cinder-api-0\" (UID: \"4376a841-9631-4c91-bae6-9c12b2f46a17\") " pod="openstack/cinder-api-0" Mar 20 10:57:31 crc kubenswrapper[4748]: I0320 10:57:31.251021 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4376a841-9631-4c91-bae6-9c12b2f46a17-config-data\") pod \"cinder-api-0\" (UID: \"4376a841-9631-4c91-bae6-9c12b2f46a17\") " pod="openstack/cinder-api-0" Mar 20 10:57:31 crc kubenswrapper[4748]: I0320 10:57:31.251135 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4376a841-9631-4c91-bae6-9c12b2f46a17-scripts\") pod \"cinder-api-0\" (UID: \"4376a841-9631-4c91-bae6-9c12b2f46a17\") " pod="openstack/cinder-api-0" Mar 20 10:57:31 crc kubenswrapper[4748]: I0320 10:57:31.251586 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4376a841-9631-4c91-bae6-9c12b2f46a17-config-data-custom\") pod \"cinder-api-0\" (UID: \"4376a841-9631-4c91-bae6-9c12b2f46a17\") " pod="openstack/cinder-api-0" Mar 20 10:57:31 crc kubenswrapper[4748]: I0320 10:57:31.252803 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4376a841-9631-4c91-bae6-9c12b2f46a17-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"4376a841-9631-4c91-bae6-9c12b2f46a17\") " pod="openstack/cinder-api-0" Mar 20 10:57:31 crc kubenswrapper[4748]: I0320 10:57:31.269033 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rm942\" (UniqueName: \"kubernetes.io/projected/4376a841-9631-4c91-bae6-9c12b2f46a17-kube-api-access-rm942\") pod \"cinder-api-0\" (UID: \"4376a841-9631-4c91-bae6-9c12b2f46a17\") " pod="openstack/cinder-api-0" Mar 20 10:57:31 crc kubenswrapper[4748]: I0320 10:57:31.413257 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 20 10:57:31 crc kubenswrapper[4748]: I0320 10:57:31.531699 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86dd57e9-a5c8-47c3-ab5e-aab96e192486" path="/var/lib/kubelet/pods/86dd57e9-a5c8-47c3-ab5e-aab96e192486/volumes" Mar 20 10:57:31 crc kubenswrapper[4748]: I0320 10:57:31.864058 4748 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-b85b9d5c6-7bgxl" podUID="18e8975c-a2d8-4319-b175-2a66ce3d97c9" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.150:8443/dashboard/auth/login/?next=/dashboard/\": read tcp 10.217.0.2:60730->10.217.0.150:8443: read: connection reset by peer" Mar 20 10:57:32 crc kubenswrapper[4748]: I0320 10:57:32.029225 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 20 10:57:32 crc kubenswrapper[4748]: W0320 10:57:32.036964 4748 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4376a841_9631_4c91_bae6_9c12b2f46a17.slice/crio-1e5b0a70d887d4742767cab8889cd748176561ba770ed0c2d8ba327987e33aeb WatchSource:0}: Error finding container 1e5b0a70d887d4742767cab8889cd748176561ba770ed0c2d8ba327987e33aeb: Status 404 returned error can't find the container with id 1e5b0a70d887d4742767cab8889cd748176561ba770ed0c2d8ba327987e33aeb Mar 20 10:57:32 crc kubenswrapper[4748]: I0320 10:57:32.178351 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 20 10:57:32 crc kubenswrapper[4748]: I0320 10:57:32.426623 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-56cc45587d-dchtq" Mar 20 10:57:32 crc kubenswrapper[4748]: I0320 10:57:32.483952 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-56cc45587d-dchtq" Mar 20 10:57:32 crc kubenswrapper[4748]: I0320 10:57:32.504059 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 20 10:57:32 crc kubenswrapper[4748]: I0320 10:57:32.582019 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-6495464d6d-wmp49"] Mar 20 10:57:32 crc kubenswrapper[4748]: I0320 10:57:32.585440 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-6495464d6d-wmp49" podUID="a9f0a0d1-a120-45fd-95f6-6a5650096207" containerName="placement-log" containerID="cri-o://aa3e5bcdeb53be9bcb7cd3059f52452ab74bfc640f4cce1815b0bb44915f913a" gracePeriod=30 Mar 20 10:57:32 crc kubenswrapper[4748]: I0320 10:57:32.585631 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-6495464d6d-wmp49" podUID="a9f0a0d1-a120-45fd-95f6-6a5650096207" containerName="placement-api" containerID="cri-o://905528e50dd4243b3974576ea6e5a824bfd8f949a041d1798afac52a1be8e960" gracePeriod=30 Mar 20 10:57:32 crc kubenswrapper[4748]: I0320 10:57:32.617238 4748 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/placement-6495464d6d-wmp49" podUID="a9f0a0d1-a120-45fd-95f6-6a5650096207" containerName="placement-api" probeResult="failure" output="Get \"https://10.217.0.156:8778/\": EOF" Mar 20 10:57:32 crc kubenswrapper[4748]: I0320 10:57:32.618087 4748 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/placement-6495464d6d-wmp49" podUID="a9f0a0d1-a120-45fd-95f6-6a5650096207" containerName="placement-log" probeResult="failure" output="Get \"https://10.217.0.156:8778/\": EOF" Mar 20 10:57:32 crc kubenswrapper[4748]: I0320 10:57:32.618337 4748 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/placement-6495464d6d-wmp49" podUID="a9f0a0d1-a120-45fd-95f6-6a5650096207" containerName="placement-api" probeResult="failure" output="Get \"https://10.217.0.156:8778/\": EOF" Mar 20 10:57:32 crc kubenswrapper[4748]: I0320 10:57:32.618531 4748 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/placement-6495464d6d-wmp49" podUID="a9f0a0d1-a120-45fd-95f6-6a5650096207" containerName="placement-log" probeResult="failure" output="Get \"https://10.217.0.156:8778/\": EOF" Mar 20 10:57:32 crc kubenswrapper[4748]: I0320 10:57:32.795094 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5c9776ccc5-dg9px" Mar 20 10:57:32 crc kubenswrapper[4748]: I0320 10:57:32.903283 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-5g98d"] Mar 20 10:57:32 crc kubenswrapper[4748]: I0320 10:57:32.903603 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-785d8bcb8c-5g98d" podUID="24adc7ea-6cf7-416b-9570-a92736a9b48b" containerName="dnsmasq-dns" containerID="cri-o://82ea62f0cf66489ad6299a0579431489e73d924d8588b0d43d6ec33c76da1136" gracePeriod=10 Mar 20 10:57:32 crc kubenswrapper[4748]: I0320 10:57:32.998051 4748 generic.go:334] "Generic (PLEG): container finished" podID="a9f0a0d1-a120-45fd-95f6-6a5650096207" containerID="aa3e5bcdeb53be9bcb7cd3059f52452ab74bfc640f4cce1815b0bb44915f913a" exitCode=143 Mar 20 10:57:32 crc kubenswrapper[4748]: I0320 10:57:32.998133 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6495464d6d-wmp49" event={"ID":"a9f0a0d1-a120-45fd-95f6-6a5650096207","Type":"ContainerDied","Data":"aa3e5bcdeb53be9bcb7cd3059f52452ab74bfc640f4cce1815b0bb44915f913a"} Mar 20 10:57:33 crc kubenswrapper[4748]: I0320 10:57:33.019033 4748 generic.go:334] "Generic (PLEG): container finished" podID="18e8975c-a2d8-4319-b175-2a66ce3d97c9" containerID="1ef9c8f1e297ada419f5c18bfd6c48a8d01adf6e580eac1a741fdfad1d08a74f" exitCode=0 Mar 20 10:57:33 crc kubenswrapper[4748]: I0320 10:57:33.019149 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-b85b9d5c6-7bgxl" event={"ID":"18e8975c-a2d8-4319-b175-2a66ce3d97c9","Type":"ContainerDied","Data":"1ef9c8f1e297ada419f5c18bfd6c48a8d01adf6e580eac1a741fdfad1d08a74f"} Mar 20 10:57:33 crc kubenswrapper[4748]: I0320 10:57:33.040895 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"4376a841-9631-4c91-bae6-9c12b2f46a17","Type":"ContainerStarted","Data":"1e5b0a70d887d4742767cab8889cd748176561ba770ed0c2d8ba327987e33aeb"} Mar 20 10:57:33 crc kubenswrapper[4748]: I0320 10:57:33.148565 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 10:57:33 crc kubenswrapper[4748]: I0320 10:57:33.317407 4748 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-785d8bcb8c-5g98d" podUID="24adc7ea-6cf7-416b-9570-a92736a9b48b" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.146:5353: connect: connection refused" Mar 20 10:57:33 crc kubenswrapper[4748]: I0320 10:57:33.469314 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-688f5b7cfd-ffmqn" Mar 20 10:57:33 crc kubenswrapper[4748]: I0320 10:57:33.888797 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 20 10:57:33 crc kubenswrapper[4748]: I0320 10:57:33.898466 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 20 10:57:33 crc kubenswrapper[4748]: I0320 10:57:33.900089 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-5g98d" Mar 20 10:57:33 crc kubenswrapper[4748]: I0320 10:57:33.909969 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Mar 20 10:57:33 crc kubenswrapper[4748]: I0320 10:57:33.910302 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-s4tgj" Mar 20 10:57:33 crc kubenswrapper[4748]: I0320 10:57:33.910442 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Mar 20 10:57:33 crc kubenswrapper[4748]: I0320 10:57:33.928394 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 20 10:57:33 crc kubenswrapper[4748]: I0320 10:57:33.954491 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/24adc7ea-6cf7-416b-9570-a92736a9b48b-dns-swift-storage-0\") pod \"24adc7ea-6cf7-416b-9570-a92736a9b48b\" (UID: \"24adc7ea-6cf7-416b-9570-a92736a9b48b\") " Mar 20 10:57:33 crc kubenswrapper[4748]: I0320 10:57:33.957356 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zqdnr\" (UniqueName: \"kubernetes.io/projected/24adc7ea-6cf7-416b-9570-a92736a9b48b-kube-api-access-zqdnr\") pod \"24adc7ea-6cf7-416b-9570-a92736a9b48b\" (UID: \"24adc7ea-6cf7-416b-9570-a92736a9b48b\") " Mar 20 10:57:33 crc kubenswrapper[4748]: I0320 10:57:33.957584 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/24adc7ea-6cf7-416b-9570-a92736a9b48b-ovsdbserver-sb\") pod \"24adc7ea-6cf7-416b-9570-a92736a9b48b\" (UID: \"24adc7ea-6cf7-416b-9570-a92736a9b48b\") " Mar 20 10:57:33 crc kubenswrapper[4748]: I0320 10:57:33.957729 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/24adc7ea-6cf7-416b-9570-a92736a9b48b-ovsdbserver-nb\") pod \"24adc7ea-6cf7-416b-9570-a92736a9b48b\" (UID: \"24adc7ea-6cf7-416b-9570-a92736a9b48b\") " Mar 20 10:57:33 crc kubenswrapper[4748]: I0320 10:57:33.957887 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/24adc7ea-6cf7-416b-9570-a92736a9b48b-dns-svc\") pod \"24adc7ea-6cf7-416b-9570-a92736a9b48b\" (UID: \"24adc7ea-6cf7-416b-9570-a92736a9b48b\") " Mar 20 10:57:33 crc kubenswrapper[4748]: I0320 10:57:33.957990 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24adc7ea-6cf7-416b-9570-a92736a9b48b-config\") pod \"24adc7ea-6cf7-416b-9570-a92736a9b48b\" (UID: \"24adc7ea-6cf7-416b-9570-a92736a9b48b\") " Mar 20 10:57:33 crc kubenswrapper[4748]: I0320 10:57:33.958691 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/1cecba7e-4b51-4f6e-ba7b-2e026ee04eb4-openstack-config\") pod \"openstackclient\" (UID: \"1cecba7e-4b51-4f6e-ba7b-2e026ee04eb4\") " pod="openstack/openstackclient" Mar 20 10:57:33 crc kubenswrapper[4748]: I0320 10:57:33.958864 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jn7qf\" (UniqueName: \"kubernetes.io/projected/1cecba7e-4b51-4f6e-ba7b-2e026ee04eb4-kube-api-access-jn7qf\") pod \"openstackclient\" (UID: \"1cecba7e-4b51-4f6e-ba7b-2e026ee04eb4\") " pod="openstack/openstackclient" Mar 20 10:57:33 crc kubenswrapper[4748]: I0320 10:57:33.959062 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cecba7e-4b51-4f6e-ba7b-2e026ee04eb4-combined-ca-bundle\") pod \"openstackclient\" (UID: \"1cecba7e-4b51-4f6e-ba7b-2e026ee04eb4\") " pod="openstack/openstackclient" Mar 20 10:57:33 crc kubenswrapper[4748]: I0320 10:57:33.959222 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/1cecba7e-4b51-4f6e-ba7b-2e026ee04eb4-openstack-config-secret\") pod \"openstackclient\" (UID: \"1cecba7e-4b51-4f6e-ba7b-2e026ee04eb4\") " pod="openstack/openstackclient" Mar 20 10:57:34 crc kubenswrapper[4748]: I0320 10:57:34.006135 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24adc7ea-6cf7-416b-9570-a92736a9b48b-kube-api-access-zqdnr" (OuterVolumeSpecName: "kube-api-access-zqdnr") pod "24adc7ea-6cf7-416b-9570-a92736a9b48b" (UID: "24adc7ea-6cf7-416b-9570-a92736a9b48b"). InnerVolumeSpecName "kube-api-access-zqdnr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:57:34 crc kubenswrapper[4748]: I0320 10:57:34.061860 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/1cecba7e-4b51-4f6e-ba7b-2e026ee04eb4-openstack-config-secret\") pod \"openstackclient\" (UID: \"1cecba7e-4b51-4f6e-ba7b-2e026ee04eb4\") " pod="openstack/openstackclient" Mar 20 10:57:34 crc kubenswrapper[4748]: I0320 10:57:34.063528 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/1cecba7e-4b51-4f6e-ba7b-2e026ee04eb4-openstack-config\") pod \"openstackclient\" (UID: \"1cecba7e-4b51-4f6e-ba7b-2e026ee04eb4\") " pod="openstack/openstackclient" Mar 20 10:57:34 crc kubenswrapper[4748]: I0320 10:57:34.069082 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/1cecba7e-4b51-4f6e-ba7b-2e026ee04eb4-openstack-config\") pod \"openstackclient\" (UID: \"1cecba7e-4b51-4f6e-ba7b-2e026ee04eb4\") " pod="openstack/openstackclient" Mar 20 10:57:34 crc kubenswrapper[4748]: I0320 10:57:34.076466 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jn7qf\" (UniqueName: \"kubernetes.io/projected/1cecba7e-4b51-4f6e-ba7b-2e026ee04eb4-kube-api-access-jn7qf\") pod \"openstackclient\" (UID: \"1cecba7e-4b51-4f6e-ba7b-2e026ee04eb4\") " pod="openstack/openstackclient" Mar 20 10:57:34 crc kubenswrapper[4748]: I0320 10:57:34.076756 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cecba7e-4b51-4f6e-ba7b-2e026ee04eb4-combined-ca-bundle\") pod \"openstackclient\" (UID: \"1cecba7e-4b51-4f6e-ba7b-2e026ee04eb4\") " pod="openstack/openstackclient" Mar 20 10:57:34 crc kubenswrapper[4748]: I0320 10:57:34.077451 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zqdnr\" (UniqueName: \"kubernetes.io/projected/24adc7ea-6cf7-416b-9570-a92736a9b48b-kube-api-access-zqdnr\") on node \"crc\" DevicePath \"\"" Mar 20 10:57:34 crc kubenswrapper[4748]: I0320 10:57:34.086874 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/1cecba7e-4b51-4f6e-ba7b-2e026ee04eb4-openstack-config-secret\") pod \"openstackclient\" (UID: \"1cecba7e-4b51-4f6e-ba7b-2e026ee04eb4\") " pod="openstack/openstackclient" Mar 20 10:57:34 crc kubenswrapper[4748]: I0320 10:57:34.113673 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cecba7e-4b51-4f6e-ba7b-2e026ee04eb4-combined-ca-bundle\") pod \"openstackclient\" (UID: \"1cecba7e-4b51-4f6e-ba7b-2e026ee04eb4\") " pod="openstack/openstackclient" Mar 20 10:57:34 crc kubenswrapper[4748]: I0320 10:57:34.120621 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"4376a841-9631-4c91-bae6-9c12b2f46a17","Type":"ContainerStarted","Data":"1e15c48ac41083fbf3332e2f38512ad56f933b52406626a0ba196b6c62f80124"} Mar 20 10:57:34 crc kubenswrapper[4748]: I0320 10:57:34.122789 4748 generic.go:334] "Generic (PLEG): container finished" podID="24adc7ea-6cf7-416b-9570-a92736a9b48b" containerID="82ea62f0cf66489ad6299a0579431489e73d924d8588b0d43d6ec33c76da1136" exitCode=0 Mar 20 10:57:34 crc kubenswrapper[4748]: I0320 10:57:34.122971 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-5g98d" Mar 20 10:57:34 crc kubenswrapper[4748]: I0320 10:57:34.122969 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-5g98d" event={"ID":"24adc7ea-6cf7-416b-9570-a92736a9b48b","Type":"ContainerDied","Data":"82ea62f0cf66489ad6299a0579431489e73d924d8588b0d43d6ec33c76da1136"} Mar 20 10:57:34 crc kubenswrapper[4748]: I0320 10:57:34.123050 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-5g98d" event={"ID":"24adc7ea-6cf7-416b-9570-a92736a9b48b","Type":"ContainerDied","Data":"1d39e39b445ed448b1df60fa8583d35fdcd62be2730f0ed34b82aa517a848ffc"} Mar 20 10:57:34 crc kubenswrapper[4748]: I0320 10:57:34.123075 4748 scope.go:117] "RemoveContainer" containerID="82ea62f0cf66489ad6299a0579431489e73d924d8588b0d43d6ec33c76da1136" Mar 20 10:57:34 crc kubenswrapper[4748]: I0320 10:57:34.123121 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="05905a19-bbef-49a3-848c-d08b0862ba89" containerName="cinder-scheduler" containerID="cri-o://ff21caed54a7631e0e405265b7fe2a4b27310b182a081f8dbedb4865c3e87fc9" gracePeriod=30 Mar 20 10:57:34 crc kubenswrapper[4748]: I0320 10:57:34.123484 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="05905a19-bbef-49a3-848c-d08b0862ba89" containerName="probe" containerID="cri-o://11c588d8d38829c4b32e022012f5e582ef31303fb102f997fdf28f2bc2c06c75" gracePeriod=30 Mar 20 10:57:34 crc kubenswrapper[4748]: I0320 10:57:34.138859 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24adc7ea-6cf7-416b-9570-a92736a9b48b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "24adc7ea-6cf7-416b-9570-a92736a9b48b" (UID: "24adc7ea-6cf7-416b-9570-a92736a9b48b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:57:34 crc kubenswrapper[4748]: I0320 10:57:34.154163 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24adc7ea-6cf7-416b-9570-a92736a9b48b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "24adc7ea-6cf7-416b-9570-a92736a9b48b" (UID: "24adc7ea-6cf7-416b-9570-a92736a9b48b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:57:34 crc kubenswrapper[4748]: I0320 10:57:34.173682 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jn7qf\" (UniqueName: \"kubernetes.io/projected/1cecba7e-4b51-4f6e-ba7b-2e026ee04eb4-kube-api-access-jn7qf\") pod \"openstackclient\" (UID: \"1cecba7e-4b51-4f6e-ba7b-2e026ee04eb4\") " pod="openstack/openstackclient" Mar 20 10:57:34 crc kubenswrapper[4748]: I0320 10:57:34.180009 4748 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/24adc7ea-6cf7-416b-9570-a92736a9b48b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 10:57:34 crc kubenswrapper[4748]: I0320 10:57:34.180050 4748 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/24adc7ea-6cf7-416b-9570-a92736a9b48b-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 10:57:34 crc kubenswrapper[4748]: I0320 10:57:34.189434 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24adc7ea-6cf7-416b-9570-a92736a9b48b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "24adc7ea-6cf7-416b-9570-a92736a9b48b" (UID: "24adc7ea-6cf7-416b-9570-a92736a9b48b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:57:34 crc kubenswrapper[4748]: I0320 10:57:34.198671 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24adc7ea-6cf7-416b-9570-a92736a9b48b-config" (OuterVolumeSpecName: "config") pod "24adc7ea-6cf7-416b-9570-a92736a9b48b" (UID: "24adc7ea-6cf7-416b-9570-a92736a9b48b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:57:34 crc kubenswrapper[4748]: I0320 10:57:34.210955 4748 scope.go:117] "RemoveContainer" containerID="332ee7a61599ba0028d64900ea9a7a116fecc5be4626068bb8e64b1e9931ddce" Mar 20 10:57:34 crc kubenswrapper[4748]: I0320 10:57:34.230669 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24adc7ea-6cf7-416b-9570-a92736a9b48b-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "24adc7ea-6cf7-416b-9570-a92736a9b48b" (UID: "24adc7ea-6cf7-416b-9570-a92736a9b48b"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:57:34 crc kubenswrapper[4748]: I0320 10:57:34.259526 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 20 10:57:34 crc kubenswrapper[4748]: I0320 10:57:34.282404 4748 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24adc7ea-6cf7-416b-9570-a92736a9b48b-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:57:34 crc kubenswrapper[4748]: I0320 10:57:34.282445 4748 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/24adc7ea-6cf7-416b-9570-a92736a9b48b-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 20 10:57:34 crc kubenswrapper[4748]: I0320 10:57:34.282464 4748 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/24adc7ea-6cf7-416b-9570-a92736a9b48b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 10:57:34 crc kubenswrapper[4748]: I0320 10:57:34.290513 4748 scope.go:117] "RemoveContainer" containerID="82ea62f0cf66489ad6299a0579431489e73d924d8588b0d43d6ec33c76da1136" Mar 20 10:57:34 crc kubenswrapper[4748]: E0320 10:57:34.291251 4748 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"82ea62f0cf66489ad6299a0579431489e73d924d8588b0d43d6ec33c76da1136\": container with ID starting with 82ea62f0cf66489ad6299a0579431489e73d924d8588b0d43d6ec33c76da1136 not found: ID does not exist" containerID="82ea62f0cf66489ad6299a0579431489e73d924d8588b0d43d6ec33c76da1136" Mar 20 10:57:34 crc kubenswrapper[4748]: I0320 10:57:34.291296 4748 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82ea62f0cf66489ad6299a0579431489e73d924d8588b0d43d6ec33c76da1136"} err="failed to get container status \"82ea62f0cf66489ad6299a0579431489e73d924d8588b0d43d6ec33c76da1136\": rpc error: code = NotFound desc = could not find container \"82ea62f0cf66489ad6299a0579431489e73d924d8588b0d43d6ec33c76da1136\": container with ID starting with 82ea62f0cf66489ad6299a0579431489e73d924d8588b0d43d6ec33c76da1136 not found: ID does not exist" Mar 20 10:57:34 crc kubenswrapper[4748]: I0320 10:57:34.291328 4748 scope.go:117] "RemoveContainer" containerID="332ee7a61599ba0028d64900ea9a7a116fecc5be4626068bb8e64b1e9931ddce" Mar 20 10:57:34 crc kubenswrapper[4748]: E0320 10:57:34.291762 4748 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"332ee7a61599ba0028d64900ea9a7a116fecc5be4626068bb8e64b1e9931ddce\": container with ID starting with 332ee7a61599ba0028d64900ea9a7a116fecc5be4626068bb8e64b1e9931ddce not found: ID does not exist" containerID="332ee7a61599ba0028d64900ea9a7a116fecc5be4626068bb8e64b1e9931ddce" Mar 20 10:57:34 crc kubenswrapper[4748]: I0320 10:57:34.291794 4748 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"332ee7a61599ba0028d64900ea9a7a116fecc5be4626068bb8e64b1e9931ddce"} err="failed to get container status \"332ee7a61599ba0028d64900ea9a7a116fecc5be4626068bb8e64b1e9931ddce\": rpc error: code = NotFound desc = could not find container \"332ee7a61599ba0028d64900ea9a7a116fecc5be4626068bb8e64b1e9931ddce\": container with ID starting with 332ee7a61599ba0028d64900ea9a7a116fecc5be4626068bb8e64b1e9931ddce not found: ID does not exist" Mar 20 10:57:34 crc kubenswrapper[4748]: I0320 10:57:34.307389 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-d4c76548d-zpdfr" Mar 20 10:57:34 crc kubenswrapper[4748]: I0320 10:57:34.437577 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Mar 20 10:57:34 crc kubenswrapper[4748]: I0320 10:57:34.450132 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Mar 20 10:57:34 crc kubenswrapper[4748]: I0320 10:57:34.509168 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 20 10:57:34 crc kubenswrapper[4748]: E0320 10:57:34.509918 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24adc7ea-6cf7-416b-9570-a92736a9b48b" containerName="dnsmasq-dns" Mar 20 10:57:34 crc kubenswrapper[4748]: I0320 10:57:34.509939 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="24adc7ea-6cf7-416b-9570-a92736a9b48b" containerName="dnsmasq-dns" Mar 20 10:57:34 crc kubenswrapper[4748]: E0320 10:57:34.509954 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24adc7ea-6cf7-416b-9570-a92736a9b48b" containerName="init" Mar 20 10:57:34 crc kubenswrapper[4748]: I0320 10:57:34.509972 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="24adc7ea-6cf7-416b-9570-a92736a9b48b" containerName="init" Mar 20 10:57:34 crc kubenswrapper[4748]: I0320 10:57:34.510258 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="24adc7ea-6cf7-416b-9570-a92736a9b48b" containerName="dnsmasq-dns" Mar 20 10:57:34 crc kubenswrapper[4748]: I0320 10:57:34.511148 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 20 10:57:34 crc kubenswrapper[4748]: I0320 10:57:34.520271 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 20 10:57:34 crc kubenswrapper[4748]: I0320 10:57:34.532668 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-5g98d"] Mar 20 10:57:34 crc kubenswrapper[4748]: I0320 10:57:34.554081 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-5g98d"] Mar 20 10:57:34 crc kubenswrapper[4748]: I0320 10:57:34.595470 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b2f2b26-6292-47bb-b8ee-971d9b47c85d-combined-ca-bundle\") pod \"openstackclient\" (UID: \"2b2f2b26-6292-47bb-b8ee-971d9b47c85d\") " pod="openstack/openstackclient" Mar 20 10:57:34 crc kubenswrapper[4748]: I0320 10:57:34.595671 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/2b2f2b26-6292-47bb-b8ee-971d9b47c85d-openstack-config-secret\") pod \"openstackclient\" (UID: \"2b2f2b26-6292-47bb-b8ee-971d9b47c85d\") " pod="openstack/openstackclient" Mar 20 10:57:34 crc kubenswrapper[4748]: I0320 10:57:34.595809 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/2b2f2b26-6292-47bb-b8ee-971d9b47c85d-openstack-config\") pod \"openstackclient\" (UID: \"2b2f2b26-6292-47bb-b8ee-971d9b47c85d\") " pod="openstack/openstackclient" Mar 20 10:57:34 crc kubenswrapper[4748]: I0320 10:57:34.595873 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v46t9\" (UniqueName: \"kubernetes.io/projected/2b2f2b26-6292-47bb-b8ee-971d9b47c85d-kube-api-access-v46t9\") pod \"openstackclient\" (UID: \"2b2f2b26-6292-47bb-b8ee-971d9b47c85d\") " pod="openstack/openstackclient" Mar 20 10:57:34 crc kubenswrapper[4748]: I0320 10:57:34.728496 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b2f2b26-6292-47bb-b8ee-971d9b47c85d-combined-ca-bundle\") pod \"openstackclient\" (UID: \"2b2f2b26-6292-47bb-b8ee-971d9b47c85d\") " pod="openstack/openstackclient" Mar 20 10:57:34 crc kubenswrapper[4748]: I0320 10:57:34.728967 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/2b2f2b26-6292-47bb-b8ee-971d9b47c85d-openstack-config-secret\") pod \"openstackclient\" (UID: \"2b2f2b26-6292-47bb-b8ee-971d9b47c85d\") " pod="openstack/openstackclient" Mar 20 10:57:34 crc kubenswrapper[4748]: I0320 10:57:34.729048 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/2b2f2b26-6292-47bb-b8ee-971d9b47c85d-openstack-config\") pod \"openstackclient\" (UID: \"2b2f2b26-6292-47bb-b8ee-971d9b47c85d\") " pod="openstack/openstackclient" Mar 20 10:57:34 crc kubenswrapper[4748]: I0320 10:57:34.729071 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v46t9\" (UniqueName: \"kubernetes.io/projected/2b2f2b26-6292-47bb-b8ee-971d9b47c85d-kube-api-access-v46t9\") pod \"openstackclient\" (UID: \"2b2f2b26-6292-47bb-b8ee-971d9b47c85d\") " pod="openstack/openstackclient" Mar 20 10:57:34 crc kubenswrapper[4748]: I0320 10:57:34.732856 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/2b2f2b26-6292-47bb-b8ee-971d9b47c85d-openstack-config\") pod \"openstackclient\" (UID: \"2b2f2b26-6292-47bb-b8ee-971d9b47c85d\") " pod="openstack/openstackclient" Mar 20 10:57:34 crc kubenswrapper[4748]: I0320 10:57:34.740238 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b2f2b26-6292-47bb-b8ee-971d9b47c85d-combined-ca-bundle\") pod \"openstackclient\" (UID: \"2b2f2b26-6292-47bb-b8ee-971d9b47c85d\") " pod="openstack/openstackclient" Mar 20 10:57:34 crc kubenswrapper[4748]: I0320 10:57:34.750661 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/2b2f2b26-6292-47bb-b8ee-971d9b47c85d-openstack-config-secret\") pod \"openstackclient\" (UID: \"2b2f2b26-6292-47bb-b8ee-971d9b47c85d\") " pod="openstack/openstackclient" Mar 20 10:57:34 crc kubenswrapper[4748]: I0320 10:57:34.770141 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v46t9\" (UniqueName: \"kubernetes.io/projected/2b2f2b26-6292-47bb-b8ee-971d9b47c85d-kube-api-access-v46t9\") pod \"openstackclient\" (UID: \"2b2f2b26-6292-47bb-b8ee-971d9b47c85d\") " pod="openstack/openstackclient" Mar 20 10:57:34 crc kubenswrapper[4748]: I0320 10:57:34.857280 4748 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-77b5d7f4f8-8jmkc" podUID="201e8a26-7bfa-40c7-aa3d-bf32c1344d61" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.166:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 10:57:34 crc kubenswrapper[4748]: I0320 10:57:34.863849 4748 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-77b5d7f4f8-8jmkc" podUID="201e8a26-7bfa-40c7-aa3d-bf32c1344d61" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.166:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 10:57:34 crc kubenswrapper[4748]: I0320 10:57:34.901938 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 20 10:57:34 crc kubenswrapper[4748]: E0320 10:57:34.954256 4748 log.go:32] "RunPodSandbox from runtime service failed" err=< Mar 20 10:57:34 crc kubenswrapper[4748]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_1cecba7e-4b51-4f6e-ba7b-2e026ee04eb4_0(819fad1df2abbad30c4f71882596df5ee4ad3f7baa9dbef8b16a9fce594f2767): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"819fad1df2abbad30c4f71882596df5ee4ad3f7baa9dbef8b16a9fce594f2767" Netns:"/var/run/netns/8b521681-82f6-4c33-81fa-29a7e9c98182" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=819fad1df2abbad30c4f71882596df5ee4ad3f7baa9dbef8b16a9fce594f2767;K8S_POD_UID=1cecba7e-4b51-4f6e-ba7b-2e026ee04eb4" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: [openstack/openstackclient/1cecba7e-4b51-4f6e-ba7b-2e026ee04eb4:ovn-kubernetes]: error adding container to network "ovn-kubernetes": CNI request failed with status 400: '[openstack/openstackclient 819fad1df2abbad30c4f71882596df5ee4ad3f7baa9dbef8b16a9fce594f2767 network default NAD default] [openstack/openstackclient 819fad1df2abbad30c4f71882596df5ee4ad3f7baa9dbef8b16a9fce594f2767 network default NAD default] pod deleted before sandbox ADD operation began Mar 20 10:57:34 crc kubenswrapper[4748]: ' Mar 20 10:57:34 crc kubenswrapper[4748]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 20 10:57:34 crc kubenswrapper[4748]: > Mar 20 10:57:34 crc kubenswrapper[4748]: E0320 10:57:34.954337 4748 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Mar 20 10:57:34 crc kubenswrapper[4748]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_1cecba7e-4b51-4f6e-ba7b-2e026ee04eb4_0(819fad1df2abbad30c4f71882596df5ee4ad3f7baa9dbef8b16a9fce594f2767): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"819fad1df2abbad30c4f71882596df5ee4ad3f7baa9dbef8b16a9fce594f2767" Netns:"/var/run/netns/8b521681-82f6-4c33-81fa-29a7e9c98182" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=819fad1df2abbad30c4f71882596df5ee4ad3f7baa9dbef8b16a9fce594f2767;K8S_POD_UID=1cecba7e-4b51-4f6e-ba7b-2e026ee04eb4" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: [openstack/openstackclient/1cecba7e-4b51-4f6e-ba7b-2e026ee04eb4:ovn-kubernetes]: error adding container to network "ovn-kubernetes": CNI request failed with status 400: '[openstack/openstackclient 819fad1df2abbad30c4f71882596df5ee4ad3f7baa9dbef8b16a9fce594f2767 network default NAD default] [openstack/openstackclient 819fad1df2abbad30c4f71882596df5ee4ad3f7baa9dbef8b16a9fce594f2767 network default NAD default] pod deleted before sandbox ADD operation began Mar 20 10:57:34 crc kubenswrapper[4748]: ' Mar 20 10:57:34 crc kubenswrapper[4748]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 20 10:57:34 crc kubenswrapper[4748]: > pod="openstack/openstackclient" Mar 20 10:57:35 crc kubenswrapper[4748]: I0320 10:57:35.144518 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"4376a841-9631-4c91-bae6-9c12b2f46a17","Type":"ContainerStarted","Data":"8fec1ebcdeb9076e3f72f29d33f81bcde37aad1aa2eda2b9a758d4d3957e949d"} Mar 20 10:57:35 crc kubenswrapper[4748]: I0320 10:57:35.144826 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 20 10:57:35 crc kubenswrapper[4748]: I0320 10:57:35.147350 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 20 10:57:35 crc kubenswrapper[4748]: I0320 10:57:35.177406 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 20 10:57:35 crc kubenswrapper[4748]: I0320 10:57:35.189496 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=5.18947333 podStartE2EDuration="5.18947333s" podCreationTimestamp="2026-03-20 10:57:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:57:35.186727051 +0000 UTC m=+1290.328272865" watchObservedRunningTime="2026-03-20 10:57:35.18947333 +0000 UTC m=+1290.331019144" Mar 20 10:57:35 crc kubenswrapper[4748]: I0320 10:57:35.194392 4748 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="1cecba7e-4b51-4f6e-ba7b-2e026ee04eb4" podUID="2b2f2b26-6292-47bb-b8ee-971d9b47c85d" Mar 20 10:57:35 crc kubenswrapper[4748]: I0320 10:57:35.256477 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cecba7e-4b51-4f6e-ba7b-2e026ee04eb4-combined-ca-bundle\") pod \"1cecba7e-4b51-4f6e-ba7b-2e026ee04eb4\" (UID: \"1cecba7e-4b51-4f6e-ba7b-2e026ee04eb4\") " Mar 20 10:57:35 crc kubenswrapper[4748]: I0320 10:57:35.256580 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/1cecba7e-4b51-4f6e-ba7b-2e026ee04eb4-openstack-config-secret\") pod \"1cecba7e-4b51-4f6e-ba7b-2e026ee04eb4\" (UID: \"1cecba7e-4b51-4f6e-ba7b-2e026ee04eb4\") " Mar 20 10:57:35 crc kubenswrapper[4748]: I0320 10:57:35.256741 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jn7qf\" (UniqueName: \"kubernetes.io/projected/1cecba7e-4b51-4f6e-ba7b-2e026ee04eb4-kube-api-access-jn7qf\") pod \"1cecba7e-4b51-4f6e-ba7b-2e026ee04eb4\" (UID: \"1cecba7e-4b51-4f6e-ba7b-2e026ee04eb4\") " Mar 20 10:57:35 crc kubenswrapper[4748]: I0320 10:57:35.256785 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/1cecba7e-4b51-4f6e-ba7b-2e026ee04eb4-openstack-config\") pod \"1cecba7e-4b51-4f6e-ba7b-2e026ee04eb4\" (UID: \"1cecba7e-4b51-4f6e-ba7b-2e026ee04eb4\") " Mar 20 10:57:35 crc kubenswrapper[4748]: I0320 10:57:35.262505 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1cecba7e-4b51-4f6e-ba7b-2e026ee04eb4-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "1cecba7e-4b51-4f6e-ba7b-2e026ee04eb4" (UID: "1cecba7e-4b51-4f6e-ba7b-2e026ee04eb4"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:57:35 crc kubenswrapper[4748]: I0320 10:57:35.268107 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1cecba7e-4b51-4f6e-ba7b-2e026ee04eb4-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "1cecba7e-4b51-4f6e-ba7b-2e026ee04eb4" (UID: "1cecba7e-4b51-4f6e-ba7b-2e026ee04eb4"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:57:35 crc kubenswrapper[4748]: I0320 10:57:35.268986 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1cecba7e-4b51-4f6e-ba7b-2e026ee04eb4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1cecba7e-4b51-4f6e-ba7b-2e026ee04eb4" (UID: "1cecba7e-4b51-4f6e-ba7b-2e026ee04eb4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:57:35 crc kubenswrapper[4748]: I0320 10:57:35.270492 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1cecba7e-4b51-4f6e-ba7b-2e026ee04eb4-kube-api-access-jn7qf" (OuterVolumeSpecName: "kube-api-access-jn7qf") pod "1cecba7e-4b51-4f6e-ba7b-2e026ee04eb4" (UID: "1cecba7e-4b51-4f6e-ba7b-2e026ee04eb4"). InnerVolumeSpecName "kube-api-access-jn7qf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:57:35 crc kubenswrapper[4748]: I0320 10:57:35.366127 4748 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cecba7e-4b51-4f6e-ba7b-2e026ee04eb4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 10:57:35 crc kubenswrapper[4748]: I0320 10:57:35.366174 4748 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/1cecba7e-4b51-4f6e-ba7b-2e026ee04eb4-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Mar 20 10:57:35 crc kubenswrapper[4748]: I0320 10:57:35.366187 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jn7qf\" (UniqueName: \"kubernetes.io/projected/1cecba7e-4b51-4f6e-ba7b-2e026ee04eb4-kube-api-access-jn7qf\") on node \"crc\" DevicePath \"\"" Mar 20 10:57:35 crc kubenswrapper[4748]: I0320 10:57:35.366202 4748 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/1cecba7e-4b51-4f6e-ba7b-2e026ee04eb4-openstack-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:57:35 crc kubenswrapper[4748]: I0320 10:57:35.549080 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1cecba7e-4b51-4f6e-ba7b-2e026ee04eb4" path="/var/lib/kubelet/pods/1cecba7e-4b51-4f6e-ba7b-2e026ee04eb4/volumes" Mar 20 10:57:35 crc kubenswrapper[4748]: I0320 10:57:35.549473 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24adc7ea-6cf7-416b-9570-a92736a9b48b" path="/var/lib/kubelet/pods/24adc7ea-6cf7-416b-9570-a92736a9b48b/volumes" Mar 20 10:57:35 crc kubenswrapper[4748]: I0320 10:57:35.713466 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 20 10:57:35 crc kubenswrapper[4748]: E0320 10:57:35.917984 4748 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1cecba7e_4b51_4f6e_ba7b_2e026ee04eb4.slice\": RecentStats: unable to find data in memory cache]" Mar 20 10:57:36 crc kubenswrapper[4748]: I0320 10:57:36.159576 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"2b2f2b26-6292-47bb-b8ee-971d9b47c85d","Type":"ContainerStarted","Data":"668ba156af4cc98efbe754f1bc885351c813028f0650ae18159348aa4da5244a"} Mar 20 10:57:36 crc kubenswrapper[4748]: I0320 10:57:36.161879 4748 generic.go:334] "Generic (PLEG): container finished" podID="05905a19-bbef-49a3-848c-d08b0862ba89" containerID="11c588d8d38829c4b32e022012f5e582ef31303fb102f997fdf28f2bc2c06c75" exitCode=0 Mar 20 10:57:36 crc kubenswrapper[4748]: I0320 10:57:36.161974 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"05905a19-bbef-49a3-848c-d08b0862ba89","Type":"ContainerDied","Data":"11c588d8d38829c4b32e022012f5e582ef31303fb102f997fdf28f2bc2c06c75"} Mar 20 10:57:36 crc kubenswrapper[4748]: I0320 10:57:36.161998 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 20 10:57:36 crc kubenswrapper[4748]: I0320 10:57:36.172087 4748 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="1cecba7e-4b51-4f6e-ba7b-2e026ee04eb4" podUID="2b2f2b26-6292-47bb-b8ee-971d9b47c85d" Mar 20 10:57:36 crc kubenswrapper[4748]: I0320 10:57:36.592102 4748 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/horizon-7d79b6bb86-nhfts" podUID="f3de236a-e527-4582-8eb5-03ca8aa883e0" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.151:8443/dashboard/auth/login/?next=/dashboard/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 10:57:37 crc kubenswrapper[4748]: I0320 10:57:37.106673 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 20 10:57:37 crc kubenswrapper[4748]: I0320 10:57:37.214008 4748 generic.go:334] "Generic (PLEG): container finished" podID="05905a19-bbef-49a3-848c-d08b0862ba89" containerID="ff21caed54a7631e0e405265b7fe2a4b27310b182a081f8dbedb4865c3e87fc9" exitCode=0 Mar 20 10:57:37 crc kubenswrapper[4748]: I0320 10:57:37.214060 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"05905a19-bbef-49a3-848c-d08b0862ba89","Type":"ContainerDied","Data":"ff21caed54a7631e0e405265b7fe2a4b27310b182a081f8dbedb4865c3e87fc9"} Mar 20 10:57:37 crc kubenswrapper[4748]: I0320 10:57:37.214090 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"05905a19-bbef-49a3-848c-d08b0862ba89","Type":"ContainerDied","Data":"94acf0c195d7e536b3b9215e8f4b086cf8e9719e2a7b8b7a5965373a8d0aaae2"} Mar 20 10:57:37 crc kubenswrapper[4748]: I0320 10:57:37.214111 4748 scope.go:117] "RemoveContainer" containerID="11c588d8d38829c4b32e022012f5e582ef31303fb102f997fdf28f2bc2c06c75" Mar 20 10:57:37 crc kubenswrapper[4748]: I0320 10:57:37.214258 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 20 10:57:37 crc kubenswrapper[4748]: I0320 10:57:37.259538 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/05905a19-bbef-49a3-848c-d08b0862ba89-scripts\") pod \"05905a19-bbef-49a3-848c-d08b0862ba89\" (UID: \"05905a19-bbef-49a3-848c-d08b0862ba89\") " Mar 20 10:57:37 crc kubenswrapper[4748]: I0320 10:57:37.259608 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05905a19-bbef-49a3-848c-d08b0862ba89-config-data\") pod \"05905a19-bbef-49a3-848c-d08b0862ba89\" (UID: \"05905a19-bbef-49a3-848c-d08b0862ba89\") " Mar 20 10:57:37 crc kubenswrapper[4748]: I0320 10:57:37.259699 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/05905a19-bbef-49a3-848c-d08b0862ba89-etc-machine-id\") pod \"05905a19-bbef-49a3-848c-d08b0862ba89\" (UID: \"05905a19-bbef-49a3-848c-d08b0862ba89\") " Mar 20 10:57:37 crc kubenswrapper[4748]: I0320 10:57:37.259757 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05905a19-bbef-49a3-848c-d08b0862ba89-combined-ca-bundle\") pod \"05905a19-bbef-49a3-848c-d08b0862ba89\" (UID: \"05905a19-bbef-49a3-848c-d08b0862ba89\") " Mar 20 10:57:37 crc kubenswrapper[4748]: I0320 10:57:37.259892 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-86n7r\" (UniqueName: \"kubernetes.io/projected/05905a19-bbef-49a3-848c-d08b0862ba89-kube-api-access-86n7r\") pod \"05905a19-bbef-49a3-848c-d08b0862ba89\" (UID: \"05905a19-bbef-49a3-848c-d08b0862ba89\") " Mar 20 10:57:37 crc kubenswrapper[4748]: I0320 10:57:37.259973 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/05905a19-bbef-49a3-848c-d08b0862ba89-config-data-custom\") pod \"05905a19-bbef-49a3-848c-d08b0862ba89\" (UID: \"05905a19-bbef-49a3-848c-d08b0862ba89\") " Mar 20 10:57:37 crc kubenswrapper[4748]: I0320 10:57:37.260994 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/05905a19-bbef-49a3-848c-d08b0862ba89-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "05905a19-bbef-49a3-848c-d08b0862ba89" (UID: "05905a19-bbef-49a3-848c-d08b0862ba89"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 10:57:37 crc kubenswrapper[4748]: I0320 10:57:37.280048 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05905a19-bbef-49a3-848c-d08b0862ba89-kube-api-access-86n7r" (OuterVolumeSpecName: "kube-api-access-86n7r") pod "05905a19-bbef-49a3-848c-d08b0862ba89" (UID: "05905a19-bbef-49a3-848c-d08b0862ba89"). InnerVolumeSpecName "kube-api-access-86n7r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:57:37 crc kubenswrapper[4748]: I0320 10:57:37.294724 4748 scope.go:117] "RemoveContainer" containerID="ff21caed54a7631e0e405265b7fe2a4b27310b182a081f8dbedb4865c3e87fc9" Mar 20 10:57:37 crc kubenswrapper[4748]: I0320 10:57:37.300034 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05905a19-bbef-49a3-848c-d08b0862ba89-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "05905a19-bbef-49a3-848c-d08b0862ba89" (UID: "05905a19-bbef-49a3-848c-d08b0862ba89"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:57:37 crc kubenswrapper[4748]: I0320 10:57:37.301207 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05905a19-bbef-49a3-848c-d08b0862ba89-scripts" (OuterVolumeSpecName: "scripts") pod "05905a19-bbef-49a3-848c-d08b0862ba89" (UID: "05905a19-bbef-49a3-848c-d08b0862ba89"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:57:37 crc kubenswrapper[4748]: I0320 10:57:37.362757 4748 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/05905a19-bbef-49a3-848c-d08b0862ba89-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 20 10:57:37 crc kubenswrapper[4748]: I0320 10:57:37.363039 4748 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/05905a19-bbef-49a3-848c-d08b0862ba89-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 10:57:37 crc kubenswrapper[4748]: I0320 10:57:37.363128 4748 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/05905a19-bbef-49a3-848c-d08b0862ba89-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 20 10:57:37 crc kubenswrapper[4748]: I0320 10:57:37.363209 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-86n7r\" (UniqueName: \"kubernetes.io/projected/05905a19-bbef-49a3-848c-d08b0862ba89-kube-api-access-86n7r\") on node \"crc\" DevicePath \"\"" Mar 20 10:57:37 crc kubenswrapper[4748]: I0320 10:57:37.405722 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-77b5d7f4f8-8jmkc" Mar 20 10:57:37 crc kubenswrapper[4748]: I0320 10:57:37.426455 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05905a19-bbef-49a3-848c-d08b0862ba89-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "05905a19-bbef-49a3-848c-d08b0862ba89" (UID: "05905a19-bbef-49a3-848c-d08b0862ba89"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:57:37 crc kubenswrapper[4748]: I0320 10:57:37.429783 4748 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/placement-6495464d6d-wmp49" podUID="a9f0a0d1-a120-45fd-95f6-6a5650096207" containerName="placement-api" probeResult="failure" output="Get \"https://10.217.0.156:8778/\": read tcp 10.217.0.2:46414->10.217.0.156:8778: read: connection reset by peer" Mar 20 10:57:37 crc kubenswrapper[4748]: I0320 10:57:37.430147 4748 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/placement-6495464d6d-wmp49" podUID="a9f0a0d1-a120-45fd-95f6-6a5650096207" containerName="placement-log" probeResult="failure" output="Get \"https://10.217.0.156:8778/\": read tcp 10.217.0.2:46418->10.217.0.156:8778: read: connection reset by peer" Mar 20 10:57:37 crc kubenswrapper[4748]: I0320 10:57:37.462678 4748 scope.go:117] "RemoveContainer" containerID="11c588d8d38829c4b32e022012f5e582ef31303fb102f997fdf28f2bc2c06c75" Mar 20 10:57:37 crc kubenswrapper[4748]: I0320 10:57:37.464535 4748 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05905a19-bbef-49a3-848c-d08b0862ba89-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 10:57:37 crc kubenswrapper[4748]: E0320 10:57:37.465916 4748 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11c588d8d38829c4b32e022012f5e582ef31303fb102f997fdf28f2bc2c06c75\": container with ID starting with 11c588d8d38829c4b32e022012f5e582ef31303fb102f997fdf28f2bc2c06c75 not found: ID does not exist" containerID="11c588d8d38829c4b32e022012f5e582ef31303fb102f997fdf28f2bc2c06c75" Mar 20 10:57:37 crc kubenswrapper[4748]: I0320 10:57:37.466115 4748 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11c588d8d38829c4b32e022012f5e582ef31303fb102f997fdf28f2bc2c06c75"} err="failed to get container status \"11c588d8d38829c4b32e022012f5e582ef31303fb102f997fdf28f2bc2c06c75\": rpc error: code = NotFound desc = could not find container \"11c588d8d38829c4b32e022012f5e582ef31303fb102f997fdf28f2bc2c06c75\": container with ID starting with 11c588d8d38829c4b32e022012f5e582ef31303fb102f997fdf28f2bc2c06c75 not found: ID does not exist" Mar 20 10:57:37 crc kubenswrapper[4748]: I0320 10:57:37.466396 4748 scope.go:117] "RemoveContainer" containerID="ff21caed54a7631e0e405265b7fe2a4b27310b182a081f8dbedb4865c3e87fc9" Mar 20 10:57:37 crc kubenswrapper[4748]: E0320 10:57:37.468357 4748 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff21caed54a7631e0e405265b7fe2a4b27310b182a081f8dbedb4865c3e87fc9\": container with ID starting with ff21caed54a7631e0e405265b7fe2a4b27310b182a081f8dbedb4865c3e87fc9 not found: ID does not exist" containerID="ff21caed54a7631e0e405265b7fe2a4b27310b182a081f8dbedb4865c3e87fc9" Mar 20 10:57:37 crc kubenswrapper[4748]: I0320 10:57:37.468394 4748 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff21caed54a7631e0e405265b7fe2a4b27310b182a081f8dbedb4865c3e87fc9"} err="failed to get container status \"ff21caed54a7631e0e405265b7fe2a4b27310b182a081f8dbedb4865c3e87fc9\": rpc error: code = NotFound desc = could not find container \"ff21caed54a7631e0e405265b7fe2a4b27310b182a081f8dbedb4865c3e87fc9\": container with ID starting with ff21caed54a7631e0e405265b7fe2a4b27310b182a081f8dbedb4865c3e87fc9 not found: ID does not exist" Mar 20 10:57:37 crc kubenswrapper[4748]: I0320 10:57:37.473075 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05905a19-bbef-49a3-848c-d08b0862ba89-config-data" (OuterVolumeSpecName: "config-data") pod "05905a19-bbef-49a3-848c-d08b0862ba89" (UID: "05905a19-bbef-49a3-848c-d08b0862ba89"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:57:37 crc kubenswrapper[4748]: I0320 10:57:37.566474 4748 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05905a19-bbef-49a3-848c-d08b0862ba89-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 10:57:37 crc kubenswrapper[4748]: I0320 10:57:37.671556 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 10:57:37 crc kubenswrapper[4748]: I0320 10:57:37.689528 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 10:57:37 crc kubenswrapper[4748]: I0320 10:57:37.704866 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 10:57:37 crc kubenswrapper[4748]: E0320 10:57:37.707569 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05905a19-bbef-49a3-848c-d08b0862ba89" containerName="cinder-scheduler" Mar 20 10:57:37 crc kubenswrapper[4748]: I0320 10:57:37.707596 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="05905a19-bbef-49a3-848c-d08b0862ba89" containerName="cinder-scheduler" Mar 20 10:57:37 crc kubenswrapper[4748]: E0320 10:57:37.707627 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05905a19-bbef-49a3-848c-d08b0862ba89" containerName="probe" Mar 20 10:57:37 crc kubenswrapper[4748]: I0320 10:57:37.707635 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="05905a19-bbef-49a3-848c-d08b0862ba89" containerName="probe" Mar 20 10:57:37 crc kubenswrapper[4748]: I0320 10:57:37.707927 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="05905a19-bbef-49a3-848c-d08b0862ba89" containerName="cinder-scheduler" Mar 20 10:57:37 crc kubenswrapper[4748]: I0320 10:57:37.707949 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="05905a19-bbef-49a3-848c-d08b0862ba89" containerName="probe" Mar 20 10:57:37 crc kubenswrapper[4748]: I0320 10:57:37.709140 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 20 10:57:37 crc kubenswrapper[4748]: I0320 10:57:37.721581 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 20 10:57:37 crc kubenswrapper[4748]: I0320 10:57:37.734635 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 10:57:37 crc kubenswrapper[4748]: I0320 10:57:37.884726 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cbc2f9e-ca03-4d54-b229-ab80f6ffb53c-config-data\") pod \"cinder-scheduler-0\" (UID: \"9cbc2f9e-ca03-4d54-b229-ab80f6ffb53c\") " pod="openstack/cinder-scheduler-0" Mar 20 10:57:37 crc kubenswrapper[4748]: I0320 10:57:37.884942 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gmpj\" (UniqueName: \"kubernetes.io/projected/9cbc2f9e-ca03-4d54-b229-ab80f6ffb53c-kube-api-access-7gmpj\") pod \"cinder-scheduler-0\" (UID: \"9cbc2f9e-ca03-4d54-b229-ab80f6ffb53c\") " pod="openstack/cinder-scheduler-0" Mar 20 10:57:37 crc kubenswrapper[4748]: I0320 10:57:37.884996 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9cbc2f9e-ca03-4d54-b229-ab80f6ffb53c-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"9cbc2f9e-ca03-4d54-b229-ab80f6ffb53c\") " pod="openstack/cinder-scheduler-0" Mar 20 10:57:37 crc kubenswrapper[4748]: I0320 10:57:37.885057 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9cbc2f9e-ca03-4d54-b229-ab80f6ffb53c-scripts\") pod \"cinder-scheduler-0\" (UID: \"9cbc2f9e-ca03-4d54-b229-ab80f6ffb53c\") " pod="openstack/cinder-scheduler-0" Mar 20 10:57:37 crc kubenswrapper[4748]: I0320 10:57:37.885082 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cbc2f9e-ca03-4d54-b229-ab80f6ffb53c-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"9cbc2f9e-ca03-4d54-b229-ab80f6ffb53c\") " pod="openstack/cinder-scheduler-0" Mar 20 10:57:37 crc kubenswrapper[4748]: I0320 10:57:37.885126 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9cbc2f9e-ca03-4d54-b229-ab80f6ffb53c-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"9cbc2f9e-ca03-4d54-b229-ab80f6ffb53c\") " pod="openstack/cinder-scheduler-0" Mar 20 10:57:37 crc kubenswrapper[4748]: I0320 10:57:37.959913 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-77b5d7f4f8-8jmkc" Mar 20 10:57:37 crc kubenswrapper[4748]: I0320 10:57:37.987458 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7gmpj\" (UniqueName: \"kubernetes.io/projected/9cbc2f9e-ca03-4d54-b229-ab80f6ffb53c-kube-api-access-7gmpj\") pod \"cinder-scheduler-0\" (UID: \"9cbc2f9e-ca03-4d54-b229-ab80f6ffb53c\") " pod="openstack/cinder-scheduler-0" Mar 20 10:57:37 crc kubenswrapper[4748]: I0320 10:57:37.987526 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9cbc2f9e-ca03-4d54-b229-ab80f6ffb53c-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"9cbc2f9e-ca03-4d54-b229-ab80f6ffb53c\") " pod="openstack/cinder-scheduler-0" Mar 20 10:57:37 crc kubenswrapper[4748]: I0320 10:57:37.987594 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9cbc2f9e-ca03-4d54-b229-ab80f6ffb53c-scripts\") pod \"cinder-scheduler-0\" (UID: \"9cbc2f9e-ca03-4d54-b229-ab80f6ffb53c\") " pod="openstack/cinder-scheduler-0" Mar 20 10:57:37 crc kubenswrapper[4748]: I0320 10:57:37.987647 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cbc2f9e-ca03-4d54-b229-ab80f6ffb53c-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"9cbc2f9e-ca03-4d54-b229-ab80f6ffb53c\") " pod="openstack/cinder-scheduler-0" Mar 20 10:57:37 crc kubenswrapper[4748]: I0320 10:57:37.987703 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9cbc2f9e-ca03-4d54-b229-ab80f6ffb53c-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"9cbc2f9e-ca03-4d54-b229-ab80f6ffb53c\") " pod="openstack/cinder-scheduler-0" Mar 20 10:57:37 crc kubenswrapper[4748]: I0320 10:57:37.987762 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cbc2f9e-ca03-4d54-b229-ab80f6ffb53c-config-data\") pod \"cinder-scheduler-0\" (UID: \"9cbc2f9e-ca03-4d54-b229-ab80f6ffb53c\") " pod="openstack/cinder-scheduler-0" Mar 20 10:57:37 crc kubenswrapper[4748]: I0320 10:57:37.987697 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9cbc2f9e-ca03-4d54-b229-ab80f6ffb53c-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"9cbc2f9e-ca03-4d54-b229-ab80f6ffb53c\") " pod="openstack/cinder-scheduler-0" Mar 20 10:57:37 crc kubenswrapper[4748]: I0320 10:57:37.995290 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9cbc2f9e-ca03-4d54-b229-ab80f6ffb53c-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"9cbc2f9e-ca03-4d54-b229-ab80f6ffb53c\") " pod="openstack/cinder-scheduler-0" Mar 20 10:57:37 crc kubenswrapper[4748]: I0320 10:57:37.996799 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cbc2f9e-ca03-4d54-b229-ab80f6ffb53c-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"9cbc2f9e-ca03-4d54-b229-ab80f6ffb53c\") " pod="openstack/cinder-scheduler-0" Mar 20 10:57:38 crc kubenswrapper[4748]: I0320 10:57:38.000381 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cbc2f9e-ca03-4d54-b229-ab80f6ffb53c-config-data\") pod \"cinder-scheduler-0\" (UID: \"9cbc2f9e-ca03-4d54-b229-ab80f6ffb53c\") " pod="openstack/cinder-scheduler-0" Mar 20 10:57:38 crc kubenswrapper[4748]: I0320 10:57:38.002406 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9cbc2f9e-ca03-4d54-b229-ab80f6ffb53c-scripts\") pod \"cinder-scheduler-0\" (UID: \"9cbc2f9e-ca03-4d54-b229-ab80f6ffb53c\") " pod="openstack/cinder-scheduler-0" Mar 20 10:57:38 crc kubenswrapper[4748]: I0320 10:57:38.015226 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gmpj\" (UniqueName: \"kubernetes.io/projected/9cbc2f9e-ca03-4d54-b229-ab80f6ffb53c-kube-api-access-7gmpj\") pod \"cinder-scheduler-0\" (UID: \"9cbc2f9e-ca03-4d54-b229-ab80f6ffb53c\") " pod="openstack/cinder-scheduler-0" Mar 20 10:57:38 crc kubenswrapper[4748]: I0320 10:57:38.033826 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 20 10:57:38 crc kubenswrapper[4748]: I0320 10:57:38.034395 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-d4c76548d-zpdfr"] Mar 20 10:57:38 crc kubenswrapper[4748]: I0320 10:57:38.034651 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-d4c76548d-zpdfr" podUID="bb0cf1bb-fa5f-4f46-951a-d3bda77fd294" containerName="barbican-api-log" containerID="cri-o://b945c162bbdb32d0b7305ed2bedd8589404cd774506d73bd6b1c2ed5c60b9fef" gracePeriod=30 Mar 20 10:57:38 crc kubenswrapper[4748]: I0320 10:57:38.035093 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-d4c76548d-zpdfr" podUID="bb0cf1bb-fa5f-4f46-951a-d3bda77fd294" containerName="barbican-api" containerID="cri-o://0e08ca5d444382c0f1d8e48666ec37368c779dbed4152da6910ef5ef5a4ba982" gracePeriod=30 Mar 20 10:57:38 crc kubenswrapper[4748]: I0320 10:57:38.076440 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6495464d6d-wmp49" Mar 20 10:57:38 crc kubenswrapper[4748]: I0320 10:57:38.194555 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a9f0a0d1-a120-45fd-95f6-6a5650096207-logs\") pod \"a9f0a0d1-a120-45fd-95f6-6a5650096207\" (UID: \"a9f0a0d1-a120-45fd-95f6-6a5650096207\") " Mar 20 10:57:38 crc kubenswrapper[4748]: I0320 10:57:38.194616 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9f0a0d1-a120-45fd-95f6-6a5650096207-config-data\") pod \"a9f0a0d1-a120-45fd-95f6-6a5650096207\" (UID: \"a9f0a0d1-a120-45fd-95f6-6a5650096207\") " Mar 20 10:57:38 crc kubenswrapper[4748]: I0320 10:57:38.194695 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9f0a0d1-a120-45fd-95f6-6a5650096207-combined-ca-bundle\") pod \"a9f0a0d1-a120-45fd-95f6-6a5650096207\" (UID: \"a9f0a0d1-a120-45fd-95f6-6a5650096207\") " Mar 20 10:57:38 crc kubenswrapper[4748]: I0320 10:57:38.194967 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a9f0a0d1-a120-45fd-95f6-6a5650096207-internal-tls-certs\") pod \"a9f0a0d1-a120-45fd-95f6-6a5650096207\" (UID: \"a9f0a0d1-a120-45fd-95f6-6a5650096207\") " Mar 20 10:57:38 crc kubenswrapper[4748]: I0320 10:57:38.195030 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a9f0a0d1-a120-45fd-95f6-6a5650096207-scripts\") pod \"a9f0a0d1-a120-45fd-95f6-6a5650096207\" (UID: \"a9f0a0d1-a120-45fd-95f6-6a5650096207\") " Mar 20 10:57:38 crc kubenswrapper[4748]: I0320 10:57:38.195086 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cq5cc\" (UniqueName: \"kubernetes.io/projected/a9f0a0d1-a120-45fd-95f6-6a5650096207-kube-api-access-cq5cc\") pod \"a9f0a0d1-a120-45fd-95f6-6a5650096207\" (UID: \"a9f0a0d1-a120-45fd-95f6-6a5650096207\") " Mar 20 10:57:38 crc kubenswrapper[4748]: I0320 10:57:38.195114 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a9f0a0d1-a120-45fd-95f6-6a5650096207-public-tls-certs\") pod \"a9f0a0d1-a120-45fd-95f6-6a5650096207\" (UID: \"a9f0a0d1-a120-45fd-95f6-6a5650096207\") " Mar 20 10:57:38 crc kubenswrapper[4748]: I0320 10:57:38.195211 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a9f0a0d1-a120-45fd-95f6-6a5650096207-logs" (OuterVolumeSpecName: "logs") pod "a9f0a0d1-a120-45fd-95f6-6a5650096207" (UID: "a9f0a0d1-a120-45fd-95f6-6a5650096207"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:57:38 crc kubenswrapper[4748]: I0320 10:57:38.195793 4748 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a9f0a0d1-a120-45fd-95f6-6a5650096207-logs\") on node \"crc\" DevicePath \"\"" Mar 20 10:57:38 crc kubenswrapper[4748]: I0320 10:57:38.208112 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9f0a0d1-a120-45fd-95f6-6a5650096207-scripts" (OuterVolumeSpecName: "scripts") pod "a9f0a0d1-a120-45fd-95f6-6a5650096207" (UID: "a9f0a0d1-a120-45fd-95f6-6a5650096207"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:57:38 crc kubenswrapper[4748]: I0320 10:57:38.209795 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9f0a0d1-a120-45fd-95f6-6a5650096207-kube-api-access-cq5cc" (OuterVolumeSpecName: "kube-api-access-cq5cc") pod "a9f0a0d1-a120-45fd-95f6-6a5650096207" (UID: "a9f0a0d1-a120-45fd-95f6-6a5650096207"). InnerVolumeSpecName "kube-api-access-cq5cc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:57:38 crc kubenswrapper[4748]: I0320 10:57:38.266588 4748 generic.go:334] "Generic (PLEG): container finished" podID="bb0cf1bb-fa5f-4f46-951a-d3bda77fd294" containerID="b945c162bbdb32d0b7305ed2bedd8589404cd774506d73bd6b1c2ed5c60b9fef" exitCode=143 Mar 20 10:57:38 crc kubenswrapper[4748]: I0320 10:57:38.266743 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-d4c76548d-zpdfr" event={"ID":"bb0cf1bb-fa5f-4f46-951a-d3bda77fd294","Type":"ContainerDied","Data":"b945c162bbdb32d0b7305ed2bedd8589404cd774506d73bd6b1c2ed5c60b9fef"} Mar 20 10:57:38 crc kubenswrapper[4748]: I0320 10:57:38.284434 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9f0a0d1-a120-45fd-95f6-6a5650096207-config-data" (OuterVolumeSpecName: "config-data") pod "a9f0a0d1-a120-45fd-95f6-6a5650096207" (UID: "a9f0a0d1-a120-45fd-95f6-6a5650096207"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:57:38 crc kubenswrapper[4748]: I0320 10:57:38.294773 4748 generic.go:334] "Generic (PLEG): container finished" podID="a9f0a0d1-a120-45fd-95f6-6a5650096207" containerID="905528e50dd4243b3974576ea6e5a824bfd8f949a041d1798afac52a1be8e960" exitCode=0 Mar 20 10:57:38 crc kubenswrapper[4748]: I0320 10:57:38.295265 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6495464d6d-wmp49" Mar 20 10:57:38 crc kubenswrapper[4748]: I0320 10:57:38.295306 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6495464d6d-wmp49" event={"ID":"a9f0a0d1-a120-45fd-95f6-6a5650096207","Type":"ContainerDied","Data":"905528e50dd4243b3974576ea6e5a824bfd8f949a041d1798afac52a1be8e960"} Mar 20 10:57:38 crc kubenswrapper[4748]: I0320 10:57:38.295395 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6495464d6d-wmp49" event={"ID":"a9f0a0d1-a120-45fd-95f6-6a5650096207","Type":"ContainerDied","Data":"a6e903d6deefbe2a66160c8dd4522cd6885148e114f0b955ac6911fe7b11bf32"} Mar 20 10:57:38 crc kubenswrapper[4748]: I0320 10:57:38.295438 4748 scope.go:117] "RemoveContainer" containerID="905528e50dd4243b3974576ea6e5a824bfd8f949a041d1798afac52a1be8e960" Mar 20 10:57:38 crc kubenswrapper[4748]: I0320 10:57:38.297685 4748 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a9f0a0d1-a120-45fd-95f6-6a5650096207-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 10:57:38 crc kubenswrapper[4748]: I0320 10:57:38.298023 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cq5cc\" (UniqueName: \"kubernetes.io/projected/a9f0a0d1-a120-45fd-95f6-6a5650096207-kube-api-access-cq5cc\") on node \"crc\" DevicePath \"\"" Mar 20 10:57:38 crc kubenswrapper[4748]: I0320 10:57:38.298045 4748 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9f0a0d1-a120-45fd-95f6-6a5650096207-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 10:57:38 crc kubenswrapper[4748]: I0320 10:57:38.312786 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9f0a0d1-a120-45fd-95f6-6a5650096207-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a9f0a0d1-a120-45fd-95f6-6a5650096207" (UID: "a9f0a0d1-a120-45fd-95f6-6a5650096207"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:57:38 crc kubenswrapper[4748]: I0320 10:57:38.366655 4748 scope.go:117] "RemoveContainer" containerID="aa3e5bcdeb53be9bcb7cd3059f52452ab74bfc640f4cce1815b0bb44915f913a" Mar 20 10:57:38 crc kubenswrapper[4748]: I0320 10:57:38.381781 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9f0a0d1-a120-45fd-95f6-6a5650096207-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "a9f0a0d1-a120-45fd-95f6-6a5650096207" (UID: "a9f0a0d1-a120-45fd-95f6-6a5650096207"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:57:38 crc kubenswrapper[4748]: I0320 10:57:38.399320 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9f0a0d1-a120-45fd-95f6-6a5650096207-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "a9f0a0d1-a120-45fd-95f6-6a5650096207" (UID: "a9f0a0d1-a120-45fd-95f6-6a5650096207"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:57:38 crc kubenswrapper[4748]: I0320 10:57:38.400111 4748 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a9f0a0d1-a120-45fd-95f6-6a5650096207-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 10:57:38 crc kubenswrapper[4748]: I0320 10:57:38.400140 4748 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a9f0a0d1-a120-45fd-95f6-6a5650096207-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 10:57:38 crc kubenswrapper[4748]: I0320 10:57:38.400154 4748 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9f0a0d1-a120-45fd-95f6-6a5650096207-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 10:57:38 crc kubenswrapper[4748]: I0320 10:57:38.420591 4748 scope.go:117] "RemoveContainer" containerID="905528e50dd4243b3974576ea6e5a824bfd8f949a041d1798afac52a1be8e960" Mar 20 10:57:38 crc kubenswrapper[4748]: E0320 10:57:38.422682 4748 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"905528e50dd4243b3974576ea6e5a824bfd8f949a041d1798afac52a1be8e960\": container with ID starting with 905528e50dd4243b3974576ea6e5a824bfd8f949a041d1798afac52a1be8e960 not found: ID does not exist" containerID="905528e50dd4243b3974576ea6e5a824bfd8f949a041d1798afac52a1be8e960" Mar 20 10:57:38 crc kubenswrapper[4748]: I0320 10:57:38.422725 4748 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"905528e50dd4243b3974576ea6e5a824bfd8f949a041d1798afac52a1be8e960"} err="failed to get container status \"905528e50dd4243b3974576ea6e5a824bfd8f949a041d1798afac52a1be8e960\": rpc error: code = NotFound desc = could not find container \"905528e50dd4243b3974576ea6e5a824bfd8f949a041d1798afac52a1be8e960\": container with ID starting with 905528e50dd4243b3974576ea6e5a824bfd8f949a041d1798afac52a1be8e960 not found: ID does not exist" Mar 20 10:57:38 crc kubenswrapper[4748]: I0320 10:57:38.422756 4748 scope.go:117] "RemoveContainer" containerID="aa3e5bcdeb53be9bcb7cd3059f52452ab74bfc640f4cce1815b0bb44915f913a" Mar 20 10:57:38 crc kubenswrapper[4748]: E0320 10:57:38.423260 4748 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa3e5bcdeb53be9bcb7cd3059f52452ab74bfc640f4cce1815b0bb44915f913a\": container with ID starting with aa3e5bcdeb53be9bcb7cd3059f52452ab74bfc640f4cce1815b0bb44915f913a not found: ID does not exist" containerID="aa3e5bcdeb53be9bcb7cd3059f52452ab74bfc640f4cce1815b0bb44915f913a" Mar 20 10:57:38 crc kubenswrapper[4748]: I0320 10:57:38.423289 4748 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa3e5bcdeb53be9bcb7cd3059f52452ab74bfc640f4cce1815b0bb44915f913a"} err="failed to get container status \"aa3e5bcdeb53be9bcb7cd3059f52452ab74bfc640f4cce1815b0bb44915f913a\": rpc error: code = NotFound desc = could not find container \"aa3e5bcdeb53be9bcb7cd3059f52452ab74bfc640f4cce1815b0bb44915f913a\": container with ID starting with aa3e5bcdeb53be9bcb7cd3059f52452ab74bfc640f4cce1815b0bb44915f913a not found: ID does not exist" Mar 20 10:57:38 crc kubenswrapper[4748]: I0320 10:57:38.634751 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-6495464d6d-wmp49"] Mar 20 10:57:38 crc kubenswrapper[4748]: I0320 10:57:38.648542 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-6495464d6d-wmp49"] Mar 20 10:57:38 crc kubenswrapper[4748]: I0320 10:57:38.680594 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 10:57:39 crc kubenswrapper[4748]: I0320 10:57:39.310401 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"9cbc2f9e-ca03-4d54-b229-ab80f6ffb53c","Type":"ContainerStarted","Data":"249ca1701c8f531bd627cffaa1c79d42c01f9909238ad46af7cb9457de232e3c"} Mar 20 10:57:39 crc kubenswrapper[4748]: I0320 10:57:39.532945 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="05905a19-bbef-49a3-848c-d08b0862ba89" path="/var/lib/kubelet/pods/05905a19-bbef-49a3-848c-d08b0862ba89/volumes" Mar 20 10:57:39 crc kubenswrapper[4748]: I0320 10:57:39.533990 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9f0a0d1-a120-45fd-95f6-6a5650096207" path="/var/lib/kubelet/pods/a9f0a0d1-a120-45fd-95f6-6a5650096207/volumes" Mar 20 10:57:40 crc kubenswrapper[4748]: I0320 10:57:40.338272 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"9cbc2f9e-ca03-4d54-b229-ab80f6ffb53c","Type":"ContainerStarted","Data":"b40c0d4b7b03a73a3f9258664807efa619dd6d10075b7a8be5ab99914452a5e3"} Mar 20 10:57:41 crc kubenswrapper[4748]: I0320 10:57:41.347617 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"9cbc2f9e-ca03-4d54-b229-ab80f6ffb53c","Type":"ContainerStarted","Data":"04b824cfc2e5554ae365a223c136b906996c0eeb521db09f196f6c9ac200bc9a"} Mar 20 10:57:41 crc kubenswrapper[4748]: I0320 10:57:41.478402 4748 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-b85b9d5c6-7bgxl" podUID="18e8975c-a2d8-4319-b175-2a66ce3d97c9" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.150:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.150:8443: connect: connection refused" Mar 20 10:57:41 crc kubenswrapper[4748]: I0320 10:57:41.541756 4748 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-d4c76548d-zpdfr" podUID="bb0cf1bb-fa5f-4f46-951a-d3bda77fd294" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.163:9311/healthcheck\": dial tcp 10.217.0.163:9311: connect: connection refused" Mar 20 10:57:41 crc kubenswrapper[4748]: I0320 10:57:41.542002 4748 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-d4c76548d-zpdfr" podUID="bb0cf1bb-fa5f-4f46-951a-d3bda77fd294" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.163:9311/healthcheck\": dial tcp 10.217.0.163:9311: connect: connection refused" Mar 20 10:57:42 crc kubenswrapper[4748]: I0320 10:57:42.238782 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-d4c76548d-zpdfr" Mar 20 10:57:42 crc kubenswrapper[4748]: I0320 10:57:42.322848 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bb0cf1bb-fa5f-4f46-951a-d3bda77fd294-logs\") pod \"bb0cf1bb-fa5f-4f46-951a-d3bda77fd294\" (UID: \"bb0cf1bb-fa5f-4f46-951a-d3bda77fd294\") " Mar 20 10:57:42 crc kubenswrapper[4748]: I0320 10:57:42.322937 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-npgxc\" (UniqueName: \"kubernetes.io/projected/bb0cf1bb-fa5f-4f46-951a-d3bda77fd294-kube-api-access-npgxc\") pod \"bb0cf1bb-fa5f-4f46-951a-d3bda77fd294\" (UID: \"bb0cf1bb-fa5f-4f46-951a-d3bda77fd294\") " Mar 20 10:57:42 crc kubenswrapper[4748]: I0320 10:57:42.324286 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb0cf1bb-fa5f-4f46-951a-d3bda77fd294-config-data\") pod \"bb0cf1bb-fa5f-4f46-951a-d3bda77fd294\" (UID: \"bb0cf1bb-fa5f-4f46-951a-d3bda77fd294\") " Mar 20 10:57:42 crc kubenswrapper[4748]: I0320 10:57:42.324324 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bb0cf1bb-fa5f-4f46-951a-d3bda77fd294-config-data-custom\") pod \"bb0cf1bb-fa5f-4f46-951a-d3bda77fd294\" (UID: \"bb0cf1bb-fa5f-4f46-951a-d3bda77fd294\") " Mar 20 10:57:42 crc kubenswrapper[4748]: I0320 10:57:42.325058 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb0cf1bb-fa5f-4f46-951a-d3bda77fd294-logs" (OuterVolumeSpecName: "logs") pod "bb0cf1bb-fa5f-4f46-951a-d3bda77fd294" (UID: "bb0cf1bb-fa5f-4f46-951a-d3bda77fd294"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:57:42 crc kubenswrapper[4748]: I0320 10:57:42.325477 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb0cf1bb-fa5f-4f46-951a-d3bda77fd294-combined-ca-bundle\") pod \"bb0cf1bb-fa5f-4f46-951a-d3bda77fd294\" (UID: \"bb0cf1bb-fa5f-4f46-951a-d3bda77fd294\") " Mar 20 10:57:42 crc kubenswrapper[4748]: I0320 10:57:42.330149 4748 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bb0cf1bb-fa5f-4f46-951a-d3bda77fd294-logs\") on node \"crc\" DevicePath \"\"" Mar 20 10:57:42 crc kubenswrapper[4748]: I0320 10:57:42.347182 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb0cf1bb-fa5f-4f46-951a-d3bda77fd294-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "bb0cf1bb-fa5f-4f46-951a-d3bda77fd294" (UID: "bb0cf1bb-fa5f-4f46-951a-d3bda77fd294"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:57:42 crc kubenswrapper[4748]: I0320 10:57:42.376347 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb0cf1bb-fa5f-4f46-951a-d3bda77fd294-kube-api-access-npgxc" (OuterVolumeSpecName: "kube-api-access-npgxc") pod "bb0cf1bb-fa5f-4f46-951a-d3bda77fd294" (UID: "bb0cf1bb-fa5f-4f46-951a-d3bda77fd294"). InnerVolumeSpecName "kube-api-access-npgxc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:57:42 crc kubenswrapper[4748]: I0320 10:57:42.379481 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb0cf1bb-fa5f-4f46-951a-d3bda77fd294-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bb0cf1bb-fa5f-4f46-951a-d3bda77fd294" (UID: "bb0cf1bb-fa5f-4f46-951a-d3bda77fd294"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:57:42 crc kubenswrapper[4748]: I0320 10:57:42.388577 4748 generic.go:334] "Generic (PLEG): container finished" podID="bb0cf1bb-fa5f-4f46-951a-d3bda77fd294" containerID="0e08ca5d444382c0f1d8e48666ec37368c779dbed4152da6910ef5ef5a4ba982" exitCode=0 Mar 20 10:57:42 crc kubenswrapper[4748]: I0320 10:57:42.388731 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-d4c76548d-zpdfr" Mar 20 10:57:42 crc kubenswrapper[4748]: I0320 10:57:42.388924 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-d4c76548d-zpdfr" event={"ID":"bb0cf1bb-fa5f-4f46-951a-d3bda77fd294","Type":"ContainerDied","Data":"0e08ca5d444382c0f1d8e48666ec37368c779dbed4152da6910ef5ef5a4ba982"} Mar 20 10:57:42 crc kubenswrapper[4748]: I0320 10:57:42.389004 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-d4c76548d-zpdfr" event={"ID":"bb0cf1bb-fa5f-4f46-951a-d3bda77fd294","Type":"ContainerDied","Data":"ec1f065a1d6f350dbdf5ad5a083bebbe57b426c1100162ced22925c50149c171"} Mar 20 10:57:42 crc kubenswrapper[4748]: I0320 10:57:42.389034 4748 scope.go:117] "RemoveContainer" containerID="0e08ca5d444382c0f1d8e48666ec37368c779dbed4152da6910ef5ef5a4ba982" Mar 20 10:57:42 crc kubenswrapper[4748]: I0320 10:57:42.391903 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb0cf1bb-fa5f-4f46-951a-d3bda77fd294-config-data" (OuterVolumeSpecName: "config-data") pod "bb0cf1bb-fa5f-4f46-951a-d3bda77fd294" (UID: "bb0cf1bb-fa5f-4f46-951a-d3bda77fd294"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:57:42 crc kubenswrapper[4748]: I0320 10:57:42.420225 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=5.420196216 podStartE2EDuration="5.420196216s" podCreationTimestamp="2026-03-20 10:57:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:57:42.419293544 +0000 UTC m=+1297.560839348" watchObservedRunningTime="2026-03-20 10:57:42.420196216 +0000 UTC m=+1297.561742030" Mar 20 10:57:42 crc kubenswrapper[4748]: I0320 10:57:42.431940 4748 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb0cf1bb-fa5f-4f46-951a-d3bda77fd294-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 10:57:42 crc kubenswrapper[4748]: I0320 10:57:42.432133 4748 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bb0cf1bb-fa5f-4f46-951a-d3bda77fd294-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 20 10:57:42 crc kubenswrapper[4748]: I0320 10:57:42.432212 4748 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb0cf1bb-fa5f-4f46-951a-d3bda77fd294-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 10:57:42 crc kubenswrapper[4748]: I0320 10:57:42.432285 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-npgxc\" (UniqueName: \"kubernetes.io/projected/bb0cf1bb-fa5f-4f46-951a-d3bda77fd294-kube-api-access-npgxc\") on node \"crc\" DevicePath \"\"" Mar 20 10:57:42 crc kubenswrapper[4748]: I0320 10:57:42.452029 4748 scope.go:117] "RemoveContainer" containerID="b945c162bbdb32d0b7305ed2bedd8589404cd774506d73bd6b1c2ed5c60b9fef" Mar 20 10:57:42 crc kubenswrapper[4748]: I0320 10:57:42.479495 4748 scope.go:117] "RemoveContainer" containerID="0e08ca5d444382c0f1d8e48666ec37368c779dbed4152da6910ef5ef5a4ba982" Mar 20 10:57:42 crc kubenswrapper[4748]: E0320 10:57:42.480173 4748 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e08ca5d444382c0f1d8e48666ec37368c779dbed4152da6910ef5ef5a4ba982\": container with ID starting with 0e08ca5d444382c0f1d8e48666ec37368c779dbed4152da6910ef5ef5a4ba982 not found: ID does not exist" containerID="0e08ca5d444382c0f1d8e48666ec37368c779dbed4152da6910ef5ef5a4ba982" Mar 20 10:57:42 crc kubenswrapper[4748]: I0320 10:57:42.480221 4748 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e08ca5d444382c0f1d8e48666ec37368c779dbed4152da6910ef5ef5a4ba982"} err="failed to get container status \"0e08ca5d444382c0f1d8e48666ec37368c779dbed4152da6910ef5ef5a4ba982\": rpc error: code = NotFound desc = could not find container \"0e08ca5d444382c0f1d8e48666ec37368c779dbed4152da6910ef5ef5a4ba982\": container with ID starting with 0e08ca5d444382c0f1d8e48666ec37368c779dbed4152da6910ef5ef5a4ba982 not found: ID does not exist" Mar 20 10:57:42 crc kubenswrapper[4748]: I0320 10:57:42.480246 4748 scope.go:117] "RemoveContainer" containerID="b945c162bbdb32d0b7305ed2bedd8589404cd774506d73bd6b1c2ed5c60b9fef" Mar 20 10:57:42 crc kubenswrapper[4748]: E0320 10:57:42.480908 4748 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b945c162bbdb32d0b7305ed2bedd8589404cd774506d73bd6b1c2ed5c60b9fef\": container with ID starting with b945c162bbdb32d0b7305ed2bedd8589404cd774506d73bd6b1c2ed5c60b9fef not found: ID does not exist" containerID="b945c162bbdb32d0b7305ed2bedd8589404cd774506d73bd6b1c2ed5c60b9fef" Mar 20 10:57:42 crc kubenswrapper[4748]: I0320 10:57:42.480975 4748 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b945c162bbdb32d0b7305ed2bedd8589404cd774506d73bd6b1c2ed5c60b9fef"} err="failed to get container status \"b945c162bbdb32d0b7305ed2bedd8589404cd774506d73bd6b1c2ed5c60b9fef\": rpc error: code = NotFound desc = could not find container \"b945c162bbdb32d0b7305ed2bedd8589404cd774506d73bd6b1c2ed5c60b9fef\": container with ID starting with b945c162bbdb32d0b7305ed2bedd8589404cd774506d73bd6b1c2ed5c60b9fef not found: ID does not exist" Mar 20 10:57:42 crc kubenswrapper[4748]: I0320 10:57:42.732596 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-d4c76548d-zpdfr"] Mar 20 10:57:42 crc kubenswrapper[4748]: I0320 10:57:42.744779 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-d4c76548d-zpdfr"] Mar 20 10:57:42 crc kubenswrapper[4748]: I0320 10:57:42.918008 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-c66c949c-9cv26"] Mar 20 10:57:42 crc kubenswrapper[4748]: E0320 10:57:42.918554 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9f0a0d1-a120-45fd-95f6-6a5650096207" containerName="placement-log" Mar 20 10:57:42 crc kubenswrapper[4748]: I0320 10:57:42.918578 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9f0a0d1-a120-45fd-95f6-6a5650096207" containerName="placement-log" Mar 20 10:57:42 crc kubenswrapper[4748]: E0320 10:57:42.918609 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb0cf1bb-fa5f-4f46-951a-d3bda77fd294" containerName="barbican-api" Mar 20 10:57:42 crc kubenswrapper[4748]: I0320 10:57:42.918617 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb0cf1bb-fa5f-4f46-951a-d3bda77fd294" containerName="barbican-api" Mar 20 10:57:42 crc kubenswrapper[4748]: E0320 10:57:42.918642 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb0cf1bb-fa5f-4f46-951a-d3bda77fd294" containerName="barbican-api-log" Mar 20 10:57:42 crc kubenswrapper[4748]: I0320 10:57:42.918649 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb0cf1bb-fa5f-4f46-951a-d3bda77fd294" containerName="barbican-api-log" Mar 20 10:57:42 crc kubenswrapper[4748]: E0320 10:57:42.918667 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9f0a0d1-a120-45fd-95f6-6a5650096207" containerName="placement-api" Mar 20 10:57:42 crc kubenswrapper[4748]: I0320 10:57:42.918673 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9f0a0d1-a120-45fd-95f6-6a5650096207" containerName="placement-api" Mar 20 10:57:42 crc kubenswrapper[4748]: I0320 10:57:42.918902 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9f0a0d1-a120-45fd-95f6-6a5650096207" containerName="placement-api" Mar 20 10:57:42 crc kubenswrapper[4748]: I0320 10:57:42.918922 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb0cf1bb-fa5f-4f46-951a-d3bda77fd294" containerName="barbican-api" Mar 20 10:57:42 crc kubenswrapper[4748]: I0320 10:57:42.918942 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb0cf1bb-fa5f-4f46-951a-d3bda77fd294" containerName="barbican-api-log" Mar 20 10:57:42 crc kubenswrapper[4748]: I0320 10:57:42.918952 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9f0a0d1-a120-45fd-95f6-6a5650096207" containerName="placement-log" Mar 20 10:57:42 crc kubenswrapper[4748]: I0320 10:57:42.928359 4748 patch_prober.go:28] interesting pod/machine-config-daemon-5lbvz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 10:57:42 crc kubenswrapper[4748]: I0320 10:57:42.928434 4748 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 10:57:42 crc kubenswrapper[4748]: I0320 10:57:42.931885 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-c66c949c-9cv26" Mar 20 10:57:42 crc kubenswrapper[4748]: I0320 10:57:42.943611 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 20 10:57:42 crc kubenswrapper[4748]: I0320 10:57:42.943877 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Mar 20 10:57:42 crc kubenswrapper[4748]: I0320 10:57:42.944128 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Mar 20 10:57:42 crc kubenswrapper[4748]: I0320 10:57:42.960654 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-c66c949c-9cv26"] Mar 20 10:57:43 crc kubenswrapper[4748]: I0320 10:57:43.035961 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 20 10:57:43 crc kubenswrapper[4748]: I0320 10:57:43.046674 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/12263f55-f4a7-481f-afab-45f51bd4d60d-public-tls-certs\") pod \"swift-proxy-c66c949c-9cv26\" (UID: \"12263f55-f4a7-481f-afab-45f51bd4d60d\") " pod="openstack/swift-proxy-c66c949c-9cv26" Mar 20 10:57:43 crc kubenswrapper[4748]: I0320 10:57:43.046708 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12263f55-f4a7-481f-afab-45f51bd4d60d-config-data\") pod \"swift-proxy-c66c949c-9cv26\" (UID: \"12263f55-f4a7-481f-afab-45f51bd4d60d\") " pod="openstack/swift-proxy-c66c949c-9cv26" Mar 20 10:57:43 crc kubenswrapper[4748]: I0320 10:57:43.046759 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/12263f55-f4a7-481f-afab-45f51bd4d60d-run-httpd\") pod \"swift-proxy-c66c949c-9cv26\" (UID: \"12263f55-f4a7-481f-afab-45f51bd4d60d\") " pod="openstack/swift-proxy-c66c949c-9cv26" Mar 20 10:57:43 crc kubenswrapper[4748]: I0320 10:57:43.046780 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fz9wg\" (UniqueName: \"kubernetes.io/projected/12263f55-f4a7-481f-afab-45f51bd4d60d-kube-api-access-fz9wg\") pod \"swift-proxy-c66c949c-9cv26\" (UID: \"12263f55-f4a7-481f-afab-45f51bd4d60d\") " pod="openstack/swift-proxy-c66c949c-9cv26" Mar 20 10:57:43 crc kubenswrapper[4748]: I0320 10:57:43.046874 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/12263f55-f4a7-481f-afab-45f51bd4d60d-etc-swift\") pod \"swift-proxy-c66c949c-9cv26\" (UID: \"12263f55-f4a7-481f-afab-45f51bd4d60d\") " pod="openstack/swift-proxy-c66c949c-9cv26" Mar 20 10:57:43 crc kubenswrapper[4748]: I0320 10:57:43.046897 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/12263f55-f4a7-481f-afab-45f51bd4d60d-log-httpd\") pod \"swift-proxy-c66c949c-9cv26\" (UID: \"12263f55-f4a7-481f-afab-45f51bd4d60d\") " pod="openstack/swift-proxy-c66c949c-9cv26" Mar 20 10:57:43 crc kubenswrapper[4748]: I0320 10:57:43.046913 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12263f55-f4a7-481f-afab-45f51bd4d60d-combined-ca-bundle\") pod \"swift-proxy-c66c949c-9cv26\" (UID: \"12263f55-f4a7-481f-afab-45f51bd4d60d\") " pod="openstack/swift-proxy-c66c949c-9cv26" Mar 20 10:57:43 crc kubenswrapper[4748]: I0320 10:57:43.046954 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/12263f55-f4a7-481f-afab-45f51bd4d60d-internal-tls-certs\") pod \"swift-proxy-c66c949c-9cv26\" (UID: \"12263f55-f4a7-481f-afab-45f51bd4d60d\") " pod="openstack/swift-proxy-c66c949c-9cv26" Mar 20 10:57:43 crc kubenswrapper[4748]: I0320 10:57:43.148457 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/12263f55-f4a7-481f-afab-45f51bd4d60d-etc-swift\") pod \"swift-proxy-c66c949c-9cv26\" (UID: \"12263f55-f4a7-481f-afab-45f51bd4d60d\") " pod="openstack/swift-proxy-c66c949c-9cv26" Mar 20 10:57:43 crc kubenswrapper[4748]: I0320 10:57:43.148511 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/12263f55-f4a7-481f-afab-45f51bd4d60d-log-httpd\") pod \"swift-proxy-c66c949c-9cv26\" (UID: \"12263f55-f4a7-481f-afab-45f51bd4d60d\") " pod="openstack/swift-proxy-c66c949c-9cv26" Mar 20 10:57:43 crc kubenswrapper[4748]: I0320 10:57:43.148531 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12263f55-f4a7-481f-afab-45f51bd4d60d-combined-ca-bundle\") pod \"swift-proxy-c66c949c-9cv26\" (UID: \"12263f55-f4a7-481f-afab-45f51bd4d60d\") " pod="openstack/swift-proxy-c66c949c-9cv26" Mar 20 10:57:43 crc kubenswrapper[4748]: I0320 10:57:43.148575 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/12263f55-f4a7-481f-afab-45f51bd4d60d-internal-tls-certs\") pod \"swift-proxy-c66c949c-9cv26\" (UID: \"12263f55-f4a7-481f-afab-45f51bd4d60d\") " pod="openstack/swift-proxy-c66c949c-9cv26" Mar 20 10:57:43 crc kubenswrapper[4748]: I0320 10:57:43.148604 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/12263f55-f4a7-481f-afab-45f51bd4d60d-public-tls-certs\") pod \"swift-proxy-c66c949c-9cv26\" (UID: \"12263f55-f4a7-481f-afab-45f51bd4d60d\") " pod="openstack/swift-proxy-c66c949c-9cv26" Mar 20 10:57:43 crc kubenswrapper[4748]: I0320 10:57:43.148633 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12263f55-f4a7-481f-afab-45f51bd4d60d-config-data\") pod \"swift-proxy-c66c949c-9cv26\" (UID: \"12263f55-f4a7-481f-afab-45f51bd4d60d\") " pod="openstack/swift-proxy-c66c949c-9cv26" Mar 20 10:57:43 crc kubenswrapper[4748]: I0320 10:57:43.148674 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/12263f55-f4a7-481f-afab-45f51bd4d60d-run-httpd\") pod \"swift-proxy-c66c949c-9cv26\" (UID: \"12263f55-f4a7-481f-afab-45f51bd4d60d\") " pod="openstack/swift-proxy-c66c949c-9cv26" Mar 20 10:57:43 crc kubenswrapper[4748]: I0320 10:57:43.148693 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fz9wg\" (UniqueName: \"kubernetes.io/projected/12263f55-f4a7-481f-afab-45f51bd4d60d-kube-api-access-fz9wg\") pod \"swift-proxy-c66c949c-9cv26\" (UID: \"12263f55-f4a7-481f-afab-45f51bd4d60d\") " pod="openstack/swift-proxy-c66c949c-9cv26" Mar 20 10:57:43 crc kubenswrapper[4748]: I0320 10:57:43.150321 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/12263f55-f4a7-481f-afab-45f51bd4d60d-run-httpd\") pod \"swift-proxy-c66c949c-9cv26\" (UID: \"12263f55-f4a7-481f-afab-45f51bd4d60d\") " pod="openstack/swift-proxy-c66c949c-9cv26" Mar 20 10:57:43 crc kubenswrapper[4748]: I0320 10:57:43.151268 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/12263f55-f4a7-481f-afab-45f51bd4d60d-log-httpd\") pod \"swift-proxy-c66c949c-9cv26\" (UID: \"12263f55-f4a7-481f-afab-45f51bd4d60d\") " pod="openstack/swift-proxy-c66c949c-9cv26" Mar 20 10:57:43 crc kubenswrapper[4748]: I0320 10:57:43.153287 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12263f55-f4a7-481f-afab-45f51bd4d60d-config-data\") pod \"swift-proxy-c66c949c-9cv26\" (UID: \"12263f55-f4a7-481f-afab-45f51bd4d60d\") " pod="openstack/swift-proxy-c66c949c-9cv26" Mar 20 10:57:43 crc kubenswrapper[4748]: I0320 10:57:43.154111 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12263f55-f4a7-481f-afab-45f51bd4d60d-combined-ca-bundle\") pod \"swift-proxy-c66c949c-9cv26\" (UID: \"12263f55-f4a7-481f-afab-45f51bd4d60d\") " pod="openstack/swift-proxy-c66c949c-9cv26" Mar 20 10:57:43 crc kubenswrapper[4748]: I0320 10:57:43.155505 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/12263f55-f4a7-481f-afab-45f51bd4d60d-public-tls-certs\") pod \"swift-proxy-c66c949c-9cv26\" (UID: \"12263f55-f4a7-481f-afab-45f51bd4d60d\") " pod="openstack/swift-proxy-c66c949c-9cv26" Mar 20 10:57:43 crc kubenswrapper[4748]: I0320 10:57:43.161131 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/12263f55-f4a7-481f-afab-45f51bd4d60d-etc-swift\") pod \"swift-proxy-c66c949c-9cv26\" (UID: \"12263f55-f4a7-481f-afab-45f51bd4d60d\") " pod="openstack/swift-proxy-c66c949c-9cv26" Mar 20 10:57:43 crc kubenswrapper[4748]: I0320 10:57:43.167289 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/12263f55-f4a7-481f-afab-45f51bd4d60d-internal-tls-certs\") pod \"swift-proxy-c66c949c-9cv26\" (UID: \"12263f55-f4a7-481f-afab-45f51bd4d60d\") " pod="openstack/swift-proxy-c66c949c-9cv26" Mar 20 10:57:43 crc kubenswrapper[4748]: I0320 10:57:43.169756 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fz9wg\" (UniqueName: \"kubernetes.io/projected/12263f55-f4a7-481f-afab-45f51bd4d60d-kube-api-access-fz9wg\") pod \"swift-proxy-c66c949c-9cv26\" (UID: \"12263f55-f4a7-481f-afab-45f51bd4d60d\") " pod="openstack/swift-proxy-c66c949c-9cv26" Mar 20 10:57:43 crc kubenswrapper[4748]: I0320 10:57:43.277046 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-c66c949c-9cv26" Mar 20 10:57:43 crc kubenswrapper[4748]: I0320 10:57:43.530574 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb0cf1bb-fa5f-4f46-951a-d3bda77fd294" path="/var/lib/kubelet/pods/bb0cf1bb-fa5f-4f46-951a-d3bda77fd294/volumes" Mar 20 10:57:43 crc kubenswrapper[4748]: I0320 10:57:43.714179 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Mar 20 10:57:44 crc kubenswrapper[4748]: I0320 10:57:44.386546 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-c66c949c-9cv26"] Mar 20 10:57:44 crc kubenswrapper[4748]: I0320 10:57:44.755177 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 10:57:44 crc kubenswrapper[4748]: I0320 10:57:44.755597 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f9ddc460-94b3-44d2-9eed-2b4e344f8232" containerName="ceilometer-central-agent" containerID="cri-o://7e75f88044945d065adcd67d239ad490c07771d34ce5bef16fa3e8c0b57c522a" gracePeriod=30 Mar 20 10:57:44 crc kubenswrapper[4748]: I0320 10:57:44.757132 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f9ddc460-94b3-44d2-9eed-2b4e344f8232" containerName="sg-core" containerID="cri-o://04847ca44cd324189231aad70f02b74561c73fd6831bcdaaf48927e9710ec62a" gracePeriod=30 Mar 20 10:57:44 crc kubenswrapper[4748]: I0320 10:57:44.757315 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f9ddc460-94b3-44d2-9eed-2b4e344f8232" containerName="proxy-httpd" containerID="cri-o://cfb24b150bed54645f70e625690898fb4f02902434052c487e5f5e92fe2cbabf" gracePeriod=30 Mar 20 10:57:44 crc kubenswrapper[4748]: I0320 10:57:44.757310 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f9ddc460-94b3-44d2-9eed-2b4e344f8232" containerName="ceilometer-notification-agent" containerID="cri-o://353e45ad1106596beef54eccff0883480d7ef6880bf17ea3857025f6eeabaef0" gracePeriod=30 Mar 20 10:57:44 crc kubenswrapper[4748]: I0320 10:57:44.783655 4748 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="f9ddc460-94b3-44d2-9eed-2b4e344f8232" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 20 10:57:45 crc kubenswrapper[4748]: I0320 10:57:45.426902 4748 generic.go:334] "Generic (PLEG): container finished" podID="f9ddc460-94b3-44d2-9eed-2b4e344f8232" containerID="cfb24b150bed54645f70e625690898fb4f02902434052c487e5f5e92fe2cbabf" exitCode=0 Mar 20 10:57:45 crc kubenswrapper[4748]: I0320 10:57:45.426944 4748 generic.go:334] "Generic (PLEG): container finished" podID="f9ddc460-94b3-44d2-9eed-2b4e344f8232" containerID="04847ca44cd324189231aad70f02b74561c73fd6831bcdaaf48927e9710ec62a" exitCode=2 Mar 20 10:57:45 crc kubenswrapper[4748]: I0320 10:57:45.426957 4748 generic.go:334] "Generic (PLEG): container finished" podID="f9ddc460-94b3-44d2-9eed-2b4e344f8232" containerID="7e75f88044945d065adcd67d239ad490c07771d34ce5bef16fa3e8c0b57c522a" exitCode=0 Mar 20 10:57:45 crc kubenswrapper[4748]: I0320 10:57:45.426974 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f9ddc460-94b3-44d2-9eed-2b4e344f8232","Type":"ContainerDied","Data":"cfb24b150bed54645f70e625690898fb4f02902434052c487e5f5e92fe2cbabf"} Mar 20 10:57:45 crc kubenswrapper[4748]: I0320 10:57:45.427018 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f9ddc460-94b3-44d2-9eed-2b4e344f8232","Type":"ContainerDied","Data":"04847ca44cd324189231aad70f02b74561c73fd6831bcdaaf48927e9710ec62a"} Mar 20 10:57:45 crc kubenswrapper[4748]: I0320 10:57:45.427030 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f9ddc460-94b3-44d2-9eed-2b4e344f8232","Type":"ContainerDied","Data":"7e75f88044945d065adcd67d239ad490c07771d34ce5bef16fa3e8c0b57c522a"} Mar 20 10:57:45 crc kubenswrapper[4748]: I0320 10:57:45.659399 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-c6v9d"] Mar 20 10:57:45 crc kubenswrapper[4748]: I0320 10:57:45.661066 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-c6v9d" Mar 20 10:57:45 crc kubenswrapper[4748]: I0320 10:57:45.669817 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-c6v9d"] Mar 20 10:57:45 crc kubenswrapper[4748]: I0320 10:57:45.707512 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hpvf6\" (UniqueName: \"kubernetes.io/projected/e9e2c5e9-4e94-47b2-82f3-c11dac5bfa25-kube-api-access-hpvf6\") pod \"nova-api-db-create-c6v9d\" (UID: \"e9e2c5e9-4e94-47b2-82f3-c11dac5bfa25\") " pod="openstack/nova-api-db-create-c6v9d" Mar 20 10:57:45 crc kubenswrapper[4748]: I0320 10:57:45.715062 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e9e2c5e9-4e94-47b2-82f3-c11dac5bfa25-operator-scripts\") pod \"nova-api-db-create-c6v9d\" (UID: \"e9e2c5e9-4e94-47b2-82f3-c11dac5bfa25\") " pod="openstack/nova-api-db-create-c6v9d" Mar 20 10:57:45 crc kubenswrapper[4748]: I0320 10:57:45.736406 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-pcsgd"] Mar 20 10:57:45 crc kubenswrapper[4748]: I0320 10:57:45.738214 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-pcsgd" Mar 20 10:57:45 crc kubenswrapper[4748]: I0320 10:57:45.745530 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-pcsgd"] Mar 20 10:57:45 crc kubenswrapper[4748]: I0320 10:57:45.747952 4748 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="f9ddc460-94b3-44d2-9eed-2b4e344f8232" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.159:3000/\": dial tcp 10.217.0.159:3000: connect: connection refused" Mar 20 10:57:45 crc kubenswrapper[4748]: I0320 10:57:45.818497 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hpvf6\" (UniqueName: \"kubernetes.io/projected/e9e2c5e9-4e94-47b2-82f3-c11dac5bfa25-kube-api-access-hpvf6\") pod \"nova-api-db-create-c6v9d\" (UID: \"e9e2c5e9-4e94-47b2-82f3-c11dac5bfa25\") " pod="openstack/nova-api-db-create-c6v9d" Mar 20 10:57:45 crc kubenswrapper[4748]: I0320 10:57:45.818722 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/62b48274-d738-49ad-81a5-a7c701193695-operator-scripts\") pod \"nova-cell0-db-create-pcsgd\" (UID: \"62b48274-d738-49ad-81a5-a7c701193695\") " pod="openstack/nova-cell0-db-create-pcsgd" Mar 20 10:57:45 crc kubenswrapper[4748]: I0320 10:57:45.818795 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzmrk\" (UniqueName: \"kubernetes.io/projected/62b48274-d738-49ad-81a5-a7c701193695-kube-api-access-vzmrk\") pod \"nova-cell0-db-create-pcsgd\" (UID: \"62b48274-d738-49ad-81a5-a7c701193695\") " pod="openstack/nova-cell0-db-create-pcsgd" Mar 20 10:57:45 crc kubenswrapper[4748]: I0320 10:57:45.818886 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e9e2c5e9-4e94-47b2-82f3-c11dac5bfa25-operator-scripts\") pod \"nova-api-db-create-c6v9d\" (UID: \"e9e2c5e9-4e94-47b2-82f3-c11dac5bfa25\") " pod="openstack/nova-api-db-create-c6v9d" Mar 20 10:57:45 crc kubenswrapper[4748]: I0320 10:57:45.819775 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e9e2c5e9-4e94-47b2-82f3-c11dac5bfa25-operator-scripts\") pod \"nova-api-db-create-c6v9d\" (UID: \"e9e2c5e9-4e94-47b2-82f3-c11dac5bfa25\") " pod="openstack/nova-api-db-create-c6v9d" Mar 20 10:57:45 crc kubenswrapper[4748]: I0320 10:57:45.858220 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-6fd8-account-create-update-mcv9d"] Mar 20 10:57:45 crc kubenswrapper[4748]: I0320 10:57:45.859882 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-6fd8-account-create-update-mcv9d" Mar 20 10:57:45 crc kubenswrapper[4748]: I0320 10:57:45.862704 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Mar 20 10:57:45 crc kubenswrapper[4748]: I0320 10:57:45.886704 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hpvf6\" (UniqueName: \"kubernetes.io/projected/e9e2c5e9-4e94-47b2-82f3-c11dac5bfa25-kube-api-access-hpvf6\") pod \"nova-api-db-create-c6v9d\" (UID: \"e9e2c5e9-4e94-47b2-82f3-c11dac5bfa25\") " pod="openstack/nova-api-db-create-c6v9d" Mar 20 10:57:45 crc kubenswrapper[4748]: I0320 10:57:45.895998 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-dqp69"] Mar 20 10:57:45 crc kubenswrapper[4748]: I0320 10:57:45.897472 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-dqp69" Mar 20 10:57:45 crc kubenswrapper[4748]: I0320 10:57:45.921753 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctcrb\" (UniqueName: \"kubernetes.io/projected/e90432d5-5ca1-4447-9e0b-2afaafa0ba1b-kube-api-access-ctcrb\") pod \"nova-api-6fd8-account-create-update-mcv9d\" (UID: \"e90432d5-5ca1-4447-9e0b-2afaafa0ba1b\") " pod="openstack/nova-api-6fd8-account-create-update-mcv9d" Mar 20 10:57:45 crc kubenswrapper[4748]: I0320 10:57:45.921950 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/777d5c45-8727-451f-bb86-6048a03ceb0b-operator-scripts\") pod \"nova-cell1-db-create-dqp69\" (UID: \"777d5c45-8727-451f-bb86-6048a03ceb0b\") " pod="openstack/nova-cell1-db-create-dqp69" Mar 20 10:57:45 crc kubenswrapper[4748]: I0320 10:57:45.921999 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/62b48274-d738-49ad-81a5-a7c701193695-operator-scripts\") pod \"nova-cell0-db-create-pcsgd\" (UID: \"62b48274-d738-49ad-81a5-a7c701193695\") " pod="openstack/nova-cell0-db-create-pcsgd" Mar 20 10:57:45 crc kubenswrapper[4748]: I0320 10:57:45.922077 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cnd2m\" (UniqueName: \"kubernetes.io/projected/777d5c45-8727-451f-bb86-6048a03ceb0b-kube-api-access-cnd2m\") pod \"nova-cell1-db-create-dqp69\" (UID: \"777d5c45-8727-451f-bb86-6048a03ceb0b\") " pod="openstack/nova-cell1-db-create-dqp69" Mar 20 10:57:45 crc kubenswrapper[4748]: I0320 10:57:45.922107 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vzmrk\" (UniqueName: \"kubernetes.io/projected/62b48274-d738-49ad-81a5-a7c701193695-kube-api-access-vzmrk\") pod \"nova-cell0-db-create-pcsgd\" (UID: \"62b48274-d738-49ad-81a5-a7c701193695\") " pod="openstack/nova-cell0-db-create-pcsgd" Mar 20 10:57:45 crc kubenswrapper[4748]: I0320 10:57:45.922182 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e90432d5-5ca1-4447-9e0b-2afaafa0ba1b-operator-scripts\") pod \"nova-api-6fd8-account-create-update-mcv9d\" (UID: \"e90432d5-5ca1-4447-9e0b-2afaafa0ba1b\") " pod="openstack/nova-api-6fd8-account-create-update-mcv9d" Mar 20 10:57:45 crc kubenswrapper[4748]: I0320 10:57:45.923342 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/62b48274-d738-49ad-81a5-a7c701193695-operator-scripts\") pod \"nova-cell0-db-create-pcsgd\" (UID: \"62b48274-d738-49ad-81a5-a7c701193695\") " pod="openstack/nova-cell0-db-create-pcsgd" Mar 20 10:57:45 crc kubenswrapper[4748]: I0320 10:57:45.929342 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-6fd8-account-create-update-mcv9d"] Mar 20 10:57:45 crc kubenswrapper[4748]: I0320 10:57:45.950604 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzmrk\" (UniqueName: \"kubernetes.io/projected/62b48274-d738-49ad-81a5-a7c701193695-kube-api-access-vzmrk\") pod \"nova-cell0-db-create-pcsgd\" (UID: \"62b48274-d738-49ad-81a5-a7c701193695\") " pod="openstack/nova-cell0-db-create-pcsgd" Mar 20 10:57:45 crc kubenswrapper[4748]: I0320 10:57:45.953549 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-dqp69"] Mar 20 10:57:45 crc kubenswrapper[4748]: I0320 10:57:45.991868 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-c6v9d" Mar 20 10:57:46 crc kubenswrapper[4748]: I0320 10:57:46.026388 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e90432d5-5ca1-4447-9e0b-2afaafa0ba1b-operator-scripts\") pod \"nova-api-6fd8-account-create-update-mcv9d\" (UID: \"e90432d5-5ca1-4447-9e0b-2afaafa0ba1b\") " pod="openstack/nova-api-6fd8-account-create-update-mcv9d" Mar 20 10:57:46 crc kubenswrapper[4748]: I0320 10:57:46.026885 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ctcrb\" (UniqueName: \"kubernetes.io/projected/e90432d5-5ca1-4447-9e0b-2afaafa0ba1b-kube-api-access-ctcrb\") pod \"nova-api-6fd8-account-create-update-mcv9d\" (UID: \"e90432d5-5ca1-4447-9e0b-2afaafa0ba1b\") " pod="openstack/nova-api-6fd8-account-create-update-mcv9d" Mar 20 10:57:46 crc kubenswrapper[4748]: I0320 10:57:46.026596 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e90432d5-5ca1-4447-9e0b-2afaafa0ba1b-operator-scripts\") pod \"nova-api-6fd8-account-create-update-mcv9d\" (UID: \"e90432d5-5ca1-4447-9e0b-2afaafa0ba1b\") " pod="openstack/nova-api-6fd8-account-create-update-mcv9d" Mar 20 10:57:46 crc kubenswrapper[4748]: I0320 10:57:46.027524 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/777d5c45-8727-451f-bb86-6048a03ceb0b-operator-scripts\") pod \"nova-cell1-db-create-dqp69\" (UID: \"777d5c45-8727-451f-bb86-6048a03ceb0b\") " pod="openstack/nova-cell1-db-create-dqp69" Mar 20 10:57:46 crc kubenswrapper[4748]: I0320 10:57:46.028589 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/777d5c45-8727-451f-bb86-6048a03ceb0b-operator-scripts\") pod \"nova-cell1-db-create-dqp69\" (UID: \"777d5c45-8727-451f-bb86-6048a03ceb0b\") " pod="openstack/nova-cell1-db-create-dqp69" Mar 20 10:57:46 crc kubenswrapper[4748]: I0320 10:57:46.029016 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cnd2m\" (UniqueName: \"kubernetes.io/projected/777d5c45-8727-451f-bb86-6048a03ceb0b-kube-api-access-cnd2m\") pod \"nova-cell1-db-create-dqp69\" (UID: \"777d5c45-8727-451f-bb86-6048a03ceb0b\") " pod="openstack/nova-cell1-db-create-dqp69" Mar 20 10:57:46 crc kubenswrapper[4748]: I0320 10:57:46.062963 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-pcsgd" Mar 20 10:57:46 crc kubenswrapper[4748]: I0320 10:57:46.064212 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctcrb\" (UniqueName: \"kubernetes.io/projected/e90432d5-5ca1-4447-9e0b-2afaafa0ba1b-kube-api-access-ctcrb\") pod \"nova-api-6fd8-account-create-update-mcv9d\" (UID: \"e90432d5-5ca1-4447-9e0b-2afaafa0ba1b\") " pod="openstack/nova-api-6fd8-account-create-update-mcv9d" Mar 20 10:57:46 crc kubenswrapper[4748]: I0320 10:57:46.069269 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-89ab-account-create-update-qtgz6"] Mar 20 10:57:46 crc kubenswrapper[4748]: I0320 10:57:46.072060 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-89ab-account-create-update-qtgz6" Mar 20 10:57:46 crc kubenswrapper[4748]: I0320 10:57:46.074621 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cnd2m\" (UniqueName: \"kubernetes.io/projected/777d5c45-8727-451f-bb86-6048a03ceb0b-kube-api-access-cnd2m\") pod \"nova-cell1-db-create-dqp69\" (UID: \"777d5c45-8727-451f-bb86-6048a03ceb0b\") " pod="openstack/nova-cell1-db-create-dqp69" Mar 20 10:57:46 crc kubenswrapper[4748]: I0320 10:57:46.078376 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-89ab-account-create-update-qtgz6"] Mar 20 10:57:46 crc kubenswrapper[4748]: I0320 10:57:46.080904 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Mar 20 10:57:46 crc kubenswrapper[4748]: I0320 10:57:46.131465 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npz9g\" (UniqueName: \"kubernetes.io/projected/dde56056-a261-4e3a-8cd6-b703d33a14ca-kube-api-access-npz9g\") pod \"nova-cell0-89ab-account-create-update-qtgz6\" (UID: \"dde56056-a261-4e3a-8cd6-b703d33a14ca\") " pod="openstack/nova-cell0-89ab-account-create-update-qtgz6" Mar 20 10:57:46 crc kubenswrapper[4748]: I0320 10:57:46.131586 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dde56056-a261-4e3a-8cd6-b703d33a14ca-operator-scripts\") pod \"nova-cell0-89ab-account-create-update-qtgz6\" (UID: \"dde56056-a261-4e3a-8cd6-b703d33a14ca\") " pod="openstack/nova-cell0-89ab-account-create-update-qtgz6" Mar 20 10:57:46 crc kubenswrapper[4748]: I0320 10:57:46.189004 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-6fd8-account-create-update-mcv9d" Mar 20 10:57:46 crc kubenswrapper[4748]: I0320 10:57:46.233577 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-npz9g\" (UniqueName: \"kubernetes.io/projected/dde56056-a261-4e3a-8cd6-b703d33a14ca-kube-api-access-npz9g\") pod \"nova-cell0-89ab-account-create-update-qtgz6\" (UID: \"dde56056-a261-4e3a-8cd6-b703d33a14ca\") " pod="openstack/nova-cell0-89ab-account-create-update-qtgz6" Mar 20 10:57:46 crc kubenswrapper[4748]: I0320 10:57:46.233664 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dde56056-a261-4e3a-8cd6-b703d33a14ca-operator-scripts\") pod \"nova-cell0-89ab-account-create-update-qtgz6\" (UID: \"dde56056-a261-4e3a-8cd6-b703d33a14ca\") " pod="openstack/nova-cell0-89ab-account-create-update-qtgz6" Mar 20 10:57:46 crc kubenswrapper[4748]: I0320 10:57:46.234559 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dde56056-a261-4e3a-8cd6-b703d33a14ca-operator-scripts\") pod \"nova-cell0-89ab-account-create-update-qtgz6\" (UID: \"dde56056-a261-4e3a-8cd6-b703d33a14ca\") " pod="openstack/nova-cell0-89ab-account-create-update-qtgz6" Mar 20 10:57:46 crc kubenswrapper[4748]: I0320 10:57:46.266012 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-npz9g\" (UniqueName: \"kubernetes.io/projected/dde56056-a261-4e3a-8cd6-b703d33a14ca-kube-api-access-npz9g\") pod \"nova-cell0-89ab-account-create-update-qtgz6\" (UID: \"dde56056-a261-4e3a-8cd6-b703d33a14ca\") " pod="openstack/nova-cell0-89ab-account-create-update-qtgz6" Mar 20 10:57:46 crc kubenswrapper[4748]: I0320 10:57:46.269134 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-d299-account-create-update-xxxtz"] Mar 20 10:57:46 crc kubenswrapper[4748]: I0320 10:57:46.270374 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-d299-account-create-update-xxxtz" Mar 20 10:57:46 crc kubenswrapper[4748]: I0320 10:57:46.273111 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Mar 20 10:57:46 crc kubenswrapper[4748]: I0320 10:57:46.280278 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-d299-account-create-update-xxxtz"] Mar 20 10:57:46 crc kubenswrapper[4748]: I0320 10:57:46.305127 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-dqp69" Mar 20 10:57:46 crc kubenswrapper[4748]: I0320 10:57:46.335517 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c2dafa6c-d784-48eb-926d-0648fb990dd4-operator-scripts\") pod \"nova-cell1-d299-account-create-update-xxxtz\" (UID: \"c2dafa6c-d784-48eb-926d-0648fb990dd4\") " pod="openstack/nova-cell1-d299-account-create-update-xxxtz" Mar 20 10:57:46 crc kubenswrapper[4748]: I0320 10:57:46.335636 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zt9dx\" (UniqueName: \"kubernetes.io/projected/c2dafa6c-d784-48eb-926d-0648fb990dd4-kube-api-access-zt9dx\") pod \"nova-cell1-d299-account-create-update-xxxtz\" (UID: \"c2dafa6c-d784-48eb-926d-0648fb990dd4\") " pod="openstack/nova-cell1-d299-account-create-update-xxxtz" Mar 20 10:57:46 crc kubenswrapper[4748]: I0320 10:57:46.437619 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c2dafa6c-d784-48eb-926d-0648fb990dd4-operator-scripts\") pod \"nova-cell1-d299-account-create-update-xxxtz\" (UID: \"c2dafa6c-d784-48eb-926d-0648fb990dd4\") " pod="openstack/nova-cell1-d299-account-create-update-xxxtz" Mar 20 10:57:46 crc kubenswrapper[4748]: I0320 10:57:46.437787 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zt9dx\" (UniqueName: \"kubernetes.io/projected/c2dafa6c-d784-48eb-926d-0648fb990dd4-kube-api-access-zt9dx\") pod \"nova-cell1-d299-account-create-update-xxxtz\" (UID: \"c2dafa6c-d784-48eb-926d-0648fb990dd4\") " pod="openstack/nova-cell1-d299-account-create-update-xxxtz" Mar 20 10:57:46 crc kubenswrapper[4748]: I0320 10:57:46.438615 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c2dafa6c-d784-48eb-926d-0648fb990dd4-operator-scripts\") pod \"nova-cell1-d299-account-create-update-xxxtz\" (UID: \"c2dafa6c-d784-48eb-926d-0648fb990dd4\") " pod="openstack/nova-cell1-d299-account-create-update-xxxtz" Mar 20 10:57:46 crc kubenswrapper[4748]: I0320 10:57:46.452299 4748 generic.go:334] "Generic (PLEG): container finished" podID="f9ddc460-94b3-44d2-9eed-2b4e344f8232" containerID="353e45ad1106596beef54eccff0883480d7ef6880bf17ea3857025f6eeabaef0" exitCode=0 Mar 20 10:57:46 crc kubenswrapper[4748]: I0320 10:57:46.452363 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f9ddc460-94b3-44d2-9eed-2b4e344f8232","Type":"ContainerDied","Data":"353e45ad1106596beef54eccff0883480d7ef6880bf17ea3857025f6eeabaef0"} Mar 20 10:57:46 crc kubenswrapper[4748]: I0320 10:57:46.457160 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zt9dx\" (UniqueName: \"kubernetes.io/projected/c2dafa6c-d784-48eb-926d-0648fb990dd4-kube-api-access-zt9dx\") pod \"nova-cell1-d299-account-create-update-xxxtz\" (UID: \"c2dafa6c-d784-48eb-926d-0648fb990dd4\") " pod="openstack/nova-cell1-d299-account-create-update-xxxtz" Mar 20 10:57:46 crc kubenswrapper[4748]: I0320 10:57:46.481300 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-89ab-account-create-update-qtgz6" Mar 20 10:57:46 crc kubenswrapper[4748]: I0320 10:57:46.638494 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-d299-account-create-update-xxxtz" Mar 20 10:57:48 crc kubenswrapper[4748]: I0320 10:57:48.246353 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 20 10:57:49 crc kubenswrapper[4748]: I0320 10:57:49.116178 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-589cf645cb-wkg45" Mar 20 10:57:49 crc kubenswrapper[4748]: I0320 10:57:49.131267 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-589cf645cb-wkg45" Mar 20 10:57:51 crc kubenswrapper[4748]: I0320 10:57:51.477706 4748 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-b85b9d5c6-7bgxl" podUID="18e8975c-a2d8-4319-b175-2a66ce3d97c9" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.150:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.150:8443: connect: connection refused" Mar 20 10:57:51 crc kubenswrapper[4748]: W0320 10:57:51.477937 4748 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod12263f55_f4a7_481f_afab_45f51bd4d60d.slice/crio-42d55aa511b38c1da9a2625d1b7bf8f621f496ea145d7f3083f37e08142b6176 WatchSource:0}: Error finding container 42d55aa511b38c1da9a2625d1b7bf8f621f496ea145d7f3083f37e08142b6176: Status 404 returned error can't find the container with id 42d55aa511b38c1da9a2625d1b7bf8f621f496ea145d7f3083f37e08142b6176 Mar 20 10:57:51 crc kubenswrapper[4748]: I0320 10:57:51.478151 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-b85b9d5c6-7bgxl" Mar 20 10:57:51 crc kubenswrapper[4748]: I0320 10:57:51.529359 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-c66c949c-9cv26" event={"ID":"12263f55-f4a7-481f-afab-45f51bd4d60d","Type":"ContainerStarted","Data":"42d55aa511b38c1da9a2625d1b7bf8f621f496ea145d7f3083f37e08142b6176"} Mar 20 10:57:52 crc kubenswrapper[4748]: I0320 10:57:52.160262 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-6fd8-account-create-update-mcv9d"] Mar 20 10:57:52 crc kubenswrapper[4748]: W0320 10:57:52.163612 4748 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode90432d5_5ca1_4447_9e0b_2afaafa0ba1b.slice/crio-1b5357bbb2e316a09f8b3b32c892023682302cbcd1ac303624a4c0cc60c7727e WatchSource:0}: Error finding container 1b5357bbb2e316a09f8b3b32c892023682302cbcd1ac303624a4c0cc60c7727e: Status 404 returned error can't find the container with id 1b5357bbb2e316a09f8b3b32c892023682302cbcd1ac303624a4c0cc60c7727e Mar 20 10:57:52 crc kubenswrapper[4748]: I0320 10:57:52.312367 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 10:57:52 crc kubenswrapper[4748]: I0320 10:57:52.407537 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9ddc460-94b3-44d2-9eed-2b4e344f8232-combined-ca-bundle\") pod \"f9ddc460-94b3-44d2-9eed-2b4e344f8232\" (UID: \"f9ddc460-94b3-44d2-9eed-2b4e344f8232\") " Mar 20 10:57:52 crc kubenswrapper[4748]: I0320 10:57:52.407661 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9ddc460-94b3-44d2-9eed-2b4e344f8232-config-data\") pod \"f9ddc460-94b3-44d2-9eed-2b4e344f8232\" (UID: \"f9ddc460-94b3-44d2-9eed-2b4e344f8232\") " Mar 20 10:57:52 crc kubenswrapper[4748]: I0320 10:57:52.407704 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8m7rx\" (UniqueName: \"kubernetes.io/projected/f9ddc460-94b3-44d2-9eed-2b4e344f8232-kube-api-access-8m7rx\") pod \"f9ddc460-94b3-44d2-9eed-2b4e344f8232\" (UID: \"f9ddc460-94b3-44d2-9eed-2b4e344f8232\") " Mar 20 10:57:52 crc kubenswrapper[4748]: I0320 10:57:52.407799 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f9ddc460-94b3-44d2-9eed-2b4e344f8232-log-httpd\") pod \"f9ddc460-94b3-44d2-9eed-2b4e344f8232\" (UID: \"f9ddc460-94b3-44d2-9eed-2b4e344f8232\") " Mar 20 10:57:52 crc kubenswrapper[4748]: I0320 10:57:52.407887 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f9ddc460-94b3-44d2-9eed-2b4e344f8232-sg-core-conf-yaml\") pod \"f9ddc460-94b3-44d2-9eed-2b4e344f8232\" (UID: \"f9ddc460-94b3-44d2-9eed-2b4e344f8232\") " Mar 20 10:57:52 crc kubenswrapper[4748]: I0320 10:57:52.407992 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f9ddc460-94b3-44d2-9eed-2b4e344f8232-scripts\") pod \"f9ddc460-94b3-44d2-9eed-2b4e344f8232\" (UID: \"f9ddc460-94b3-44d2-9eed-2b4e344f8232\") " Mar 20 10:57:52 crc kubenswrapper[4748]: I0320 10:57:52.408024 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f9ddc460-94b3-44d2-9eed-2b4e344f8232-run-httpd\") pod \"f9ddc460-94b3-44d2-9eed-2b4e344f8232\" (UID: \"f9ddc460-94b3-44d2-9eed-2b4e344f8232\") " Mar 20 10:57:52 crc kubenswrapper[4748]: I0320 10:57:52.408925 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9ddc460-94b3-44d2-9eed-2b4e344f8232-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "f9ddc460-94b3-44d2-9eed-2b4e344f8232" (UID: "f9ddc460-94b3-44d2-9eed-2b4e344f8232"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:57:52 crc kubenswrapper[4748]: I0320 10:57:52.510153 4748 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f9ddc460-94b3-44d2-9eed-2b4e344f8232-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 10:57:52 crc kubenswrapper[4748]: I0320 10:57:52.526262 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-6fd8-account-create-update-mcv9d" event={"ID":"e90432d5-5ca1-4447-9e0b-2afaafa0ba1b","Type":"ContainerStarted","Data":"1b5357bbb2e316a09f8b3b32c892023682302cbcd1ac303624a4c0cc60c7727e"} Mar 20 10:57:52 crc kubenswrapper[4748]: I0320 10:57:52.531341 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f9ddc460-94b3-44d2-9eed-2b4e344f8232","Type":"ContainerDied","Data":"1dbd4fa19e69235d080432aa0b3ac54cce1f41f90a1059f97b72648d84c0fd39"} Mar 20 10:57:52 crc kubenswrapper[4748]: I0320 10:57:52.531403 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 10:57:52 crc kubenswrapper[4748]: I0320 10:57:52.531415 4748 scope.go:117] "RemoveContainer" containerID="cfb24b150bed54645f70e625690898fb4f02902434052c487e5f5e92fe2cbabf" Mar 20 10:57:52 crc kubenswrapper[4748]: I0320 10:57:52.551233 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-pcsgd"] Mar 20 10:57:52 crc kubenswrapper[4748]: I0320 10:57:52.555102 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9ddc460-94b3-44d2-9eed-2b4e344f8232-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "f9ddc460-94b3-44d2-9eed-2b4e344f8232" (UID: "f9ddc460-94b3-44d2-9eed-2b4e344f8232"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:57:52 crc kubenswrapper[4748]: I0320 10:57:52.559377 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9ddc460-94b3-44d2-9eed-2b4e344f8232-scripts" (OuterVolumeSpecName: "scripts") pod "f9ddc460-94b3-44d2-9eed-2b4e344f8232" (UID: "f9ddc460-94b3-44d2-9eed-2b4e344f8232"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:57:52 crc kubenswrapper[4748]: I0320 10:57:52.563694 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9ddc460-94b3-44d2-9eed-2b4e344f8232-config-data" (OuterVolumeSpecName: "config-data") pod "f9ddc460-94b3-44d2-9eed-2b4e344f8232" (UID: "f9ddc460-94b3-44d2-9eed-2b4e344f8232"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:57:52 crc kubenswrapper[4748]: I0320 10:57:52.569646 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-c6v9d"] Mar 20 10:57:52 crc kubenswrapper[4748]: I0320 10:57:52.581159 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-dqp69"] Mar 20 10:57:52 crc kubenswrapper[4748]: I0320 10:57:52.584407 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9ddc460-94b3-44d2-9eed-2b4e344f8232-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f9ddc460-94b3-44d2-9eed-2b4e344f8232" (UID: "f9ddc460-94b3-44d2-9eed-2b4e344f8232"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:57:52 crc kubenswrapper[4748]: W0320 10:57:52.587107 4748 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddde56056_a261_4e3a_8cd6_b703d33a14ca.slice/crio-e5368974d12afc70b122a7116a37179b45c32c541e8a90ba1c8baaca1285b9f0 WatchSource:0}: Error finding container e5368974d12afc70b122a7116a37179b45c32c541e8a90ba1c8baaca1285b9f0: Status 404 returned error can't find the container with id e5368974d12afc70b122a7116a37179b45c32c541e8a90ba1c8baaca1285b9f0 Mar 20 10:57:52 crc kubenswrapper[4748]: I0320 10:57:52.587152 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-d299-account-create-update-xxxtz"] Mar 20 10:57:52 crc kubenswrapper[4748]: E0320 10:57:52.587457 4748 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-openstackclient:current-podified" Mar 20 10:57:52 crc kubenswrapper[4748]: E0320 10:57:52.587581 4748 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:openstackclient,Image:quay.io/podified-antelope-centos9/openstack-openstackclient:current-podified,Command:[/bin/sleep],Args:[infinity],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n665h5dfh68dh8h5bfh57fh5cfh78h78hfbh67ch67ch97h59ch5fh655h64bh655h7dh7dhf6h68bh5b6h59h684h7bhcdh5ffhf5h59fh585h5c5q,ValueFrom:nil,},EnvVar{Name:OS_CLOUD,Value:default,ValueFrom:nil,},EnvVar{Name:PROMETHEUS_HOST,Value:metric-storage-prometheus.openstack.svc,ValueFrom:nil,},EnvVar{Name:PROMETHEUS_PORT,Value:9090,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:openstack-config,ReadOnly:false,MountPath:/home/cloud-admin/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/home/cloud-admin/.config/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/home/cloud-admin/cloudrc,SubPath:cloudrc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-v46t9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42401,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*42401,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstackclient_openstack(2b2f2b26-6292-47bb-b8ee-971d9b47c85d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 10:57:52 crc kubenswrapper[4748]: E0320 10:57:52.588859 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openstackclient\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/openstackclient" podUID="2b2f2b26-6292-47bb-b8ee-971d9b47c85d" Mar 20 10:57:52 crc kubenswrapper[4748]: I0320 10:57:52.594746 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-89ab-account-create-update-qtgz6"] Mar 20 10:57:52 crc kubenswrapper[4748]: I0320 10:57:52.599801 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9ddc460-94b3-44d2-9eed-2b4e344f8232-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "f9ddc460-94b3-44d2-9eed-2b4e344f8232" (UID: "f9ddc460-94b3-44d2-9eed-2b4e344f8232"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:57:52 crc kubenswrapper[4748]: W0320 10:57:52.601680 4748 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc2dafa6c_d784_48eb_926d_0648fb990dd4.slice/crio-120c3364638503e625ce17aa539ef6bda4a553920782f085616556e0486d2280 WatchSource:0}: Error finding container 120c3364638503e625ce17aa539ef6bda4a553920782f085616556e0486d2280: Status 404 returned error can't find the container with id 120c3364638503e625ce17aa539ef6bda4a553920782f085616556e0486d2280 Mar 20 10:57:52 crc kubenswrapper[4748]: I0320 10:57:52.610437 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9ddc460-94b3-44d2-9eed-2b4e344f8232-kube-api-access-8m7rx" (OuterVolumeSpecName: "kube-api-access-8m7rx") pod "f9ddc460-94b3-44d2-9eed-2b4e344f8232" (UID: "f9ddc460-94b3-44d2-9eed-2b4e344f8232"). InnerVolumeSpecName "kube-api-access-8m7rx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:57:52 crc kubenswrapper[4748]: I0320 10:57:52.611338 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8m7rx\" (UniqueName: \"kubernetes.io/projected/f9ddc460-94b3-44d2-9eed-2b4e344f8232-kube-api-access-8m7rx\") on node \"crc\" DevicePath \"\"" Mar 20 10:57:52 crc kubenswrapper[4748]: I0320 10:57:52.611368 4748 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f9ddc460-94b3-44d2-9eed-2b4e344f8232-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 20 10:57:52 crc kubenswrapper[4748]: I0320 10:57:52.611380 4748 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f9ddc460-94b3-44d2-9eed-2b4e344f8232-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 10:57:52 crc kubenswrapper[4748]: I0320 10:57:52.611391 4748 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f9ddc460-94b3-44d2-9eed-2b4e344f8232-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 10:57:52 crc kubenswrapper[4748]: I0320 10:57:52.611404 4748 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9ddc460-94b3-44d2-9eed-2b4e344f8232-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 10:57:52 crc kubenswrapper[4748]: I0320 10:57:52.611415 4748 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9ddc460-94b3-44d2-9eed-2b4e344f8232-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 10:57:52 crc kubenswrapper[4748]: I0320 10:57:52.865851 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 10:57:52 crc kubenswrapper[4748]: I0320 10:57:52.874191 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 20 10:57:52 crc kubenswrapper[4748]: I0320 10:57:52.887788 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 10:57:52 crc kubenswrapper[4748]: E0320 10:57:52.888289 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9ddc460-94b3-44d2-9eed-2b4e344f8232" containerName="ceilometer-central-agent" Mar 20 10:57:52 crc kubenswrapper[4748]: I0320 10:57:52.888316 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9ddc460-94b3-44d2-9eed-2b4e344f8232" containerName="ceilometer-central-agent" Mar 20 10:57:52 crc kubenswrapper[4748]: E0320 10:57:52.888330 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9ddc460-94b3-44d2-9eed-2b4e344f8232" containerName="sg-core" Mar 20 10:57:52 crc kubenswrapper[4748]: I0320 10:57:52.888340 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9ddc460-94b3-44d2-9eed-2b4e344f8232" containerName="sg-core" Mar 20 10:57:52 crc kubenswrapper[4748]: E0320 10:57:52.888378 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9ddc460-94b3-44d2-9eed-2b4e344f8232" containerName="proxy-httpd" Mar 20 10:57:52 crc kubenswrapper[4748]: I0320 10:57:52.888386 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9ddc460-94b3-44d2-9eed-2b4e344f8232" containerName="proxy-httpd" Mar 20 10:57:52 crc kubenswrapper[4748]: E0320 10:57:52.888402 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9ddc460-94b3-44d2-9eed-2b4e344f8232" containerName="ceilometer-notification-agent" Mar 20 10:57:52 crc kubenswrapper[4748]: I0320 10:57:52.888410 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9ddc460-94b3-44d2-9eed-2b4e344f8232" containerName="ceilometer-notification-agent" Mar 20 10:57:52 crc kubenswrapper[4748]: I0320 10:57:52.888613 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9ddc460-94b3-44d2-9eed-2b4e344f8232" containerName="sg-core" Mar 20 10:57:52 crc kubenswrapper[4748]: I0320 10:57:52.888632 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9ddc460-94b3-44d2-9eed-2b4e344f8232" containerName="proxy-httpd" Mar 20 10:57:52 crc kubenswrapper[4748]: I0320 10:57:52.888643 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9ddc460-94b3-44d2-9eed-2b4e344f8232" containerName="ceilometer-central-agent" Mar 20 10:57:52 crc kubenswrapper[4748]: I0320 10:57:52.888660 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9ddc460-94b3-44d2-9eed-2b4e344f8232" containerName="ceilometer-notification-agent" Mar 20 10:57:52 crc kubenswrapper[4748]: I0320 10:57:52.890350 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 10:57:52 crc kubenswrapper[4748]: I0320 10:57:52.892613 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 10:57:52 crc kubenswrapper[4748]: I0320 10:57:52.892848 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 10:57:52 crc kubenswrapper[4748]: I0320 10:57:52.911483 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 10:57:52 crc kubenswrapper[4748]: I0320 10:57:52.915985 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/847669ae-106d-4f9f-8f74-ea1f8e2d221d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"847669ae-106d-4f9f-8f74-ea1f8e2d221d\") " pod="openstack/ceilometer-0" Mar 20 10:57:52 crc kubenswrapper[4748]: I0320 10:57:52.916203 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/847669ae-106d-4f9f-8f74-ea1f8e2d221d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"847669ae-106d-4f9f-8f74-ea1f8e2d221d\") " pod="openstack/ceilometer-0" Mar 20 10:57:52 crc kubenswrapper[4748]: I0320 10:57:52.916421 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/847669ae-106d-4f9f-8f74-ea1f8e2d221d-run-httpd\") pod \"ceilometer-0\" (UID: \"847669ae-106d-4f9f-8f74-ea1f8e2d221d\") " pod="openstack/ceilometer-0" Mar 20 10:57:52 crc kubenswrapper[4748]: I0320 10:57:52.916545 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/847669ae-106d-4f9f-8f74-ea1f8e2d221d-config-data\") pod \"ceilometer-0\" (UID: \"847669ae-106d-4f9f-8f74-ea1f8e2d221d\") " pod="openstack/ceilometer-0" Mar 20 10:57:52 crc kubenswrapper[4748]: I0320 10:57:52.916591 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/847669ae-106d-4f9f-8f74-ea1f8e2d221d-scripts\") pod \"ceilometer-0\" (UID: \"847669ae-106d-4f9f-8f74-ea1f8e2d221d\") " pod="openstack/ceilometer-0" Mar 20 10:57:52 crc kubenswrapper[4748]: I0320 10:57:52.916676 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/847669ae-106d-4f9f-8f74-ea1f8e2d221d-log-httpd\") pod \"ceilometer-0\" (UID: \"847669ae-106d-4f9f-8f74-ea1f8e2d221d\") " pod="openstack/ceilometer-0" Mar 20 10:57:52 crc kubenswrapper[4748]: I0320 10:57:52.916737 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xj4rj\" (UniqueName: \"kubernetes.io/projected/847669ae-106d-4f9f-8f74-ea1f8e2d221d-kube-api-access-xj4rj\") pod \"ceilometer-0\" (UID: \"847669ae-106d-4f9f-8f74-ea1f8e2d221d\") " pod="openstack/ceilometer-0" Mar 20 10:57:53 crc kubenswrapper[4748]: I0320 10:57:53.018453 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/847669ae-106d-4f9f-8f74-ea1f8e2d221d-config-data\") pod \"ceilometer-0\" (UID: \"847669ae-106d-4f9f-8f74-ea1f8e2d221d\") " pod="openstack/ceilometer-0" Mar 20 10:57:53 crc kubenswrapper[4748]: I0320 10:57:53.018499 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/847669ae-106d-4f9f-8f74-ea1f8e2d221d-scripts\") pod \"ceilometer-0\" (UID: \"847669ae-106d-4f9f-8f74-ea1f8e2d221d\") " pod="openstack/ceilometer-0" Mar 20 10:57:53 crc kubenswrapper[4748]: I0320 10:57:53.018518 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/847669ae-106d-4f9f-8f74-ea1f8e2d221d-log-httpd\") pod \"ceilometer-0\" (UID: \"847669ae-106d-4f9f-8f74-ea1f8e2d221d\") " pod="openstack/ceilometer-0" Mar 20 10:57:53 crc kubenswrapper[4748]: I0320 10:57:53.018543 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xj4rj\" (UniqueName: \"kubernetes.io/projected/847669ae-106d-4f9f-8f74-ea1f8e2d221d-kube-api-access-xj4rj\") pod \"ceilometer-0\" (UID: \"847669ae-106d-4f9f-8f74-ea1f8e2d221d\") " pod="openstack/ceilometer-0" Mar 20 10:57:53 crc kubenswrapper[4748]: I0320 10:57:53.018616 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/847669ae-106d-4f9f-8f74-ea1f8e2d221d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"847669ae-106d-4f9f-8f74-ea1f8e2d221d\") " pod="openstack/ceilometer-0" Mar 20 10:57:53 crc kubenswrapper[4748]: I0320 10:57:53.018648 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/847669ae-106d-4f9f-8f74-ea1f8e2d221d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"847669ae-106d-4f9f-8f74-ea1f8e2d221d\") " pod="openstack/ceilometer-0" Mar 20 10:57:53 crc kubenswrapper[4748]: I0320 10:57:53.018691 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/847669ae-106d-4f9f-8f74-ea1f8e2d221d-run-httpd\") pod \"ceilometer-0\" (UID: \"847669ae-106d-4f9f-8f74-ea1f8e2d221d\") " pod="openstack/ceilometer-0" Mar 20 10:57:53 crc kubenswrapper[4748]: I0320 10:57:53.019507 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/847669ae-106d-4f9f-8f74-ea1f8e2d221d-run-httpd\") pod \"ceilometer-0\" (UID: \"847669ae-106d-4f9f-8f74-ea1f8e2d221d\") " pod="openstack/ceilometer-0" Mar 20 10:57:53 crc kubenswrapper[4748]: I0320 10:57:53.020119 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/847669ae-106d-4f9f-8f74-ea1f8e2d221d-log-httpd\") pod \"ceilometer-0\" (UID: \"847669ae-106d-4f9f-8f74-ea1f8e2d221d\") " pod="openstack/ceilometer-0" Mar 20 10:57:53 crc kubenswrapper[4748]: I0320 10:57:53.025080 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/847669ae-106d-4f9f-8f74-ea1f8e2d221d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"847669ae-106d-4f9f-8f74-ea1f8e2d221d\") " pod="openstack/ceilometer-0" Mar 20 10:57:53 crc kubenswrapper[4748]: I0320 10:57:53.025659 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/847669ae-106d-4f9f-8f74-ea1f8e2d221d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"847669ae-106d-4f9f-8f74-ea1f8e2d221d\") " pod="openstack/ceilometer-0" Mar 20 10:57:53 crc kubenswrapper[4748]: I0320 10:57:53.026673 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/847669ae-106d-4f9f-8f74-ea1f8e2d221d-config-data\") pod \"ceilometer-0\" (UID: \"847669ae-106d-4f9f-8f74-ea1f8e2d221d\") " pod="openstack/ceilometer-0" Mar 20 10:57:53 crc kubenswrapper[4748]: I0320 10:57:53.030959 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/847669ae-106d-4f9f-8f74-ea1f8e2d221d-scripts\") pod \"ceilometer-0\" (UID: \"847669ae-106d-4f9f-8f74-ea1f8e2d221d\") " pod="openstack/ceilometer-0" Mar 20 10:57:53 crc kubenswrapper[4748]: I0320 10:57:53.043262 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xj4rj\" (UniqueName: \"kubernetes.io/projected/847669ae-106d-4f9f-8f74-ea1f8e2d221d-kube-api-access-xj4rj\") pod \"ceilometer-0\" (UID: \"847669ae-106d-4f9f-8f74-ea1f8e2d221d\") " pod="openstack/ceilometer-0" Mar 20 10:57:53 crc kubenswrapper[4748]: I0320 10:57:53.113053 4748 scope.go:117] "RemoveContainer" containerID="04847ca44cd324189231aad70f02b74561c73fd6831bcdaaf48927e9710ec62a" Mar 20 10:57:53 crc kubenswrapper[4748]: I0320 10:57:53.207234 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 10:57:53 crc kubenswrapper[4748]: I0320 10:57:53.310739 4748 scope.go:117] "RemoveContainer" containerID="353e45ad1106596beef54eccff0883480d7ef6880bf17ea3857025f6eeabaef0" Mar 20 10:57:53 crc kubenswrapper[4748]: I0320 10:57:53.537602 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9ddc460-94b3-44d2-9eed-2b4e344f8232" path="/var/lib/kubelet/pods/f9ddc460-94b3-44d2-9eed-2b4e344f8232/volumes" Mar 20 10:57:53 crc kubenswrapper[4748]: I0320 10:57:53.542502 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-d299-account-create-update-xxxtz" event={"ID":"c2dafa6c-d784-48eb-926d-0648fb990dd4","Type":"ContainerStarted","Data":"120c3364638503e625ce17aa539ef6bda4a553920782f085616556e0486d2280"} Mar 20 10:57:53 crc kubenswrapper[4748]: I0320 10:57:53.543543 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-89ab-account-create-update-qtgz6" event={"ID":"dde56056-a261-4e3a-8cd6-b703d33a14ca","Type":"ContainerStarted","Data":"e5368974d12afc70b122a7116a37179b45c32c541e8a90ba1c8baaca1285b9f0"} Mar 20 10:57:53 crc kubenswrapper[4748]: I0320 10:57:53.544609 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-dqp69" event={"ID":"777d5c45-8727-451f-bb86-6048a03ceb0b","Type":"ContainerStarted","Data":"1fe4960b90a4b4132db8466ffc49ad7253f667cc5746faa45ae3a5c8db74c0a5"} Mar 20 10:57:53 crc kubenswrapper[4748]: I0320 10:57:53.548192 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-c66c949c-9cv26" event={"ID":"12263f55-f4a7-481f-afab-45f51bd4d60d","Type":"ContainerStarted","Data":"7bec66c8a65c0634ff5be0c9651b38ad416e0bb8f45cc186a0927810af3be0be"} Mar 20 10:57:53 crc kubenswrapper[4748]: I0320 10:57:53.548985 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-pcsgd" event={"ID":"62b48274-d738-49ad-81a5-a7c701193695","Type":"ContainerStarted","Data":"cb2fce004344ed11a206db1eb1c9e44696ea9ec87eaf38792f0b30af7ec64707"} Mar 20 10:57:53 crc kubenswrapper[4748]: I0320 10:57:53.549819 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-c6v9d" event={"ID":"e9e2c5e9-4e94-47b2-82f3-c11dac5bfa25","Type":"ContainerStarted","Data":"8b19bed78de1f97825f8c8e0c00e43bfceceac68ffea11545adf27877cc8019b"} Mar 20 10:57:53 crc kubenswrapper[4748]: E0320 10:57:53.607109 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openstackclient\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-openstackclient:current-podified\\\"\"" pod="openstack/openstackclient" podUID="2b2f2b26-6292-47bb-b8ee-971d9b47c85d" Mar 20 10:57:53 crc kubenswrapper[4748]: I0320 10:57:53.627399 4748 scope.go:117] "RemoveContainer" containerID="7e75f88044945d065adcd67d239ad490c07771d34ce5bef16fa3e8c0b57c522a" Mar 20 10:57:54 crc kubenswrapper[4748]: I0320 10:57:54.054284 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 10:57:54 crc kubenswrapper[4748]: I0320 10:57:54.562301 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-6fd8-account-create-update-mcv9d" event={"ID":"e90432d5-5ca1-4447-9e0b-2afaafa0ba1b","Type":"ContainerStarted","Data":"0e20c16be7121fffee41879fde884299ec5b691be8fd53dffb29b0715fcc4d7b"} Mar 20 10:57:54 crc kubenswrapper[4748]: I0320 10:57:54.565127 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"847669ae-106d-4f9f-8f74-ea1f8e2d221d","Type":"ContainerStarted","Data":"100c881e9aa7efcaa37f6078271596fdccae1d2e5f3fe70104b864275a1364e5"} Mar 20 10:57:55 crc kubenswrapper[4748]: I0320 10:57:55.576137 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-d299-account-create-update-xxxtz" event={"ID":"c2dafa6c-d784-48eb-926d-0648fb990dd4","Type":"ContainerStarted","Data":"215e76e819f49de525ff0060641fbe0eff4c291c6d92a2b41f7a99c17b5747e3"} Mar 20 10:57:55 crc kubenswrapper[4748]: I0320 10:57:55.578057 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-89ab-account-create-update-qtgz6" event={"ID":"dde56056-a261-4e3a-8cd6-b703d33a14ca","Type":"ContainerStarted","Data":"c1883afcac4f293a3d3b5ec3f2d4c57afa2b80b2c38b93e3a42b5c4c78645574"} Mar 20 10:57:55 crc kubenswrapper[4748]: I0320 10:57:55.580436 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-dqp69" event={"ID":"777d5c45-8727-451f-bb86-6048a03ceb0b","Type":"ContainerStarted","Data":"9bd4f5b57b785ddc899d661a8c8d5dd749ac08edccafe7bb665a8d95b9604f2a"} Mar 20 10:57:55 crc kubenswrapper[4748]: I0320 10:57:55.583189 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-c66c949c-9cv26" event={"ID":"12263f55-f4a7-481f-afab-45f51bd4d60d","Type":"ContainerStarted","Data":"732dc388f5915764381fe51810024c5e6c230c46d6559f08420f8320dc015610"} Mar 20 10:57:55 crc kubenswrapper[4748]: I0320 10:57:55.583321 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-c66c949c-9cv26" Mar 20 10:57:55 crc kubenswrapper[4748]: I0320 10:57:55.583362 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-c66c949c-9cv26" Mar 20 10:57:55 crc kubenswrapper[4748]: I0320 10:57:55.585882 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-pcsgd" event={"ID":"62b48274-d738-49ad-81a5-a7c701193695","Type":"ContainerStarted","Data":"a5cce16e1895a78c7c2ba4c5b940d7ec88cb5d9dfe1c69b5555e34ea041a135b"} Mar 20 10:57:55 crc kubenswrapper[4748]: I0320 10:57:55.601949 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-c6v9d" event={"ID":"e9e2c5e9-4e94-47b2-82f3-c11dac5bfa25","Type":"ContainerStarted","Data":"e7df77cd00a4e6f0259903becec720d11b8639fd32460f92aa673a4d87b59994"} Mar 20 10:57:55 crc kubenswrapper[4748]: I0320 10:57:55.684550 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-db-create-dqp69" podStartSLOduration=10.684523975 podStartE2EDuration="10.684523975s" podCreationTimestamp="2026-03-20 10:57:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:57:55.676002302 +0000 UTC m=+1310.817548116" watchObservedRunningTime="2026-03-20 10:57:55.684523975 +0000 UTC m=+1310.826069789" Mar 20 10:57:55 crc kubenswrapper[4748]: I0320 10:57:55.695885 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-89ab-account-create-update-qtgz6" podStartSLOduration=9.69585494 podStartE2EDuration="9.69585494s" podCreationTimestamp="2026-03-20 10:57:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:57:55.695827889 +0000 UTC m=+1310.837373703" watchObservedRunningTime="2026-03-20 10:57:55.69585494 +0000 UTC m=+1310.837400754" Mar 20 10:57:55 crc kubenswrapper[4748]: I0320 10:57:55.711958 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-db-create-c6v9d" podStartSLOduration=10.711927143 podStartE2EDuration="10.711927143s" podCreationTimestamp="2026-03-20 10:57:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:57:55.708496007 +0000 UTC m=+1310.850041831" watchObservedRunningTime="2026-03-20 10:57:55.711927143 +0000 UTC m=+1310.853472957" Mar 20 10:57:55 crc kubenswrapper[4748]: I0320 10:57:55.729328 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-db-create-pcsgd" podStartSLOduration=10.729305989 podStartE2EDuration="10.729305989s" podCreationTimestamp="2026-03-20 10:57:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:57:55.723965905 +0000 UTC m=+1310.865511719" watchObservedRunningTime="2026-03-20 10:57:55.729305989 +0000 UTC m=+1310.870851813" Mar 20 10:57:55 crc kubenswrapper[4748]: I0320 10:57:55.742346 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-6fd8-account-create-update-mcv9d" podStartSLOduration=10.742329006 podStartE2EDuration="10.742329006s" podCreationTimestamp="2026-03-20 10:57:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:57:55.737821333 +0000 UTC m=+1310.879367147" watchObservedRunningTime="2026-03-20 10:57:55.742329006 +0000 UTC m=+1310.883874820" Mar 20 10:57:55 crc kubenswrapper[4748]: I0320 10:57:55.763030 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-d299-account-create-update-xxxtz" podStartSLOduration=9.763000535 podStartE2EDuration="9.763000535s" podCreationTimestamp="2026-03-20 10:57:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:57:55.750652845 +0000 UTC m=+1310.892198659" watchObservedRunningTime="2026-03-20 10:57:55.763000535 +0000 UTC m=+1310.904546359" Mar 20 10:57:56 crc kubenswrapper[4748]: E0320 10:57:56.594165 4748 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod62b48274_d738_49ad_81a5_a7c701193695.slice/crio-a5cce16e1895a78c7c2ba4c5b940d7ec88cb5d9dfe1c69b5555e34ea041a135b.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod777d5c45_8727_451f_bb86_6048a03ceb0b.slice/crio-9bd4f5b57b785ddc899d661a8c8d5dd749ac08edccafe7bb665a8d95b9604f2a.scope\": RecentStats: unable to find data in memory cache]" Mar 20 10:57:57 crc kubenswrapper[4748]: I0320 10:57:57.146691 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-695f6cc9c-5bkz4" Mar 20 10:57:57 crc kubenswrapper[4748]: I0320 10:57:57.172087 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-c66c949c-9cv26" podStartSLOduration=15.172065696 podStartE2EDuration="15.172065696s" podCreationTimestamp="2026-03-20 10:57:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:57:55.791530021 +0000 UTC m=+1310.933075835" watchObservedRunningTime="2026-03-20 10:57:57.172065696 +0000 UTC m=+1312.313611510" Mar 20 10:57:57 crc kubenswrapper[4748]: I0320 10:57:57.223128 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-589cf645cb-wkg45"] Mar 20 10:57:57 crc kubenswrapper[4748]: I0320 10:57:57.229060 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-589cf645cb-wkg45" podUID="c8728f34-b5b9-4ced-9424-b83ff940580f" containerName="neutron-api" containerID="cri-o://a0b237751dc526e15e76f71e5acad80dcc3d5b8a27566599661be2cfd8af52aa" gracePeriod=30 Mar 20 10:57:57 crc kubenswrapper[4748]: I0320 10:57:57.229197 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-589cf645cb-wkg45" podUID="c8728f34-b5b9-4ced-9424-b83ff940580f" containerName="neutron-httpd" containerID="cri-o://eeb50da6d9d48d7097bcc7e766ded49dbca3fb46e5b62b031e89186a098b6d45" gracePeriod=30 Mar 20 10:57:57 crc kubenswrapper[4748]: I0320 10:57:57.619890 4748 generic.go:334] "Generic (PLEG): container finished" podID="e9e2c5e9-4e94-47b2-82f3-c11dac5bfa25" containerID="e7df77cd00a4e6f0259903becec720d11b8639fd32460f92aa673a4d87b59994" exitCode=0 Mar 20 10:57:57 crc kubenswrapper[4748]: I0320 10:57:57.619998 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-c6v9d" event={"ID":"e9e2c5e9-4e94-47b2-82f3-c11dac5bfa25","Type":"ContainerDied","Data":"e7df77cd00a4e6f0259903becec720d11b8639fd32460f92aa673a4d87b59994"} Mar 20 10:57:57 crc kubenswrapper[4748]: I0320 10:57:57.622915 4748 generic.go:334] "Generic (PLEG): container finished" podID="c2dafa6c-d784-48eb-926d-0648fb990dd4" containerID="215e76e819f49de525ff0060641fbe0eff4c291c6d92a2b41f7a99c17b5747e3" exitCode=0 Mar 20 10:57:57 crc kubenswrapper[4748]: I0320 10:57:57.623000 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-d299-account-create-update-xxxtz" event={"ID":"c2dafa6c-d784-48eb-926d-0648fb990dd4","Type":"ContainerDied","Data":"215e76e819f49de525ff0060641fbe0eff4c291c6d92a2b41f7a99c17b5747e3"} Mar 20 10:57:57 crc kubenswrapper[4748]: I0320 10:57:57.633051 4748 generic.go:334] "Generic (PLEG): container finished" podID="dde56056-a261-4e3a-8cd6-b703d33a14ca" containerID="c1883afcac4f293a3d3b5ec3f2d4c57afa2b80b2c38b93e3a42b5c4c78645574" exitCode=0 Mar 20 10:57:57 crc kubenswrapper[4748]: I0320 10:57:57.633120 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-89ab-account-create-update-qtgz6" event={"ID":"dde56056-a261-4e3a-8cd6-b703d33a14ca","Type":"ContainerDied","Data":"c1883afcac4f293a3d3b5ec3f2d4c57afa2b80b2c38b93e3a42b5c4c78645574"} Mar 20 10:57:57 crc kubenswrapper[4748]: I0320 10:57:57.637857 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"847669ae-106d-4f9f-8f74-ea1f8e2d221d","Type":"ContainerStarted","Data":"d4cbd022be390d67fc0a3423cfe7cffd3eabc70daf6a0fd850ed78e3f32e83e5"} Mar 20 10:57:57 crc kubenswrapper[4748]: I0320 10:57:57.641174 4748 generic.go:334] "Generic (PLEG): container finished" podID="777d5c45-8727-451f-bb86-6048a03ceb0b" containerID="9bd4f5b57b785ddc899d661a8c8d5dd749ac08edccafe7bb665a8d95b9604f2a" exitCode=0 Mar 20 10:57:57 crc kubenswrapper[4748]: I0320 10:57:57.641224 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-dqp69" event={"ID":"777d5c45-8727-451f-bb86-6048a03ceb0b","Type":"ContainerDied","Data":"9bd4f5b57b785ddc899d661a8c8d5dd749ac08edccafe7bb665a8d95b9604f2a"} Mar 20 10:57:57 crc kubenswrapper[4748]: I0320 10:57:57.643862 4748 generic.go:334] "Generic (PLEG): container finished" podID="c8728f34-b5b9-4ced-9424-b83ff940580f" containerID="eeb50da6d9d48d7097bcc7e766ded49dbca3fb46e5b62b031e89186a098b6d45" exitCode=0 Mar 20 10:57:57 crc kubenswrapper[4748]: I0320 10:57:57.643930 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-589cf645cb-wkg45" event={"ID":"c8728f34-b5b9-4ced-9424-b83ff940580f","Type":"ContainerDied","Data":"eeb50da6d9d48d7097bcc7e766ded49dbca3fb46e5b62b031e89186a098b6d45"} Mar 20 10:57:57 crc kubenswrapper[4748]: I0320 10:57:57.648986 4748 generic.go:334] "Generic (PLEG): container finished" podID="e90432d5-5ca1-4447-9e0b-2afaafa0ba1b" containerID="0e20c16be7121fffee41879fde884299ec5b691be8fd53dffb29b0715fcc4d7b" exitCode=0 Mar 20 10:57:57 crc kubenswrapper[4748]: I0320 10:57:57.649051 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-6fd8-account-create-update-mcv9d" event={"ID":"e90432d5-5ca1-4447-9e0b-2afaafa0ba1b","Type":"ContainerDied","Data":"0e20c16be7121fffee41879fde884299ec5b691be8fd53dffb29b0715fcc4d7b"} Mar 20 10:57:57 crc kubenswrapper[4748]: I0320 10:57:57.654248 4748 generic.go:334] "Generic (PLEG): container finished" podID="62b48274-d738-49ad-81a5-a7c701193695" containerID="a5cce16e1895a78c7c2ba4c5b940d7ec88cb5d9dfe1c69b5555e34ea041a135b" exitCode=0 Mar 20 10:57:57 crc kubenswrapper[4748]: I0320 10:57:57.654311 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-pcsgd" event={"ID":"62b48274-d738-49ad-81a5-a7c701193695","Type":"ContainerDied","Data":"a5cce16e1895a78c7c2ba4c5b940d7ec88cb5d9dfe1c69b5555e34ea041a135b"} Mar 20 10:57:58 crc kubenswrapper[4748]: I0320 10:57:58.296669 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-c66c949c-9cv26" Mar 20 10:57:58 crc kubenswrapper[4748]: I0320 10:57:58.664702 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"847669ae-106d-4f9f-8f74-ea1f8e2d221d","Type":"ContainerStarted","Data":"b548843003db3272aba4481961ab94ee37fe5d87780e1d7d7b1100cbb1db6c7e"} Mar 20 10:57:59 crc kubenswrapper[4748]: I0320 10:57:59.146754 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-pcsgd" Mar 20 10:57:59 crc kubenswrapper[4748]: I0320 10:57:59.239742 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/62b48274-d738-49ad-81a5-a7c701193695-operator-scripts\") pod \"62b48274-d738-49ad-81a5-a7c701193695\" (UID: \"62b48274-d738-49ad-81a5-a7c701193695\") " Mar 20 10:57:59 crc kubenswrapper[4748]: I0320 10:57:59.239795 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vzmrk\" (UniqueName: \"kubernetes.io/projected/62b48274-d738-49ad-81a5-a7c701193695-kube-api-access-vzmrk\") pod \"62b48274-d738-49ad-81a5-a7c701193695\" (UID: \"62b48274-d738-49ad-81a5-a7c701193695\") " Mar 20 10:57:59 crc kubenswrapper[4748]: I0320 10:57:59.241009 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62b48274-d738-49ad-81a5-a7c701193695-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "62b48274-d738-49ad-81a5-a7c701193695" (UID: "62b48274-d738-49ad-81a5-a7c701193695"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:57:59 crc kubenswrapper[4748]: I0320 10:57:59.246113 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62b48274-d738-49ad-81a5-a7c701193695-kube-api-access-vzmrk" (OuterVolumeSpecName: "kube-api-access-vzmrk") pod "62b48274-d738-49ad-81a5-a7c701193695" (UID: "62b48274-d738-49ad-81a5-a7c701193695"). InnerVolumeSpecName "kube-api-access-vzmrk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:57:59 crc kubenswrapper[4748]: I0320 10:57:59.341898 4748 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/62b48274-d738-49ad-81a5-a7c701193695-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 10:57:59 crc kubenswrapper[4748]: I0320 10:57:59.342193 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vzmrk\" (UniqueName: \"kubernetes.io/projected/62b48274-d738-49ad-81a5-a7c701193695-kube-api-access-vzmrk\") on node \"crc\" DevicePath \"\"" Mar 20 10:57:59 crc kubenswrapper[4748]: I0320 10:57:59.432240 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-89ab-account-create-update-qtgz6" Mar 20 10:57:59 crc kubenswrapper[4748]: I0320 10:57:59.447726 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-6fd8-account-create-update-mcv9d" Mar 20 10:57:59 crc kubenswrapper[4748]: I0320 10:57:59.464633 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-d299-account-create-update-xxxtz" Mar 20 10:57:59 crc kubenswrapper[4748]: I0320 10:57:59.468246 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-dqp69" Mar 20 10:57:59 crc kubenswrapper[4748]: I0320 10:57:59.549928 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cnd2m\" (UniqueName: \"kubernetes.io/projected/777d5c45-8727-451f-bb86-6048a03ceb0b-kube-api-access-cnd2m\") pod \"777d5c45-8727-451f-bb86-6048a03ceb0b\" (UID: \"777d5c45-8727-451f-bb86-6048a03ceb0b\") " Mar 20 10:57:59 crc kubenswrapper[4748]: I0320 10:57:59.550107 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/777d5c45-8727-451f-bb86-6048a03ceb0b-operator-scripts\") pod \"777d5c45-8727-451f-bb86-6048a03ceb0b\" (UID: \"777d5c45-8727-451f-bb86-6048a03ceb0b\") " Mar 20 10:57:59 crc kubenswrapper[4748]: I0320 10:57:59.550173 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ctcrb\" (UniqueName: \"kubernetes.io/projected/e90432d5-5ca1-4447-9e0b-2afaafa0ba1b-kube-api-access-ctcrb\") pod \"e90432d5-5ca1-4447-9e0b-2afaafa0ba1b\" (UID: \"e90432d5-5ca1-4447-9e0b-2afaafa0ba1b\") " Mar 20 10:57:59 crc kubenswrapper[4748]: I0320 10:57:59.550195 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-npz9g\" (UniqueName: \"kubernetes.io/projected/dde56056-a261-4e3a-8cd6-b703d33a14ca-kube-api-access-npz9g\") pod \"dde56056-a261-4e3a-8cd6-b703d33a14ca\" (UID: \"dde56056-a261-4e3a-8cd6-b703d33a14ca\") " Mar 20 10:57:59 crc kubenswrapper[4748]: I0320 10:57:59.550243 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c2dafa6c-d784-48eb-926d-0648fb990dd4-operator-scripts\") pod \"c2dafa6c-d784-48eb-926d-0648fb990dd4\" (UID: \"c2dafa6c-d784-48eb-926d-0648fb990dd4\") " Mar 20 10:57:59 crc kubenswrapper[4748]: I0320 10:57:59.550282 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zt9dx\" (UniqueName: \"kubernetes.io/projected/c2dafa6c-d784-48eb-926d-0648fb990dd4-kube-api-access-zt9dx\") pod \"c2dafa6c-d784-48eb-926d-0648fb990dd4\" (UID: \"c2dafa6c-d784-48eb-926d-0648fb990dd4\") " Mar 20 10:57:59 crc kubenswrapper[4748]: I0320 10:57:59.550302 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dde56056-a261-4e3a-8cd6-b703d33a14ca-operator-scripts\") pod \"dde56056-a261-4e3a-8cd6-b703d33a14ca\" (UID: \"dde56056-a261-4e3a-8cd6-b703d33a14ca\") " Mar 20 10:57:59 crc kubenswrapper[4748]: I0320 10:57:59.550384 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e90432d5-5ca1-4447-9e0b-2afaafa0ba1b-operator-scripts\") pod \"e90432d5-5ca1-4447-9e0b-2afaafa0ba1b\" (UID: \"e90432d5-5ca1-4447-9e0b-2afaafa0ba1b\") " Mar 20 10:57:59 crc kubenswrapper[4748]: I0320 10:57:59.552435 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e90432d5-5ca1-4447-9e0b-2afaafa0ba1b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e90432d5-5ca1-4447-9e0b-2afaafa0ba1b" (UID: "e90432d5-5ca1-4447-9e0b-2afaafa0ba1b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:57:59 crc kubenswrapper[4748]: I0320 10:57:59.553597 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/777d5c45-8727-451f-bb86-6048a03ceb0b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "777d5c45-8727-451f-bb86-6048a03ceb0b" (UID: "777d5c45-8727-451f-bb86-6048a03ceb0b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:57:59 crc kubenswrapper[4748]: I0320 10:57:59.560525 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dde56056-a261-4e3a-8cd6-b703d33a14ca-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "dde56056-a261-4e3a-8cd6-b703d33a14ca" (UID: "dde56056-a261-4e3a-8cd6-b703d33a14ca"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:57:59 crc kubenswrapper[4748]: I0320 10:57:59.561030 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2dafa6c-d784-48eb-926d-0648fb990dd4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c2dafa6c-d784-48eb-926d-0648fb990dd4" (UID: "c2dafa6c-d784-48eb-926d-0648fb990dd4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:57:59 crc kubenswrapper[4748]: I0320 10:57:59.563861 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e90432d5-5ca1-4447-9e0b-2afaafa0ba1b-kube-api-access-ctcrb" (OuterVolumeSpecName: "kube-api-access-ctcrb") pod "e90432d5-5ca1-4447-9e0b-2afaafa0ba1b" (UID: "e90432d5-5ca1-4447-9e0b-2afaafa0ba1b"). InnerVolumeSpecName "kube-api-access-ctcrb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:57:59 crc kubenswrapper[4748]: I0320 10:57:59.566048 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/777d5c45-8727-451f-bb86-6048a03ceb0b-kube-api-access-cnd2m" (OuterVolumeSpecName: "kube-api-access-cnd2m") pod "777d5c45-8727-451f-bb86-6048a03ceb0b" (UID: "777d5c45-8727-451f-bb86-6048a03ceb0b"). InnerVolumeSpecName "kube-api-access-cnd2m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:57:59 crc kubenswrapper[4748]: I0320 10:57:59.568473 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2dafa6c-d784-48eb-926d-0648fb990dd4-kube-api-access-zt9dx" (OuterVolumeSpecName: "kube-api-access-zt9dx") pod "c2dafa6c-d784-48eb-926d-0648fb990dd4" (UID: "c2dafa6c-d784-48eb-926d-0648fb990dd4"). InnerVolumeSpecName "kube-api-access-zt9dx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:57:59 crc kubenswrapper[4748]: I0320 10:57:59.572657 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dde56056-a261-4e3a-8cd6-b703d33a14ca-kube-api-access-npz9g" (OuterVolumeSpecName: "kube-api-access-npz9g") pod "dde56056-a261-4e3a-8cd6-b703d33a14ca" (UID: "dde56056-a261-4e3a-8cd6-b703d33a14ca"). InnerVolumeSpecName "kube-api-access-npz9g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:57:59 crc kubenswrapper[4748]: I0320 10:57:59.646748 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-c6v9d" Mar 20 10:57:59 crc kubenswrapper[4748]: I0320 10:57:59.653480 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-b85b9d5c6-7bgxl" Mar 20 10:57:59 crc kubenswrapper[4748]: I0320 10:57:59.653565 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cnd2m\" (UniqueName: \"kubernetes.io/projected/777d5c45-8727-451f-bb86-6048a03ceb0b-kube-api-access-cnd2m\") on node \"crc\" DevicePath \"\"" Mar 20 10:57:59 crc kubenswrapper[4748]: I0320 10:57:59.653593 4748 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/777d5c45-8727-451f-bb86-6048a03ceb0b-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 10:57:59 crc kubenswrapper[4748]: I0320 10:57:59.653606 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ctcrb\" (UniqueName: \"kubernetes.io/projected/e90432d5-5ca1-4447-9e0b-2afaafa0ba1b-kube-api-access-ctcrb\") on node \"crc\" DevicePath \"\"" Mar 20 10:57:59 crc kubenswrapper[4748]: I0320 10:57:59.653619 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-npz9g\" (UniqueName: \"kubernetes.io/projected/dde56056-a261-4e3a-8cd6-b703d33a14ca-kube-api-access-npz9g\") on node \"crc\" DevicePath \"\"" Mar 20 10:57:59 crc kubenswrapper[4748]: I0320 10:57:59.653632 4748 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c2dafa6c-d784-48eb-926d-0648fb990dd4-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 10:57:59 crc kubenswrapper[4748]: I0320 10:57:59.653645 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zt9dx\" (UniqueName: \"kubernetes.io/projected/c2dafa6c-d784-48eb-926d-0648fb990dd4-kube-api-access-zt9dx\") on node \"crc\" DevicePath \"\"" Mar 20 10:57:59 crc kubenswrapper[4748]: I0320 10:57:59.653658 4748 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dde56056-a261-4e3a-8cd6-b703d33a14ca-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 10:57:59 crc kubenswrapper[4748]: I0320 10:57:59.653670 4748 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e90432d5-5ca1-4447-9e0b-2afaafa0ba1b-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 10:57:59 crc kubenswrapper[4748]: I0320 10:57:59.713766 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-89ab-account-create-update-qtgz6" Mar 20 10:57:59 crc kubenswrapper[4748]: I0320 10:57:59.713786 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-89ab-account-create-update-qtgz6" event={"ID":"dde56056-a261-4e3a-8cd6-b703d33a14ca","Type":"ContainerDied","Data":"e5368974d12afc70b122a7116a37179b45c32c541e8a90ba1c8baaca1285b9f0"} Mar 20 10:57:59 crc kubenswrapper[4748]: I0320 10:57:59.715152 4748 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e5368974d12afc70b122a7116a37179b45c32c541e8a90ba1c8baaca1285b9f0" Mar 20 10:57:59 crc kubenswrapper[4748]: I0320 10:57:59.735433 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"847669ae-106d-4f9f-8f74-ea1f8e2d221d","Type":"ContainerStarted","Data":"5378fa20cdc8defe0e8c30928c7e93473341304ae8d67e7d815e059dbbb01837"} Mar 20 10:57:59 crc kubenswrapper[4748]: I0320 10:57:59.742468 4748 generic.go:334] "Generic (PLEG): container finished" podID="18e8975c-a2d8-4319-b175-2a66ce3d97c9" containerID="87958b0500956376a8c4f78533010b993f91853465c77d78af833be6203bfca7" exitCode=137 Mar 20 10:57:59 crc kubenswrapper[4748]: I0320 10:57:59.742550 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-b85b9d5c6-7bgxl" event={"ID":"18e8975c-a2d8-4319-b175-2a66ce3d97c9","Type":"ContainerDied","Data":"87958b0500956376a8c4f78533010b993f91853465c77d78af833be6203bfca7"} Mar 20 10:57:59 crc kubenswrapper[4748]: I0320 10:57:59.742581 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-b85b9d5c6-7bgxl" event={"ID":"18e8975c-a2d8-4319-b175-2a66ce3d97c9","Type":"ContainerDied","Data":"69c482b45ad1acfdb209c46107ce31e29092379c1bd726b0b2a9235cbf1d7c97"} Mar 20 10:57:59 crc kubenswrapper[4748]: I0320 10:57:59.742599 4748 scope.go:117] "RemoveContainer" containerID="1ef9c8f1e297ada419f5c18bfd6c48a8d01adf6e580eac1a741fdfad1d08a74f" Mar 20 10:57:59 crc kubenswrapper[4748]: I0320 10:57:59.742720 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-b85b9d5c6-7bgxl" Mar 20 10:57:59 crc kubenswrapper[4748]: I0320 10:57:59.753390 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-dqp69" event={"ID":"777d5c45-8727-451f-bb86-6048a03ceb0b","Type":"ContainerDied","Data":"1fe4960b90a4b4132db8466ffc49ad7253f667cc5746faa45ae3a5c8db74c0a5"} Mar 20 10:57:59 crc kubenswrapper[4748]: I0320 10:57:59.753435 4748 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1fe4960b90a4b4132db8466ffc49ad7253f667cc5746faa45ae3a5c8db74c0a5" Mar 20 10:57:59 crc kubenswrapper[4748]: I0320 10:57:59.753514 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-dqp69" Mar 20 10:57:59 crc kubenswrapper[4748]: I0320 10:57:59.757863 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/18e8975c-a2d8-4319-b175-2a66ce3d97c9-horizon-secret-key\") pod \"18e8975c-a2d8-4319-b175-2a66ce3d97c9\" (UID: \"18e8975c-a2d8-4319-b175-2a66ce3d97c9\") " Mar 20 10:57:59 crc kubenswrapper[4748]: I0320 10:57:59.757955 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/18e8975c-a2d8-4319-b175-2a66ce3d97c9-logs\") pod \"18e8975c-a2d8-4319-b175-2a66ce3d97c9\" (UID: \"18e8975c-a2d8-4319-b175-2a66ce3d97c9\") " Mar 20 10:57:59 crc kubenswrapper[4748]: I0320 10:57:59.757990 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/18e8975c-a2d8-4319-b175-2a66ce3d97c9-config-data\") pod \"18e8975c-a2d8-4319-b175-2a66ce3d97c9\" (UID: \"18e8975c-a2d8-4319-b175-2a66ce3d97c9\") " Mar 20 10:57:59 crc kubenswrapper[4748]: I0320 10:57:59.758008 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18e8975c-a2d8-4319-b175-2a66ce3d97c9-combined-ca-bundle\") pod \"18e8975c-a2d8-4319-b175-2a66ce3d97c9\" (UID: \"18e8975c-a2d8-4319-b175-2a66ce3d97c9\") " Mar 20 10:57:59 crc kubenswrapper[4748]: I0320 10:57:59.758039 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e9e2c5e9-4e94-47b2-82f3-c11dac5bfa25-operator-scripts\") pod \"e9e2c5e9-4e94-47b2-82f3-c11dac5bfa25\" (UID: \"e9e2c5e9-4e94-47b2-82f3-c11dac5bfa25\") " Mar 20 10:57:59 crc kubenswrapper[4748]: I0320 10:57:59.758087 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/18e8975c-a2d8-4319-b175-2a66ce3d97c9-horizon-tls-certs\") pod \"18e8975c-a2d8-4319-b175-2a66ce3d97c9\" (UID: \"18e8975c-a2d8-4319-b175-2a66ce3d97c9\") " Mar 20 10:57:59 crc kubenswrapper[4748]: I0320 10:57:59.758109 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2ds9t\" (UniqueName: \"kubernetes.io/projected/18e8975c-a2d8-4319-b175-2a66ce3d97c9-kube-api-access-2ds9t\") pod \"18e8975c-a2d8-4319-b175-2a66ce3d97c9\" (UID: \"18e8975c-a2d8-4319-b175-2a66ce3d97c9\") " Mar 20 10:57:59 crc kubenswrapper[4748]: I0320 10:57:59.758244 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/18e8975c-a2d8-4319-b175-2a66ce3d97c9-scripts\") pod \"18e8975c-a2d8-4319-b175-2a66ce3d97c9\" (UID: \"18e8975c-a2d8-4319-b175-2a66ce3d97c9\") " Mar 20 10:57:59 crc kubenswrapper[4748]: I0320 10:57:59.758267 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hpvf6\" (UniqueName: \"kubernetes.io/projected/e9e2c5e9-4e94-47b2-82f3-c11dac5bfa25-kube-api-access-hpvf6\") pod \"e9e2c5e9-4e94-47b2-82f3-c11dac5bfa25\" (UID: \"e9e2c5e9-4e94-47b2-82f3-c11dac5bfa25\") " Mar 20 10:57:59 crc kubenswrapper[4748]: I0320 10:57:59.761218 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e9e2c5e9-4e94-47b2-82f3-c11dac5bfa25-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e9e2c5e9-4e94-47b2-82f3-c11dac5bfa25" (UID: "e9e2c5e9-4e94-47b2-82f3-c11dac5bfa25"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:57:59 crc kubenswrapper[4748]: I0320 10:57:59.768473 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18e8975c-a2d8-4319-b175-2a66ce3d97c9-logs" (OuterVolumeSpecName: "logs") pod "18e8975c-a2d8-4319-b175-2a66ce3d97c9" (UID: "18e8975c-a2d8-4319-b175-2a66ce3d97c9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:57:59 crc kubenswrapper[4748]: I0320 10:57:59.771391 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18e8975c-a2d8-4319-b175-2a66ce3d97c9-kube-api-access-2ds9t" (OuterVolumeSpecName: "kube-api-access-2ds9t") pod "18e8975c-a2d8-4319-b175-2a66ce3d97c9" (UID: "18e8975c-a2d8-4319-b175-2a66ce3d97c9"). InnerVolumeSpecName "kube-api-access-2ds9t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:57:59 crc kubenswrapper[4748]: I0320 10:57:59.778218 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9e2c5e9-4e94-47b2-82f3-c11dac5bfa25-kube-api-access-hpvf6" (OuterVolumeSpecName: "kube-api-access-hpvf6") pod "e9e2c5e9-4e94-47b2-82f3-c11dac5bfa25" (UID: "e9e2c5e9-4e94-47b2-82f3-c11dac5bfa25"). InnerVolumeSpecName "kube-api-access-hpvf6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:57:59 crc kubenswrapper[4748]: I0320 10:57:59.778206 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18e8975c-a2d8-4319-b175-2a66ce3d97c9-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "18e8975c-a2d8-4319-b175-2a66ce3d97c9" (UID: "18e8975c-a2d8-4319-b175-2a66ce3d97c9"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:57:59 crc kubenswrapper[4748]: I0320 10:57:59.779125 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-6fd8-account-create-update-mcv9d" event={"ID":"e90432d5-5ca1-4447-9e0b-2afaafa0ba1b","Type":"ContainerDied","Data":"1b5357bbb2e316a09f8b3b32c892023682302cbcd1ac303624a4c0cc60c7727e"} Mar 20 10:57:59 crc kubenswrapper[4748]: I0320 10:57:59.779172 4748 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1b5357bbb2e316a09f8b3b32c892023682302cbcd1ac303624a4c0cc60c7727e" Mar 20 10:57:59 crc kubenswrapper[4748]: I0320 10:57:59.779237 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-6fd8-account-create-update-mcv9d" Mar 20 10:57:59 crc kubenswrapper[4748]: I0320 10:57:59.786043 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-pcsgd" event={"ID":"62b48274-d738-49ad-81a5-a7c701193695","Type":"ContainerDied","Data":"cb2fce004344ed11a206db1eb1c9e44696ea9ec87eaf38792f0b30af7ec64707"} Mar 20 10:57:59 crc kubenswrapper[4748]: I0320 10:57:59.786400 4748 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cb2fce004344ed11a206db1eb1c9e44696ea9ec87eaf38792f0b30af7ec64707" Mar 20 10:57:59 crc kubenswrapper[4748]: I0320 10:57:59.786346 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-pcsgd" Mar 20 10:57:59 crc kubenswrapper[4748]: I0320 10:57:59.796688 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-c6v9d" event={"ID":"e9e2c5e9-4e94-47b2-82f3-c11dac5bfa25","Type":"ContainerDied","Data":"8b19bed78de1f97825f8c8e0c00e43bfceceac68ffea11545adf27877cc8019b"} Mar 20 10:57:59 crc kubenswrapper[4748]: I0320 10:57:59.796743 4748 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8b19bed78de1f97825f8c8e0c00e43bfceceac68ffea11545adf27877cc8019b" Mar 20 10:57:59 crc kubenswrapper[4748]: I0320 10:57:59.796823 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-c6v9d" Mar 20 10:57:59 crc kubenswrapper[4748]: I0320 10:57:59.798463 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18e8975c-a2d8-4319-b175-2a66ce3d97c9-config-data" (OuterVolumeSpecName: "config-data") pod "18e8975c-a2d8-4319-b175-2a66ce3d97c9" (UID: "18e8975c-a2d8-4319-b175-2a66ce3d97c9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:57:59 crc kubenswrapper[4748]: I0320 10:57:59.801414 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-d299-account-create-update-xxxtz" event={"ID":"c2dafa6c-d784-48eb-926d-0648fb990dd4","Type":"ContainerDied","Data":"120c3364638503e625ce17aa539ef6bda4a553920782f085616556e0486d2280"} Mar 20 10:57:59 crc kubenswrapper[4748]: I0320 10:57:59.801715 4748 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="120c3364638503e625ce17aa539ef6bda4a553920782f085616556e0486d2280" Mar 20 10:57:59 crc kubenswrapper[4748]: I0320 10:57:59.801724 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-d299-account-create-update-xxxtz" Mar 20 10:57:59 crc kubenswrapper[4748]: I0320 10:57:59.822292 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18e8975c-a2d8-4319-b175-2a66ce3d97c9-scripts" (OuterVolumeSpecName: "scripts") pod "18e8975c-a2d8-4319-b175-2a66ce3d97c9" (UID: "18e8975c-a2d8-4319-b175-2a66ce3d97c9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:57:59 crc kubenswrapper[4748]: I0320 10:57:59.834803 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18e8975c-a2d8-4319-b175-2a66ce3d97c9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "18e8975c-a2d8-4319-b175-2a66ce3d97c9" (UID: "18e8975c-a2d8-4319-b175-2a66ce3d97c9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:57:59 crc kubenswrapper[4748]: I0320 10:57:59.860087 4748 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/18e8975c-a2d8-4319-b175-2a66ce3d97c9-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 20 10:57:59 crc kubenswrapper[4748]: I0320 10:57:59.860320 4748 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/18e8975c-a2d8-4319-b175-2a66ce3d97c9-logs\") on node \"crc\" DevicePath \"\"" Mar 20 10:57:59 crc kubenswrapper[4748]: I0320 10:57:59.860401 4748 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/18e8975c-a2d8-4319-b175-2a66ce3d97c9-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 10:57:59 crc kubenswrapper[4748]: I0320 10:57:59.860476 4748 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18e8975c-a2d8-4319-b175-2a66ce3d97c9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 10:57:59 crc kubenswrapper[4748]: I0320 10:57:59.860573 4748 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e9e2c5e9-4e94-47b2-82f3-c11dac5bfa25-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 10:57:59 crc kubenswrapper[4748]: I0320 10:57:59.860650 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2ds9t\" (UniqueName: \"kubernetes.io/projected/18e8975c-a2d8-4319-b175-2a66ce3d97c9-kube-api-access-2ds9t\") on node \"crc\" DevicePath \"\"" Mar 20 10:57:59 crc kubenswrapper[4748]: I0320 10:57:59.860720 4748 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/18e8975c-a2d8-4319-b175-2a66ce3d97c9-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 10:57:59 crc kubenswrapper[4748]: I0320 10:57:59.860799 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hpvf6\" (UniqueName: \"kubernetes.io/projected/e9e2c5e9-4e94-47b2-82f3-c11dac5bfa25-kube-api-access-hpvf6\") on node \"crc\" DevicePath \"\"" Mar 20 10:57:59 crc kubenswrapper[4748]: I0320 10:57:59.885935 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18e8975c-a2d8-4319-b175-2a66ce3d97c9-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "18e8975c-a2d8-4319-b175-2a66ce3d97c9" (UID: "18e8975c-a2d8-4319-b175-2a66ce3d97c9"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:57:59 crc kubenswrapper[4748]: I0320 10:57:59.966301 4748 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/18e8975c-a2d8-4319-b175-2a66ce3d97c9-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 10:57:59 crc kubenswrapper[4748]: I0320 10:57:59.968073 4748 scope.go:117] "RemoveContainer" containerID="87958b0500956376a8c4f78533010b993f91853465c77d78af833be6203bfca7" Mar 20 10:57:59 crc kubenswrapper[4748]: I0320 10:57:59.987521 4748 scope.go:117] "RemoveContainer" containerID="1ef9c8f1e297ada419f5c18bfd6c48a8d01adf6e580eac1a741fdfad1d08a74f" Mar 20 10:57:59 crc kubenswrapper[4748]: E0320 10:57:59.988196 4748 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ef9c8f1e297ada419f5c18bfd6c48a8d01adf6e580eac1a741fdfad1d08a74f\": container with ID starting with 1ef9c8f1e297ada419f5c18bfd6c48a8d01adf6e580eac1a741fdfad1d08a74f not found: ID does not exist" containerID="1ef9c8f1e297ada419f5c18bfd6c48a8d01adf6e580eac1a741fdfad1d08a74f" Mar 20 10:57:59 crc kubenswrapper[4748]: I0320 10:57:59.988268 4748 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ef9c8f1e297ada419f5c18bfd6c48a8d01adf6e580eac1a741fdfad1d08a74f"} err="failed to get container status \"1ef9c8f1e297ada419f5c18bfd6c48a8d01adf6e580eac1a741fdfad1d08a74f\": rpc error: code = NotFound desc = could not find container \"1ef9c8f1e297ada419f5c18bfd6c48a8d01adf6e580eac1a741fdfad1d08a74f\": container with ID starting with 1ef9c8f1e297ada419f5c18bfd6c48a8d01adf6e580eac1a741fdfad1d08a74f not found: ID does not exist" Mar 20 10:57:59 crc kubenswrapper[4748]: I0320 10:57:59.988312 4748 scope.go:117] "RemoveContainer" containerID="87958b0500956376a8c4f78533010b993f91853465c77d78af833be6203bfca7" Mar 20 10:57:59 crc kubenswrapper[4748]: E0320 10:57:59.988948 4748 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87958b0500956376a8c4f78533010b993f91853465c77d78af833be6203bfca7\": container with ID starting with 87958b0500956376a8c4f78533010b993f91853465c77d78af833be6203bfca7 not found: ID does not exist" containerID="87958b0500956376a8c4f78533010b993f91853465c77d78af833be6203bfca7" Mar 20 10:57:59 crc kubenswrapper[4748]: I0320 10:57:59.989018 4748 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87958b0500956376a8c4f78533010b993f91853465c77d78af833be6203bfca7"} err="failed to get container status \"87958b0500956376a8c4f78533010b993f91853465c77d78af833be6203bfca7\": rpc error: code = NotFound desc = could not find container \"87958b0500956376a8c4f78533010b993f91853465c77d78af833be6203bfca7\": container with ID starting with 87958b0500956376a8c4f78533010b993f91853465c77d78af833be6203bfca7 not found: ID does not exist" Mar 20 10:58:00 crc kubenswrapper[4748]: I0320 10:58:00.084259 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-b85b9d5c6-7bgxl"] Mar 20 10:58:00 crc kubenswrapper[4748]: I0320 10:58:00.118449 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-b85b9d5c6-7bgxl"] Mar 20 10:58:00 crc kubenswrapper[4748]: I0320 10:58:00.151893 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566738-k6md5"] Mar 20 10:58:00 crc kubenswrapper[4748]: E0320 10:58:00.152522 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e90432d5-5ca1-4447-9e0b-2afaafa0ba1b" containerName="mariadb-account-create-update" Mar 20 10:58:00 crc kubenswrapper[4748]: I0320 10:58:00.152609 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="e90432d5-5ca1-4447-9e0b-2afaafa0ba1b" containerName="mariadb-account-create-update" Mar 20 10:58:00 crc kubenswrapper[4748]: E0320 10:58:00.152690 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9e2c5e9-4e94-47b2-82f3-c11dac5bfa25" containerName="mariadb-database-create" Mar 20 10:58:00 crc kubenswrapper[4748]: I0320 10:58:00.152760 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9e2c5e9-4e94-47b2-82f3-c11dac5bfa25" containerName="mariadb-database-create" Mar 20 10:58:00 crc kubenswrapper[4748]: E0320 10:58:00.152896 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18e8975c-a2d8-4319-b175-2a66ce3d97c9" containerName="horizon" Mar 20 10:58:00 crc kubenswrapper[4748]: I0320 10:58:00.153006 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="18e8975c-a2d8-4319-b175-2a66ce3d97c9" containerName="horizon" Mar 20 10:58:00 crc kubenswrapper[4748]: E0320 10:58:00.153093 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18e8975c-a2d8-4319-b175-2a66ce3d97c9" containerName="horizon-log" Mar 20 10:58:00 crc kubenswrapper[4748]: I0320 10:58:00.153168 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="18e8975c-a2d8-4319-b175-2a66ce3d97c9" containerName="horizon-log" Mar 20 10:58:00 crc kubenswrapper[4748]: E0320 10:58:00.153258 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dde56056-a261-4e3a-8cd6-b703d33a14ca" containerName="mariadb-account-create-update" Mar 20 10:58:00 crc kubenswrapper[4748]: I0320 10:58:00.153327 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="dde56056-a261-4e3a-8cd6-b703d33a14ca" containerName="mariadb-account-create-update" Mar 20 10:58:00 crc kubenswrapper[4748]: E0320 10:58:00.153404 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62b48274-d738-49ad-81a5-a7c701193695" containerName="mariadb-database-create" Mar 20 10:58:00 crc kubenswrapper[4748]: I0320 10:58:00.153500 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="62b48274-d738-49ad-81a5-a7c701193695" containerName="mariadb-database-create" Mar 20 10:58:00 crc kubenswrapper[4748]: E0320 10:58:00.153597 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2dafa6c-d784-48eb-926d-0648fb990dd4" containerName="mariadb-account-create-update" Mar 20 10:58:00 crc kubenswrapper[4748]: I0320 10:58:00.153666 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2dafa6c-d784-48eb-926d-0648fb990dd4" containerName="mariadb-account-create-update" Mar 20 10:58:00 crc kubenswrapper[4748]: E0320 10:58:00.153744 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="777d5c45-8727-451f-bb86-6048a03ceb0b" containerName="mariadb-database-create" Mar 20 10:58:00 crc kubenswrapper[4748]: I0320 10:58:00.153817 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="777d5c45-8727-451f-bb86-6048a03ceb0b" containerName="mariadb-database-create" Mar 20 10:58:00 crc kubenswrapper[4748]: I0320 10:58:00.154131 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="dde56056-a261-4e3a-8cd6-b703d33a14ca" containerName="mariadb-account-create-update" Mar 20 10:58:00 crc kubenswrapper[4748]: I0320 10:58:00.154220 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2dafa6c-d784-48eb-926d-0648fb990dd4" containerName="mariadb-account-create-update" Mar 20 10:58:00 crc kubenswrapper[4748]: I0320 10:58:00.154318 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="18e8975c-a2d8-4319-b175-2a66ce3d97c9" containerName="horizon" Mar 20 10:58:00 crc kubenswrapper[4748]: I0320 10:58:00.154400 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9e2c5e9-4e94-47b2-82f3-c11dac5bfa25" containerName="mariadb-database-create" Mar 20 10:58:00 crc kubenswrapper[4748]: I0320 10:58:00.154488 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="777d5c45-8727-451f-bb86-6048a03ceb0b" containerName="mariadb-database-create" Mar 20 10:58:00 crc kubenswrapper[4748]: I0320 10:58:00.154558 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="18e8975c-a2d8-4319-b175-2a66ce3d97c9" containerName="horizon-log" Mar 20 10:58:00 crc kubenswrapper[4748]: I0320 10:58:00.154634 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="e90432d5-5ca1-4447-9e0b-2afaafa0ba1b" containerName="mariadb-account-create-update" Mar 20 10:58:00 crc kubenswrapper[4748]: I0320 10:58:00.154709 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="62b48274-d738-49ad-81a5-a7c701193695" containerName="mariadb-database-create" Mar 20 10:58:00 crc kubenswrapper[4748]: I0320 10:58:00.155612 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566738-k6md5" Mar 20 10:58:00 crc kubenswrapper[4748]: I0320 10:58:00.159274 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-p6q8z" Mar 20 10:58:00 crc kubenswrapper[4748]: I0320 10:58:00.163158 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 10:58:00 crc kubenswrapper[4748]: I0320 10:58:00.163655 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 10:58:00 crc kubenswrapper[4748]: I0320 10:58:00.164052 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566738-k6md5"] Mar 20 10:58:00 crc kubenswrapper[4748]: I0320 10:58:00.272435 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xhlr\" (UniqueName: \"kubernetes.io/projected/50ebd6f9-305e-482a-9f7a-e5a63e04921a-kube-api-access-6xhlr\") pod \"auto-csr-approver-29566738-k6md5\" (UID: \"50ebd6f9-305e-482a-9f7a-e5a63e04921a\") " pod="openshift-infra/auto-csr-approver-29566738-k6md5" Mar 20 10:58:00 crc kubenswrapper[4748]: I0320 10:58:00.373885 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xhlr\" (UniqueName: \"kubernetes.io/projected/50ebd6f9-305e-482a-9f7a-e5a63e04921a-kube-api-access-6xhlr\") pod \"auto-csr-approver-29566738-k6md5\" (UID: \"50ebd6f9-305e-482a-9f7a-e5a63e04921a\") " pod="openshift-infra/auto-csr-approver-29566738-k6md5" Mar 20 10:58:00 crc kubenswrapper[4748]: I0320 10:58:00.404147 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xhlr\" (UniqueName: \"kubernetes.io/projected/50ebd6f9-305e-482a-9f7a-e5a63e04921a-kube-api-access-6xhlr\") pod \"auto-csr-approver-29566738-k6md5\" (UID: \"50ebd6f9-305e-482a-9f7a-e5a63e04921a\") " pod="openshift-infra/auto-csr-approver-29566738-k6md5" Mar 20 10:58:00 crc kubenswrapper[4748]: I0320 10:58:00.499662 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566738-k6md5" Mar 20 10:58:00 crc kubenswrapper[4748]: I0320 10:58:00.952483 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566738-k6md5"] Mar 20 10:58:01 crc kubenswrapper[4748]: I0320 10:58:01.254016 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-bs7cs"] Mar 20 10:58:01 crc kubenswrapper[4748]: I0320 10:58:01.255765 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-bs7cs" Mar 20 10:58:01 crc kubenswrapper[4748]: I0320 10:58:01.264554 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-m7mzz" Mar 20 10:58:01 crc kubenswrapper[4748]: I0320 10:58:01.264673 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 20 10:58:01 crc kubenswrapper[4748]: I0320 10:58:01.264884 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Mar 20 10:58:01 crc kubenswrapper[4748]: I0320 10:58:01.265470 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-bs7cs"] Mar 20 10:58:01 crc kubenswrapper[4748]: I0320 10:58:01.418075 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a154fa7a-7ef7-4c8b-ac86-b0508e4c1cb9-scripts\") pod \"nova-cell0-conductor-db-sync-bs7cs\" (UID: \"a154fa7a-7ef7-4c8b-ac86-b0508e4c1cb9\") " pod="openstack/nova-cell0-conductor-db-sync-bs7cs" Mar 20 10:58:01 crc kubenswrapper[4748]: I0320 10:58:01.418157 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a154fa7a-7ef7-4c8b-ac86-b0508e4c1cb9-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-bs7cs\" (UID: \"a154fa7a-7ef7-4c8b-ac86-b0508e4c1cb9\") " pod="openstack/nova-cell0-conductor-db-sync-bs7cs" Mar 20 10:58:01 crc kubenswrapper[4748]: I0320 10:58:01.418226 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a154fa7a-7ef7-4c8b-ac86-b0508e4c1cb9-config-data\") pod \"nova-cell0-conductor-db-sync-bs7cs\" (UID: \"a154fa7a-7ef7-4c8b-ac86-b0508e4c1cb9\") " pod="openstack/nova-cell0-conductor-db-sync-bs7cs" Mar 20 10:58:01 crc kubenswrapper[4748]: I0320 10:58:01.418438 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fd92x\" (UniqueName: \"kubernetes.io/projected/a154fa7a-7ef7-4c8b-ac86-b0508e4c1cb9-kube-api-access-fd92x\") pod \"nova-cell0-conductor-db-sync-bs7cs\" (UID: \"a154fa7a-7ef7-4c8b-ac86-b0508e4c1cb9\") " pod="openstack/nova-cell0-conductor-db-sync-bs7cs" Mar 20 10:58:01 crc kubenswrapper[4748]: I0320 10:58:01.519952 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a154fa7a-7ef7-4c8b-ac86-b0508e4c1cb9-scripts\") pod \"nova-cell0-conductor-db-sync-bs7cs\" (UID: \"a154fa7a-7ef7-4c8b-ac86-b0508e4c1cb9\") " pod="openstack/nova-cell0-conductor-db-sync-bs7cs" Mar 20 10:58:01 crc kubenswrapper[4748]: I0320 10:58:01.520023 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a154fa7a-7ef7-4c8b-ac86-b0508e4c1cb9-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-bs7cs\" (UID: \"a154fa7a-7ef7-4c8b-ac86-b0508e4c1cb9\") " pod="openstack/nova-cell0-conductor-db-sync-bs7cs" Mar 20 10:58:01 crc kubenswrapper[4748]: I0320 10:58:01.520048 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a154fa7a-7ef7-4c8b-ac86-b0508e4c1cb9-config-data\") pod \"nova-cell0-conductor-db-sync-bs7cs\" (UID: \"a154fa7a-7ef7-4c8b-ac86-b0508e4c1cb9\") " pod="openstack/nova-cell0-conductor-db-sync-bs7cs" Mar 20 10:58:01 crc kubenswrapper[4748]: I0320 10:58:01.520123 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fd92x\" (UniqueName: \"kubernetes.io/projected/a154fa7a-7ef7-4c8b-ac86-b0508e4c1cb9-kube-api-access-fd92x\") pod \"nova-cell0-conductor-db-sync-bs7cs\" (UID: \"a154fa7a-7ef7-4c8b-ac86-b0508e4c1cb9\") " pod="openstack/nova-cell0-conductor-db-sync-bs7cs" Mar 20 10:58:01 crc kubenswrapper[4748]: I0320 10:58:01.528873 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18e8975c-a2d8-4319-b175-2a66ce3d97c9" path="/var/lib/kubelet/pods/18e8975c-a2d8-4319-b175-2a66ce3d97c9/volumes" Mar 20 10:58:01 crc kubenswrapper[4748]: I0320 10:58:01.530823 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a154fa7a-7ef7-4c8b-ac86-b0508e4c1cb9-config-data\") pod \"nova-cell0-conductor-db-sync-bs7cs\" (UID: \"a154fa7a-7ef7-4c8b-ac86-b0508e4c1cb9\") " pod="openstack/nova-cell0-conductor-db-sync-bs7cs" Mar 20 10:58:01 crc kubenswrapper[4748]: I0320 10:58:01.536425 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a154fa7a-7ef7-4c8b-ac86-b0508e4c1cb9-scripts\") pod \"nova-cell0-conductor-db-sync-bs7cs\" (UID: \"a154fa7a-7ef7-4c8b-ac86-b0508e4c1cb9\") " pod="openstack/nova-cell0-conductor-db-sync-bs7cs" Mar 20 10:58:01 crc kubenswrapper[4748]: I0320 10:58:01.536450 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a154fa7a-7ef7-4c8b-ac86-b0508e4c1cb9-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-bs7cs\" (UID: \"a154fa7a-7ef7-4c8b-ac86-b0508e4c1cb9\") " pod="openstack/nova-cell0-conductor-db-sync-bs7cs" Mar 20 10:58:01 crc kubenswrapper[4748]: I0320 10:58:01.542432 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fd92x\" (UniqueName: \"kubernetes.io/projected/a154fa7a-7ef7-4c8b-ac86-b0508e4c1cb9-kube-api-access-fd92x\") pod \"nova-cell0-conductor-db-sync-bs7cs\" (UID: \"a154fa7a-7ef7-4c8b-ac86-b0508e4c1cb9\") " pod="openstack/nova-cell0-conductor-db-sync-bs7cs" Mar 20 10:58:01 crc kubenswrapper[4748]: I0320 10:58:01.633858 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-bs7cs" Mar 20 10:58:01 crc kubenswrapper[4748]: I0320 10:58:01.909514 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566738-k6md5" event={"ID":"50ebd6f9-305e-482a-9f7a-e5a63e04921a","Type":"ContainerStarted","Data":"b61f2588b854288b01c974a1fdf3976c4eba8e075ab1bc39f7485a84396b7052"} Mar 20 10:58:02 crc kubenswrapper[4748]: W0320 10:58:02.332642 4748 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda154fa7a_7ef7_4c8b_ac86_b0508e4c1cb9.slice/crio-569e1001e44de1d02ce2847704b18bce5d56d8ccc7e31fd68352f1d72304f1b6 WatchSource:0}: Error finding container 569e1001e44de1d02ce2847704b18bce5d56d8ccc7e31fd68352f1d72304f1b6: Status 404 returned error can't find the container with id 569e1001e44de1d02ce2847704b18bce5d56d8ccc7e31fd68352f1d72304f1b6 Mar 20 10:58:02 crc kubenswrapper[4748]: I0320 10:58:02.335293 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-bs7cs"] Mar 20 10:58:02 crc kubenswrapper[4748]: I0320 10:58:02.920814 4748 generic.go:334] "Generic (PLEG): container finished" podID="50ebd6f9-305e-482a-9f7a-e5a63e04921a" containerID="49a3fb574b78615d975ca586c2e94ce881cd83d29c263d85d451be63e9565da0" exitCode=0 Mar 20 10:58:02 crc kubenswrapper[4748]: I0320 10:58:02.920867 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566738-k6md5" event={"ID":"50ebd6f9-305e-482a-9f7a-e5a63e04921a","Type":"ContainerDied","Data":"49a3fb574b78615d975ca586c2e94ce881cd83d29c263d85d451be63e9565da0"} Mar 20 10:58:02 crc kubenswrapper[4748]: I0320 10:58:02.924107 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"847669ae-106d-4f9f-8f74-ea1f8e2d221d","Type":"ContainerStarted","Data":"19fc679c210222d1ff016794891f4e94e4de33a4324d0a3466fe5ac80621233f"} Mar 20 10:58:02 crc kubenswrapper[4748]: I0320 10:58:02.924256 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 20 10:58:02 crc kubenswrapper[4748]: I0320 10:58:02.925370 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-bs7cs" event={"ID":"a154fa7a-7ef7-4c8b-ac86-b0508e4c1cb9","Type":"ContainerStarted","Data":"569e1001e44de1d02ce2847704b18bce5d56d8ccc7e31fd68352f1d72304f1b6"} Mar 20 10:58:02 crc kubenswrapper[4748]: I0320 10:58:02.962512 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.279259485 podStartE2EDuration="10.962486927s" podCreationTimestamp="2026-03-20 10:57:52 +0000 UTC" firstStartedPulling="2026-03-20 10:57:54.061789853 +0000 UTC m=+1309.203335657" lastFinishedPulling="2026-03-20 10:58:01.745017285 +0000 UTC m=+1316.886563099" observedRunningTime="2026-03-20 10:58:02.954694311 +0000 UTC m=+1318.096240125" watchObservedRunningTime="2026-03-20 10:58:02.962486927 +0000 UTC m=+1318.104032751" Mar 20 10:58:03 crc kubenswrapper[4748]: I0320 10:58:03.291123 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-c66c949c-9cv26" Mar 20 10:58:03 crc kubenswrapper[4748]: I0320 10:58:03.718704 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 10:58:04 crc kubenswrapper[4748]: I0320 10:58:04.341411 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566738-k6md5" Mar 20 10:58:04 crc kubenswrapper[4748]: I0320 10:58:04.495333 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6xhlr\" (UniqueName: \"kubernetes.io/projected/50ebd6f9-305e-482a-9f7a-e5a63e04921a-kube-api-access-6xhlr\") pod \"50ebd6f9-305e-482a-9f7a-e5a63e04921a\" (UID: \"50ebd6f9-305e-482a-9f7a-e5a63e04921a\") " Mar 20 10:58:04 crc kubenswrapper[4748]: I0320 10:58:04.503150 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50ebd6f9-305e-482a-9f7a-e5a63e04921a-kube-api-access-6xhlr" (OuterVolumeSpecName: "kube-api-access-6xhlr") pod "50ebd6f9-305e-482a-9f7a-e5a63e04921a" (UID: "50ebd6f9-305e-482a-9f7a-e5a63e04921a"). InnerVolumeSpecName "kube-api-access-6xhlr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:58:04 crc kubenswrapper[4748]: I0320 10:58:04.561881 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-589cf645cb-wkg45" Mar 20 10:58:04 crc kubenswrapper[4748]: I0320 10:58:04.600086 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6xhlr\" (UniqueName: \"kubernetes.io/projected/50ebd6f9-305e-482a-9f7a-e5a63e04921a-kube-api-access-6xhlr\") on node \"crc\" DevicePath \"\"" Mar 20 10:58:04 crc kubenswrapper[4748]: I0320 10:58:04.701228 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htpsf\" (UniqueName: \"kubernetes.io/projected/c8728f34-b5b9-4ced-9424-b83ff940580f-kube-api-access-htpsf\") pod \"c8728f34-b5b9-4ced-9424-b83ff940580f\" (UID: \"c8728f34-b5b9-4ced-9424-b83ff940580f\") " Mar 20 10:58:04 crc kubenswrapper[4748]: I0320 10:58:04.701576 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c8728f34-b5b9-4ced-9424-b83ff940580f-httpd-config\") pod \"c8728f34-b5b9-4ced-9424-b83ff940580f\" (UID: \"c8728f34-b5b9-4ced-9424-b83ff940580f\") " Mar 20 10:58:04 crc kubenswrapper[4748]: I0320 10:58:04.701706 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c8728f34-b5b9-4ced-9424-b83ff940580f-config\") pod \"c8728f34-b5b9-4ced-9424-b83ff940580f\" (UID: \"c8728f34-b5b9-4ced-9424-b83ff940580f\") " Mar 20 10:58:04 crc kubenswrapper[4748]: I0320 10:58:04.701759 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8728f34-b5b9-4ced-9424-b83ff940580f-combined-ca-bundle\") pod \"c8728f34-b5b9-4ced-9424-b83ff940580f\" (UID: \"c8728f34-b5b9-4ced-9424-b83ff940580f\") " Mar 20 10:58:04 crc kubenswrapper[4748]: I0320 10:58:04.701987 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8728f34-b5b9-4ced-9424-b83ff940580f-ovndb-tls-certs\") pod \"c8728f34-b5b9-4ced-9424-b83ff940580f\" (UID: \"c8728f34-b5b9-4ced-9424-b83ff940580f\") " Mar 20 10:58:04 crc kubenswrapper[4748]: I0320 10:58:04.706936 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8728f34-b5b9-4ced-9424-b83ff940580f-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "c8728f34-b5b9-4ced-9424-b83ff940580f" (UID: "c8728f34-b5b9-4ced-9424-b83ff940580f"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:58:04 crc kubenswrapper[4748]: I0320 10:58:04.707140 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8728f34-b5b9-4ced-9424-b83ff940580f-kube-api-access-htpsf" (OuterVolumeSpecName: "kube-api-access-htpsf") pod "c8728f34-b5b9-4ced-9424-b83ff940580f" (UID: "c8728f34-b5b9-4ced-9424-b83ff940580f"). InnerVolumeSpecName "kube-api-access-htpsf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:58:04 crc kubenswrapper[4748]: I0320 10:58:04.769530 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8728f34-b5b9-4ced-9424-b83ff940580f-config" (OuterVolumeSpecName: "config") pod "c8728f34-b5b9-4ced-9424-b83ff940580f" (UID: "c8728f34-b5b9-4ced-9424-b83ff940580f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:58:04 crc kubenswrapper[4748]: I0320 10:58:04.793865 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8728f34-b5b9-4ced-9424-b83ff940580f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c8728f34-b5b9-4ced-9424-b83ff940580f" (UID: "c8728f34-b5b9-4ced-9424-b83ff940580f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:58:04 crc kubenswrapper[4748]: I0320 10:58:04.800096 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8728f34-b5b9-4ced-9424-b83ff940580f-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "c8728f34-b5b9-4ced-9424-b83ff940580f" (UID: "c8728f34-b5b9-4ced-9424-b83ff940580f"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:58:04 crc kubenswrapper[4748]: I0320 10:58:04.804934 4748 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c8728f34-b5b9-4ced-9424-b83ff940580f-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:58:04 crc kubenswrapper[4748]: I0320 10:58:04.804966 4748 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/c8728f34-b5b9-4ced-9424-b83ff940580f-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:58:04 crc kubenswrapper[4748]: I0320 10:58:04.804976 4748 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8728f34-b5b9-4ced-9424-b83ff940580f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 10:58:04 crc kubenswrapper[4748]: I0320 10:58:04.804988 4748 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8728f34-b5b9-4ced-9424-b83ff940580f-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 10:58:04 crc kubenswrapper[4748]: I0320 10:58:04.804995 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htpsf\" (UniqueName: \"kubernetes.io/projected/c8728f34-b5b9-4ced-9424-b83ff940580f-kube-api-access-htpsf\") on node \"crc\" DevicePath \"\"" Mar 20 10:58:04 crc kubenswrapper[4748]: I0320 10:58:04.962641 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566738-k6md5" Mar 20 10:58:04 crc kubenswrapper[4748]: I0320 10:58:04.963006 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566738-k6md5" event={"ID":"50ebd6f9-305e-482a-9f7a-e5a63e04921a","Type":"ContainerDied","Data":"b61f2588b854288b01c974a1fdf3976c4eba8e075ab1bc39f7485a84396b7052"} Mar 20 10:58:04 crc kubenswrapper[4748]: I0320 10:58:04.963049 4748 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b61f2588b854288b01c974a1fdf3976c4eba8e075ab1bc39f7485a84396b7052" Mar 20 10:58:04 crc kubenswrapper[4748]: I0320 10:58:04.976136 4748 generic.go:334] "Generic (PLEG): container finished" podID="c8728f34-b5b9-4ced-9424-b83ff940580f" containerID="a0b237751dc526e15e76f71e5acad80dcc3d5b8a27566599661be2cfd8af52aa" exitCode=0 Mar 20 10:58:04 crc kubenswrapper[4748]: I0320 10:58:04.976218 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-589cf645cb-wkg45" event={"ID":"c8728f34-b5b9-4ced-9424-b83ff940580f","Type":"ContainerDied","Data":"a0b237751dc526e15e76f71e5acad80dcc3d5b8a27566599661be2cfd8af52aa"} Mar 20 10:58:04 crc kubenswrapper[4748]: I0320 10:58:04.976230 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-589cf645cb-wkg45" Mar 20 10:58:04 crc kubenswrapper[4748]: I0320 10:58:04.976262 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-589cf645cb-wkg45" event={"ID":"c8728f34-b5b9-4ced-9424-b83ff940580f","Type":"ContainerDied","Data":"50ff638ef6da43b65c0a92fd98e016e06e128fed98784dc44857c6bfbf24f48c"} Mar 20 10:58:04 crc kubenswrapper[4748]: I0320 10:58:04.976286 4748 scope.go:117] "RemoveContainer" containerID="eeb50da6d9d48d7097bcc7e766ded49dbca3fb46e5b62b031e89186a098b6d45" Mar 20 10:58:04 crc kubenswrapper[4748]: I0320 10:58:04.976675 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="847669ae-106d-4f9f-8f74-ea1f8e2d221d" containerName="proxy-httpd" containerID="cri-o://19fc679c210222d1ff016794891f4e94e4de33a4324d0a3466fe5ac80621233f" gracePeriod=30 Mar 20 10:58:04 crc kubenswrapper[4748]: I0320 10:58:04.976709 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="847669ae-106d-4f9f-8f74-ea1f8e2d221d" containerName="ceilometer-central-agent" containerID="cri-o://d4cbd022be390d67fc0a3423cfe7cffd3eabc70daf6a0fd850ed78e3f32e83e5" gracePeriod=30 Mar 20 10:58:04 crc kubenswrapper[4748]: I0320 10:58:04.976675 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="847669ae-106d-4f9f-8f74-ea1f8e2d221d" containerName="sg-core" containerID="cri-o://5378fa20cdc8defe0e8c30928c7e93473341304ae8d67e7d815e059dbbb01837" gracePeriod=30 Mar 20 10:58:04 crc kubenswrapper[4748]: I0320 10:58:04.976731 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="847669ae-106d-4f9f-8f74-ea1f8e2d221d" containerName="ceilometer-notification-agent" containerID="cri-o://b548843003db3272aba4481961ab94ee37fe5d87780e1d7d7b1100cbb1db6c7e" gracePeriod=30 Mar 20 10:58:05 crc kubenswrapper[4748]: I0320 10:58:05.067351 4748 scope.go:117] "RemoveContainer" containerID="a0b237751dc526e15e76f71e5acad80dcc3d5b8a27566599661be2cfd8af52aa" Mar 20 10:58:05 crc kubenswrapper[4748]: I0320 10:58:05.072547 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-589cf645cb-wkg45"] Mar 20 10:58:05 crc kubenswrapper[4748]: I0320 10:58:05.080367 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-589cf645cb-wkg45"] Mar 20 10:58:05 crc kubenswrapper[4748]: I0320 10:58:05.171376 4748 scope.go:117] "RemoveContainer" containerID="eeb50da6d9d48d7097bcc7e766ded49dbca3fb46e5b62b031e89186a098b6d45" Mar 20 10:58:05 crc kubenswrapper[4748]: E0320 10:58:05.173638 4748 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eeb50da6d9d48d7097bcc7e766ded49dbca3fb46e5b62b031e89186a098b6d45\": container with ID starting with eeb50da6d9d48d7097bcc7e766ded49dbca3fb46e5b62b031e89186a098b6d45 not found: ID does not exist" containerID="eeb50da6d9d48d7097bcc7e766ded49dbca3fb46e5b62b031e89186a098b6d45" Mar 20 10:58:05 crc kubenswrapper[4748]: I0320 10:58:05.173697 4748 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eeb50da6d9d48d7097bcc7e766ded49dbca3fb46e5b62b031e89186a098b6d45"} err="failed to get container status \"eeb50da6d9d48d7097bcc7e766ded49dbca3fb46e5b62b031e89186a098b6d45\": rpc error: code = NotFound desc = could not find container \"eeb50da6d9d48d7097bcc7e766ded49dbca3fb46e5b62b031e89186a098b6d45\": container with ID starting with eeb50da6d9d48d7097bcc7e766ded49dbca3fb46e5b62b031e89186a098b6d45 not found: ID does not exist" Mar 20 10:58:05 crc kubenswrapper[4748]: I0320 10:58:05.173730 4748 scope.go:117] "RemoveContainer" containerID="a0b237751dc526e15e76f71e5acad80dcc3d5b8a27566599661be2cfd8af52aa" Mar 20 10:58:05 crc kubenswrapper[4748]: E0320 10:58:05.174203 4748 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0b237751dc526e15e76f71e5acad80dcc3d5b8a27566599661be2cfd8af52aa\": container with ID starting with a0b237751dc526e15e76f71e5acad80dcc3d5b8a27566599661be2cfd8af52aa not found: ID does not exist" containerID="a0b237751dc526e15e76f71e5acad80dcc3d5b8a27566599661be2cfd8af52aa" Mar 20 10:58:05 crc kubenswrapper[4748]: I0320 10:58:05.174225 4748 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0b237751dc526e15e76f71e5acad80dcc3d5b8a27566599661be2cfd8af52aa"} err="failed to get container status \"a0b237751dc526e15e76f71e5acad80dcc3d5b8a27566599661be2cfd8af52aa\": rpc error: code = NotFound desc = could not find container \"a0b237751dc526e15e76f71e5acad80dcc3d5b8a27566599661be2cfd8af52aa\": container with ID starting with a0b237751dc526e15e76f71e5acad80dcc3d5b8a27566599661be2cfd8af52aa not found: ID does not exist" Mar 20 10:58:05 crc kubenswrapper[4748]: I0320 10:58:05.421196 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566732-vvq4n"] Mar 20 10:58:05 crc kubenswrapper[4748]: I0320 10:58:05.430492 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566732-vvq4n"] Mar 20 10:58:05 crc kubenswrapper[4748]: I0320 10:58:05.536673 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2b93435-c7d8-4adc-856f-5b318567767e" path="/var/lib/kubelet/pods/c2b93435-c7d8-4adc-856f-5b318567767e/volumes" Mar 20 10:58:05 crc kubenswrapper[4748]: I0320 10:58:05.537969 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8728f34-b5b9-4ced-9424-b83ff940580f" path="/var/lib/kubelet/pods/c8728f34-b5b9-4ced-9424-b83ff940580f/volumes" Mar 20 10:58:06 crc kubenswrapper[4748]: I0320 10:58:06.026781 4748 generic.go:334] "Generic (PLEG): container finished" podID="847669ae-106d-4f9f-8f74-ea1f8e2d221d" containerID="19fc679c210222d1ff016794891f4e94e4de33a4324d0a3466fe5ac80621233f" exitCode=0 Mar 20 10:58:06 crc kubenswrapper[4748]: I0320 10:58:06.027142 4748 generic.go:334] "Generic (PLEG): container finished" podID="847669ae-106d-4f9f-8f74-ea1f8e2d221d" containerID="5378fa20cdc8defe0e8c30928c7e93473341304ae8d67e7d815e059dbbb01837" exitCode=2 Mar 20 10:58:06 crc kubenswrapper[4748]: I0320 10:58:06.027155 4748 generic.go:334] "Generic (PLEG): container finished" podID="847669ae-106d-4f9f-8f74-ea1f8e2d221d" containerID="b548843003db3272aba4481961ab94ee37fe5d87780e1d7d7b1100cbb1db6c7e" exitCode=0 Mar 20 10:58:06 crc kubenswrapper[4748]: I0320 10:58:06.027168 4748 generic.go:334] "Generic (PLEG): container finished" podID="847669ae-106d-4f9f-8f74-ea1f8e2d221d" containerID="d4cbd022be390d67fc0a3423cfe7cffd3eabc70daf6a0fd850ed78e3f32e83e5" exitCode=0 Mar 20 10:58:06 crc kubenswrapper[4748]: I0320 10:58:06.027195 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"847669ae-106d-4f9f-8f74-ea1f8e2d221d","Type":"ContainerDied","Data":"19fc679c210222d1ff016794891f4e94e4de33a4324d0a3466fe5ac80621233f"} Mar 20 10:58:06 crc kubenswrapper[4748]: I0320 10:58:06.027233 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"847669ae-106d-4f9f-8f74-ea1f8e2d221d","Type":"ContainerDied","Data":"5378fa20cdc8defe0e8c30928c7e93473341304ae8d67e7d815e059dbbb01837"} Mar 20 10:58:06 crc kubenswrapper[4748]: I0320 10:58:06.027250 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"847669ae-106d-4f9f-8f74-ea1f8e2d221d","Type":"ContainerDied","Data":"b548843003db3272aba4481961ab94ee37fe5d87780e1d7d7b1100cbb1db6c7e"} Mar 20 10:58:06 crc kubenswrapper[4748]: I0320 10:58:06.027261 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"847669ae-106d-4f9f-8f74-ea1f8e2d221d","Type":"ContainerDied","Data":"d4cbd022be390d67fc0a3423cfe7cffd3eabc70daf6a0fd850ed78e3f32e83e5"} Mar 20 10:58:06 crc kubenswrapper[4748]: I0320 10:58:06.027272 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"847669ae-106d-4f9f-8f74-ea1f8e2d221d","Type":"ContainerDied","Data":"100c881e9aa7efcaa37f6078271596fdccae1d2e5f3fe70104b864275a1364e5"} Mar 20 10:58:06 crc kubenswrapper[4748]: I0320 10:58:06.027284 4748 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="100c881e9aa7efcaa37f6078271596fdccae1d2e5f3fe70104b864275a1364e5" Mar 20 10:58:06 crc kubenswrapper[4748]: I0320 10:58:06.071989 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 10:58:06 crc kubenswrapper[4748]: I0320 10:58:06.146859 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/847669ae-106d-4f9f-8f74-ea1f8e2d221d-combined-ca-bundle\") pod \"847669ae-106d-4f9f-8f74-ea1f8e2d221d\" (UID: \"847669ae-106d-4f9f-8f74-ea1f8e2d221d\") " Mar 20 10:58:06 crc kubenswrapper[4748]: I0320 10:58:06.147002 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/847669ae-106d-4f9f-8f74-ea1f8e2d221d-run-httpd\") pod \"847669ae-106d-4f9f-8f74-ea1f8e2d221d\" (UID: \"847669ae-106d-4f9f-8f74-ea1f8e2d221d\") " Mar 20 10:58:06 crc kubenswrapper[4748]: I0320 10:58:06.147061 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/847669ae-106d-4f9f-8f74-ea1f8e2d221d-scripts\") pod \"847669ae-106d-4f9f-8f74-ea1f8e2d221d\" (UID: \"847669ae-106d-4f9f-8f74-ea1f8e2d221d\") " Mar 20 10:58:06 crc kubenswrapper[4748]: I0320 10:58:06.147153 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/847669ae-106d-4f9f-8f74-ea1f8e2d221d-sg-core-conf-yaml\") pod \"847669ae-106d-4f9f-8f74-ea1f8e2d221d\" (UID: \"847669ae-106d-4f9f-8f74-ea1f8e2d221d\") " Mar 20 10:58:06 crc kubenswrapper[4748]: I0320 10:58:06.147216 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xj4rj\" (UniqueName: \"kubernetes.io/projected/847669ae-106d-4f9f-8f74-ea1f8e2d221d-kube-api-access-xj4rj\") pod \"847669ae-106d-4f9f-8f74-ea1f8e2d221d\" (UID: \"847669ae-106d-4f9f-8f74-ea1f8e2d221d\") " Mar 20 10:58:06 crc kubenswrapper[4748]: I0320 10:58:06.147244 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/847669ae-106d-4f9f-8f74-ea1f8e2d221d-config-data\") pod \"847669ae-106d-4f9f-8f74-ea1f8e2d221d\" (UID: \"847669ae-106d-4f9f-8f74-ea1f8e2d221d\") " Mar 20 10:58:06 crc kubenswrapper[4748]: I0320 10:58:06.147292 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/847669ae-106d-4f9f-8f74-ea1f8e2d221d-log-httpd\") pod \"847669ae-106d-4f9f-8f74-ea1f8e2d221d\" (UID: \"847669ae-106d-4f9f-8f74-ea1f8e2d221d\") " Mar 20 10:58:06 crc kubenswrapper[4748]: I0320 10:58:06.147559 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/847669ae-106d-4f9f-8f74-ea1f8e2d221d-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "847669ae-106d-4f9f-8f74-ea1f8e2d221d" (UID: "847669ae-106d-4f9f-8f74-ea1f8e2d221d"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:58:06 crc kubenswrapper[4748]: I0320 10:58:06.147851 4748 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/847669ae-106d-4f9f-8f74-ea1f8e2d221d-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 10:58:06 crc kubenswrapper[4748]: I0320 10:58:06.148321 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/847669ae-106d-4f9f-8f74-ea1f8e2d221d-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "847669ae-106d-4f9f-8f74-ea1f8e2d221d" (UID: "847669ae-106d-4f9f-8f74-ea1f8e2d221d"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:58:06 crc kubenswrapper[4748]: I0320 10:58:06.156551 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/847669ae-106d-4f9f-8f74-ea1f8e2d221d-kube-api-access-xj4rj" (OuterVolumeSpecName: "kube-api-access-xj4rj") pod "847669ae-106d-4f9f-8f74-ea1f8e2d221d" (UID: "847669ae-106d-4f9f-8f74-ea1f8e2d221d"). InnerVolumeSpecName "kube-api-access-xj4rj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:58:06 crc kubenswrapper[4748]: I0320 10:58:06.157808 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/847669ae-106d-4f9f-8f74-ea1f8e2d221d-scripts" (OuterVolumeSpecName: "scripts") pod "847669ae-106d-4f9f-8f74-ea1f8e2d221d" (UID: "847669ae-106d-4f9f-8f74-ea1f8e2d221d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:58:06 crc kubenswrapper[4748]: I0320 10:58:06.177221 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/847669ae-106d-4f9f-8f74-ea1f8e2d221d-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "847669ae-106d-4f9f-8f74-ea1f8e2d221d" (UID: "847669ae-106d-4f9f-8f74-ea1f8e2d221d"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:58:06 crc kubenswrapper[4748]: I0320 10:58:06.224525 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/847669ae-106d-4f9f-8f74-ea1f8e2d221d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "847669ae-106d-4f9f-8f74-ea1f8e2d221d" (UID: "847669ae-106d-4f9f-8f74-ea1f8e2d221d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:58:06 crc kubenswrapper[4748]: I0320 10:58:06.250238 4748 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/847669ae-106d-4f9f-8f74-ea1f8e2d221d-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 10:58:06 crc kubenswrapper[4748]: I0320 10:58:06.250632 4748 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/847669ae-106d-4f9f-8f74-ea1f8e2d221d-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 20 10:58:06 crc kubenswrapper[4748]: I0320 10:58:06.250652 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xj4rj\" (UniqueName: \"kubernetes.io/projected/847669ae-106d-4f9f-8f74-ea1f8e2d221d-kube-api-access-xj4rj\") on node \"crc\" DevicePath \"\"" Mar 20 10:58:06 crc kubenswrapper[4748]: I0320 10:58:06.250667 4748 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/847669ae-106d-4f9f-8f74-ea1f8e2d221d-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 10:58:06 crc kubenswrapper[4748]: I0320 10:58:06.250681 4748 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/847669ae-106d-4f9f-8f74-ea1f8e2d221d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 10:58:06 crc kubenswrapper[4748]: I0320 10:58:06.299320 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/847669ae-106d-4f9f-8f74-ea1f8e2d221d-config-data" (OuterVolumeSpecName: "config-data") pod "847669ae-106d-4f9f-8f74-ea1f8e2d221d" (UID: "847669ae-106d-4f9f-8f74-ea1f8e2d221d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:58:06 crc kubenswrapper[4748]: I0320 10:58:06.352621 4748 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/847669ae-106d-4f9f-8f74-ea1f8e2d221d-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 10:58:07 crc kubenswrapper[4748]: I0320 10:58:07.045897 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 10:58:07 crc kubenswrapper[4748]: I0320 10:58:07.100566 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 10:58:07 crc kubenswrapper[4748]: I0320 10:58:07.119107 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 20 10:58:07 crc kubenswrapper[4748]: I0320 10:58:07.142368 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 10:58:07 crc kubenswrapper[4748]: E0320 10:58:07.142954 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8728f34-b5b9-4ced-9424-b83ff940580f" containerName="neutron-httpd" Mar 20 10:58:07 crc kubenswrapper[4748]: I0320 10:58:07.142973 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8728f34-b5b9-4ced-9424-b83ff940580f" containerName="neutron-httpd" Mar 20 10:58:07 crc kubenswrapper[4748]: E0320 10:58:07.142998 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8728f34-b5b9-4ced-9424-b83ff940580f" containerName="neutron-api" Mar 20 10:58:07 crc kubenswrapper[4748]: I0320 10:58:07.143007 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8728f34-b5b9-4ced-9424-b83ff940580f" containerName="neutron-api" Mar 20 10:58:07 crc kubenswrapper[4748]: E0320 10:58:07.143026 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="847669ae-106d-4f9f-8f74-ea1f8e2d221d" containerName="proxy-httpd" Mar 20 10:58:07 crc kubenswrapper[4748]: I0320 10:58:07.143033 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="847669ae-106d-4f9f-8f74-ea1f8e2d221d" containerName="proxy-httpd" Mar 20 10:58:07 crc kubenswrapper[4748]: E0320 10:58:07.143052 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="847669ae-106d-4f9f-8f74-ea1f8e2d221d" containerName="ceilometer-notification-agent" Mar 20 10:58:07 crc kubenswrapper[4748]: I0320 10:58:07.143061 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="847669ae-106d-4f9f-8f74-ea1f8e2d221d" containerName="ceilometer-notification-agent" Mar 20 10:58:07 crc kubenswrapper[4748]: E0320 10:58:07.143076 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="847669ae-106d-4f9f-8f74-ea1f8e2d221d" containerName="ceilometer-central-agent" Mar 20 10:58:07 crc kubenswrapper[4748]: I0320 10:58:07.143086 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="847669ae-106d-4f9f-8f74-ea1f8e2d221d" containerName="ceilometer-central-agent" Mar 20 10:58:07 crc kubenswrapper[4748]: E0320 10:58:07.143095 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50ebd6f9-305e-482a-9f7a-e5a63e04921a" containerName="oc" Mar 20 10:58:07 crc kubenswrapper[4748]: I0320 10:58:07.143103 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="50ebd6f9-305e-482a-9f7a-e5a63e04921a" containerName="oc" Mar 20 10:58:07 crc kubenswrapper[4748]: E0320 10:58:07.143123 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="847669ae-106d-4f9f-8f74-ea1f8e2d221d" containerName="sg-core" Mar 20 10:58:07 crc kubenswrapper[4748]: I0320 10:58:07.143133 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="847669ae-106d-4f9f-8f74-ea1f8e2d221d" containerName="sg-core" Mar 20 10:58:07 crc kubenswrapper[4748]: I0320 10:58:07.143389 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="847669ae-106d-4f9f-8f74-ea1f8e2d221d" containerName="proxy-httpd" Mar 20 10:58:07 crc kubenswrapper[4748]: I0320 10:58:07.143407 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8728f34-b5b9-4ced-9424-b83ff940580f" containerName="neutron-api" Mar 20 10:58:07 crc kubenswrapper[4748]: I0320 10:58:07.143427 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="847669ae-106d-4f9f-8f74-ea1f8e2d221d" containerName="ceilometer-central-agent" Mar 20 10:58:07 crc kubenswrapper[4748]: I0320 10:58:07.143444 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="847669ae-106d-4f9f-8f74-ea1f8e2d221d" containerName="sg-core" Mar 20 10:58:07 crc kubenswrapper[4748]: I0320 10:58:07.143460 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="847669ae-106d-4f9f-8f74-ea1f8e2d221d" containerName="ceilometer-notification-agent" Mar 20 10:58:07 crc kubenswrapper[4748]: I0320 10:58:07.143473 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="50ebd6f9-305e-482a-9f7a-e5a63e04921a" containerName="oc" Mar 20 10:58:07 crc kubenswrapper[4748]: I0320 10:58:07.143484 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8728f34-b5b9-4ced-9424-b83ff940580f" containerName="neutron-httpd" Mar 20 10:58:07 crc kubenswrapper[4748]: I0320 10:58:07.145731 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 10:58:07 crc kubenswrapper[4748]: I0320 10:58:07.149242 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 10:58:07 crc kubenswrapper[4748]: I0320 10:58:07.149457 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 10:58:07 crc kubenswrapper[4748]: I0320 10:58:07.158243 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 10:58:07 crc kubenswrapper[4748]: I0320 10:58:07.171144 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/246fb9c9-9e51-4acc-b93b-a630b9d65d6d-log-httpd\") pod \"ceilometer-0\" (UID: \"246fb9c9-9e51-4acc-b93b-a630b9d65d6d\") " pod="openstack/ceilometer-0" Mar 20 10:58:07 crc kubenswrapper[4748]: I0320 10:58:07.171193 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/246fb9c9-9e51-4acc-b93b-a630b9d65d6d-config-data\") pod \"ceilometer-0\" (UID: \"246fb9c9-9e51-4acc-b93b-a630b9d65d6d\") " pod="openstack/ceilometer-0" Mar 20 10:58:07 crc kubenswrapper[4748]: I0320 10:58:07.171227 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/246fb9c9-9e51-4acc-b93b-a630b9d65d6d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"246fb9c9-9e51-4acc-b93b-a630b9d65d6d\") " pod="openstack/ceilometer-0" Mar 20 10:58:07 crc kubenswrapper[4748]: I0320 10:58:07.171275 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8c6w\" (UniqueName: \"kubernetes.io/projected/246fb9c9-9e51-4acc-b93b-a630b9d65d6d-kube-api-access-c8c6w\") pod \"ceilometer-0\" (UID: \"246fb9c9-9e51-4acc-b93b-a630b9d65d6d\") " pod="openstack/ceilometer-0" Mar 20 10:58:07 crc kubenswrapper[4748]: I0320 10:58:07.171311 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/246fb9c9-9e51-4acc-b93b-a630b9d65d6d-scripts\") pod \"ceilometer-0\" (UID: \"246fb9c9-9e51-4acc-b93b-a630b9d65d6d\") " pod="openstack/ceilometer-0" Mar 20 10:58:07 crc kubenswrapper[4748]: I0320 10:58:07.171359 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/246fb9c9-9e51-4acc-b93b-a630b9d65d6d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"246fb9c9-9e51-4acc-b93b-a630b9d65d6d\") " pod="openstack/ceilometer-0" Mar 20 10:58:07 crc kubenswrapper[4748]: I0320 10:58:07.171399 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/246fb9c9-9e51-4acc-b93b-a630b9d65d6d-run-httpd\") pod \"ceilometer-0\" (UID: \"246fb9c9-9e51-4acc-b93b-a630b9d65d6d\") " pod="openstack/ceilometer-0" Mar 20 10:58:07 crc kubenswrapper[4748]: I0320 10:58:07.273777 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/246fb9c9-9e51-4acc-b93b-a630b9d65d6d-log-httpd\") pod \"ceilometer-0\" (UID: \"246fb9c9-9e51-4acc-b93b-a630b9d65d6d\") " pod="openstack/ceilometer-0" Mar 20 10:58:07 crc kubenswrapper[4748]: I0320 10:58:07.274178 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/246fb9c9-9e51-4acc-b93b-a630b9d65d6d-config-data\") pod \"ceilometer-0\" (UID: \"246fb9c9-9e51-4acc-b93b-a630b9d65d6d\") " pod="openstack/ceilometer-0" Mar 20 10:58:07 crc kubenswrapper[4748]: I0320 10:58:07.274215 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/246fb9c9-9e51-4acc-b93b-a630b9d65d6d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"246fb9c9-9e51-4acc-b93b-a630b9d65d6d\") " pod="openstack/ceilometer-0" Mar 20 10:58:07 crc kubenswrapper[4748]: I0320 10:58:07.274272 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c8c6w\" (UniqueName: \"kubernetes.io/projected/246fb9c9-9e51-4acc-b93b-a630b9d65d6d-kube-api-access-c8c6w\") pod \"ceilometer-0\" (UID: \"246fb9c9-9e51-4acc-b93b-a630b9d65d6d\") " pod="openstack/ceilometer-0" Mar 20 10:58:07 crc kubenswrapper[4748]: I0320 10:58:07.274345 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/246fb9c9-9e51-4acc-b93b-a630b9d65d6d-scripts\") pod \"ceilometer-0\" (UID: \"246fb9c9-9e51-4acc-b93b-a630b9d65d6d\") " pod="openstack/ceilometer-0" Mar 20 10:58:07 crc kubenswrapper[4748]: I0320 10:58:07.274450 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/246fb9c9-9e51-4acc-b93b-a630b9d65d6d-log-httpd\") pod \"ceilometer-0\" (UID: \"246fb9c9-9e51-4acc-b93b-a630b9d65d6d\") " pod="openstack/ceilometer-0" Mar 20 10:58:07 crc kubenswrapper[4748]: I0320 10:58:07.275275 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/246fb9c9-9e51-4acc-b93b-a630b9d65d6d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"246fb9c9-9e51-4acc-b93b-a630b9d65d6d\") " pod="openstack/ceilometer-0" Mar 20 10:58:07 crc kubenswrapper[4748]: I0320 10:58:07.275363 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/246fb9c9-9e51-4acc-b93b-a630b9d65d6d-run-httpd\") pod \"ceilometer-0\" (UID: \"246fb9c9-9e51-4acc-b93b-a630b9d65d6d\") " pod="openstack/ceilometer-0" Mar 20 10:58:07 crc kubenswrapper[4748]: I0320 10:58:07.275758 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/246fb9c9-9e51-4acc-b93b-a630b9d65d6d-run-httpd\") pod \"ceilometer-0\" (UID: \"246fb9c9-9e51-4acc-b93b-a630b9d65d6d\") " pod="openstack/ceilometer-0" Mar 20 10:58:07 crc kubenswrapper[4748]: I0320 10:58:07.280557 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/246fb9c9-9e51-4acc-b93b-a630b9d65d6d-scripts\") pod \"ceilometer-0\" (UID: \"246fb9c9-9e51-4acc-b93b-a630b9d65d6d\") " pod="openstack/ceilometer-0" Mar 20 10:58:07 crc kubenswrapper[4748]: I0320 10:58:07.283043 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/246fb9c9-9e51-4acc-b93b-a630b9d65d6d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"246fb9c9-9e51-4acc-b93b-a630b9d65d6d\") " pod="openstack/ceilometer-0" Mar 20 10:58:07 crc kubenswrapper[4748]: I0320 10:58:07.283220 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/246fb9c9-9e51-4acc-b93b-a630b9d65d6d-config-data\") pod \"ceilometer-0\" (UID: \"246fb9c9-9e51-4acc-b93b-a630b9d65d6d\") " pod="openstack/ceilometer-0" Mar 20 10:58:07 crc kubenswrapper[4748]: I0320 10:58:07.294297 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/246fb9c9-9e51-4acc-b93b-a630b9d65d6d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"246fb9c9-9e51-4acc-b93b-a630b9d65d6d\") " pod="openstack/ceilometer-0" Mar 20 10:58:07 crc kubenswrapper[4748]: I0320 10:58:07.299775 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8c6w\" (UniqueName: \"kubernetes.io/projected/246fb9c9-9e51-4acc-b93b-a630b9d65d6d-kube-api-access-c8c6w\") pod \"ceilometer-0\" (UID: \"246fb9c9-9e51-4acc-b93b-a630b9d65d6d\") " pod="openstack/ceilometer-0" Mar 20 10:58:07 crc kubenswrapper[4748]: I0320 10:58:07.482715 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 10:58:07 crc kubenswrapper[4748]: I0320 10:58:07.529398 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="847669ae-106d-4f9f-8f74-ea1f8e2d221d" path="/var/lib/kubelet/pods/847669ae-106d-4f9f-8f74-ea1f8e2d221d/volumes" Mar 20 10:58:12 crc kubenswrapper[4748]: I0320 10:58:12.584080 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 10:58:12 crc kubenswrapper[4748]: W0320 10:58:12.600032 4748 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod246fb9c9_9e51_4acc_b93b_a630b9d65d6d.slice/crio-cfe3990da9af2edf9b0906f89dd103a3625ff74779238f50081f8fd8c6871de1 WatchSource:0}: Error finding container cfe3990da9af2edf9b0906f89dd103a3625ff74779238f50081f8fd8c6871de1: Status 404 returned error can't find the container with id cfe3990da9af2edf9b0906f89dd103a3625ff74779238f50081f8fd8c6871de1 Mar 20 10:58:12 crc kubenswrapper[4748]: I0320 10:58:12.928924 4748 patch_prober.go:28] interesting pod/machine-config-daemon-5lbvz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 10:58:12 crc kubenswrapper[4748]: I0320 10:58:12.929002 4748 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 10:58:13 crc kubenswrapper[4748]: I0320 10:58:13.118923 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-bs7cs" event={"ID":"a154fa7a-7ef7-4c8b-ac86-b0508e4c1cb9","Type":"ContainerStarted","Data":"5a49b5d8f18000cabf840c3c9026fdb78f1a5f83b9c6cce775ffeaa423653f08"} Mar 20 10:58:13 crc kubenswrapper[4748]: I0320 10:58:13.121556 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"2b2f2b26-6292-47bb-b8ee-971d9b47c85d","Type":"ContainerStarted","Data":"86ef53e17c0742228abc224f3a70f5909965a3b831d0269b888c18db087f1695"} Mar 20 10:58:13 crc kubenswrapper[4748]: I0320 10:58:13.123187 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"246fb9c9-9e51-4acc-b93b-a630b9d65d6d","Type":"ContainerStarted","Data":"cfe3990da9af2edf9b0906f89dd103a3625ff74779238f50081f8fd8c6871de1"} Mar 20 10:58:13 crc kubenswrapper[4748]: I0320 10:58:13.139918 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-bs7cs" podStartSLOduration=2.33323837 podStartE2EDuration="12.139894689s" podCreationTimestamp="2026-03-20 10:58:01 +0000 UTC" firstStartedPulling="2026-03-20 10:58:02.335135683 +0000 UTC m=+1317.476681497" lastFinishedPulling="2026-03-20 10:58:12.141792012 +0000 UTC m=+1327.283337816" observedRunningTime="2026-03-20 10:58:13.134716909 +0000 UTC m=+1328.276262783" watchObservedRunningTime="2026-03-20 10:58:13.139894689 +0000 UTC m=+1328.281440503" Mar 20 10:58:13 crc kubenswrapper[4748]: I0320 10:58:13.154169 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.755812927 podStartE2EDuration="39.154146107s" podCreationTimestamp="2026-03-20 10:57:34 +0000 UTC" firstStartedPulling="2026-03-20 10:57:35.746105639 +0000 UTC m=+1290.887651453" lastFinishedPulling="2026-03-20 10:58:12.144438829 +0000 UTC m=+1327.285984633" observedRunningTime="2026-03-20 10:58:13.148962727 +0000 UTC m=+1328.290508711" watchObservedRunningTime="2026-03-20 10:58:13.154146107 +0000 UTC m=+1328.295691921" Mar 20 10:58:14 crc kubenswrapper[4748]: I0320 10:58:14.135022 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"246fb9c9-9e51-4acc-b93b-a630b9d65d6d","Type":"ContainerStarted","Data":"008873a1a263b9df636360272d0f050ef0e564b8e9ef2b35e92e27442ee77c86"} Mar 20 10:58:14 crc kubenswrapper[4748]: I0320 10:58:14.135629 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"246fb9c9-9e51-4acc-b93b-a630b9d65d6d","Type":"ContainerStarted","Data":"7df4a7959b03db15ccbc3d1cd86e1cd6e8de9867705181272a1e3ae10b2473dc"} Mar 20 10:58:14 crc kubenswrapper[4748]: I0320 10:58:14.611867 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 10:58:15 crc kubenswrapper[4748]: I0320 10:58:15.146372 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"246fb9c9-9e51-4acc-b93b-a630b9d65d6d","Type":"ContainerStarted","Data":"7cd5e8205006dfbdd4f33f086718c1d4b03326a930a7c42ea5cec3dbf6bf18d1"} Mar 20 10:58:15 crc kubenswrapper[4748]: I0320 10:58:15.233911 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 10:58:15 crc kubenswrapper[4748]: I0320 10:58:15.234243 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="3de89ba4-6b05-410a-a7a0-e6ec1e1ba095" containerName="glance-log" containerID="cri-o://065f22a766d13285d2727ecf62800ca8102915dd8b53b27ce6bd138ca82c75d0" gracePeriod=30 Mar 20 10:58:15 crc kubenswrapper[4748]: I0320 10:58:15.234319 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="3de89ba4-6b05-410a-a7a0-e6ec1e1ba095" containerName="glance-httpd" containerID="cri-o://ec96f6c09a0d2fe8560c444f5939eb9b0f5bb271f3d6e415b8cd9d420c3f3d97" gracePeriod=30 Mar 20 10:58:16 crc kubenswrapper[4748]: I0320 10:58:16.164107 4748 generic.go:334] "Generic (PLEG): container finished" podID="3de89ba4-6b05-410a-a7a0-e6ec1e1ba095" containerID="065f22a766d13285d2727ecf62800ca8102915dd8b53b27ce6bd138ca82c75d0" exitCode=143 Mar 20 10:58:16 crc kubenswrapper[4748]: I0320 10:58:16.164195 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3de89ba4-6b05-410a-a7a0-e6ec1e1ba095","Type":"ContainerDied","Data":"065f22a766d13285d2727ecf62800ca8102915dd8b53b27ce6bd138ca82c75d0"} Mar 20 10:58:16 crc kubenswrapper[4748]: I0320 10:58:16.532538 4748 scope.go:117] "RemoveContainer" containerID="e8c94ff1d5e9a53a0a5d8c040dd94d943d991dc9ef7fe18e2ca0b299ba2c0393" Mar 20 10:58:17 crc kubenswrapper[4748]: I0320 10:58:17.181892 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"246fb9c9-9e51-4acc-b93b-a630b9d65d6d","Type":"ContainerStarted","Data":"35dde74b88e408e8e7cfff28ce2fdf992cf0dff4e05b7d0fd2f64f7b6d0be800"} Mar 20 10:58:17 crc kubenswrapper[4748]: I0320 10:58:17.182065 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="246fb9c9-9e51-4acc-b93b-a630b9d65d6d" containerName="ceilometer-central-agent" containerID="cri-o://7df4a7959b03db15ccbc3d1cd86e1cd6e8de9867705181272a1e3ae10b2473dc" gracePeriod=30 Mar 20 10:58:17 crc kubenswrapper[4748]: I0320 10:58:17.182123 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="246fb9c9-9e51-4acc-b93b-a630b9d65d6d" containerName="proxy-httpd" containerID="cri-o://35dde74b88e408e8e7cfff28ce2fdf992cf0dff4e05b7d0fd2f64f7b6d0be800" gracePeriod=30 Mar 20 10:58:17 crc kubenswrapper[4748]: I0320 10:58:17.182129 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 20 10:58:17 crc kubenswrapper[4748]: I0320 10:58:17.182172 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="246fb9c9-9e51-4acc-b93b-a630b9d65d6d" containerName="sg-core" containerID="cri-o://7cd5e8205006dfbdd4f33f086718c1d4b03326a930a7c42ea5cec3dbf6bf18d1" gracePeriod=30 Mar 20 10:58:17 crc kubenswrapper[4748]: I0320 10:58:17.182159 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="246fb9c9-9e51-4acc-b93b-a630b9d65d6d" containerName="ceilometer-notification-agent" containerID="cri-o://008873a1a263b9df636360272d0f050ef0e564b8e9ef2b35e92e27442ee77c86" gracePeriod=30 Mar 20 10:58:17 crc kubenswrapper[4748]: I0320 10:58:17.214196 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=6.478784474 podStartE2EDuration="10.214173043s" podCreationTimestamp="2026-03-20 10:58:07 +0000 UTC" firstStartedPulling="2026-03-20 10:58:12.603178231 +0000 UTC m=+1327.744724045" lastFinishedPulling="2026-03-20 10:58:16.3385668 +0000 UTC m=+1331.480112614" observedRunningTime="2026-03-20 10:58:17.208965853 +0000 UTC m=+1332.350511677" watchObservedRunningTime="2026-03-20 10:58:17.214173043 +0000 UTC m=+1332.355718857" Mar 20 10:58:18 crc kubenswrapper[4748]: I0320 10:58:18.193659 4748 generic.go:334] "Generic (PLEG): container finished" podID="246fb9c9-9e51-4acc-b93b-a630b9d65d6d" containerID="35dde74b88e408e8e7cfff28ce2fdf992cf0dff4e05b7d0fd2f64f7b6d0be800" exitCode=0 Mar 20 10:58:18 crc kubenswrapper[4748]: I0320 10:58:18.194017 4748 generic.go:334] "Generic (PLEG): container finished" podID="246fb9c9-9e51-4acc-b93b-a630b9d65d6d" containerID="7cd5e8205006dfbdd4f33f086718c1d4b03326a930a7c42ea5cec3dbf6bf18d1" exitCode=2 Mar 20 10:58:18 crc kubenswrapper[4748]: I0320 10:58:18.194032 4748 generic.go:334] "Generic (PLEG): container finished" podID="246fb9c9-9e51-4acc-b93b-a630b9d65d6d" containerID="008873a1a263b9df636360272d0f050ef0e564b8e9ef2b35e92e27442ee77c86" exitCode=0 Mar 20 10:58:18 crc kubenswrapper[4748]: I0320 10:58:18.193903 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"246fb9c9-9e51-4acc-b93b-a630b9d65d6d","Type":"ContainerDied","Data":"35dde74b88e408e8e7cfff28ce2fdf992cf0dff4e05b7d0fd2f64f7b6d0be800"} Mar 20 10:58:18 crc kubenswrapper[4748]: I0320 10:58:18.194080 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"246fb9c9-9e51-4acc-b93b-a630b9d65d6d","Type":"ContainerDied","Data":"7cd5e8205006dfbdd4f33f086718c1d4b03326a930a7c42ea5cec3dbf6bf18d1"} Mar 20 10:58:18 crc kubenswrapper[4748]: I0320 10:58:18.194101 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"246fb9c9-9e51-4acc-b93b-a630b9d65d6d","Type":"ContainerDied","Data":"008873a1a263b9df636360272d0f050ef0e564b8e9ef2b35e92e27442ee77c86"} Mar 20 10:58:19 crc kubenswrapper[4748]: I0320 10:58:19.212239 4748 generic.go:334] "Generic (PLEG): container finished" podID="3de89ba4-6b05-410a-a7a0-e6ec1e1ba095" containerID="ec96f6c09a0d2fe8560c444f5939eb9b0f5bb271f3d6e415b8cd9d420c3f3d97" exitCode=0 Mar 20 10:58:19 crc kubenswrapper[4748]: I0320 10:58:19.212292 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3de89ba4-6b05-410a-a7a0-e6ec1e1ba095","Type":"ContainerDied","Data":"ec96f6c09a0d2fe8560c444f5939eb9b0f5bb271f3d6e415b8cd9d420c3f3d97"} Mar 20 10:58:19 crc kubenswrapper[4748]: I0320 10:58:19.444404 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 10:58:19 crc kubenswrapper[4748]: I0320 10:58:19.445061 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="0f4ca5c7-6a70-485e-b39d-e786ad0004a5" containerName="glance-log" containerID="cri-o://12a28e7df4ad27261901712b424f1439a97b18d7aecab5a7b19754232c5975d5" gracePeriod=30 Mar 20 10:58:19 crc kubenswrapper[4748]: I0320 10:58:19.445163 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="0f4ca5c7-6a70-485e-b39d-e786ad0004a5" containerName="glance-httpd" containerID="cri-o://df4ecb4636ea9e29b83162f356f15c0a5c472e8b049de36d07077b5bb611b6dc" gracePeriod=30 Mar 20 10:58:20 crc kubenswrapper[4748]: I0320 10:58:20.236940 4748 generic.go:334] "Generic (PLEG): container finished" podID="0f4ca5c7-6a70-485e-b39d-e786ad0004a5" containerID="12a28e7df4ad27261901712b424f1439a97b18d7aecab5a7b19754232c5975d5" exitCode=143 Mar 20 10:58:20 crc kubenswrapper[4748]: I0320 10:58:20.237071 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"0f4ca5c7-6a70-485e-b39d-e786ad0004a5","Type":"ContainerDied","Data":"12a28e7df4ad27261901712b424f1439a97b18d7aecab5a7b19754232c5975d5"} Mar 20 10:58:20 crc kubenswrapper[4748]: I0320 10:58:20.656968 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 10:58:20 crc kubenswrapper[4748]: I0320 10:58:20.846070 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3de89ba4-6b05-410a-a7a0-e6ec1e1ba095-config-data\") pod \"3de89ba4-6b05-410a-a7a0-e6ec1e1ba095\" (UID: \"3de89ba4-6b05-410a-a7a0-e6ec1e1ba095\") " Mar 20 10:58:20 crc kubenswrapper[4748]: I0320 10:58:20.846143 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3de89ba4-6b05-410a-a7a0-e6ec1e1ba095-scripts\") pod \"3de89ba4-6b05-410a-a7a0-e6ec1e1ba095\" (UID: \"3de89ba4-6b05-410a-a7a0-e6ec1e1ba095\") " Mar 20 10:58:20 crc kubenswrapper[4748]: I0320 10:58:20.846310 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nncx5\" (UniqueName: \"kubernetes.io/projected/3de89ba4-6b05-410a-a7a0-e6ec1e1ba095-kube-api-access-nncx5\") pod \"3de89ba4-6b05-410a-a7a0-e6ec1e1ba095\" (UID: \"3de89ba4-6b05-410a-a7a0-e6ec1e1ba095\") " Mar 20 10:58:20 crc kubenswrapper[4748]: I0320 10:58:20.846355 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3de89ba4-6b05-410a-a7a0-e6ec1e1ba095-logs\") pod \"3de89ba4-6b05-410a-a7a0-e6ec1e1ba095\" (UID: \"3de89ba4-6b05-410a-a7a0-e6ec1e1ba095\") " Mar 20 10:58:20 crc kubenswrapper[4748]: I0320 10:58:20.846446 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3de89ba4-6b05-410a-a7a0-e6ec1e1ba095-combined-ca-bundle\") pod \"3de89ba4-6b05-410a-a7a0-e6ec1e1ba095\" (UID: \"3de89ba4-6b05-410a-a7a0-e6ec1e1ba095\") " Mar 20 10:58:20 crc kubenswrapper[4748]: I0320 10:58:20.846494 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"3de89ba4-6b05-410a-a7a0-e6ec1e1ba095\" (UID: \"3de89ba4-6b05-410a-a7a0-e6ec1e1ba095\") " Mar 20 10:58:20 crc kubenswrapper[4748]: I0320 10:58:20.847068 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3de89ba4-6b05-410a-a7a0-e6ec1e1ba095-httpd-run\") pod \"3de89ba4-6b05-410a-a7a0-e6ec1e1ba095\" (UID: \"3de89ba4-6b05-410a-a7a0-e6ec1e1ba095\") " Mar 20 10:58:20 crc kubenswrapper[4748]: I0320 10:58:20.847116 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3de89ba4-6b05-410a-a7a0-e6ec1e1ba095-public-tls-certs\") pod \"3de89ba4-6b05-410a-a7a0-e6ec1e1ba095\" (UID: \"3de89ba4-6b05-410a-a7a0-e6ec1e1ba095\") " Mar 20 10:58:20 crc kubenswrapper[4748]: I0320 10:58:20.847148 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3de89ba4-6b05-410a-a7a0-e6ec1e1ba095-logs" (OuterVolumeSpecName: "logs") pod "3de89ba4-6b05-410a-a7a0-e6ec1e1ba095" (UID: "3de89ba4-6b05-410a-a7a0-e6ec1e1ba095"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:58:20 crc kubenswrapper[4748]: I0320 10:58:20.847398 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3de89ba4-6b05-410a-a7a0-e6ec1e1ba095-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "3de89ba4-6b05-410a-a7a0-e6ec1e1ba095" (UID: "3de89ba4-6b05-410a-a7a0-e6ec1e1ba095"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:58:20 crc kubenswrapper[4748]: I0320 10:58:20.847772 4748 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3de89ba4-6b05-410a-a7a0-e6ec1e1ba095-logs\") on node \"crc\" DevicePath \"\"" Mar 20 10:58:20 crc kubenswrapper[4748]: I0320 10:58:20.847801 4748 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3de89ba4-6b05-410a-a7a0-e6ec1e1ba095-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 20 10:58:20 crc kubenswrapper[4748]: I0320 10:58:20.852752 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3de89ba4-6b05-410a-a7a0-e6ec1e1ba095-kube-api-access-nncx5" (OuterVolumeSpecName: "kube-api-access-nncx5") pod "3de89ba4-6b05-410a-a7a0-e6ec1e1ba095" (UID: "3de89ba4-6b05-410a-a7a0-e6ec1e1ba095"). InnerVolumeSpecName "kube-api-access-nncx5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:58:20 crc kubenswrapper[4748]: I0320 10:58:20.853794 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "glance") pod "3de89ba4-6b05-410a-a7a0-e6ec1e1ba095" (UID: "3de89ba4-6b05-410a-a7a0-e6ec1e1ba095"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 20 10:58:20 crc kubenswrapper[4748]: I0320 10:58:20.854892 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3de89ba4-6b05-410a-a7a0-e6ec1e1ba095-scripts" (OuterVolumeSpecName: "scripts") pod "3de89ba4-6b05-410a-a7a0-e6ec1e1ba095" (UID: "3de89ba4-6b05-410a-a7a0-e6ec1e1ba095"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:58:20 crc kubenswrapper[4748]: I0320 10:58:20.878164 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3de89ba4-6b05-410a-a7a0-e6ec1e1ba095-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3de89ba4-6b05-410a-a7a0-e6ec1e1ba095" (UID: "3de89ba4-6b05-410a-a7a0-e6ec1e1ba095"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:58:20 crc kubenswrapper[4748]: I0320 10:58:20.907624 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3de89ba4-6b05-410a-a7a0-e6ec1e1ba095-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "3de89ba4-6b05-410a-a7a0-e6ec1e1ba095" (UID: "3de89ba4-6b05-410a-a7a0-e6ec1e1ba095"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:58:20 crc kubenswrapper[4748]: I0320 10:58:20.918809 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3de89ba4-6b05-410a-a7a0-e6ec1e1ba095-config-data" (OuterVolumeSpecName: "config-data") pod "3de89ba4-6b05-410a-a7a0-e6ec1e1ba095" (UID: "3de89ba4-6b05-410a-a7a0-e6ec1e1ba095"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:58:20 crc kubenswrapper[4748]: I0320 10:58:20.950393 4748 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3de89ba4-6b05-410a-a7a0-e6ec1e1ba095-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 10:58:20 crc kubenswrapper[4748]: I0320 10:58:20.950437 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nncx5\" (UniqueName: \"kubernetes.io/projected/3de89ba4-6b05-410a-a7a0-e6ec1e1ba095-kube-api-access-nncx5\") on node \"crc\" DevicePath \"\"" Mar 20 10:58:20 crc kubenswrapper[4748]: I0320 10:58:20.950453 4748 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3de89ba4-6b05-410a-a7a0-e6ec1e1ba095-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 10:58:20 crc kubenswrapper[4748]: I0320 10:58:20.950512 4748 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Mar 20 10:58:20 crc kubenswrapper[4748]: I0320 10:58:20.950530 4748 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3de89ba4-6b05-410a-a7a0-e6ec1e1ba095-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 10:58:20 crc kubenswrapper[4748]: I0320 10:58:20.950541 4748 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3de89ba4-6b05-410a-a7a0-e6ec1e1ba095-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 10:58:20 crc kubenswrapper[4748]: I0320 10:58:20.969101 4748 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Mar 20 10:58:21 crc kubenswrapper[4748]: I0320 10:58:21.051762 4748 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Mar 20 10:58:21 crc kubenswrapper[4748]: I0320 10:58:21.247647 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3de89ba4-6b05-410a-a7a0-e6ec1e1ba095","Type":"ContainerDied","Data":"04d6864f5cfac2f58a829489b9bdc1b6c4e7fec51052db50a77a31f8dcb39c8b"} Mar 20 10:58:21 crc kubenswrapper[4748]: I0320 10:58:21.247710 4748 scope.go:117] "RemoveContainer" containerID="ec96f6c09a0d2fe8560c444f5939eb9b0f5bb271f3d6e415b8cd9d420c3f3d97" Mar 20 10:58:21 crc kubenswrapper[4748]: I0320 10:58:21.247727 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 10:58:21 crc kubenswrapper[4748]: I0320 10:58:21.311850 4748 scope.go:117] "RemoveContainer" containerID="065f22a766d13285d2727ecf62800ca8102915dd8b53b27ce6bd138ca82c75d0" Mar 20 10:58:21 crc kubenswrapper[4748]: I0320 10:58:21.337606 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 10:58:21 crc kubenswrapper[4748]: I0320 10:58:21.376510 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 10:58:21 crc kubenswrapper[4748]: I0320 10:58:21.386200 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 10:58:21 crc kubenswrapper[4748]: E0320 10:58:21.386809 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3de89ba4-6b05-410a-a7a0-e6ec1e1ba095" containerName="glance-httpd" Mar 20 10:58:21 crc kubenswrapper[4748]: I0320 10:58:21.386846 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="3de89ba4-6b05-410a-a7a0-e6ec1e1ba095" containerName="glance-httpd" Mar 20 10:58:21 crc kubenswrapper[4748]: E0320 10:58:21.386876 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3de89ba4-6b05-410a-a7a0-e6ec1e1ba095" containerName="glance-log" Mar 20 10:58:21 crc kubenswrapper[4748]: I0320 10:58:21.386885 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="3de89ba4-6b05-410a-a7a0-e6ec1e1ba095" containerName="glance-log" Mar 20 10:58:21 crc kubenswrapper[4748]: I0320 10:58:21.387136 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="3de89ba4-6b05-410a-a7a0-e6ec1e1ba095" containerName="glance-httpd" Mar 20 10:58:21 crc kubenswrapper[4748]: I0320 10:58:21.387163 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="3de89ba4-6b05-410a-a7a0-e6ec1e1ba095" containerName="glance-log" Mar 20 10:58:21 crc kubenswrapper[4748]: I0320 10:58:21.388955 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 10:58:21 crc kubenswrapper[4748]: I0320 10:58:21.395270 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 20 10:58:21 crc kubenswrapper[4748]: I0320 10:58:21.395489 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 20 10:58:21 crc kubenswrapper[4748]: I0320 10:58:21.411390 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 10:58:21 crc kubenswrapper[4748]: I0320 10:58:21.529442 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3de89ba4-6b05-410a-a7a0-e6ec1e1ba095" path="/var/lib/kubelet/pods/3de89ba4-6b05-410a-a7a0-e6ec1e1ba095/volumes" Mar 20 10:58:21 crc kubenswrapper[4748]: I0320 10:58:21.559690 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffe44fec-9121-46b2-9087-eba59b656915-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ffe44fec-9121-46b2-9087-eba59b656915\") " pod="openstack/glance-default-external-api-0" Mar 20 10:58:21 crc kubenswrapper[4748]: I0320 10:58:21.560037 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ffe44fec-9121-46b2-9087-eba59b656915-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"ffe44fec-9121-46b2-9087-eba59b656915\") " pod="openstack/glance-default-external-api-0" Mar 20 10:58:21 crc kubenswrapper[4748]: I0320 10:58:21.560171 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ffe44fec-9121-46b2-9087-eba59b656915-scripts\") pod \"glance-default-external-api-0\" (UID: \"ffe44fec-9121-46b2-9087-eba59b656915\") " pod="openstack/glance-default-external-api-0" Mar 20 10:58:21 crc kubenswrapper[4748]: I0320 10:58:21.560282 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ffe44fec-9121-46b2-9087-eba59b656915-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ffe44fec-9121-46b2-9087-eba59b656915\") " pod="openstack/glance-default-external-api-0" Mar 20 10:58:21 crc kubenswrapper[4748]: I0320 10:58:21.560438 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xc4hf\" (UniqueName: \"kubernetes.io/projected/ffe44fec-9121-46b2-9087-eba59b656915-kube-api-access-xc4hf\") pod \"glance-default-external-api-0\" (UID: \"ffe44fec-9121-46b2-9087-eba59b656915\") " pod="openstack/glance-default-external-api-0" Mar 20 10:58:21 crc kubenswrapper[4748]: I0320 10:58:21.560865 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"ffe44fec-9121-46b2-9087-eba59b656915\") " pod="openstack/glance-default-external-api-0" Mar 20 10:58:21 crc kubenswrapper[4748]: I0320 10:58:21.560981 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ffe44fec-9121-46b2-9087-eba59b656915-config-data\") pod \"glance-default-external-api-0\" (UID: \"ffe44fec-9121-46b2-9087-eba59b656915\") " pod="openstack/glance-default-external-api-0" Mar 20 10:58:21 crc kubenswrapper[4748]: I0320 10:58:21.561209 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ffe44fec-9121-46b2-9087-eba59b656915-logs\") pod \"glance-default-external-api-0\" (UID: \"ffe44fec-9121-46b2-9087-eba59b656915\") " pod="openstack/glance-default-external-api-0" Mar 20 10:58:21 crc kubenswrapper[4748]: I0320 10:58:21.663570 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"ffe44fec-9121-46b2-9087-eba59b656915\") " pod="openstack/glance-default-external-api-0" Mar 20 10:58:21 crc kubenswrapper[4748]: I0320 10:58:21.663912 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ffe44fec-9121-46b2-9087-eba59b656915-config-data\") pod \"glance-default-external-api-0\" (UID: \"ffe44fec-9121-46b2-9087-eba59b656915\") " pod="openstack/glance-default-external-api-0" Mar 20 10:58:21 crc kubenswrapper[4748]: I0320 10:58:21.663961 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ffe44fec-9121-46b2-9087-eba59b656915-logs\") pod \"glance-default-external-api-0\" (UID: \"ffe44fec-9121-46b2-9087-eba59b656915\") " pod="openstack/glance-default-external-api-0" Mar 20 10:58:21 crc kubenswrapper[4748]: I0320 10:58:21.664110 4748 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"ffe44fec-9121-46b2-9087-eba59b656915\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/glance-default-external-api-0" Mar 20 10:58:21 crc kubenswrapper[4748]: I0320 10:58:21.664455 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ffe44fec-9121-46b2-9087-eba59b656915-logs\") pod \"glance-default-external-api-0\" (UID: \"ffe44fec-9121-46b2-9087-eba59b656915\") " pod="openstack/glance-default-external-api-0" Mar 20 10:58:21 crc kubenswrapper[4748]: I0320 10:58:21.666864 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffe44fec-9121-46b2-9087-eba59b656915-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ffe44fec-9121-46b2-9087-eba59b656915\") " pod="openstack/glance-default-external-api-0" Mar 20 10:58:21 crc kubenswrapper[4748]: I0320 10:58:21.666952 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ffe44fec-9121-46b2-9087-eba59b656915-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"ffe44fec-9121-46b2-9087-eba59b656915\") " pod="openstack/glance-default-external-api-0" Mar 20 10:58:21 crc kubenswrapper[4748]: I0320 10:58:21.666994 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ffe44fec-9121-46b2-9087-eba59b656915-scripts\") pod \"glance-default-external-api-0\" (UID: \"ffe44fec-9121-46b2-9087-eba59b656915\") " pod="openstack/glance-default-external-api-0" Mar 20 10:58:21 crc kubenswrapper[4748]: I0320 10:58:21.667020 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ffe44fec-9121-46b2-9087-eba59b656915-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ffe44fec-9121-46b2-9087-eba59b656915\") " pod="openstack/glance-default-external-api-0" Mar 20 10:58:21 crc kubenswrapper[4748]: I0320 10:58:21.667094 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xc4hf\" (UniqueName: \"kubernetes.io/projected/ffe44fec-9121-46b2-9087-eba59b656915-kube-api-access-xc4hf\") pod \"glance-default-external-api-0\" (UID: \"ffe44fec-9121-46b2-9087-eba59b656915\") " pod="openstack/glance-default-external-api-0" Mar 20 10:58:21 crc kubenswrapper[4748]: I0320 10:58:21.667604 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ffe44fec-9121-46b2-9087-eba59b656915-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ffe44fec-9121-46b2-9087-eba59b656915\") " pod="openstack/glance-default-external-api-0" Mar 20 10:58:21 crc kubenswrapper[4748]: I0320 10:58:21.669082 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffe44fec-9121-46b2-9087-eba59b656915-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ffe44fec-9121-46b2-9087-eba59b656915\") " pod="openstack/glance-default-external-api-0" Mar 20 10:58:21 crc kubenswrapper[4748]: I0320 10:58:21.671106 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ffe44fec-9121-46b2-9087-eba59b656915-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"ffe44fec-9121-46b2-9087-eba59b656915\") " pod="openstack/glance-default-external-api-0" Mar 20 10:58:21 crc kubenswrapper[4748]: I0320 10:58:21.672269 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ffe44fec-9121-46b2-9087-eba59b656915-scripts\") pod \"glance-default-external-api-0\" (UID: \"ffe44fec-9121-46b2-9087-eba59b656915\") " pod="openstack/glance-default-external-api-0" Mar 20 10:58:21 crc kubenswrapper[4748]: I0320 10:58:21.678428 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ffe44fec-9121-46b2-9087-eba59b656915-config-data\") pod \"glance-default-external-api-0\" (UID: \"ffe44fec-9121-46b2-9087-eba59b656915\") " pod="openstack/glance-default-external-api-0" Mar 20 10:58:21 crc kubenswrapper[4748]: I0320 10:58:21.684297 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xc4hf\" (UniqueName: \"kubernetes.io/projected/ffe44fec-9121-46b2-9087-eba59b656915-kube-api-access-xc4hf\") pod \"glance-default-external-api-0\" (UID: \"ffe44fec-9121-46b2-9087-eba59b656915\") " pod="openstack/glance-default-external-api-0" Mar 20 10:58:21 crc kubenswrapper[4748]: I0320 10:58:21.700752 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"ffe44fec-9121-46b2-9087-eba59b656915\") " pod="openstack/glance-default-external-api-0" Mar 20 10:58:21 crc kubenswrapper[4748]: I0320 10:58:21.706621 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 10:58:22 crc kubenswrapper[4748]: I0320 10:58:22.273650 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 10:58:22 crc kubenswrapper[4748]: W0320 10:58:22.283528 4748 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podffe44fec_9121_46b2_9087_eba59b656915.slice/crio-21b073bf75ea1e6a3ff01b100cc1d4cf134edb35281ba4ae453b16c0de5bfe05 WatchSource:0}: Error finding container 21b073bf75ea1e6a3ff01b100cc1d4cf134edb35281ba4ae453b16c0de5bfe05: Status 404 returned error can't find the container with id 21b073bf75ea1e6a3ff01b100cc1d4cf134edb35281ba4ae453b16c0de5bfe05 Mar 20 10:58:23 crc kubenswrapper[4748]: I0320 10:58:23.267487 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ffe44fec-9121-46b2-9087-eba59b656915","Type":"ContainerStarted","Data":"7bbd6c6e24056748414ab2e2ce3ebe3c543a99af96d4355b3b82eafabcb25a66"} Mar 20 10:58:23 crc kubenswrapper[4748]: I0320 10:58:23.267821 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ffe44fec-9121-46b2-9087-eba59b656915","Type":"ContainerStarted","Data":"21b073bf75ea1e6a3ff01b100cc1d4cf134edb35281ba4ae453b16c0de5bfe05"} Mar 20 10:58:23 crc kubenswrapper[4748]: I0320 10:58:23.270180 4748 generic.go:334] "Generic (PLEG): container finished" podID="0f4ca5c7-6a70-485e-b39d-e786ad0004a5" containerID="df4ecb4636ea9e29b83162f356f15c0a5c472e8b049de36d07077b5bb611b6dc" exitCode=0 Mar 20 10:58:23 crc kubenswrapper[4748]: I0320 10:58:23.270207 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"0f4ca5c7-6a70-485e-b39d-e786ad0004a5","Type":"ContainerDied","Data":"df4ecb4636ea9e29b83162f356f15c0a5c472e8b049de36d07077b5bb611b6dc"} Mar 20 10:58:23 crc kubenswrapper[4748]: I0320 10:58:23.889214 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 10:58:24 crc kubenswrapper[4748]: I0320 10:58:24.022092 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f4ca5c7-6a70-485e-b39d-e786ad0004a5-combined-ca-bundle\") pod \"0f4ca5c7-6a70-485e-b39d-e786ad0004a5\" (UID: \"0f4ca5c7-6a70-485e-b39d-e786ad0004a5\") " Mar 20 10:58:24 crc kubenswrapper[4748]: I0320 10:58:24.023402 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f4ca5c7-6a70-485e-b39d-e786ad0004a5-config-data\") pod \"0f4ca5c7-6a70-485e-b39d-e786ad0004a5\" (UID: \"0f4ca5c7-6a70-485e-b39d-e786ad0004a5\") " Mar 20 10:58:24 crc kubenswrapper[4748]: I0320 10:58:24.023433 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f4ca5c7-6a70-485e-b39d-e786ad0004a5-scripts\") pod \"0f4ca5c7-6a70-485e-b39d-e786ad0004a5\" (UID: \"0f4ca5c7-6a70-485e-b39d-e786ad0004a5\") " Mar 20 10:58:24 crc kubenswrapper[4748]: I0320 10:58:24.023454 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0f4ca5c7-6a70-485e-b39d-e786ad0004a5-logs\") pod \"0f4ca5c7-6a70-485e-b39d-e786ad0004a5\" (UID: \"0f4ca5c7-6a70-485e-b39d-e786ad0004a5\") " Mar 20 10:58:24 crc kubenswrapper[4748]: I0320 10:58:24.023490 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pwfx9\" (UniqueName: \"kubernetes.io/projected/0f4ca5c7-6a70-485e-b39d-e786ad0004a5-kube-api-access-pwfx9\") pod \"0f4ca5c7-6a70-485e-b39d-e786ad0004a5\" (UID: \"0f4ca5c7-6a70-485e-b39d-e786ad0004a5\") " Mar 20 10:58:24 crc kubenswrapper[4748]: I0320 10:58:24.023541 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0f4ca5c7-6a70-485e-b39d-e786ad0004a5-internal-tls-certs\") pod \"0f4ca5c7-6a70-485e-b39d-e786ad0004a5\" (UID: \"0f4ca5c7-6a70-485e-b39d-e786ad0004a5\") " Mar 20 10:58:24 crc kubenswrapper[4748]: I0320 10:58:24.023556 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"0f4ca5c7-6a70-485e-b39d-e786ad0004a5\" (UID: \"0f4ca5c7-6a70-485e-b39d-e786ad0004a5\") " Mar 20 10:58:24 crc kubenswrapper[4748]: I0320 10:58:24.023601 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0f4ca5c7-6a70-485e-b39d-e786ad0004a5-httpd-run\") pod \"0f4ca5c7-6a70-485e-b39d-e786ad0004a5\" (UID: \"0f4ca5c7-6a70-485e-b39d-e786ad0004a5\") " Mar 20 10:58:24 crc kubenswrapper[4748]: I0320 10:58:24.024237 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0f4ca5c7-6a70-485e-b39d-e786ad0004a5-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "0f4ca5c7-6a70-485e-b39d-e786ad0004a5" (UID: "0f4ca5c7-6a70-485e-b39d-e786ad0004a5"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:58:24 crc kubenswrapper[4748]: I0320 10:58:24.032440 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "0f4ca5c7-6a70-485e-b39d-e786ad0004a5" (UID: "0f4ca5c7-6a70-485e-b39d-e786ad0004a5"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 20 10:58:24 crc kubenswrapper[4748]: I0320 10:58:24.032937 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0f4ca5c7-6a70-485e-b39d-e786ad0004a5-logs" (OuterVolumeSpecName: "logs") pod "0f4ca5c7-6a70-485e-b39d-e786ad0004a5" (UID: "0f4ca5c7-6a70-485e-b39d-e786ad0004a5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:58:24 crc kubenswrapper[4748]: I0320 10:58:24.036047 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f4ca5c7-6a70-485e-b39d-e786ad0004a5-kube-api-access-pwfx9" (OuterVolumeSpecName: "kube-api-access-pwfx9") pod "0f4ca5c7-6a70-485e-b39d-e786ad0004a5" (UID: "0f4ca5c7-6a70-485e-b39d-e786ad0004a5"). InnerVolumeSpecName "kube-api-access-pwfx9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:58:24 crc kubenswrapper[4748]: I0320 10:58:24.036726 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f4ca5c7-6a70-485e-b39d-e786ad0004a5-scripts" (OuterVolumeSpecName: "scripts") pod "0f4ca5c7-6a70-485e-b39d-e786ad0004a5" (UID: "0f4ca5c7-6a70-485e-b39d-e786ad0004a5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:58:24 crc kubenswrapper[4748]: I0320 10:58:24.065443 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f4ca5c7-6a70-485e-b39d-e786ad0004a5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0f4ca5c7-6a70-485e-b39d-e786ad0004a5" (UID: "0f4ca5c7-6a70-485e-b39d-e786ad0004a5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:58:24 crc kubenswrapper[4748]: I0320 10:58:24.098888 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f4ca5c7-6a70-485e-b39d-e786ad0004a5-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "0f4ca5c7-6a70-485e-b39d-e786ad0004a5" (UID: "0f4ca5c7-6a70-485e-b39d-e786ad0004a5"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:58:24 crc kubenswrapper[4748]: I0320 10:58:24.125308 4748 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f4ca5c7-6a70-485e-b39d-e786ad0004a5-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 10:58:24 crc kubenswrapper[4748]: I0320 10:58:24.125392 4748 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0f4ca5c7-6a70-485e-b39d-e786ad0004a5-logs\") on node \"crc\" DevicePath \"\"" Mar 20 10:58:24 crc kubenswrapper[4748]: I0320 10:58:24.125407 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pwfx9\" (UniqueName: \"kubernetes.io/projected/0f4ca5c7-6a70-485e-b39d-e786ad0004a5-kube-api-access-pwfx9\") on node \"crc\" DevicePath \"\"" Mar 20 10:58:24 crc kubenswrapper[4748]: I0320 10:58:24.125421 4748 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0f4ca5c7-6a70-485e-b39d-e786ad0004a5-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 10:58:24 crc kubenswrapper[4748]: I0320 10:58:24.125472 4748 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Mar 20 10:58:24 crc kubenswrapper[4748]: I0320 10:58:24.125486 4748 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0f4ca5c7-6a70-485e-b39d-e786ad0004a5-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 20 10:58:24 crc kubenswrapper[4748]: I0320 10:58:24.125497 4748 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f4ca5c7-6a70-485e-b39d-e786ad0004a5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 10:58:24 crc kubenswrapper[4748]: I0320 10:58:24.130846 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f4ca5c7-6a70-485e-b39d-e786ad0004a5-config-data" (OuterVolumeSpecName: "config-data") pod "0f4ca5c7-6a70-485e-b39d-e786ad0004a5" (UID: "0f4ca5c7-6a70-485e-b39d-e786ad0004a5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:58:24 crc kubenswrapper[4748]: I0320 10:58:24.158012 4748 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Mar 20 10:58:24 crc kubenswrapper[4748]: I0320 10:58:24.227186 4748 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f4ca5c7-6a70-485e-b39d-e786ad0004a5-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 10:58:24 crc kubenswrapper[4748]: I0320 10:58:24.227234 4748 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Mar 20 10:58:24 crc kubenswrapper[4748]: I0320 10:58:24.282071 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"0f4ca5c7-6a70-485e-b39d-e786ad0004a5","Type":"ContainerDied","Data":"f3bf85e569ec584f9e22583b8e0f41f19087cd0ff785ae06e648d27836965069"} Mar 20 10:58:24 crc kubenswrapper[4748]: I0320 10:58:24.282152 4748 scope.go:117] "RemoveContainer" containerID="df4ecb4636ea9e29b83162f356f15c0a5c472e8b049de36d07077b5bb611b6dc" Mar 20 10:58:24 crc kubenswrapper[4748]: I0320 10:58:24.282282 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 10:58:24 crc kubenswrapper[4748]: I0320 10:58:24.315623 4748 scope.go:117] "RemoveContainer" containerID="12a28e7df4ad27261901712b424f1439a97b18d7aecab5a7b19754232c5975d5" Mar 20 10:58:24 crc kubenswrapper[4748]: I0320 10:58:24.329304 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 10:58:24 crc kubenswrapper[4748]: I0320 10:58:24.335751 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 10:58:24 crc kubenswrapper[4748]: I0320 10:58:24.367231 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 10:58:24 crc kubenswrapper[4748]: E0320 10:58:24.368262 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f4ca5c7-6a70-485e-b39d-e786ad0004a5" containerName="glance-httpd" Mar 20 10:58:24 crc kubenswrapper[4748]: I0320 10:58:24.368282 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f4ca5c7-6a70-485e-b39d-e786ad0004a5" containerName="glance-httpd" Mar 20 10:58:24 crc kubenswrapper[4748]: E0320 10:58:24.368324 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f4ca5c7-6a70-485e-b39d-e786ad0004a5" containerName="glance-log" Mar 20 10:58:24 crc kubenswrapper[4748]: I0320 10:58:24.368333 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f4ca5c7-6a70-485e-b39d-e786ad0004a5" containerName="glance-log" Mar 20 10:58:24 crc kubenswrapper[4748]: I0320 10:58:24.368817 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f4ca5c7-6a70-485e-b39d-e786ad0004a5" containerName="glance-log" Mar 20 10:58:24 crc kubenswrapper[4748]: I0320 10:58:24.369865 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f4ca5c7-6a70-485e-b39d-e786ad0004a5" containerName="glance-httpd" Mar 20 10:58:24 crc kubenswrapper[4748]: I0320 10:58:24.372229 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 10:58:24 crc kubenswrapper[4748]: I0320 10:58:24.376734 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 20 10:58:24 crc kubenswrapper[4748]: I0320 10:58:24.377030 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 20 10:58:24 crc kubenswrapper[4748]: I0320 10:58:24.413358 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 10:58:24 crc kubenswrapper[4748]: I0320 10:58:24.535949 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f70c4bdb-308f-485c-9f2b-388e135bdfc9-config-data\") pod \"glance-default-internal-api-0\" (UID: \"f70c4bdb-308f-485c-9f2b-388e135bdfc9\") " pod="openstack/glance-default-internal-api-0" Mar 20 10:58:24 crc kubenswrapper[4748]: I0320 10:58:24.536030 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f70c4bdb-308f-485c-9f2b-388e135bdfc9-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"f70c4bdb-308f-485c-9f2b-388e135bdfc9\") " pod="openstack/glance-default-internal-api-0" Mar 20 10:58:24 crc kubenswrapper[4748]: I0320 10:58:24.536162 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f70c4bdb-308f-485c-9f2b-388e135bdfc9-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"f70c4bdb-308f-485c-9f2b-388e135bdfc9\") " pod="openstack/glance-default-internal-api-0" Mar 20 10:58:24 crc kubenswrapper[4748]: I0320 10:58:24.536225 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f70c4bdb-308f-485c-9f2b-388e135bdfc9-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"f70c4bdb-308f-485c-9f2b-388e135bdfc9\") " pod="openstack/glance-default-internal-api-0" Mar 20 10:58:24 crc kubenswrapper[4748]: I0320 10:58:24.536285 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f70c4bdb-308f-485c-9f2b-388e135bdfc9-logs\") pod \"glance-default-internal-api-0\" (UID: \"f70c4bdb-308f-485c-9f2b-388e135bdfc9\") " pod="openstack/glance-default-internal-api-0" Mar 20 10:58:24 crc kubenswrapper[4748]: I0320 10:58:24.536304 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f70c4bdb-308f-485c-9f2b-388e135bdfc9-scripts\") pod \"glance-default-internal-api-0\" (UID: \"f70c4bdb-308f-485c-9f2b-388e135bdfc9\") " pod="openstack/glance-default-internal-api-0" Mar 20 10:58:24 crc kubenswrapper[4748]: I0320 10:58:24.536359 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"f70c4bdb-308f-485c-9f2b-388e135bdfc9\") " pod="openstack/glance-default-internal-api-0" Mar 20 10:58:24 crc kubenswrapper[4748]: I0320 10:58:24.536436 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qknfs\" (UniqueName: \"kubernetes.io/projected/f70c4bdb-308f-485c-9f2b-388e135bdfc9-kube-api-access-qknfs\") pod \"glance-default-internal-api-0\" (UID: \"f70c4bdb-308f-485c-9f2b-388e135bdfc9\") " pod="openstack/glance-default-internal-api-0" Mar 20 10:58:24 crc kubenswrapper[4748]: I0320 10:58:24.637552 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f70c4bdb-308f-485c-9f2b-388e135bdfc9-logs\") pod \"glance-default-internal-api-0\" (UID: \"f70c4bdb-308f-485c-9f2b-388e135bdfc9\") " pod="openstack/glance-default-internal-api-0" Mar 20 10:58:24 crc kubenswrapper[4748]: I0320 10:58:24.637590 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f70c4bdb-308f-485c-9f2b-388e135bdfc9-scripts\") pod \"glance-default-internal-api-0\" (UID: \"f70c4bdb-308f-485c-9f2b-388e135bdfc9\") " pod="openstack/glance-default-internal-api-0" Mar 20 10:58:24 crc kubenswrapper[4748]: I0320 10:58:24.637637 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"f70c4bdb-308f-485c-9f2b-388e135bdfc9\") " pod="openstack/glance-default-internal-api-0" Mar 20 10:58:24 crc kubenswrapper[4748]: I0320 10:58:24.637720 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qknfs\" (UniqueName: \"kubernetes.io/projected/f70c4bdb-308f-485c-9f2b-388e135bdfc9-kube-api-access-qknfs\") pod \"glance-default-internal-api-0\" (UID: \"f70c4bdb-308f-485c-9f2b-388e135bdfc9\") " pod="openstack/glance-default-internal-api-0" Mar 20 10:58:24 crc kubenswrapper[4748]: I0320 10:58:24.637801 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f70c4bdb-308f-485c-9f2b-388e135bdfc9-config-data\") pod \"glance-default-internal-api-0\" (UID: \"f70c4bdb-308f-485c-9f2b-388e135bdfc9\") " pod="openstack/glance-default-internal-api-0" Mar 20 10:58:24 crc kubenswrapper[4748]: I0320 10:58:24.637823 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f70c4bdb-308f-485c-9f2b-388e135bdfc9-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"f70c4bdb-308f-485c-9f2b-388e135bdfc9\") " pod="openstack/glance-default-internal-api-0" Mar 20 10:58:24 crc kubenswrapper[4748]: I0320 10:58:24.637863 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f70c4bdb-308f-485c-9f2b-388e135bdfc9-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"f70c4bdb-308f-485c-9f2b-388e135bdfc9\") " pod="openstack/glance-default-internal-api-0" Mar 20 10:58:24 crc kubenswrapper[4748]: I0320 10:58:24.637879 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f70c4bdb-308f-485c-9f2b-388e135bdfc9-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"f70c4bdb-308f-485c-9f2b-388e135bdfc9\") " pod="openstack/glance-default-internal-api-0" Mar 20 10:58:24 crc kubenswrapper[4748]: I0320 10:58:24.639584 4748 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"f70c4bdb-308f-485c-9f2b-388e135bdfc9\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-internal-api-0" Mar 20 10:58:24 crc kubenswrapper[4748]: I0320 10:58:24.639649 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f70c4bdb-308f-485c-9f2b-388e135bdfc9-logs\") pod \"glance-default-internal-api-0\" (UID: \"f70c4bdb-308f-485c-9f2b-388e135bdfc9\") " pod="openstack/glance-default-internal-api-0" Mar 20 10:58:24 crc kubenswrapper[4748]: I0320 10:58:24.640291 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f70c4bdb-308f-485c-9f2b-388e135bdfc9-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"f70c4bdb-308f-485c-9f2b-388e135bdfc9\") " pod="openstack/glance-default-internal-api-0" Mar 20 10:58:24 crc kubenswrapper[4748]: I0320 10:58:24.645327 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f70c4bdb-308f-485c-9f2b-388e135bdfc9-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"f70c4bdb-308f-485c-9f2b-388e135bdfc9\") " pod="openstack/glance-default-internal-api-0" Mar 20 10:58:24 crc kubenswrapper[4748]: I0320 10:58:24.647042 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f70c4bdb-308f-485c-9f2b-388e135bdfc9-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"f70c4bdb-308f-485c-9f2b-388e135bdfc9\") " pod="openstack/glance-default-internal-api-0" Mar 20 10:58:24 crc kubenswrapper[4748]: I0320 10:58:24.647918 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f70c4bdb-308f-485c-9f2b-388e135bdfc9-config-data\") pod \"glance-default-internal-api-0\" (UID: \"f70c4bdb-308f-485c-9f2b-388e135bdfc9\") " pod="openstack/glance-default-internal-api-0" Mar 20 10:58:24 crc kubenswrapper[4748]: I0320 10:58:24.648862 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f70c4bdb-308f-485c-9f2b-388e135bdfc9-scripts\") pod \"glance-default-internal-api-0\" (UID: \"f70c4bdb-308f-485c-9f2b-388e135bdfc9\") " pod="openstack/glance-default-internal-api-0" Mar 20 10:58:24 crc kubenswrapper[4748]: I0320 10:58:24.664359 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qknfs\" (UniqueName: \"kubernetes.io/projected/f70c4bdb-308f-485c-9f2b-388e135bdfc9-kube-api-access-qknfs\") pod \"glance-default-internal-api-0\" (UID: \"f70c4bdb-308f-485c-9f2b-388e135bdfc9\") " pod="openstack/glance-default-internal-api-0" Mar 20 10:58:24 crc kubenswrapper[4748]: I0320 10:58:24.671632 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"f70c4bdb-308f-485c-9f2b-388e135bdfc9\") " pod="openstack/glance-default-internal-api-0" Mar 20 10:58:24 crc kubenswrapper[4748]: I0320 10:58:24.741178 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 10:58:25 crc kubenswrapper[4748]: I0320 10:58:25.323484 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ffe44fec-9121-46b2-9087-eba59b656915","Type":"ContainerStarted","Data":"efa46b523c646969bb76ad7bb0ebb37e999b428dde234c8b95b4b2ddab5489f9"} Mar 20 10:58:25 crc kubenswrapper[4748]: I0320 10:58:25.349660 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.349633743 podStartE2EDuration="4.349633743s" podCreationTimestamp="2026-03-20 10:58:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:25.341908019 +0000 UTC m=+1340.483453843" watchObservedRunningTime="2026-03-20 10:58:25.349633743 +0000 UTC m=+1340.491179557" Mar 20 10:58:25 crc kubenswrapper[4748]: I0320 10:58:25.378278 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 10:58:25 crc kubenswrapper[4748]: W0320 10:58:25.382663 4748 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf70c4bdb_308f_485c_9f2b_388e135bdfc9.slice/crio-2a0e51b2e12fc185f18814c861c75b586d2d812bb77893412077e2f4f282dc53 WatchSource:0}: Error finding container 2a0e51b2e12fc185f18814c861c75b586d2d812bb77893412077e2f4f282dc53: Status 404 returned error can't find the container with id 2a0e51b2e12fc185f18814c861c75b586d2d812bb77893412077e2f4f282dc53 Mar 20 10:58:25 crc kubenswrapper[4748]: I0320 10:58:25.536232 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f4ca5c7-6a70-485e-b39d-e786ad0004a5" path="/var/lib/kubelet/pods/0f4ca5c7-6a70-485e-b39d-e786ad0004a5/volumes" Mar 20 10:58:26 crc kubenswrapper[4748]: I0320 10:58:26.337565 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f70c4bdb-308f-485c-9f2b-388e135bdfc9","Type":"ContainerStarted","Data":"caa244dee2d8426619fc8c1c0bd263a14d9ef995c4af2b3d79ca596ad453362a"} Mar 20 10:58:26 crc kubenswrapper[4748]: I0320 10:58:26.338712 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f70c4bdb-308f-485c-9f2b-388e135bdfc9","Type":"ContainerStarted","Data":"2a0e51b2e12fc185f18814c861c75b586d2d812bb77893412077e2f4f282dc53"} Mar 20 10:58:27 crc kubenswrapper[4748]: I0320 10:58:27.368676 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f70c4bdb-308f-485c-9f2b-388e135bdfc9","Type":"ContainerStarted","Data":"7b6eef07df2d748e86402a627b995069bf2d3bad57c03b439a9dbca2f77eaeb4"} Mar 20 10:58:27 crc kubenswrapper[4748]: I0320 10:58:27.395453 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.395427362 podStartE2EDuration="3.395427362s" podCreationTimestamp="2026-03-20 10:58:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:27.393225267 +0000 UTC m=+1342.534771091" watchObservedRunningTime="2026-03-20 10:58:27.395427362 +0000 UTC m=+1342.536973176" Mar 20 10:58:30 crc kubenswrapper[4748]: I0320 10:58:30.400379 4748 generic.go:334] "Generic (PLEG): container finished" podID="246fb9c9-9e51-4acc-b93b-a630b9d65d6d" containerID="7df4a7959b03db15ccbc3d1cd86e1cd6e8de9867705181272a1e3ae10b2473dc" exitCode=0 Mar 20 10:58:30 crc kubenswrapper[4748]: I0320 10:58:30.400536 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"246fb9c9-9e51-4acc-b93b-a630b9d65d6d","Type":"ContainerDied","Data":"7df4a7959b03db15ccbc3d1cd86e1cd6e8de9867705181272a1e3ae10b2473dc"} Mar 20 10:58:30 crc kubenswrapper[4748]: I0320 10:58:30.402206 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"246fb9c9-9e51-4acc-b93b-a630b9d65d6d","Type":"ContainerDied","Data":"cfe3990da9af2edf9b0906f89dd103a3625ff74779238f50081f8fd8c6871de1"} Mar 20 10:58:30 crc kubenswrapper[4748]: I0320 10:58:30.402338 4748 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cfe3990da9af2edf9b0906f89dd103a3625ff74779238f50081f8fd8c6871de1" Mar 20 10:58:30 crc kubenswrapper[4748]: I0320 10:58:30.465140 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 10:58:30 crc kubenswrapper[4748]: I0320 10:58:30.552777 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/246fb9c9-9e51-4acc-b93b-a630b9d65d6d-log-httpd\") pod \"246fb9c9-9e51-4acc-b93b-a630b9d65d6d\" (UID: \"246fb9c9-9e51-4acc-b93b-a630b9d65d6d\") " Mar 20 10:58:30 crc kubenswrapper[4748]: I0320 10:58:30.552928 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/246fb9c9-9e51-4acc-b93b-a630b9d65d6d-combined-ca-bundle\") pod \"246fb9c9-9e51-4acc-b93b-a630b9d65d6d\" (UID: \"246fb9c9-9e51-4acc-b93b-a630b9d65d6d\") " Mar 20 10:58:30 crc kubenswrapper[4748]: I0320 10:58:30.552985 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/246fb9c9-9e51-4acc-b93b-a630b9d65d6d-sg-core-conf-yaml\") pod \"246fb9c9-9e51-4acc-b93b-a630b9d65d6d\" (UID: \"246fb9c9-9e51-4acc-b93b-a630b9d65d6d\") " Mar 20 10:58:30 crc kubenswrapper[4748]: I0320 10:58:30.553056 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/246fb9c9-9e51-4acc-b93b-a630b9d65d6d-scripts\") pod \"246fb9c9-9e51-4acc-b93b-a630b9d65d6d\" (UID: \"246fb9c9-9e51-4acc-b93b-a630b9d65d6d\") " Mar 20 10:58:30 crc kubenswrapper[4748]: I0320 10:58:30.553085 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/246fb9c9-9e51-4acc-b93b-a630b9d65d6d-config-data\") pod \"246fb9c9-9e51-4acc-b93b-a630b9d65d6d\" (UID: \"246fb9c9-9e51-4acc-b93b-a630b9d65d6d\") " Mar 20 10:58:30 crc kubenswrapper[4748]: I0320 10:58:30.553191 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/246fb9c9-9e51-4acc-b93b-a630b9d65d6d-run-httpd\") pod \"246fb9c9-9e51-4acc-b93b-a630b9d65d6d\" (UID: \"246fb9c9-9e51-4acc-b93b-a630b9d65d6d\") " Mar 20 10:58:30 crc kubenswrapper[4748]: I0320 10:58:30.553250 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c8c6w\" (UniqueName: \"kubernetes.io/projected/246fb9c9-9e51-4acc-b93b-a630b9d65d6d-kube-api-access-c8c6w\") pod \"246fb9c9-9e51-4acc-b93b-a630b9d65d6d\" (UID: \"246fb9c9-9e51-4acc-b93b-a630b9d65d6d\") " Mar 20 10:58:30 crc kubenswrapper[4748]: I0320 10:58:30.554149 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/246fb9c9-9e51-4acc-b93b-a630b9d65d6d-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "246fb9c9-9e51-4acc-b93b-a630b9d65d6d" (UID: "246fb9c9-9e51-4acc-b93b-a630b9d65d6d"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:58:30 crc kubenswrapper[4748]: I0320 10:58:30.554998 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/246fb9c9-9e51-4acc-b93b-a630b9d65d6d-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "246fb9c9-9e51-4acc-b93b-a630b9d65d6d" (UID: "246fb9c9-9e51-4acc-b93b-a630b9d65d6d"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:58:30 crc kubenswrapper[4748]: I0320 10:58:30.566097 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/246fb9c9-9e51-4acc-b93b-a630b9d65d6d-kube-api-access-c8c6w" (OuterVolumeSpecName: "kube-api-access-c8c6w") pod "246fb9c9-9e51-4acc-b93b-a630b9d65d6d" (UID: "246fb9c9-9e51-4acc-b93b-a630b9d65d6d"). InnerVolumeSpecName "kube-api-access-c8c6w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:58:30 crc kubenswrapper[4748]: I0320 10:58:30.589775 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/246fb9c9-9e51-4acc-b93b-a630b9d65d6d-scripts" (OuterVolumeSpecName: "scripts") pod "246fb9c9-9e51-4acc-b93b-a630b9d65d6d" (UID: "246fb9c9-9e51-4acc-b93b-a630b9d65d6d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:58:30 crc kubenswrapper[4748]: I0320 10:58:30.593609 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/246fb9c9-9e51-4acc-b93b-a630b9d65d6d-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "246fb9c9-9e51-4acc-b93b-a630b9d65d6d" (UID: "246fb9c9-9e51-4acc-b93b-a630b9d65d6d"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:58:30 crc kubenswrapper[4748]: I0320 10:58:30.651972 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/246fb9c9-9e51-4acc-b93b-a630b9d65d6d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "246fb9c9-9e51-4acc-b93b-a630b9d65d6d" (UID: "246fb9c9-9e51-4acc-b93b-a630b9d65d6d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:58:30 crc kubenswrapper[4748]: I0320 10:58:30.655374 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c8c6w\" (UniqueName: \"kubernetes.io/projected/246fb9c9-9e51-4acc-b93b-a630b9d65d6d-kube-api-access-c8c6w\") on node \"crc\" DevicePath \"\"" Mar 20 10:58:30 crc kubenswrapper[4748]: I0320 10:58:30.655414 4748 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/246fb9c9-9e51-4acc-b93b-a630b9d65d6d-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 10:58:30 crc kubenswrapper[4748]: I0320 10:58:30.655431 4748 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/246fb9c9-9e51-4acc-b93b-a630b9d65d6d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 10:58:30 crc kubenswrapper[4748]: I0320 10:58:30.655446 4748 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/246fb9c9-9e51-4acc-b93b-a630b9d65d6d-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 20 10:58:30 crc kubenswrapper[4748]: I0320 10:58:30.655459 4748 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/246fb9c9-9e51-4acc-b93b-a630b9d65d6d-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 10:58:30 crc kubenswrapper[4748]: I0320 10:58:30.655471 4748 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/246fb9c9-9e51-4acc-b93b-a630b9d65d6d-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 10:58:30 crc kubenswrapper[4748]: I0320 10:58:30.678171 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/246fb9c9-9e51-4acc-b93b-a630b9d65d6d-config-data" (OuterVolumeSpecName: "config-data") pod "246fb9c9-9e51-4acc-b93b-a630b9d65d6d" (UID: "246fb9c9-9e51-4acc-b93b-a630b9d65d6d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:58:30 crc kubenswrapper[4748]: I0320 10:58:30.757031 4748 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/246fb9c9-9e51-4acc-b93b-a630b9d65d6d-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 10:58:31 crc kubenswrapper[4748]: I0320 10:58:31.410539 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 10:58:31 crc kubenswrapper[4748]: I0320 10:58:31.448253 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 10:58:31 crc kubenswrapper[4748]: I0320 10:58:31.459995 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 20 10:58:31 crc kubenswrapper[4748]: I0320 10:58:31.469161 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 10:58:31 crc kubenswrapper[4748]: E0320 10:58:31.469633 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="246fb9c9-9e51-4acc-b93b-a630b9d65d6d" containerName="ceilometer-notification-agent" Mar 20 10:58:31 crc kubenswrapper[4748]: I0320 10:58:31.469664 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="246fb9c9-9e51-4acc-b93b-a630b9d65d6d" containerName="ceilometer-notification-agent" Mar 20 10:58:31 crc kubenswrapper[4748]: E0320 10:58:31.469701 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="246fb9c9-9e51-4acc-b93b-a630b9d65d6d" containerName="ceilometer-central-agent" Mar 20 10:58:31 crc kubenswrapper[4748]: I0320 10:58:31.469710 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="246fb9c9-9e51-4acc-b93b-a630b9d65d6d" containerName="ceilometer-central-agent" Mar 20 10:58:31 crc kubenswrapper[4748]: E0320 10:58:31.469724 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="246fb9c9-9e51-4acc-b93b-a630b9d65d6d" containerName="proxy-httpd" Mar 20 10:58:31 crc kubenswrapper[4748]: I0320 10:58:31.469731 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="246fb9c9-9e51-4acc-b93b-a630b9d65d6d" containerName="proxy-httpd" Mar 20 10:58:31 crc kubenswrapper[4748]: E0320 10:58:31.469750 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="246fb9c9-9e51-4acc-b93b-a630b9d65d6d" containerName="sg-core" Mar 20 10:58:31 crc kubenswrapper[4748]: I0320 10:58:31.469758 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="246fb9c9-9e51-4acc-b93b-a630b9d65d6d" containerName="sg-core" Mar 20 10:58:31 crc kubenswrapper[4748]: I0320 10:58:31.469966 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="246fb9c9-9e51-4acc-b93b-a630b9d65d6d" containerName="sg-core" Mar 20 10:58:31 crc kubenswrapper[4748]: I0320 10:58:31.469988 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="246fb9c9-9e51-4acc-b93b-a630b9d65d6d" containerName="proxy-httpd" Mar 20 10:58:31 crc kubenswrapper[4748]: I0320 10:58:31.469999 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="246fb9c9-9e51-4acc-b93b-a630b9d65d6d" containerName="ceilometer-notification-agent" Mar 20 10:58:31 crc kubenswrapper[4748]: I0320 10:58:31.470013 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="246fb9c9-9e51-4acc-b93b-a630b9d65d6d" containerName="ceilometer-central-agent" Mar 20 10:58:31 crc kubenswrapper[4748]: I0320 10:58:31.482699 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 10:58:31 crc kubenswrapper[4748]: I0320 10:58:31.486184 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 10:58:31 crc kubenswrapper[4748]: I0320 10:58:31.487395 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 10:58:31 crc kubenswrapper[4748]: I0320 10:58:31.496188 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 10:58:31 crc kubenswrapper[4748]: I0320 10:58:31.551693 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="246fb9c9-9e51-4acc-b93b-a630b9d65d6d" path="/var/lib/kubelet/pods/246fb9c9-9e51-4acc-b93b-a630b9d65d6d/volumes" Mar 20 10:58:31 crc kubenswrapper[4748]: I0320 10:58:31.572247 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0c9d3443-1243-4415-a0fe-747695b73aa4-log-httpd\") pod \"ceilometer-0\" (UID: \"0c9d3443-1243-4415-a0fe-747695b73aa4\") " pod="openstack/ceilometer-0" Mar 20 10:58:31 crc kubenswrapper[4748]: I0320 10:58:31.572349 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c9d3443-1243-4415-a0fe-747695b73aa4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0c9d3443-1243-4415-a0fe-747695b73aa4\") " pod="openstack/ceilometer-0" Mar 20 10:58:31 crc kubenswrapper[4748]: I0320 10:58:31.572458 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rddnp\" (UniqueName: \"kubernetes.io/projected/0c9d3443-1243-4415-a0fe-747695b73aa4-kube-api-access-rddnp\") pod \"ceilometer-0\" (UID: \"0c9d3443-1243-4415-a0fe-747695b73aa4\") " pod="openstack/ceilometer-0" Mar 20 10:58:31 crc kubenswrapper[4748]: I0320 10:58:31.573334 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0c9d3443-1243-4415-a0fe-747695b73aa4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0c9d3443-1243-4415-a0fe-747695b73aa4\") " pod="openstack/ceilometer-0" Mar 20 10:58:31 crc kubenswrapper[4748]: I0320 10:58:31.573435 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0c9d3443-1243-4415-a0fe-747695b73aa4-scripts\") pod \"ceilometer-0\" (UID: \"0c9d3443-1243-4415-a0fe-747695b73aa4\") " pod="openstack/ceilometer-0" Mar 20 10:58:31 crc kubenswrapper[4748]: I0320 10:58:31.573466 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c9d3443-1243-4415-a0fe-747695b73aa4-config-data\") pod \"ceilometer-0\" (UID: \"0c9d3443-1243-4415-a0fe-747695b73aa4\") " pod="openstack/ceilometer-0" Mar 20 10:58:31 crc kubenswrapper[4748]: I0320 10:58:31.573499 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0c9d3443-1243-4415-a0fe-747695b73aa4-run-httpd\") pod \"ceilometer-0\" (UID: \"0c9d3443-1243-4415-a0fe-747695b73aa4\") " pod="openstack/ceilometer-0" Mar 20 10:58:31 crc kubenswrapper[4748]: I0320 10:58:31.674994 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rddnp\" (UniqueName: \"kubernetes.io/projected/0c9d3443-1243-4415-a0fe-747695b73aa4-kube-api-access-rddnp\") pod \"ceilometer-0\" (UID: \"0c9d3443-1243-4415-a0fe-747695b73aa4\") " pod="openstack/ceilometer-0" Mar 20 10:58:31 crc kubenswrapper[4748]: I0320 10:58:31.675054 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0c9d3443-1243-4415-a0fe-747695b73aa4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0c9d3443-1243-4415-a0fe-747695b73aa4\") " pod="openstack/ceilometer-0" Mar 20 10:58:31 crc kubenswrapper[4748]: I0320 10:58:31.675098 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0c9d3443-1243-4415-a0fe-747695b73aa4-scripts\") pod \"ceilometer-0\" (UID: \"0c9d3443-1243-4415-a0fe-747695b73aa4\") " pod="openstack/ceilometer-0" Mar 20 10:58:31 crc kubenswrapper[4748]: I0320 10:58:31.675114 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c9d3443-1243-4415-a0fe-747695b73aa4-config-data\") pod \"ceilometer-0\" (UID: \"0c9d3443-1243-4415-a0fe-747695b73aa4\") " pod="openstack/ceilometer-0" Mar 20 10:58:31 crc kubenswrapper[4748]: I0320 10:58:31.675140 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0c9d3443-1243-4415-a0fe-747695b73aa4-run-httpd\") pod \"ceilometer-0\" (UID: \"0c9d3443-1243-4415-a0fe-747695b73aa4\") " pod="openstack/ceilometer-0" Mar 20 10:58:31 crc kubenswrapper[4748]: I0320 10:58:31.675232 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0c9d3443-1243-4415-a0fe-747695b73aa4-log-httpd\") pod \"ceilometer-0\" (UID: \"0c9d3443-1243-4415-a0fe-747695b73aa4\") " pod="openstack/ceilometer-0" Mar 20 10:58:31 crc kubenswrapper[4748]: I0320 10:58:31.675280 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c9d3443-1243-4415-a0fe-747695b73aa4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0c9d3443-1243-4415-a0fe-747695b73aa4\") " pod="openstack/ceilometer-0" Mar 20 10:58:31 crc kubenswrapper[4748]: I0320 10:58:31.676079 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0c9d3443-1243-4415-a0fe-747695b73aa4-log-httpd\") pod \"ceilometer-0\" (UID: \"0c9d3443-1243-4415-a0fe-747695b73aa4\") " pod="openstack/ceilometer-0" Mar 20 10:58:31 crc kubenswrapper[4748]: I0320 10:58:31.676271 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0c9d3443-1243-4415-a0fe-747695b73aa4-run-httpd\") pod \"ceilometer-0\" (UID: \"0c9d3443-1243-4415-a0fe-747695b73aa4\") " pod="openstack/ceilometer-0" Mar 20 10:58:31 crc kubenswrapper[4748]: I0320 10:58:31.680748 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0c9d3443-1243-4415-a0fe-747695b73aa4-scripts\") pod \"ceilometer-0\" (UID: \"0c9d3443-1243-4415-a0fe-747695b73aa4\") " pod="openstack/ceilometer-0" Mar 20 10:58:31 crc kubenswrapper[4748]: I0320 10:58:31.680883 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0c9d3443-1243-4415-a0fe-747695b73aa4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0c9d3443-1243-4415-a0fe-747695b73aa4\") " pod="openstack/ceilometer-0" Mar 20 10:58:31 crc kubenswrapper[4748]: I0320 10:58:31.688762 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c9d3443-1243-4415-a0fe-747695b73aa4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0c9d3443-1243-4415-a0fe-747695b73aa4\") " pod="openstack/ceilometer-0" Mar 20 10:58:31 crc kubenswrapper[4748]: I0320 10:58:31.689146 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c9d3443-1243-4415-a0fe-747695b73aa4-config-data\") pod \"ceilometer-0\" (UID: \"0c9d3443-1243-4415-a0fe-747695b73aa4\") " pod="openstack/ceilometer-0" Mar 20 10:58:31 crc kubenswrapper[4748]: I0320 10:58:31.695684 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rddnp\" (UniqueName: \"kubernetes.io/projected/0c9d3443-1243-4415-a0fe-747695b73aa4-kube-api-access-rddnp\") pod \"ceilometer-0\" (UID: \"0c9d3443-1243-4415-a0fe-747695b73aa4\") " pod="openstack/ceilometer-0" Mar 20 10:58:31 crc kubenswrapper[4748]: I0320 10:58:31.707419 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 20 10:58:31 crc kubenswrapper[4748]: I0320 10:58:31.709302 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 20 10:58:31 crc kubenswrapper[4748]: I0320 10:58:31.754645 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 20 10:58:31 crc kubenswrapper[4748]: I0320 10:58:31.755156 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 20 10:58:31 crc kubenswrapper[4748]: I0320 10:58:31.815587 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 10:58:32 crc kubenswrapper[4748]: I0320 10:58:32.317351 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 10:58:32 crc kubenswrapper[4748]: I0320 10:58:32.420749 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0c9d3443-1243-4415-a0fe-747695b73aa4","Type":"ContainerStarted","Data":"895665576deb408245e06e0e474d1d41dd11e139772f6d93ea44e63822923789"} Mar 20 10:58:32 crc kubenswrapper[4748]: I0320 10:58:32.421023 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 20 10:58:32 crc kubenswrapper[4748]: I0320 10:58:32.421048 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 20 10:58:33 crc kubenswrapper[4748]: I0320 10:58:33.431189 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0c9d3443-1243-4415-a0fe-747695b73aa4","Type":"ContainerStarted","Data":"160c93f583183a2c237e35815bfbc7537348f3742cb2a528ef8c6bbdf2f0bdab"} Mar 20 10:58:34 crc kubenswrapper[4748]: I0320 10:58:34.505953 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 20 10:58:34 crc kubenswrapper[4748]: I0320 10:58:34.506459 4748 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 10:58:34 crc kubenswrapper[4748]: I0320 10:58:34.562514 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 20 10:58:34 crc kubenswrapper[4748]: I0320 10:58:34.742730 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 20 10:58:34 crc kubenswrapper[4748]: I0320 10:58:34.744657 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 20 10:58:34 crc kubenswrapper[4748]: I0320 10:58:34.813017 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 20 10:58:34 crc kubenswrapper[4748]: I0320 10:58:34.814377 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 20 10:58:35 crc kubenswrapper[4748]: I0320 10:58:35.451195 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0c9d3443-1243-4415-a0fe-747695b73aa4","Type":"ContainerStarted","Data":"7829ba8789365041b79bc2d828c39594d8a6d5d4aadd7880de8d298b87680499"} Mar 20 10:58:35 crc kubenswrapper[4748]: I0320 10:58:35.451502 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 20 10:58:35 crc kubenswrapper[4748]: I0320 10:58:35.451941 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 20 10:58:36 crc kubenswrapper[4748]: I0320 10:58:36.464759 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0c9d3443-1243-4415-a0fe-747695b73aa4","Type":"ContainerStarted","Data":"fd9aa1368ea7942f8510cc92d750f198adc44655f2573edb5b8deda50d649c2a"} Mar 20 10:58:37 crc kubenswrapper[4748]: I0320 10:58:37.473422 4748 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 10:58:37 crc kubenswrapper[4748]: I0320 10:58:37.474743 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 20 10:58:37 crc kubenswrapper[4748]: I0320 10:58:37.475280 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 20 10:58:38 crc kubenswrapper[4748]: I0320 10:58:38.484772 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0c9d3443-1243-4415-a0fe-747695b73aa4","Type":"ContainerStarted","Data":"cfabfd00d8ac8ab3af11eab8c1a1eef9a80078ebe7308993892e26437662eebf"} Mar 20 10:58:38 crc kubenswrapper[4748]: I0320 10:58:38.485423 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 20 10:58:38 crc kubenswrapper[4748]: I0320 10:58:38.513129 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.684973674 podStartE2EDuration="7.513107901s" podCreationTimestamp="2026-03-20 10:58:31 +0000 UTC" firstStartedPulling="2026-03-20 10:58:32.328328324 +0000 UTC m=+1347.469874138" lastFinishedPulling="2026-03-20 10:58:38.156462551 +0000 UTC m=+1353.298008365" observedRunningTime="2026-03-20 10:58:38.505565552 +0000 UTC m=+1353.647111396" watchObservedRunningTime="2026-03-20 10:58:38.513107901 +0000 UTC m=+1353.654653715" Mar 20 10:58:42 crc kubenswrapper[4748]: I0320 10:58:42.929052 4748 patch_prober.go:28] interesting pod/machine-config-daemon-5lbvz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 10:58:42 crc kubenswrapper[4748]: I0320 10:58:42.930917 4748 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 10:58:42 crc kubenswrapper[4748]: I0320 10:58:42.931044 4748 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" Mar 20 10:58:42 crc kubenswrapper[4748]: I0320 10:58:42.931848 4748 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5702c4811be941554197075836f07c222a1d80d3d9c15e6ccc7b992ee69ce82c"} pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 10:58:42 crc kubenswrapper[4748]: I0320 10:58:42.931991 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" containerName="machine-config-daemon" containerID="cri-o://5702c4811be941554197075836f07c222a1d80d3d9c15e6ccc7b992ee69ce82c" gracePeriod=600 Mar 20 10:58:43 crc kubenswrapper[4748]: I0320 10:58:43.525178 4748 generic.go:334] "Generic (PLEG): container finished" podID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" containerID="5702c4811be941554197075836f07c222a1d80d3d9c15e6ccc7b992ee69ce82c" exitCode=0 Mar 20 10:58:43 crc kubenswrapper[4748]: I0320 10:58:43.526062 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" event={"ID":"8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c","Type":"ContainerDied","Data":"5702c4811be941554197075836f07c222a1d80d3d9c15e6ccc7b992ee69ce82c"} Mar 20 10:58:43 crc kubenswrapper[4748]: I0320 10:58:43.526104 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" event={"ID":"8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c","Type":"ContainerStarted","Data":"ec8318a59f2a0dcbdd19ff1535aafa2664120c1ab98ee7f0ce82eda8b7b3e371"} Mar 20 10:58:43 crc kubenswrapper[4748]: I0320 10:58:43.526123 4748 scope.go:117] "RemoveContainer" containerID="c1253b5c73f9d1fbdb9567d5ed33468b56df43e62dd82201c361a89f824870ce" Mar 20 10:58:49 crc kubenswrapper[4748]: I0320 10:58:49.584754 4748 generic.go:334] "Generic (PLEG): container finished" podID="a154fa7a-7ef7-4c8b-ac86-b0508e4c1cb9" containerID="5a49b5d8f18000cabf840c3c9026fdb78f1a5f83b9c6cce775ffeaa423653f08" exitCode=0 Mar 20 10:58:49 crc kubenswrapper[4748]: I0320 10:58:49.585000 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-bs7cs" event={"ID":"a154fa7a-7ef7-4c8b-ac86-b0508e4c1cb9","Type":"ContainerDied","Data":"5a49b5d8f18000cabf840c3c9026fdb78f1a5f83b9c6cce775ffeaa423653f08"} Mar 20 10:58:50 crc kubenswrapper[4748]: I0320 10:58:50.867386 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-bs7cs" Mar 20 10:58:51 crc kubenswrapper[4748]: I0320 10:58:51.024814 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a154fa7a-7ef7-4c8b-ac86-b0508e4c1cb9-scripts\") pod \"a154fa7a-7ef7-4c8b-ac86-b0508e4c1cb9\" (UID: \"a154fa7a-7ef7-4c8b-ac86-b0508e4c1cb9\") " Mar 20 10:58:51 crc kubenswrapper[4748]: I0320 10:58:51.024890 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a154fa7a-7ef7-4c8b-ac86-b0508e4c1cb9-config-data\") pod \"a154fa7a-7ef7-4c8b-ac86-b0508e4c1cb9\" (UID: \"a154fa7a-7ef7-4c8b-ac86-b0508e4c1cb9\") " Mar 20 10:58:51 crc kubenswrapper[4748]: I0320 10:58:51.024965 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a154fa7a-7ef7-4c8b-ac86-b0508e4c1cb9-combined-ca-bundle\") pod \"a154fa7a-7ef7-4c8b-ac86-b0508e4c1cb9\" (UID: \"a154fa7a-7ef7-4c8b-ac86-b0508e4c1cb9\") " Mar 20 10:58:51 crc kubenswrapper[4748]: I0320 10:58:51.025010 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fd92x\" (UniqueName: \"kubernetes.io/projected/a154fa7a-7ef7-4c8b-ac86-b0508e4c1cb9-kube-api-access-fd92x\") pod \"a154fa7a-7ef7-4c8b-ac86-b0508e4c1cb9\" (UID: \"a154fa7a-7ef7-4c8b-ac86-b0508e4c1cb9\") " Mar 20 10:58:51 crc kubenswrapper[4748]: I0320 10:58:51.030648 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a154fa7a-7ef7-4c8b-ac86-b0508e4c1cb9-scripts" (OuterVolumeSpecName: "scripts") pod "a154fa7a-7ef7-4c8b-ac86-b0508e4c1cb9" (UID: "a154fa7a-7ef7-4c8b-ac86-b0508e4c1cb9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:58:51 crc kubenswrapper[4748]: I0320 10:58:51.031364 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a154fa7a-7ef7-4c8b-ac86-b0508e4c1cb9-kube-api-access-fd92x" (OuterVolumeSpecName: "kube-api-access-fd92x") pod "a154fa7a-7ef7-4c8b-ac86-b0508e4c1cb9" (UID: "a154fa7a-7ef7-4c8b-ac86-b0508e4c1cb9"). InnerVolumeSpecName "kube-api-access-fd92x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:58:51 crc kubenswrapper[4748]: I0320 10:58:51.054575 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a154fa7a-7ef7-4c8b-ac86-b0508e4c1cb9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a154fa7a-7ef7-4c8b-ac86-b0508e4c1cb9" (UID: "a154fa7a-7ef7-4c8b-ac86-b0508e4c1cb9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:58:51 crc kubenswrapper[4748]: I0320 10:58:51.056202 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a154fa7a-7ef7-4c8b-ac86-b0508e4c1cb9-config-data" (OuterVolumeSpecName: "config-data") pod "a154fa7a-7ef7-4c8b-ac86-b0508e4c1cb9" (UID: "a154fa7a-7ef7-4c8b-ac86-b0508e4c1cb9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:58:51 crc kubenswrapper[4748]: I0320 10:58:51.127120 4748 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a154fa7a-7ef7-4c8b-ac86-b0508e4c1cb9-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 10:58:51 crc kubenswrapper[4748]: I0320 10:58:51.127154 4748 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a154fa7a-7ef7-4c8b-ac86-b0508e4c1cb9-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 10:58:51 crc kubenswrapper[4748]: I0320 10:58:51.127165 4748 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a154fa7a-7ef7-4c8b-ac86-b0508e4c1cb9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 10:58:51 crc kubenswrapper[4748]: I0320 10:58:51.127178 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fd92x\" (UniqueName: \"kubernetes.io/projected/a154fa7a-7ef7-4c8b-ac86-b0508e4c1cb9-kube-api-access-fd92x\") on node \"crc\" DevicePath \"\"" Mar 20 10:58:51 crc kubenswrapper[4748]: I0320 10:58:51.604392 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-bs7cs" event={"ID":"a154fa7a-7ef7-4c8b-ac86-b0508e4c1cb9","Type":"ContainerDied","Data":"569e1001e44de1d02ce2847704b18bce5d56d8ccc7e31fd68352f1d72304f1b6"} Mar 20 10:58:51 crc kubenswrapper[4748]: I0320 10:58:51.604440 4748 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="569e1001e44de1d02ce2847704b18bce5d56d8ccc7e31fd68352f1d72304f1b6" Mar 20 10:58:51 crc kubenswrapper[4748]: I0320 10:58:51.604500 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-bs7cs" Mar 20 10:58:51 crc kubenswrapper[4748]: I0320 10:58:51.707825 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 20 10:58:51 crc kubenswrapper[4748]: E0320 10:58:51.708616 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a154fa7a-7ef7-4c8b-ac86-b0508e4c1cb9" containerName="nova-cell0-conductor-db-sync" Mar 20 10:58:51 crc kubenswrapper[4748]: I0320 10:58:51.708640 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="a154fa7a-7ef7-4c8b-ac86-b0508e4c1cb9" containerName="nova-cell0-conductor-db-sync" Mar 20 10:58:51 crc kubenswrapper[4748]: I0320 10:58:51.708904 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="a154fa7a-7ef7-4c8b-ac86-b0508e4c1cb9" containerName="nova-cell0-conductor-db-sync" Mar 20 10:58:51 crc kubenswrapper[4748]: I0320 10:58:51.709753 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 20 10:58:51 crc kubenswrapper[4748]: I0320 10:58:51.712721 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-m7mzz" Mar 20 10:58:51 crc kubenswrapper[4748]: I0320 10:58:51.712903 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 20 10:58:51 crc kubenswrapper[4748]: I0320 10:58:51.721792 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 20 10:58:51 crc kubenswrapper[4748]: I0320 10:58:51.841654 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b67252d-9978-4214-bf76-b57b2272c603-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"8b67252d-9978-4214-bf76-b57b2272c603\") " pod="openstack/nova-cell0-conductor-0" Mar 20 10:58:51 crc kubenswrapper[4748]: I0320 10:58:51.842062 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7phf\" (UniqueName: \"kubernetes.io/projected/8b67252d-9978-4214-bf76-b57b2272c603-kube-api-access-d7phf\") pod \"nova-cell0-conductor-0\" (UID: \"8b67252d-9978-4214-bf76-b57b2272c603\") " pod="openstack/nova-cell0-conductor-0" Mar 20 10:58:51 crc kubenswrapper[4748]: I0320 10:58:51.842201 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b67252d-9978-4214-bf76-b57b2272c603-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"8b67252d-9978-4214-bf76-b57b2272c603\") " pod="openstack/nova-cell0-conductor-0" Mar 20 10:58:51 crc kubenswrapper[4748]: I0320 10:58:51.943916 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b67252d-9978-4214-bf76-b57b2272c603-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"8b67252d-9978-4214-bf76-b57b2272c603\") " pod="openstack/nova-cell0-conductor-0" Mar 20 10:58:51 crc kubenswrapper[4748]: I0320 10:58:51.944024 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d7phf\" (UniqueName: \"kubernetes.io/projected/8b67252d-9978-4214-bf76-b57b2272c603-kube-api-access-d7phf\") pod \"nova-cell0-conductor-0\" (UID: \"8b67252d-9978-4214-bf76-b57b2272c603\") " pod="openstack/nova-cell0-conductor-0" Mar 20 10:58:51 crc kubenswrapper[4748]: I0320 10:58:51.944057 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b67252d-9978-4214-bf76-b57b2272c603-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"8b67252d-9978-4214-bf76-b57b2272c603\") " pod="openstack/nova-cell0-conductor-0" Mar 20 10:58:51 crc kubenswrapper[4748]: I0320 10:58:51.948274 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b67252d-9978-4214-bf76-b57b2272c603-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"8b67252d-9978-4214-bf76-b57b2272c603\") " pod="openstack/nova-cell0-conductor-0" Mar 20 10:58:51 crc kubenswrapper[4748]: I0320 10:58:51.954283 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b67252d-9978-4214-bf76-b57b2272c603-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"8b67252d-9978-4214-bf76-b57b2272c603\") " pod="openstack/nova-cell0-conductor-0" Mar 20 10:58:51 crc kubenswrapper[4748]: I0320 10:58:51.968916 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7phf\" (UniqueName: \"kubernetes.io/projected/8b67252d-9978-4214-bf76-b57b2272c603-kube-api-access-d7phf\") pod \"nova-cell0-conductor-0\" (UID: \"8b67252d-9978-4214-bf76-b57b2272c603\") " pod="openstack/nova-cell0-conductor-0" Mar 20 10:58:52 crc kubenswrapper[4748]: I0320 10:58:52.025886 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 20 10:58:52 crc kubenswrapper[4748]: I0320 10:58:52.472078 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 20 10:58:52 crc kubenswrapper[4748]: I0320 10:58:52.614183 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"8b67252d-9978-4214-bf76-b57b2272c603","Type":"ContainerStarted","Data":"87649af5c0b5ec60f7cd3354cf87cea84640fcde9c4c0c14345d243929583b41"} Mar 20 10:58:53 crc kubenswrapper[4748]: I0320 10:58:53.626783 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"8b67252d-9978-4214-bf76-b57b2272c603","Type":"ContainerStarted","Data":"922921d6ee98ef04e61dacba9a91a9f192752186f811532de9cab8c6bb343aa2"} Mar 20 10:58:53 crc kubenswrapper[4748]: I0320 10:58:53.627366 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Mar 20 10:58:53 crc kubenswrapper[4748]: I0320 10:58:53.649209 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.649186243 podStartE2EDuration="2.649186243s" podCreationTimestamp="2026-03-20 10:58:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:53.644829724 +0000 UTC m=+1368.786375558" watchObservedRunningTime="2026-03-20 10:58:53.649186243 +0000 UTC m=+1368.790732077" Mar 20 10:58:57 crc kubenswrapper[4748]: I0320 10:58:57.058859 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Mar 20 10:58:57 crc kubenswrapper[4748]: I0320 10:58:57.467454 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-hww8p"] Mar 20 10:58:57 crc kubenswrapper[4748]: I0320 10:58:57.468582 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-hww8p" Mar 20 10:58:57 crc kubenswrapper[4748]: I0320 10:58:57.470858 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Mar 20 10:58:57 crc kubenswrapper[4748]: I0320 10:58:57.471393 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Mar 20 10:58:57 crc kubenswrapper[4748]: I0320 10:58:57.480561 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-hww8p"] Mar 20 10:58:57 crc kubenswrapper[4748]: I0320 10:58:57.516799 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7263772-e7ec-43ad-815f-7c6a67575402-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-hww8p\" (UID: \"d7263772-e7ec-43ad-815f-7c6a67575402\") " pod="openstack/nova-cell0-cell-mapping-hww8p" Mar 20 10:58:57 crc kubenswrapper[4748]: I0320 10:58:57.516940 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7263772-e7ec-43ad-815f-7c6a67575402-config-data\") pod \"nova-cell0-cell-mapping-hww8p\" (UID: \"d7263772-e7ec-43ad-815f-7c6a67575402\") " pod="openstack/nova-cell0-cell-mapping-hww8p" Mar 20 10:58:57 crc kubenswrapper[4748]: I0320 10:58:57.517007 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7mcb\" (UniqueName: \"kubernetes.io/projected/d7263772-e7ec-43ad-815f-7c6a67575402-kube-api-access-p7mcb\") pod \"nova-cell0-cell-mapping-hww8p\" (UID: \"d7263772-e7ec-43ad-815f-7c6a67575402\") " pod="openstack/nova-cell0-cell-mapping-hww8p" Mar 20 10:58:57 crc kubenswrapper[4748]: I0320 10:58:57.517026 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d7263772-e7ec-43ad-815f-7c6a67575402-scripts\") pod \"nova-cell0-cell-mapping-hww8p\" (UID: \"d7263772-e7ec-43ad-815f-7c6a67575402\") " pod="openstack/nova-cell0-cell-mapping-hww8p" Mar 20 10:58:57 crc kubenswrapper[4748]: I0320 10:58:57.621633 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7263772-e7ec-43ad-815f-7c6a67575402-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-hww8p\" (UID: \"d7263772-e7ec-43ad-815f-7c6a67575402\") " pod="openstack/nova-cell0-cell-mapping-hww8p" Mar 20 10:58:57 crc kubenswrapper[4748]: I0320 10:58:57.623891 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7263772-e7ec-43ad-815f-7c6a67575402-config-data\") pod \"nova-cell0-cell-mapping-hww8p\" (UID: \"d7263772-e7ec-43ad-815f-7c6a67575402\") " pod="openstack/nova-cell0-cell-mapping-hww8p" Mar 20 10:58:57 crc kubenswrapper[4748]: I0320 10:58:57.624155 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p7mcb\" (UniqueName: \"kubernetes.io/projected/d7263772-e7ec-43ad-815f-7c6a67575402-kube-api-access-p7mcb\") pod \"nova-cell0-cell-mapping-hww8p\" (UID: \"d7263772-e7ec-43ad-815f-7c6a67575402\") " pod="openstack/nova-cell0-cell-mapping-hww8p" Mar 20 10:58:57 crc kubenswrapper[4748]: I0320 10:58:57.624179 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d7263772-e7ec-43ad-815f-7c6a67575402-scripts\") pod \"nova-cell0-cell-mapping-hww8p\" (UID: \"d7263772-e7ec-43ad-815f-7c6a67575402\") " pod="openstack/nova-cell0-cell-mapping-hww8p" Mar 20 10:58:57 crc kubenswrapper[4748]: I0320 10:58:57.642123 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7263772-e7ec-43ad-815f-7c6a67575402-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-hww8p\" (UID: \"d7263772-e7ec-43ad-815f-7c6a67575402\") " pod="openstack/nova-cell0-cell-mapping-hww8p" Mar 20 10:58:57 crc kubenswrapper[4748]: I0320 10:58:57.651229 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d7263772-e7ec-43ad-815f-7c6a67575402-scripts\") pod \"nova-cell0-cell-mapping-hww8p\" (UID: \"d7263772-e7ec-43ad-815f-7c6a67575402\") " pod="openstack/nova-cell0-cell-mapping-hww8p" Mar 20 10:58:57 crc kubenswrapper[4748]: I0320 10:58:57.652380 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7263772-e7ec-43ad-815f-7c6a67575402-config-data\") pod \"nova-cell0-cell-mapping-hww8p\" (UID: \"d7263772-e7ec-43ad-815f-7c6a67575402\") " pod="openstack/nova-cell0-cell-mapping-hww8p" Mar 20 10:58:57 crc kubenswrapper[4748]: I0320 10:58:57.652811 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 20 10:58:57 crc kubenswrapper[4748]: I0320 10:58:57.658459 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 10:58:57 crc kubenswrapper[4748]: I0320 10:58:57.663494 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7mcb\" (UniqueName: \"kubernetes.io/projected/d7263772-e7ec-43ad-815f-7c6a67575402-kube-api-access-p7mcb\") pod \"nova-cell0-cell-mapping-hww8p\" (UID: \"d7263772-e7ec-43ad-815f-7c6a67575402\") " pod="openstack/nova-cell0-cell-mapping-hww8p" Mar 20 10:58:57 crc kubenswrapper[4748]: I0320 10:58:57.667258 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 20 10:58:57 crc kubenswrapper[4748]: I0320 10:58:57.681446 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 10:58:57 crc kubenswrapper[4748]: I0320 10:58:57.725929 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gnnmd\" (UniqueName: \"kubernetes.io/projected/f45b2b72-df5d-4ae6-9f5a-8f6cced75642-kube-api-access-gnnmd\") pod \"nova-api-0\" (UID: \"f45b2b72-df5d-4ae6-9f5a-8f6cced75642\") " pod="openstack/nova-api-0" Mar 20 10:58:57 crc kubenswrapper[4748]: I0320 10:58:57.726052 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f45b2b72-df5d-4ae6-9f5a-8f6cced75642-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f45b2b72-df5d-4ae6-9f5a-8f6cced75642\") " pod="openstack/nova-api-0" Mar 20 10:58:57 crc kubenswrapper[4748]: I0320 10:58:57.726091 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f45b2b72-df5d-4ae6-9f5a-8f6cced75642-config-data\") pod \"nova-api-0\" (UID: \"f45b2b72-df5d-4ae6-9f5a-8f6cced75642\") " pod="openstack/nova-api-0" Mar 20 10:58:57 crc kubenswrapper[4748]: I0320 10:58:57.726120 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f45b2b72-df5d-4ae6-9f5a-8f6cced75642-logs\") pod \"nova-api-0\" (UID: \"f45b2b72-df5d-4ae6-9f5a-8f6cced75642\") " pod="openstack/nova-api-0" Mar 20 10:58:57 crc kubenswrapper[4748]: I0320 10:58:57.760921 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 10:58:57 crc kubenswrapper[4748]: I0320 10:58:57.767522 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 10:58:57 crc kubenswrapper[4748]: I0320 10:58:57.774281 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 20 10:58:57 crc kubenswrapper[4748]: I0320 10:58:57.791188 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-hww8p" Mar 20 10:58:57 crc kubenswrapper[4748]: I0320 10:58:57.824048 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 10:58:57 crc kubenswrapper[4748]: I0320 10:58:57.826801 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f45b2b72-df5d-4ae6-9f5a-8f6cced75642-config-data\") pod \"nova-api-0\" (UID: \"f45b2b72-df5d-4ae6-9f5a-8f6cced75642\") " pod="openstack/nova-api-0" Mar 20 10:58:57 crc kubenswrapper[4748]: I0320 10:58:57.826871 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f45b2b72-df5d-4ae6-9f5a-8f6cced75642-logs\") pod \"nova-api-0\" (UID: \"f45b2b72-df5d-4ae6-9f5a-8f6cced75642\") " pod="openstack/nova-api-0" Mar 20 10:58:57 crc kubenswrapper[4748]: I0320 10:58:57.826938 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3539da7-c3da-4c63-9eec-56bf69254d9f-config-data\") pod \"nova-scheduler-0\" (UID: \"a3539da7-c3da-4c63-9eec-56bf69254d9f\") " pod="openstack/nova-scheduler-0" Mar 20 10:58:57 crc kubenswrapper[4748]: I0320 10:58:57.827000 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gnnmd\" (UniqueName: \"kubernetes.io/projected/f45b2b72-df5d-4ae6-9f5a-8f6cced75642-kube-api-access-gnnmd\") pod \"nova-api-0\" (UID: \"f45b2b72-df5d-4ae6-9f5a-8f6cced75642\") " pod="openstack/nova-api-0" Mar 20 10:58:57 crc kubenswrapper[4748]: I0320 10:58:57.827084 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqw4v\" (UniqueName: \"kubernetes.io/projected/a3539da7-c3da-4c63-9eec-56bf69254d9f-kube-api-access-zqw4v\") pod \"nova-scheduler-0\" (UID: \"a3539da7-c3da-4c63-9eec-56bf69254d9f\") " pod="openstack/nova-scheduler-0" Mar 20 10:58:57 crc kubenswrapper[4748]: I0320 10:58:57.827108 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3539da7-c3da-4c63-9eec-56bf69254d9f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a3539da7-c3da-4c63-9eec-56bf69254d9f\") " pod="openstack/nova-scheduler-0" Mar 20 10:58:57 crc kubenswrapper[4748]: I0320 10:58:57.827143 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f45b2b72-df5d-4ae6-9f5a-8f6cced75642-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f45b2b72-df5d-4ae6-9f5a-8f6cced75642\") " pod="openstack/nova-api-0" Mar 20 10:58:57 crc kubenswrapper[4748]: I0320 10:58:57.828440 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f45b2b72-df5d-4ae6-9f5a-8f6cced75642-logs\") pod \"nova-api-0\" (UID: \"f45b2b72-df5d-4ae6-9f5a-8f6cced75642\") " pod="openstack/nova-api-0" Mar 20 10:58:57 crc kubenswrapper[4748]: I0320 10:58:57.842622 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f45b2b72-df5d-4ae6-9f5a-8f6cced75642-config-data\") pod \"nova-api-0\" (UID: \"f45b2b72-df5d-4ae6-9f5a-8f6cced75642\") " pod="openstack/nova-api-0" Mar 20 10:58:57 crc kubenswrapper[4748]: I0320 10:58:57.861805 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f45b2b72-df5d-4ae6-9f5a-8f6cced75642-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f45b2b72-df5d-4ae6-9f5a-8f6cced75642\") " pod="openstack/nova-api-0" Mar 20 10:58:57 crc kubenswrapper[4748]: I0320 10:58:57.871616 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gnnmd\" (UniqueName: \"kubernetes.io/projected/f45b2b72-df5d-4ae6-9f5a-8f6cced75642-kube-api-access-gnnmd\") pod \"nova-api-0\" (UID: \"f45b2b72-df5d-4ae6-9f5a-8f6cced75642\") " pod="openstack/nova-api-0" Mar 20 10:58:57 crc kubenswrapper[4748]: I0320 10:58:57.886029 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 20 10:58:57 crc kubenswrapper[4748]: I0320 10:58:57.888484 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 10:58:57 crc kubenswrapper[4748]: I0320 10:58:57.911635 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 10:58:57 crc kubenswrapper[4748]: I0320 10:58:57.913039 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 20 10:58:57 crc kubenswrapper[4748]: I0320 10:58:57.928757 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3539da7-c3da-4c63-9eec-56bf69254d9f-config-data\") pod \"nova-scheduler-0\" (UID: \"a3539da7-c3da-4c63-9eec-56bf69254d9f\") " pod="openstack/nova-scheduler-0" Mar 20 10:58:57 crc kubenswrapper[4748]: I0320 10:58:57.929113 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3539da7-c3da-4c63-9eec-56bf69254d9f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a3539da7-c3da-4c63-9eec-56bf69254d9f\") " pod="openstack/nova-scheduler-0" Mar 20 10:58:57 crc kubenswrapper[4748]: I0320 10:58:57.929717 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zqw4v\" (UniqueName: \"kubernetes.io/projected/a3539da7-c3da-4c63-9eec-56bf69254d9f-kube-api-access-zqw4v\") pod \"nova-scheduler-0\" (UID: \"a3539da7-c3da-4c63-9eec-56bf69254d9f\") " pod="openstack/nova-scheduler-0" Mar 20 10:58:57 crc kubenswrapper[4748]: I0320 10:58:57.937480 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 10:58:57 crc kubenswrapper[4748]: I0320 10:58:57.939125 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 20 10:58:57 crc kubenswrapper[4748]: I0320 10:58:57.946335 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 20 10:58:57 crc kubenswrapper[4748]: I0320 10:58:57.953643 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 10:58:57 crc kubenswrapper[4748]: I0320 10:58:57.957216 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3539da7-c3da-4c63-9eec-56bf69254d9f-config-data\") pod \"nova-scheduler-0\" (UID: \"a3539da7-c3da-4c63-9eec-56bf69254d9f\") " pod="openstack/nova-scheduler-0" Mar 20 10:58:57 crc kubenswrapper[4748]: I0320 10:58:57.968962 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqw4v\" (UniqueName: \"kubernetes.io/projected/a3539da7-c3da-4c63-9eec-56bf69254d9f-kube-api-access-zqw4v\") pod \"nova-scheduler-0\" (UID: \"a3539da7-c3da-4c63-9eec-56bf69254d9f\") " pod="openstack/nova-scheduler-0" Mar 20 10:58:57 crc kubenswrapper[4748]: I0320 10:58:57.969618 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3539da7-c3da-4c63-9eec-56bf69254d9f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a3539da7-c3da-4c63-9eec-56bf69254d9f\") " pod="openstack/nova-scheduler-0" Mar 20 10:58:58 crc kubenswrapper[4748]: I0320 10:58:58.002892 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-9c85k"] Mar 20 10:58:58 crc kubenswrapper[4748]: I0320 10:58:58.004386 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-9c85k" Mar 20 10:58:58 crc kubenswrapper[4748]: I0320 10:58:58.035647 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-9c85k"] Mar 20 10:58:58 crc kubenswrapper[4748]: I0320 10:58:58.044806 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/46817d78-848a-4ab6-8de1-2e2b03381000-ovsdbserver-nb\") pod \"dnsmasq-dns-757b4f8459-9c85k\" (UID: \"46817d78-848a-4ab6-8de1-2e2b03381000\") " pod="openstack/dnsmasq-dns-757b4f8459-9c85k" Mar 20 10:58:58 crc kubenswrapper[4748]: I0320 10:58:58.044911 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhh87\" (UniqueName: \"kubernetes.io/projected/b9da5564-8bfd-4c19-acba-4b382519195a-kube-api-access-dhh87\") pod \"nova-cell1-novncproxy-0\" (UID: \"b9da5564-8bfd-4c19-acba-4b382519195a\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 10:58:58 crc kubenswrapper[4748]: I0320 10:58:58.044938 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8zq79\" (UniqueName: \"kubernetes.io/projected/03bcc896-7c4f-4db1-97a4-c8b48821bd1d-kube-api-access-8zq79\") pod \"nova-metadata-0\" (UID: \"03bcc896-7c4f-4db1-97a4-c8b48821bd1d\") " pod="openstack/nova-metadata-0" Mar 20 10:58:58 crc kubenswrapper[4748]: I0320 10:58:58.044998 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/46817d78-848a-4ab6-8de1-2e2b03381000-dns-svc\") pod \"dnsmasq-dns-757b4f8459-9c85k\" (UID: \"46817d78-848a-4ab6-8de1-2e2b03381000\") " pod="openstack/dnsmasq-dns-757b4f8459-9c85k" Mar 20 10:58:58 crc kubenswrapper[4748]: I0320 10:58:58.045040 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46817d78-848a-4ab6-8de1-2e2b03381000-config\") pod \"dnsmasq-dns-757b4f8459-9c85k\" (UID: \"46817d78-848a-4ab6-8de1-2e2b03381000\") " pod="openstack/dnsmasq-dns-757b4f8459-9c85k" Mar 20 10:58:58 crc kubenswrapper[4748]: I0320 10:58:58.045093 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/46817d78-848a-4ab6-8de1-2e2b03381000-dns-swift-storage-0\") pod \"dnsmasq-dns-757b4f8459-9c85k\" (UID: \"46817d78-848a-4ab6-8de1-2e2b03381000\") " pod="openstack/dnsmasq-dns-757b4f8459-9c85k" Mar 20 10:58:58 crc kubenswrapper[4748]: I0320 10:58:58.045128 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9da5564-8bfd-4c19-acba-4b382519195a-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"b9da5564-8bfd-4c19-acba-4b382519195a\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 10:58:58 crc kubenswrapper[4748]: I0320 10:58:58.045150 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9da5564-8bfd-4c19-acba-4b382519195a-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"b9da5564-8bfd-4c19-acba-4b382519195a\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 10:58:58 crc kubenswrapper[4748]: I0320 10:58:58.045201 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03bcc896-7c4f-4db1-97a4-c8b48821bd1d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"03bcc896-7c4f-4db1-97a4-c8b48821bd1d\") " pod="openstack/nova-metadata-0" Mar 20 10:58:58 crc kubenswrapper[4748]: I0320 10:58:58.045240 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/03bcc896-7c4f-4db1-97a4-c8b48821bd1d-logs\") pod \"nova-metadata-0\" (UID: \"03bcc896-7c4f-4db1-97a4-c8b48821bd1d\") " pod="openstack/nova-metadata-0" Mar 20 10:58:58 crc kubenswrapper[4748]: I0320 10:58:58.045272 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/46817d78-848a-4ab6-8de1-2e2b03381000-ovsdbserver-sb\") pod \"dnsmasq-dns-757b4f8459-9c85k\" (UID: \"46817d78-848a-4ab6-8de1-2e2b03381000\") " pod="openstack/dnsmasq-dns-757b4f8459-9c85k" Mar 20 10:58:58 crc kubenswrapper[4748]: I0320 10:58:58.045418 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03bcc896-7c4f-4db1-97a4-c8b48821bd1d-config-data\") pod \"nova-metadata-0\" (UID: \"03bcc896-7c4f-4db1-97a4-c8b48821bd1d\") " pod="openstack/nova-metadata-0" Mar 20 10:58:58 crc kubenswrapper[4748]: I0320 10:58:58.045456 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gq7xk\" (UniqueName: \"kubernetes.io/projected/46817d78-848a-4ab6-8de1-2e2b03381000-kube-api-access-gq7xk\") pod \"dnsmasq-dns-757b4f8459-9c85k\" (UID: \"46817d78-848a-4ab6-8de1-2e2b03381000\") " pod="openstack/dnsmasq-dns-757b4f8459-9c85k" Mar 20 10:58:58 crc kubenswrapper[4748]: I0320 10:58:58.067761 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 10:58:58 crc kubenswrapper[4748]: I0320 10:58:58.103013 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 10:58:58 crc kubenswrapper[4748]: I0320 10:58:58.148559 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03bcc896-7c4f-4db1-97a4-c8b48821bd1d-config-data\") pod \"nova-metadata-0\" (UID: \"03bcc896-7c4f-4db1-97a4-c8b48821bd1d\") " pod="openstack/nova-metadata-0" Mar 20 10:58:58 crc kubenswrapper[4748]: I0320 10:58:58.149114 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gq7xk\" (UniqueName: \"kubernetes.io/projected/46817d78-848a-4ab6-8de1-2e2b03381000-kube-api-access-gq7xk\") pod \"dnsmasq-dns-757b4f8459-9c85k\" (UID: \"46817d78-848a-4ab6-8de1-2e2b03381000\") " pod="openstack/dnsmasq-dns-757b4f8459-9c85k" Mar 20 10:58:58 crc kubenswrapper[4748]: I0320 10:58:58.149155 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/46817d78-848a-4ab6-8de1-2e2b03381000-ovsdbserver-nb\") pod \"dnsmasq-dns-757b4f8459-9c85k\" (UID: \"46817d78-848a-4ab6-8de1-2e2b03381000\") " pod="openstack/dnsmasq-dns-757b4f8459-9c85k" Mar 20 10:58:58 crc kubenswrapper[4748]: I0320 10:58:58.149203 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dhh87\" (UniqueName: \"kubernetes.io/projected/b9da5564-8bfd-4c19-acba-4b382519195a-kube-api-access-dhh87\") pod \"nova-cell1-novncproxy-0\" (UID: \"b9da5564-8bfd-4c19-acba-4b382519195a\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 10:58:58 crc kubenswrapper[4748]: I0320 10:58:58.149222 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8zq79\" (UniqueName: \"kubernetes.io/projected/03bcc896-7c4f-4db1-97a4-c8b48821bd1d-kube-api-access-8zq79\") pod \"nova-metadata-0\" (UID: \"03bcc896-7c4f-4db1-97a4-c8b48821bd1d\") " pod="openstack/nova-metadata-0" Mar 20 10:58:58 crc kubenswrapper[4748]: I0320 10:58:58.149270 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/46817d78-848a-4ab6-8de1-2e2b03381000-dns-svc\") pod \"dnsmasq-dns-757b4f8459-9c85k\" (UID: \"46817d78-848a-4ab6-8de1-2e2b03381000\") " pod="openstack/dnsmasq-dns-757b4f8459-9c85k" Mar 20 10:58:58 crc kubenswrapper[4748]: I0320 10:58:58.149300 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46817d78-848a-4ab6-8de1-2e2b03381000-config\") pod \"dnsmasq-dns-757b4f8459-9c85k\" (UID: \"46817d78-848a-4ab6-8de1-2e2b03381000\") " pod="openstack/dnsmasq-dns-757b4f8459-9c85k" Mar 20 10:58:58 crc kubenswrapper[4748]: I0320 10:58:58.149340 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/46817d78-848a-4ab6-8de1-2e2b03381000-dns-swift-storage-0\") pod \"dnsmasq-dns-757b4f8459-9c85k\" (UID: \"46817d78-848a-4ab6-8de1-2e2b03381000\") " pod="openstack/dnsmasq-dns-757b4f8459-9c85k" Mar 20 10:58:58 crc kubenswrapper[4748]: I0320 10:58:58.149785 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9da5564-8bfd-4c19-acba-4b382519195a-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"b9da5564-8bfd-4c19-acba-4b382519195a\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 10:58:58 crc kubenswrapper[4748]: I0320 10:58:58.149804 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9da5564-8bfd-4c19-acba-4b382519195a-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"b9da5564-8bfd-4c19-acba-4b382519195a\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 10:58:58 crc kubenswrapper[4748]: I0320 10:58:58.149859 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03bcc896-7c4f-4db1-97a4-c8b48821bd1d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"03bcc896-7c4f-4db1-97a4-c8b48821bd1d\") " pod="openstack/nova-metadata-0" Mar 20 10:58:58 crc kubenswrapper[4748]: I0320 10:58:58.149888 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/03bcc896-7c4f-4db1-97a4-c8b48821bd1d-logs\") pod \"nova-metadata-0\" (UID: \"03bcc896-7c4f-4db1-97a4-c8b48821bd1d\") " pod="openstack/nova-metadata-0" Mar 20 10:58:58 crc kubenswrapper[4748]: I0320 10:58:58.149909 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/46817d78-848a-4ab6-8de1-2e2b03381000-ovsdbserver-sb\") pod \"dnsmasq-dns-757b4f8459-9c85k\" (UID: \"46817d78-848a-4ab6-8de1-2e2b03381000\") " pod="openstack/dnsmasq-dns-757b4f8459-9c85k" Mar 20 10:58:58 crc kubenswrapper[4748]: I0320 10:58:58.151123 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/46817d78-848a-4ab6-8de1-2e2b03381000-ovsdbserver-sb\") pod \"dnsmasq-dns-757b4f8459-9c85k\" (UID: \"46817d78-848a-4ab6-8de1-2e2b03381000\") " pod="openstack/dnsmasq-dns-757b4f8459-9c85k" Mar 20 10:58:58 crc kubenswrapper[4748]: I0320 10:58:58.152753 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46817d78-848a-4ab6-8de1-2e2b03381000-config\") pod \"dnsmasq-dns-757b4f8459-9c85k\" (UID: \"46817d78-848a-4ab6-8de1-2e2b03381000\") " pod="openstack/dnsmasq-dns-757b4f8459-9c85k" Mar 20 10:58:58 crc kubenswrapper[4748]: I0320 10:58:58.153773 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/46817d78-848a-4ab6-8de1-2e2b03381000-ovsdbserver-nb\") pod \"dnsmasq-dns-757b4f8459-9c85k\" (UID: \"46817d78-848a-4ab6-8de1-2e2b03381000\") " pod="openstack/dnsmasq-dns-757b4f8459-9c85k" Mar 20 10:58:58 crc kubenswrapper[4748]: I0320 10:58:58.154850 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/46817d78-848a-4ab6-8de1-2e2b03381000-dns-svc\") pod \"dnsmasq-dns-757b4f8459-9c85k\" (UID: \"46817d78-848a-4ab6-8de1-2e2b03381000\") " pod="openstack/dnsmasq-dns-757b4f8459-9c85k" Mar 20 10:58:58 crc kubenswrapper[4748]: I0320 10:58:58.155345 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03bcc896-7c4f-4db1-97a4-c8b48821bd1d-config-data\") pod \"nova-metadata-0\" (UID: \"03bcc896-7c4f-4db1-97a4-c8b48821bd1d\") " pod="openstack/nova-metadata-0" Mar 20 10:58:58 crc kubenswrapper[4748]: I0320 10:58:58.155790 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/03bcc896-7c4f-4db1-97a4-c8b48821bd1d-logs\") pod \"nova-metadata-0\" (UID: \"03bcc896-7c4f-4db1-97a4-c8b48821bd1d\") " pod="openstack/nova-metadata-0" Mar 20 10:58:58 crc kubenswrapper[4748]: I0320 10:58:58.155987 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/46817d78-848a-4ab6-8de1-2e2b03381000-dns-swift-storage-0\") pod \"dnsmasq-dns-757b4f8459-9c85k\" (UID: \"46817d78-848a-4ab6-8de1-2e2b03381000\") " pod="openstack/dnsmasq-dns-757b4f8459-9c85k" Mar 20 10:58:58 crc kubenswrapper[4748]: I0320 10:58:58.160338 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03bcc896-7c4f-4db1-97a4-c8b48821bd1d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"03bcc896-7c4f-4db1-97a4-c8b48821bd1d\") " pod="openstack/nova-metadata-0" Mar 20 10:58:58 crc kubenswrapper[4748]: I0320 10:58:58.162114 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9da5564-8bfd-4c19-acba-4b382519195a-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"b9da5564-8bfd-4c19-acba-4b382519195a\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 10:58:58 crc kubenswrapper[4748]: I0320 10:58:58.172264 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9da5564-8bfd-4c19-acba-4b382519195a-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"b9da5564-8bfd-4c19-acba-4b382519195a\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 10:58:58 crc kubenswrapper[4748]: I0320 10:58:58.177939 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gq7xk\" (UniqueName: \"kubernetes.io/projected/46817d78-848a-4ab6-8de1-2e2b03381000-kube-api-access-gq7xk\") pod \"dnsmasq-dns-757b4f8459-9c85k\" (UID: \"46817d78-848a-4ab6-8de1-2e2b03381000\") " pod="openstack/dnsmasq-dns-757b4f8459-9c85k" Mar 20 10:58:58 crc kubenswrapper[4748]: I0320 10:58:58.179406 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8zq79\" (UniqueName: \"kubernetes.io/projected/03bcc896-7c4f-4db1-97a4-c8b48821bd1d-kube-api-access-8zq79\") pod \"nova-metadata-0\" (UID: \"03bcc896-7c4f-4db1-97a4-c8b48821bd1d\") " pod="openstack/nova-metadata-0" Mar 20 10:58:58 crc kubenswrapper[4748]: I0320 10:58:58.183558 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhh87\" (UniqueName: \"kubernetes.io/projected/b9da5564-8bfd-4c19-acba-4b382519195a-kube-api-access-dhh87\") pod \"nova-cell1-novncproxy-0\" (UID: \"b9da5564-8bfd-4c19-acba-4b382519195a\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 10:58:58 crc kubenswrapper[4748]: I0320 10:58:58.364519 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 10:58:58 crc kubenswrapper[4748]: I0320 10:58:58.393669 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 20 10:58:58 crc kubenswrapper[4748]: I0320 10:58:58.419752 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-9c85k" Mar 20 10:58:58 crc kubenswrapper[4748]: I0320 10:58:58.474326 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-hww8p"] Mar 20 10:58:58 crc kubenswrapper[4748]: I0320 10:58:58.645725 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 10:58:58 crc kubenswrapper[4748]: I0320 10:58:58.707497 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-hww8p" event={"ID":"d7263772-e7ec-43ad-815f-7c6a67575402","Type":"ContainerStarted","Data":"90083acdc38d59c542973c9bfec48fd2c41804d4fe37ab0108a61291548240dd"} Mar 20 10:58:58 crc kubenswrapper[4748]: I0320 10:58:58.710435 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f45b2b72-df5d-4ae6-9f5a-8f6cced75642","Type":"ContainerStarted","Data":"d0b842ac82033584a141103c86b10919396c0bcaaed1b0b4f0466192b92eb967"} Mar 20 10:58:58 crc kubenswrapper[4748]: W0320 10:58:58.752092 4748 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda3539da7_c3da_4c63_9eec_56bf69254d9f.slice/crio-5e5a5de8544b037774fafaa7b774b6011ddae98c5647d3c8a2783f9e11d96560 WatchSource:0}: Error finding container 5e5a5de8544b037774fafaa7b774b6011ddae98c5647d3c8a2783f9e11d96560: Status 404 returned error can't find the container with id 5e5a5de8544b037774fafaa7b774b6011ddae98c5647d3c8a2783f9e11d96560 Mar 20 10:58:58 crc kubenswrapper[4748]: I0320 10:58:58.759141 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 10:58:58 crc kubenswrapper[4748]: I0320 10:58:58.872260 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-rghcf"] Mar 20 10:58:58 crc kubenswrapper[4748]: I0320 10:58:58.873536 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-rghcf" Mar 20 10:58:58 crc kubenswrapper[4748]: I0320 10:58:58.877551 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Mar 20 10:58:58 crc kubenswrapper[4748]: I0320 10:58:58.877664 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 20 10:58:58 crc kubenswrapper[4748]: I0320 10:58:58.892807 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-rghcf"] Mar 20 10:58:58 crc kubenswrapper[4748]: I0320 10:58:58.966762 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/292d1168-7edc-4e05-a657-c03029450a6b-scripts\") pod \"nova-cell1-conductor-db-sync-rghcf\" (UID: \"292d1168-7edc-4e05-a657-c03029450a6b\") " pod="openstack/nova-cell1-conductor-db-sync-rghcf" Mar 20 10:58:58 crc kubenswrapper[4748]: I0320 10:58:58.966881 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/292d1168-7edc-4e05-a657-c03029450a6b-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-rghcf\" (UID: \"292d1168-7edc-4e05-a657-c03029450a6b\") " pod="openstack/nova-cell1-conductor-db-sync-rghcf" Mar 20 10:58:58 crc kubenswrapper[4748]: I0320 10:58:58.966946 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xcmbv\" (UniqueName: \"kubernetes.io/projected/292d1168-7edc-4e05-a657-c03029450a6b-kube-api-access-xcmbv\") pod \"nova-cell1-conductor-db-sync-rghcf\" (UID: \"292d1168-7edc-4e05-a657-c03029450a6b\") " pod="openstack/nova-cell1-conductor-db-sync-rghcf" Mar 20 10:58:58 crc kubenswrapper[4748]: I0320 10:58:58.967137 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/292d1168-7edc-4e05-a657-c03029450a6b-config-data\") pod \"nova-cell1-conductor-db-sync-rghcf\" (UID: \"292d1168-7edc-4e05-a657-c03029450a6b\") " pod="openstack/nova-cell1-conductor-db-sync-rghcf" Mar 20 10:58:59 crc kubenswrapper[4748]: I0320 10:58:59.007796 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 10:58:59 crc kubenswrapper[4748]: I0320 10:58:59.069011 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/292d1168-7edc-4e05-a657-c03029450a6b-config-data\") pod \"nova-cell1-conductor-db-sync-rghcf\" (UID: \"292d1168-7edc-4e05-a657-c03029450a6b\") " pod="openstack/nova-cell1-conductor-db-sync-rghcf" Mar 20 10:58:59 crc kubenswrapper[4748]: I0320 10:58:59.069405 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/292d1168-7edc-4e05-a657-c03029450a6b-scripts\") pod \"nova-cell1-conductor-db-sync-rghcf\" (UID: \"292d1168-7edc-4e05-a657-c03029450a6b\") " pod="openstack/nova-cell1-conductor-db-sync-rghcf" Mar 20 10:58:59 crc kubenswrapper[4748]: I0320 10:58:59.069501 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/292d1168-7edc-4e05-a657-c03029450a6b-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-rghcf\" (UID: \"292d1168-7edc-4e05-a657-c03029450a6b\") " pod="openstack/nova-cell1-conductor-db-sync-rghcf" Mar 20 10:58:59 crc kubenswrapper[4748]: I0320 10:58:59.069655 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xcmbv\" (UniqueName: \"kubernetes.io/projected/292d1168-7edc-4e05-a657-c03029450a6b-kube-api-access-xcmbv\") pod \"nova-cell1-conductor-db-sync-rghcf\" (UID: \"292d1168-7edc-4e05-a657-c03029450a6b\") " pod="openstack/nova-cell1-conductor-db-sync-rghcf" Mar 20 10:58:59 crc kubenswrapper[4748]: I0320 10:58:59.076115 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/292d1168-7edc-4e05-a657-c03029450a6b-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-rghcf\" (UID: \"292d1168-7edc-4e05-a657-c03029450a6b\") " pod="openstack/nova-cell1-conductor-db-sync-rghcf" Mar 20 10:58:59 crc kubenswrapper[4748]: I0320 10:58:59.076139 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/292d1168-7edc-4e05-a657-c03029450a6b-config-data\") pod \"nova-cell1-conductor-db-sync-rghcf\" (UID: \"292d1168-7edc-4e05-a657-c03029450a6b\") " pod="openstack/nova-cell1-conductor-db-sync-rghcf" Mar 20 10:58:59 crc kubenswrapper[4748]: I0320 10:58:59.082024 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/292d1168-7edc-4e05-a657-c03029450a6b-scripts\") pod \"nova-cell1-conductor-db-sync-rghcf\" (UID: \"292d1168-7edc-4e05-a657-c03029450a6b\") " pod="openstack/nova-cell1-conductor-db-sync-rghcf" Mar 20 10:58:59 crc kubenswrapper[4748]: I0320 10:58:59.090392 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xcmbv\" (UniqueName: \"kubernetes.io/projected/292d1168-7edc-4e05-a657-c03029450a6b-kube-api-access-xcmbv\") pod \"nova-cell1-conductor-db-sync-rghcf\" (UID: \"292d1168-7edc-4e05-a657-c03029450a6b\") " pod="openstack/nova-cell1-conductor-db-sync-rghcf" Mar 20 10:58:59 crc kubenswrapper[4748]: I0320 10:58:59.127904 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-9c85k"] Mar 20 10:58:59 crc kubenswrapper[4748]: I0320 10:58:59.145184 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 10:58:59 crc kubenswrapper[4748]: I0320 10:58:59.204384 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-rghcf" Mar 20 10:58:59 crc kubenswrapper[4748]: I0320 10:58:59.754323 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"b9da5564-8bfd-4c19-acba-4b382519195a","Type":"ContainerStarted","Data":"45d93bab766ffb83b62010f9f0ba1e6f9649e2e23bd740094589ffd3990f860d"} Mar 20 10:58:59 crc kubenswrapper[4748]: I0320 10:58:59.761318 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a3539da7-c3da-4c63-9eec-56bf69254d9f","Type":"ContainerStarted","Data":"5e5a5de8544b037774fafaa7b774b6011ddae98c5647d3c8a2783f9e11d96560"} Mar 20 10:58:59 crc kubenswrapper[4748]: I0320 10:58:59.766669 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-rghcf"] Mar 20 10:58:59 crc kubenswrapper[4748]: I0320 10:58:59.773332 4748 generic.go:334] "Generic (PLEG): container finished" podID="46817d78-848a-4ab6-8de1-2e2b03381000" containerID="75303485c9a0643eebc9b8259b42027729c61be3251c5259866868d260dda6db" exitCode=0 Mar 20 10:58:59 crc kubenswrapper[4748]: I0320 10:58:59.773499 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-9c85k" event={"ID":"46817d78-848a-4ab6-8de1-2e2b03381000","Type":"ContainerDied","Data":"75303485c9a0643eebc9b8259b42027729c61be3251c5259866868d260dda6db"} Mar 20 10:58:59 crc kubenswrapper[4748]: I0320 10:58:59.773536 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-9c85k" event={"ID":"46817d78-848a-4ab6-8de1-2e2b03381000","Type":"ContainerStarted","Data":"97b691cd7d02b4968233ba62e4838ab17b2af99371ea1276f7c3526d7dcd93df"} Mar 20 10:58:59 crc kubenswrapper[4748]: I0320 10:58:59.783369 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-hww8p" event={"ID":"d7263772-e7ec-43ad-815f-7c6a67575402","Type":"ContainerStarted","Data":"6ffd362a35588b891b195eb4be044481eb6c22e82158760f46e425d88d3a61b6"} Mar 20 10:58:59 crc kubenswrapper[4748]: I0320 10:58:59.788694 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"03bcc896-7c4f-4db1-97a4-c8b48821bd1d","Type":"ContainerStarted","Data":"c98d63c4f48d163fce90e46451b76470d18dd7d0850858ae57e62559039cf0dd"} Mar 20 10:58:59 crc kubenswrapper[4748]: I0320 10:58:59.841075 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-hww8p" podStartSLOduration=2.84104328 podStartE2EDuration="2.84104328s" podCreationTimestamp="2026-03-20 10:58:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:59.838510447 +0000 UTC m=+1374.980056261" watchObservedRunningTime="2026-03-20 10:58:59.84104328 +0000 UTC m=+1374.982589094" Mar 20 10:59:00 crc kubenswrapper[4748]: I0320 10:59:00.845158 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-9c85k" event={"ID":"46817d78-848a-4ab6-8de1-2e2b03381000","Type":"ContainerStarted","Data":"7c379ff449728ba81f9de6130ce47080f5d3d4a8fac849e17e9e6c3e04c6b10e"} Mar 20 10:59:00 crc kubenswrapper[4748]: I0320 10:59:00.845921 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-757b4f8459-9c85k" Mar 20 10:59:00 crc kubenswrapper[4748]: I0320 10:59:00.862594 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-rghcf" event={"ID":"292d1168-7edc-4e05-a657-c03029450a6b","Type":"ContainerStarted","Data":"d1ebbb3667bd4d483bc0e2f4be4610d8a04be8354aac381ebc50df5e6bad9595"} Mar 20 10:59:00 crc kubenswrapper[4748]: I0320 10:59:00.862653 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-rghcf" event={"ID":"292d1168-7edc-4e05-a657-c03029450a6b","Type":"ContainerStarted","Data":"1f6fdf596716024868d4249232ddfe589fb6b2692b2f13c40fd1e84a4c60a9c7"} Mar 20 10:59:00 crc kubenswrapper[4748]: I0320 10:59:00.888778 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-757b4f8459-9c85k" podStartSLOduration=3.888748712 podStartE2EDuration="3.888748712s" podCreationTimestamp="2026-03-20 10:58:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:59:00.876423684 +0000 UTC m=+1376.017969518" watchObservedRunningTime="2026-03-20 10:59:00.888748712 +0000 UTC m=+1376.030294526" Mar 20 10:59:00 crc kubenswrapper[4748]: I0320 10:59:00.902641 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-rghcf" podStartSLOduration=2.902618429 podStartE2EDuration="2.902618429s" podCreationTimestamp="2026-03-20 10:58:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:59:00.893785298 +0000 UTC m=+1376.035331102" watchObservedRunningTime="2026-03-20 10:59:00.902618429 +0000 UTC m=+1376.044164243" Mar 20 10:59:01 crc kubenswrapper[4748]: I0320 10:59:01.607734 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 10:59:01 crc kubenswrapper[4748]: I0320 10:59:01.620622 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 10:59:02 crc kubenswrapper[4748]: I0320 10:59:02.040723 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 20 10:59:03 crc kubenswrapper[4748]: I0320 10:59:03.895102 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f45b2b72-df5d-4ae6-9f5a-8f6cced75642","Type":"ContainerStarted","Data":"ddcac3d621e07e876e9eb9fe3a0fa0559595aa5e04dc6c8838cf569b7effb9cf"} Mar 20 10:59:04 crc kubenswrapper[4748]: I0320 10:59:04.908135 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"b9da5564-8bfd-4c19-acba-4b382519195a","Type":"ContainerStarted","Data":"0cbe55127ac649ea0b81d09353b3b581486ceba01c492c5d463a67bf50225f18"} Mar 20 10:59:04 crc kubenswrapper[4748]: I0320 10:59:04.908262 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="b9da5564-8bfd-4c19-acba-4b382519195a" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://0cbe55127ac649ea0b81d09353b3b581486ceba01c492c5d463a67bf50225f18" gracePeriod=30 Mar 20 10:59:04 crc kubenswrapper[4748]: I0320 10:59:04.910457 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"03bcc896-7c4f-4db1-97a4-c8b48821bd1d","Type":"ContainerStarted","Data":"9e4c7f13fba570acfebd1976c777e0d5cfcca5ac087ce2121d42feed119cf895"} Mar 20 10:59:04 crc kubenswrapper[4748]: I0320 10:59:04.910505 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"03bcc896-7c4f-4db1-97a4-c8b48821bd1d","Type":"ContainerStarted","Data":"b2ae65f728186bdbfc5896331a23feba311ba3c767f94842146308204ebb545b"} Mar 20 10:59:04 crc kubenswrapper[4748]: I0320 10:59:04.910563 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="03bcc896-7c4f-4db1-97a4-c8b48821bd1d" containerName="nova-metadata-log" containerID="cri-o://b2ae65f728186bdbfc5896331a23feba311ba3c767f94842146308204ebb545b" gracePeriod=30 Mar 20 10:59:04 crc kubenswrapper[4748]: I0320 10:59:04.910601 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="03bcc896-7c4f-4db1-97a4-c8b48821bd1d" containerName="nova-metadata-metadata" containerID="cri-o://9e4c7f13fba570acfebd1976c777e0d5cfcca5ac087ce2121d42feed119cf895" gracePeriod=30 Mar 20 10:59:04 crc kubenswrapper[4748]: I0320 10:59:04.912545 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f45b2b72-df5d-4ae6-9f5a-8f6cced75642","Type":"ContainerStarted","Data":"f92e189958f888ce283b036bc284b4431af82bd0c2f0bd94e1a43b803ebb2fc2"} Mar 20 10:59:04 crc kubenswrapper[4748]: I0320 10:59:04.914711 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a3539da7-c3da-4c63-9eec-56bf69254d9f","Type":"ContainerStarted","Data":"8924d11d2e8e9ae5dc583afbb64c79638c1923c1eb98201390ee1811f9ad2fa6"} Mar 20 10:59:04 crc kubenswrapper[4748]: I0320 10:59:04.932618 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=3.585112183 podStartE2EDuration="7.932599923s" podCreationTimestamp="2026-03-20 10:58:57 +0000 UTC" firstStartedPulling="2026-03-20 10:58:59.142815817 +0000 UTC m=+1374.284361631" lastFinishedPulling="2026-03-20 10:59:03.490303557 +0000 UTC m=+1378.631849371" observedRunningTime="2026-03-20 10:59:04.925020003 +0000 UTC m=+1380.066565817" watchObservedRunningTime="2026-03-20 10:59:04.932599923 +0000 UTC m=+1380.074145737" Mar 20 10:59:04 crc kubenswrapper[4748]: I0320 10:59:04.957376 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.157360509 podStartE2EDuration="7.957352792s" podCreationTimestamp="2026-03-20 10:58:57 +0000 UTC" firstStartedPulling="2026-03-20 10:58:58.686426137 +0000 UTC m=+1373.827971961" lastFinishedPulling="2026-03-20 10:59:03.48641844 +0000 UTC m=+1378.627964244" observedRunningTime="2026-03-20 10:59:04.950545882 +0000 UTC m=+1380.092091696" watchObservedRunningTime="2026-03-20 10:59:04.957352792 +0000 UTC m=+1380.098898606" Mar 20 10:59:04 crc kubenswrapper[4748]: I0320 10:59:04.977023 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.249071572 podStartE2EDuration="7.977001283s" podCreationTimestamp="2026-03-20 10:58:57 +0000 UTC" firstStartedPulling="2026-03-20 10:58:58.758083299 +0000 UTC m=+1373.899629113" lastFinishedPulling="2026-03-20 10:59:03.48601301 +0000 UTC m=+1378.627558824" observedRunningTime="2026-03-20 10:59:04.967395983 +0000 UTC m=+1380.108941817" watchObservedRunningTime="2026-03-20 10:59:04.977001283 +0000 UTC m=+1380.118547097" Mar 20 10:59:04 crc kubenswrapper[4748]: I0320 10:59:04.993474 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.521968234 podStartE2EDuration="7.993450244s" podCreationTimestamp="2026-03-20 10:58:57 +0000 UTC" firstStartedPulling="2026-03-20 10:58:59.014777386 +0000 UTC m=+1374.156323200" lastFinishedPulling="2026-03-20 10:59:03.486259396 +0000 UTC m=+1378.627805210" observedRunningTime="2026-03-20 10:59:04.986759437 +0000 UTC m=+1380.128305261" watchObservedRunningTime="2026-03-20 10:59:04.993450244 +0000 UTC m=+1380.134996058" Mar 20 10:59:05 crc kubenswrapper[4748]: I0320 10:59:05.564576 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 10:59:05 crc kubenswrapper[4748]: I0320 10:59:05.634611 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03bcc896-7c4f-4db1-97a4-c8b48821bd1d-combined-ca-bundle\") pod \"03bcc896-7c4f-4db1-97a4-c8b48821bd1d\" (UID: \"03bcc896-7c4f-4db1-97a4-c8b48821bd1d\") " Mar 20 10:59:05 crc kubenswrapper[4748]: I0320 10:59:05.634859 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8zq79\" (UniqueName: \"kubernetes.io/projected/03bcc896-7c4f-4db1-97a4-c8b48821bd1d-kube-api-access-8zq79\") pod \"03bcc896-7c4f-4db1-97a4-c8b48821bd1d\" (UID: \"03bcc896-7c4f-4db1-97a4-c8b48821bd1d\") " Mar 20 10:59:05 crc kubenswrapper[4748]: I0320 10:59:05.634905 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/03bcc896-7c4f-4db1-97a4-c8b48821bd1d-logs\") pod \"03bcc896-7c4f-4db1-97a4-c8b48821bd1d\" (UID: \"03bcc896-7c4f-4db1-97a4-c8b48821bd1d\") " Mar 20 10:59:05 crc kubenswrapper[4748]: I0320 10:59:05.634951 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03bcc896-7c4f-4db1-97a4-c8b48821bd1d-config-data\") pod \"03bcc896-7c4f-4db1-97a4-c8b48821bd1d\" (UID: \"03bcc896-7c4f-4db1-97a4-c8b48821bd1d\") " Mar 20 10:59:05 crc kubenswrapper[4748]: I0320 10:59:05.636641 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/03bcc896-7c4f-4db1-97a4-c8b48821bd1d-logs" (OuterVolumeSpecName: "logs") pod "03bcc896-7c4f-4db1-97a4-c8b48821bd1d" (UID: "03bcc896-7c4f-4db1-97a4-c8b48821bd1d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:59:05 crc kubenswrapper[4748]: I0320 10:59:05.657089 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03bcc896-7c4f-4db1-97a4-c8b48821bd1d-kube-api-access-8zq79" (OuterVolumeSpecName: "kube-api-access-8zq79") pod "03bcc896-7c4f-4db1-97a4-c8b48821bd1d" (UID: "03bcc896-7c4f-4db1-97a4-c8b48821bd1d"). InnerVolumeSpecName "kube-api-access-8zq79". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:59:05 crc kubenswrapper[4748]: I0320 10:59:05.668261 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03bcc896-7c4f-4db1-97a4-c8b48821bd1d-config-data" (OuterVolumeSpecName: "config-data") pod "03bcc896-7c4f-4db1-97a4-c8b48821bd1d" (UID: "03bcc896-7c4f-4db1-97a4-c8b48821bd1d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:59:05 crc kubenswrapper[4748]: I0320 10:59:05.668806 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03bcc896-7c4f-4db1-97a4-c8b48821bd1d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "03bcc896-7c4f-4db1-97a4-c8b48821bd1d" (UID: "03bcc896-7c4f-4db1-97a4-c8b48821bd1d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:59:05 crc kubenswrapper[4748]: I0320 10:59:05.736717 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8zq79\" (UniqueName: \"kubernetes.io/projected/03bcc896-7c4f-4db1-97a4-c8b48821bd1d-kube-api-access-8zq79\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:05 crc kubenswrapper[4748]: I0320 10:59:05.736758 4748 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/03bcc896-7c4f-4db1-97a4-c8b48821bd1d-logs\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:05 crc kubenswrapper[4748]: I0320 10:59:05.736770 4748 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03bcc896-7c4f-4db1-97a4-c8b48821bd1d-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:05 crc kubenswrapper[4748]: I0320 10:59:05.736784 4748 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03bcc896-7c4f-4db1-97a4-c8b48821bd1d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:05 crc kubenswrapper[4748]: I0320 10:59:05.923564 4748 generic.go:334] "Generic (PLEG): container finished" podID="03bcc896-7c4f-4db1-97a4-c8b48821bd1d" containerID="9e4c7f13fba570acfebd1976c777e0d5cfcca5ac087ce2121d42feed119cf895" exitCode=0 Mar 20 10:59:05 crc kubenswrapper[4748]: I0320 10:59:05.923598 4748 generic.go:334] "Generic (PLEG): container finished" podID="03bcc896-7c4f-4db1-97a4-c8b48821bd1d" containerID="b2ae65f728186bdbfc5896331a23feba311ba3c767f94842146308204ebb545b" exitCode=143 Mar 20 10:59:05 crc kubenswrapper[4748]: I0320 10:59:05.923619 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 10:59:05 crc kubenswrapper[4748]: I0320 10:59:05.923664 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"03bcc896-7c4f-4db1-97a4-c8b48821bd1d","Type":"ContainerDied","Data":"9e4c7f13fba570acfebd1976c777e0d5cfcca5ac087ce2121d42feed119cf895"} Mar 20 10:59:05 crc kubenswrapper[4748]: I0320 10:59:05.923691 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"03bcc896-7c4f-4db1-97a4-c8b48821bd1d","Type":"ContainerDied","Data":"b2ae65f728186bdbfc5896331a23feba311ba3c767f94842146308204ebb545b"} Mar 20 10:59:05 crc kubenswrapper[4748]: I0320 10:59:05.923700 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"03bcc896-7c4f-4db1-97a4-c8b48821bd1d","Type":"ContainerDied","Data":"c98d63c4f48d163fce90e46451b76470d18dd7d0850858ae57e62559039cf0dd"} Mar 20 10:59:05 crc kubenswrapper[4748]: I0320 10:59:05.923714 4748 scope.go:117] "RemoveContainer" containerID="9e4c7f13fba570acfebd1976c777e0d5cfcca5ac087ce2121d42feed119cf895" Mar 20 10:59:05 crc kubenswrapper[4748]: I0320 10:59:05.956902 4748 scope.go:117] "RemoveContainer" containerID="b2ae65f728186bdbfc5896331a23feba311ba3c767f94842146308204ebb545b" Mar 20 10:59:05 crc kubenswrapper[4748]: I0320 10:59:05.964130 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 10:59:05 crc kubenswrapper[4748]: I0320 10:59:05.980861 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 10:59:05 crc kubenswrapper[4748]: I0320 10:59:05.984055 4748 scope.go:117] "RemoveContainer" containerID="9e4c7f13fba570acfebd1976c777e0d5cfcca5ac087ce2121d42feed119cf895" Mar 20 10:59:05 crc kubenswrapper[4748]: E0320 10:59:05.988266 4748 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e4c7f13fba570acfebd1976c777e0d5cfcca5ac087ce2121d42feed119cf895\": container with ID starting with 9e4c7f13fba570acfebd1976c777e0d5cfcca5ac087ce2121d42feed119cf895 not found: ID does not exist" containerID="9e4c7f13fba570acfebd1976c777e0d5cfcca5ac087ce2121d42feed119cf895" Mar 20 10:59:05 crc kubenswrapper[4748]: I0320 10:59:05.988311 4748 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e4c7f13fba570acfebd1976c777e0d5cfcca5ac087ce2121d42feed119cf895"} err="failed to get container status \"9e4c7f13fba570acfebd1976c777e0d5cfcca5ac087ce2121d42feed119cf895\": rpc error: code = NotFound desc = could not find container \"9e4c7f13fba570acfebd1976c777e0d5cfcca5ac087ce2121d42feed119cf895\": container with ID starting with 9e4c7f13fba570acfebd1976c777e0d5cfcca5ac087ce2121d42feed119cf895 not found: ID does not exist" Mar 20 10:59:05 crc kubenswrapper[4748]: I0320 10:59:05.988337 4748 scope.go:117] "RemoveContainer" containerID="b2ae65f728186bdbfc5896331a23feba311ba3c767f94842146308204ebb545b" Mar 20 10:59:05 crc kubenswrapper[4748]: E0320 10:59:05.988822 4748 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b2ae65f728186bdbfc5896331a23feba311ba3c767f94842146308204ebb545b\": container with ID starting with b2ae65f728186bdbfc5896331a23feba311ba3c767f94842146308204ebb545b not found: ID does not exist" containerID="b2ae65f728186bdbfc5896331a23feba311ba3c767f94842146308204ebb545b" Mar 20 10:59:05 crc kubenswrapper[4748]: I0320 10:59:05.988870 4748 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2ae65f728186bdbfc5896331a23feba311ba3c767f94842146308204ebb545b"} err="failed to get container status \"b2ae65f728186bdbfc5896331a23feba311ba3c767f94842146308204ebb545b\": rpc error: code = NotFound desc = could not find container \"b2ae65f728186bdbfc5896331a23feba311ba3c767f94842146308204ebb545b\": container with ID starting with b2ae65f728186bdbfc5896331a23feba311ba3c767f94842146308204ebb545b not found: ID does not exist" Mar 20 10:59:05 crc kubenswrapper[4748]: I0320 10:59:05.988887 4748 scope.go:117] "RemoveContainer" containerID="9e4c7f13fba570acfebd1976c777e0d5cfcca5ac087ce2121d42feed119cf895" Mar 20 10:59:05 crc kubenswrapper[4748]: I0320 10:59:05.989108 4748 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e4c7f13fba570acfebd1976c777e0d5cfcca5ac087ce2121d42feed119cf895"} err="failed to get container status \"9e4c7f13fba570acfebd1976c777e0d5cfcca5ac087ce2121d42feed119cf895\": rpc error: code = NotFound desc = could not find container \"9e4c7f13fba570acfebd1976c777e0d5cfcca5ac087ce2121d42feed119cf895\": container with ID starting with 9e4c7f13fba570acfebd1976c777e0d5cfcca5ac087ce2121d42feed119cf895 not found: ID does not exist" Mar 20 10:59:05 crc kubenswrapper[4748]: I0320 10:59:05.989127 4748 scope.go:117] "RemoveContainer" containerID="b2ae65f728186bdbfc5896331a23feba311ba3c767f94842146308204ebb545b" Mar 20 10:59:05 crc kubenswrapper[4748]: I0320 10:59:05.989425 4748 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2ae65f728186bdbfc5896331a23feba311ba3c767f94842146308204ebb545b"} err="failed to get container status \"b2ae65f728186bdbfc5896331a23feba311ba3c767f94842146308204ebb545b\": rpc error: code = NotFound desc = could not find container \"b2ae65f728186bdbfc5896331a23feba311ba3c767f94842146308204ebb545b\": container with ID starting with b2ae65f728186bdbfc5896331a23feba311ba3c767f94842146308204ebb545b not found: ID does not exist" Mar 20 10:59:05 crc kubenswrapper[4748]: I0320 10:59:05.989587 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 20 10:59:05 crc kubenswrapper[4748]: E0320 10:59:05.990008 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03bcc896-7c4f-4db1-97a4-c8b48821bd1d" containerName="nova-metadata-log" Mar 20 10:59:05 crc kubenswrapper[4748]: I0320 10:59:05.990026 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="03bcc896-7c4f-4db1-97a4-c8b48821bd1d" containerName="nova-metadata-log" Mar 20 10:59:05 crc kubenswrapper[4748]: E0320 10:59:05.990061 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03bcc896-7c4f-4db1-97a4-c8b48821bd1d" containerName="nova-metadata-metadata" Mar 20 10:59:05 crc kubenswrapper[4748]: I0320 10:59:05.990068 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="03bcc896-7c4f-4db1-97a4-c8b48821bd1d" containerName="nova-metadata-metadata" Mar 20 10:59:05 crc kubenswrapper[4748]: I0320 10:59:05.990290 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="03bcc896-7c4f-4db1-97a4-c8b48821bd1d" containerName="nova-metadata-metadata" Mar 20 10:59:05 crc kubenswrapper[4748]: I0320 10:59:05.990311 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="03bcc896-7c4f-4db1-97a4-c8b48821bd1d" containerName="nova-metadata-log" Mar 20 10:59:05 crc kubenswrapper[4748]: I0320 10:59:05.991281 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 10:59:05 crc kubenswrapper[4748]: I0320 10:59:05.995786 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 20 10:59:05 crc kubenswrapper[4748]: I0320 10:59:05.996050 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 20 10:59:06 crc kubenswrapper[4748]: I0320 10:59:06.004904 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 10:59:06 crc kubenswrapper[4748]: I0320 10:59:06.047334 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/004a36eb-3b0d-4a36-acf2-f9b8187bcef4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"004a36eb-3b0d-4a36-acf2-f9b8187bcef4\") " pod="openstack/nova-metadata-0" Mar 20 10:59:06 crc kubenswrapper[4748]: I0320 10:59:06.047438 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/004a36eb-3b0d-4a36-acf2-f9b8187bcef4-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"004a36eb-3b0d-4a36-acf2-f9b8187bcef4\") " pod="openstack/nova-metadata-0" Mar 20 10:59:06 crc kubenswrapper[4748]: I0320 10:59:06.047563 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/004a36eb-3b0d-4a36-acf2-f9b8187bcef4-config-data\") pod \"nova-metadata-0\" (UID: \"004a36eb-3b0d-4a36-acf2-f9b8187bcef4\") " pod="openstack/nova-metadata-0" Mar 20 10:59:06 crc kubenswrapper[4748]: I0320 10:59:06.047641 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/004a36eb-3b0d-4a36-acf2-f9b8187bcef4-logs\") pod \"nova-metadata-0\" (UID: \"004a36eb-3b0d-4a36-acf2-f9b8187bcef4\") " pod="openstack/nova-metadata-0" Mar 20 10:59:06 crc kubenswrapper[4748]: I0320 10:59:06.047693 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8n2vk\" (UniqueName: \"kubernetes.io/projected/004a36eb-3b0d-4a36-acf2-f9b8187bcef4-kube-api-access-8n2vk\") pod \"nova-metadata-0\" (UID: \"004a36eb-3b0d-4a36-acf2-f9b8187bcef4\") " pod="openstack/nova-metadata-0" Mar 20 10:59:06 crc kubenswrapper[4748]: I0320 10:59:06.148824 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/004a36eb-3b0d-4a36-acf2-f9b8187bcef4-config-data\") pod \"nova-metadata-0\" (UID: \"004a36eb-3b0d-4a36-acf2-f9b8187bcef4\") " pod="openstack/nova-metadata-0" Mar 20 10:59:06 crc kubenswrapper[4748]: I0320 10:59:06.148909 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/004a36eb-3b0d-4a36-acf2-f9b8187bcef4-logs\") pod \"nova-metadata-0\" (UID: \"004a36eb-3b0d-4a36-acf2-f9b8187bcef4\") " pod="openstack/nova-metadata-0" Mar 20 10:59:06 crc kubenswrapper[4748]: I0320 10:59:06.148939 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8n2vk\" (UniqueName: \"kubernetes.io/projected/004a36eb-3b0d-4a36-acf2-f9b8187bcef4-kube-api-access-8n2vk\") pod \"nova-metadata-0\" (UID: \"004a36eb-3b0d-4a36-acf2-f9b8187bcef4\") " pod="openstack/nova-metadata-0" Mar 20 10:59:06 crc kubenswrapper[4748]: I0320 10:59:06.148983 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/004a36eb-3b0d-4a36-acf2-f9b8187bcef4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"004a36eb-3b0d-4a36-acf2-f9b8187bcef4\") " pod="openstack/nova-metadata-0" Mar 20 10:59:06 crc kubenswrapper[4748]: I0320 10:59:06.149024 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/004a36eb-3b0d-4a36-acf2-f9b8187bcef4-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"004a36eb-3b0d-4a36-acf2-f9b8187bcef4\") " pod="openstack/nova-metadata-0" Mar 20 10:59:06 crc kubenswrapper[4748]: I0320 10:59:06.149498 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/004a36eb-3b0d-4a36-acf2-f9b8187bcef4-logs\") pod \"nova-metadata-0\" (UID: \"004a36eb-3b0d-4a36-acf2-f9b8187bcef4\") " pod="openstack/nova-metadata-0" Mar 20 10:59:06 crc kubenswrapper[4748]: I0320 10:59:06.156662 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/004a36eb-3b0d-4a36-acf2-f9b8187bcef4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"004a36eb-3b0d-4a36-acf2-f9b8187bcef4\") " pod="openstack/nova-metadata-0" Mar 20 10:59:06 crc kubenswrapper[4748]: I0320 10:59:06.156855 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/004a36eb-3b0d-4a36-acf2-f9b8187bcef4-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"004a36eb-3b0d-4a36-acf2-f9b8187bcef4\") " pod="openstack/nova-metadata-0" Mar 20 10:59:06 crc kubenswrapper[4748]: I0320 10:59:06.156907 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/004a36eb-3b0d-4a36-acf2-f9b8187bcef4-config-data\") pod \"nova-metadata-0\" (UID: \"004a36eb-3b0d-4a36-acf2-f9b8187bcef4\") " pod="openstack/nova-metadata-0" Mar 20 10:59:06 crc kubenswrapper[4748]: I0320 10:59:06.173921 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8n2vk\" (UniqueName: \"kubernetes.io/projected/004a36eb-3b0d-4a36-acf2-f9b8187bcef4-kube-api-access-8n2vk\") pod \"nova-metadata-0\" (UID: \"004a36eb-3b0d-4a36-acf2-f9b8187bcef4\") " pod="openstack/nova-metadata-0" Mar 20 10:59:06 crc kubenswrapper[4748]: I0320 10:59:06.323707 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 10:59:06 crc kubenswrapper[4748]: I0320 10:59:06.760865 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 10:59:06 crc kubenswrapper[4748]: I0320 10:59:06.926198 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 10:59:06 crc kubenswrapper[4748]: I0320 10:59:06.926648 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="cae37e8d-39b5-4045-aa76-b36630130555" containerName="kube-state-metrics" containerID="cri-o://ffaf001983b0e0e79ddb3a63e0ef90c30af0324d2384906b1e696e1b7cf09210" gracePeriod=30 Mar 20 10:59:06 crc kubenswrapper[4748]: I0320 10:59:06.935624 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"004a36eb-3b0d-4a36-acf2-f9b8187bcef4","Type":"ContainerStarted","Data":"253ff5e67f6ff840b4ccbd4729eced0001f2c3c4c20b8d26f5de2a889751aa7e"} Mar 20 10:59:07 crc kubenswrapper[4748]: I0320 10:59:07.529364 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03bcc896-7c4f-4db1-97a4-c8b48821bd1d" path="/var/lib/kubelet/pods/03bcc896-7c4f-4db1-97a4-c8b48821bd1d/volumes" Mar 20 10:59:07 crc kubenswrapper[4748]: I0320 10:59:07.948361 4748 generic.go:334] "Generic (PLEG): container finished" podID="cae37e8d-39b5-4045-aa76-b36630130555" containerID="ffaf001983b0e0e79ddb3a63e0ef90c30af0324d2384906b1e696e1b7cf09210" exitCode=2 Mar 20 10:59:07 crc kubenswrapper[4748]: I0320 10:59:07.948470 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"cae37e8d-39b5-4045-aa76-b36630130555","Type":"ContainerDied","Data":"ffaf001983b0e0e79ddb3a63e0ef90c30af0324d2384906b1e696e1b7cf09210"} Mar 20 10:59:07 crc kubenswrapper[4748]: I0320 10:59:07.952055 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"004a36eb-3b0d-4a36-acf2-f9b8187bcef4","Type":"ContainerStarted","Data":"b09444d0b28597ebdc3857a37e7f455eaea7e7cca49abbdac92f5f1276f00468"} Mar 20 10:59:07 crc kubenswrapper[4748]: I0320 10:59:07.954584 4748 generic.go:334] "Generic (PLEG): container finished" podID="d7263772-e7ec-43ad-815f-7c6a67575402" containerID="6ffd362a35588b891b195eb4be044481eb6c22e82158760f46e425d88d3a61b6" exitCode=0 Mar 20 10:59:07 crc kubenswrapper[4748]: I0320 10:59:07.954656 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-hww8p" event={"ID":"d7263772-e7ec-43ad-815f-7c6a67575402","Type":"ContainerDied","Data":"6ffd362a35588b891b195eb4be044481eb6c22e82158760f46e425d88d3a61b6"} Mar 20 10:59:08 crc kubenswrapper[4748]: I0320 10:59:08.068876 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 20 10:59:08 crc kubenswrapper[4748]: I0320 10:59:08.072109 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 20 10:59:08 crc kubenswrapper[4748]: I0320 10:59:08.104283 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 20 10:59:08 crc kubenswrapper[4748]: I0320 10:59:08.104324 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 20 10:59:08 crc kubenswrapper[4748]: I0320 10:59:08.163130 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 20 10:59:08 crc kubenswrapper[4748]: I0320 10:59:08.280550 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 20 10:59:08 crc kubenswrapper[4748]: I0320 10:59:08.399091 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 20 10:59:08 crc kubenswrapper[4748]: I0320 10:59:08.400067 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fsvcl\" (UniqueName: \"kubernetes.io/projected/cae37e8d-39b5-4045-aa76-b36630130555-kube-api-access-fsvcl\") pod \"cae37e8d-39b5-4045-aa76-b36630130555\" (UID: \"cae37e8d-39b5-4045-aa76-b36630130555\") " Mar 20 10:59:08 crc kubenswrapper[4748]: I0320 10:59:08.408847 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cae37e8d-39b5-4045-aa76-b36630130555-kube-api-access-fsvcl" (OuterVolumeSpecName: "kube-api-access-fsvcl") pod "cae37e8d-39b5-4045-aa76-b36630130555" (UID: "cae37e8d-39b5-4045-aa76-b36630130555"). InnerVolumeSpecName "kube-api-access-fsvcl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:59:08 crc kubenswrapper[4748]: I0320 10:59:08.421059 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-757b4f8459-9c85k" Mar 20 10:59:08 crc kubenswrapper[4748]: I0320 10:59:08.484263 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-dg9px"] Mar 20 10:59:08 crc kubenswrapper[4748]: I0320 10:59:08.484550 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c9776ccc5-dg9px" podUID="f25ff3ee-c6f6-4b0c-ab9b-b16d5f1d33ba" containerName="dnsmasq-dns" containerID="cri-o://9865d2f53475bb341f646c1a24fffb584a17e8effa6599ec36b2ad72671239b1" gracePeriod=10 Mar 20 10:59:08 crc kubenswrapper[4748]: I0320 10:59:08.502402 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fsvcl\" (UniqueName: \"kubernetes.io/projected/cae37e8d-39b5-4045-aa76-b36630130555-kube-api-access-fsvcl\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:08 crc kubenswrapper[4748]: E0320 10:59:08.640698 4748 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf25ff3ee_c6f6_4b0c_ab9b_b16d5f1d33ba.slice/crio-9865d2f53475bb341f646c1a24fffb584a17e8effa6599ec36b2ad72671239b1.scope\": RecentStats: unable to find data in memory cache]" Mar 20 10:59:08 crc kubenswrapper[4748]: I0320 10:59:08.964682 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"cae37e8d-39b5-4045-aa76-b36630130555","Type":"ContainerDied","Data":"1f9ceb1d83b0268a51fcbb041538e072d0a06d480ae737ff6447b08905aafe3f"} Mar 20 10:59:08 crc kubenswrapper[4748]: I0320 10:59:08.964760 4748 scope.go:117] "RemoveContainer" containerID="ffaf001983b0e0e79ddb3a63e0ef90c30af0324d2384906b1e696e1b7cf09210" Mar 20 10:59:08 crc kubenswrapper[4748]: I0320 10:59:08.964918 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 20 10:59:08 crc kubenswrapper[4748]: I0320 10:59:08.969611 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"004a36eb-3b0d-4a36-acf2-f9b8187bcef4","Type":"ContainerStarted","Data":"f71e88db4a9250b6893dc8de5e8a6e00028b0de0a607e36410458115cbcbc439"} Mar 20 10:59:08 crc kubenswrapper[4748]: I0320 10:59:08.981341 4748 generic.go:334] "Generic (PLEG): container finished" podID="f25ff3ee-c6f6-4b0c-ab9b-b16d5f1d33ba" containerID="9865d2f53475bb341f646c1a24fffb584a17e8effa6599ec36b2ad72671239b1" exitCode=0 Mar 20 10:59:08 crc kubenswrapper[4748]: I0320 10:59:08.982361 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-dg9px" event={"ID":"f25ff3ee-c6f6-4b0c-ab9b-b16d5f1d33ba","Type":"ContainerDied","Data":"9865d2f53475bb341f646c1a24fffb584a17e8effa6599ec36b2ad72671239b1"} Mar 20 10:59:08 crc kubenswrapper[4748]: I0320 10:59:08.997974 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.997954392 podStartE2EDuration="3.997954392s" podCreationTimestamp="2026-03-20 10:59:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:59:08.985664384 +0000 UTC m=+1384.127210208" watchObservedRunningTime="2026-03-20 10:59:08.997954392 +0000 UTC m=+1384.139500206" Mar 20 10:59:09 crc kubenswrapper[4748]: I0320 10:59:09.010936 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 10:59:09 crc kubenswrapper[4748]: I0320 10:59:09.030574 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 10:59:09 crc kubenswrapper[4748]: I0320 10:59:09.035047 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 20 10:59:09 crc kubenswrapper[4748]: I0320 10:59:09.037158 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 10:59:09 crc kubenswrapper[4748]: E0320 10:59:09.037599 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cae37e8d-39b5-4045-aa76-b36630130555" containerName="kube-state-metrics" Mar 20 10:59:09 crc kubenswrapper[4748]: I0320 10:59:09.037620 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="cae37e8d-39b5-4045-aa76-b36630130555" containerName="kube-state-metrics" Mar 20 10:59:09 crc kubenswrapper[4748]: I0320 10:59:09.037810 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="cae37e8d-39b5-4045-aa76-b36630130555" containerName="kube-state-metrics" Mar 20 10:59:09 crc kubenswrapper[4748]: I0320 10:59:09.038507 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 20 10:59:09 crc kubenswrapper[4748]: I0320 10:59:09.042205 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Mar 20 10:59:09 crc kubenswrapper[4748]: I0320 10:59:09.042444 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Mar 20 10:59:09 crc kubenswrapper[4748]: I0320 10:59:09.048669 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 10:59:09 crc kubenswrapper[4748]: I0320 10:59:09.113308 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/e9d06c7d-5d90-45f8-b4df-b53bff4761a5-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"e9d06c7d-5d90-45f8-b4df-b53bff4761a5\") " pod="openstack/kube-state-metrics-0" Mar 20 10:59:09 crc kubenswrapper[4748]: I0320 10:59:09.113364 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krt88\" (UniqueName: \"kubernetes.io/projected/e9d06c7d-5d90-45f8-b4df-b53bff4761a5-kube-api-access-krt88\") pod \"kube-state-metrics-0\" (UID: \"e9d06c7d-5d90-45f8-b4df-b53bff4761a5\") " pod="openstack/kube-state-metrics-0" Mar 20 10:59:09 crc kubenswrapper[4748]: I0320 10:59:09.113432 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9d06c7d-5d90-45f8-b4df-b53bff4761a5-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"e9d06c7d-5d90-45f8-b4df-b53bff4761a5\") " pod="openstack/kube-state-metrics-0" Mar 20 10:59:09 crc kubenswrapper[4748]: I0320 10:59:09.113511 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/e9d06c7d-5d90-45f8-b4df-b53bff4761a5-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"e9d06c7d-5d90-45f8-b4df-b53bff4761a5\") " pod="openstack/kube-state-metrics-0" Mar 20 10:59:09 crc kubenswrapper[4748]: I0320 10:59:09.152085 4748 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="f45b2b72-df5d-4ae6-9f5a-8f6cced75642" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.191:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 10:59:09 crc kubenswrapper[4748]: I0320 10:59:09.152104 4748 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="f45b2b72-df5d-4ae6-9f5a-8f6cced75642" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.191:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 10:59:09 crc kubenswrapper[4748]: I0320 10:59:09.214918 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/e9d06c7d-5d90-45f8-b4df-b53bff4761a5-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"e9d06c7d-5d90-45f8-b4df-b53bff4761a5\") " pod="openstack/kube-state-metrics-0" Mar 20 10:59:09 crc kubenswrapper[4748]: I0320 10:59:09.214995 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/e9d06c7d-5d90-45f8-b4df-b53bff4761a5-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"e9d06c7d-5d90-45f8-b4df-b53bff4761a5\") " pod="openstack/kube-state-metrics-0" Mar 20 10:59:09 crc kubenswrapper[4748]: I0320 10:59:09.215025 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-krt88\" (UniqueName: \"kubernetes.io/projected/e9d06c7d-5d90-45f8-b4df-b53bff4761a5-kube-api-access-krt88\") pod \"kube-state-metrics-0\" (UID: \"e9d06c7d-5d90-45f8-b4df-b53bff4761a5\") " pod="openstack/kube-state-metrics-0" Mar 20 10:59:09 crc kubenswrapper[4748]: I0320 10:59:09.215092 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9d06c7d-5d90-45f8-b4df-b53bff4761a5-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"e9d06c7d-5d90-45f8-b4df-b53bff4761a5\") " pod="openstack/kube-state-metrics-0" Mar 20 10:59:09 crc kubenswrapper[4748]: I0320 10:59:09.221035 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/e9d06c7d-5d90-45f8-b4df-b53bff4761a5-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"e9d06c7d-5d90-45f8-b4df-b53bff4761a5\") " pod="openstack/kube-state-metrics-0" Mar 20 10:59:09 crc kubenswrapper[4748]: I0320 10:59:09.223673 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/e9d06c7d-5d90-45f8-b4df-b53bff4761a5-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"e9d06c7d-5d90-45f8-b4df-b53bff4761a5\") " pod="openstack/kube-state-metrics-0" Mar 20 10:59:09 crc kubenswrapper[4748]: I0320 10:59:09.225074 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9d06c7d-5d90-45f8-b4df-b53bff4761a5-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"e9d06c7d-5d90-45f8-b4df-b53bff4761a5\") " pod="openstack/kube-state-metrics-0" Mar 20 10:59:09 crc kubenswrapper[4748]: I0320 10:59:09.239911 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-krt88\" (UniqueName: \"kubernetes.io/projected/e9d06c7d-5d90-45f8-b4df-b53bff4761a5-kube-api-access-krt88\") pod \"kube-state-metrics-0\" (UID: \"e9d06c7d-5d90-45f8-b4df-b53bff4761a5\") " pod="openstack/kube-state-metrics-0" Mar 20 10:59:09 crc kubenswrapper[4748]: I0320 10:59:09.365386 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 20 10:59:09 crc kubenswrapper[4748]: I0320 10:59:09.370799 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-hww8p" Mar 20 10:59:09 crc kubenswrapper[4748]: I0320 10:59:09.419120 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d7263772-e7ec-43ad-815f-7c6a67575402-scripts\") pod \"d7263772-e7ec-43ad-815f-7c6a67575402\" (UID: \"d7263772-e7ec-43ad-815f-7c6a67575402\") " Mar 20 10:59:09 crc kubenswrapper[4748]: I0320 10:59:09.419198 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7263772-e7ec-43ad-815f-7c6a67575402-config-data\") pod \"d7263772-e7ec-43ad-815f-7c6a67575402\" (UID: \"d7263772-e7ec-43ad-815f-7c6a67575402\") " Mar 20 10:59:09 crc kubenswrapper[4748]: I0320 10:59:09.419251 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7263772-e7ec-43ad-815f-7c6a67575402-combined-ca-bundle\") pod \"d7263772-e7ec-43ad-815f-7c6a67575402\" (UID: \"d7263772-e7ec-43ad-815f-7c6a67575402\") " Mar 20 10:59:09 crc kubenswrapper[4748]: I0320 10:59:09.419353 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p7mcb\" (UniqueName: \"kubernetes.io/projected/d7263772-e7ec-43ad-815f-7c6a67575402-kube-api-access-p7mcb\") pod \"d7263772-e7ec-43ad-815f-7c6a67575402\" (UID: \"d7263772-e7ec-43ad-815f-7c6a67575402\") " Mar 20 10:59:09 crc kubenswrapper[4748]: I0320 10:59:09.425326 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7263772-e7ec-43ad-815f-7c6a67575402-scripts" (OuterVolumeSpecName: "scripts") pod "d7263772-e7ec-43ad-815f-7c6a67575402" (UID: "d7263772-e7ec-43ad-815f-7c6a67575402"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:59:09 crc kubenswrapper[4748]: I0320 10:59:09.425381 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7263772-e7ec-43ad-815f-7c6a67575402-kube-api-access-p7mcb" (OuterVolumeSpecName: "kube-api-access-p7mcb") pod "d7263772-e7ec-43ad-815f-7c6a67575402" (UID: "d7263772-e7ec-43ad-815f-7c6a67575402"). InnerVolumeSpecName "kube-api-access-p7mcb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:59:09 crc kubenswrapper[4748]: I0320 10:59:09.452026 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7263772-e7ec-43ad-815f-7c6a67575402-config-data" (OuterVolumeSpecName: "config-data") pod "d7263772-e7ec-43ad-815f-7c6a67575402" (UID: "d7263772-e7ec-43ad-815f-7c6a67575402"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:59:09 crc kubenswrapper[4748]: I0320 10:59:09.459195 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7263772-e7ec-43ad-815f-7c6a67575402-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d7263772-e7ec-43ad-815f-7c6a67575402" (UID: "d7263772-e7ec-43ad-815f-7c6a67575402"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:59:09 crc kubenswrapper[4748]: I0320 10:59:09.521778 4748 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d7263772-e7ec-43ad-815f-7c6a67575402-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:09 crc kubenswrapper[4748]: I0320 10:59:09.521820 4748 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7263772-e7ec-43ad-815f-7c6a67575402-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:09 crc kubenswrapper[4748]: I0320 10:59:09.521979 4748 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7263772-e7ec-43ad-815f-7c6a67575402-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:09 crc kubenswrapper[4748]: I0320 10:59:09.522000 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p7mcb\" (UniqueName: \"kubernetes.io/projected/d7263772-e7ec-43ad-815f-7c6a67575402-kube-api-access-p7mcb\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:09 crc kubenswrapper[4748]: I0320 10:59:09.530921 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cae37e8d-39b5-4045-aa76-b36630130555" path="/var/lib/kubelet/pods/cae37e8d-39b5-4045-aa76-b36630130555/volumes" Mar 20 10:59:09 crc kubenswrapper[4748]: I0320 10:59:09.717480 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 10:59:09 crc kubenswrapper[4748]: I0320 10:59:09.717773 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0c9d3443-1243-4415-a0fe-747695b73aa4" containerName="ceilometer-central-agent" containerID="cri-o://160c93f583183a2c237e35815bfbc7537348f3742cb2a528ef8c6bbdf2f0bdab" gracePeriod=30 Mar 20 10:59:09 crc kubenswrapper[4748]: I0320 10:59:09.717907 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0c9d3443-1243-4415-a0fe-747695b73aa4" containerName="proxy-httpd" containerID="cri-o://cfabfd00d8ac8ab3af11eab8c1a1eef9a80078ebe7308993892e26437662eebf" gracePeriod=30 Mar 20 10:59:09 crc kubenswrapper[4748]: I0320 10:59:09.717971 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0c9d3443-1243-4415-a0fe-747695b73aa4" containerName="ceilometer-notification-agent" containerID="cri-o://7829ba8789365041b79bc2d828c39594d8a6d5d4aadd7880de8d298b87680499" gracePeriod=30 Mar 20 10:59:09 crc kubenswrapper[4748]: I0320 10:59:09.718149 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0c9d3443-1243-4415-a0fe-747695b73aa4" containerName="sg-core" containerID="cri-o://fd9aa1368ea7942f8510cc92d750f198adc44655f2573edb5b8deda50d649c2a" gracePeriod=30 Mar 20 10:59:09 crc kubenswrapper[4748]: I0320 10:59:09.852820 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 10:59:09 crc kubenswrapper[4748]: I0320 10:59:09.876484 4748 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 10:59:09 crc kubenswrapper[4748]: I0320 10:59:09.995461 4748 generic.go:334] "Generic (PLEG): container finished" podID="0c9d3443-1243-4415-a0fe-747695b73aa4" containerID="fd9aa1368ea7942f8510cc92d750f198adc44655f2573edb5b8deda50d649c2a" exitCode=2 Mar 20 10:59:09 crc kubenswrapper[4748]: I0320 10:59:09.995535 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0c9d3443-1243-4415-a0fe-747695b73aa4","Type":"ContainerDied","Data":"fd9aa1368ea7942f8510cc92d750f198adc44655f2573edb5b8deda50d649c2a"} Mar 20 10:59:10 crc kubenswrapper[4748]: I0320 10:59:10.008116 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-hww8p" event={"ID":"d7263772-e7ec-43ad-815f-7c6a67575402","Type":"ContainerDied","Data":"90083acdc38d59c542973c9bfec48fd2c41804d4fe37ab0108a61291548240dd"} Mar 20 10:59:10 crc kubenswrapper[4748]: I0320 10:59:10.008161 4748 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="90083acdc38d59c542973c9bfec48fd2c41804d4fe37ab0108a61291548240dd" Mar 20 10:59:10 crc kubenswrapper[4748]: I0320 10:59:10.008264 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-hww8p" Mar 20 10:59:10 crc kubenswrapper[4748]: I0320 10:59:10.015454 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"e9d06c7d-5d90-45f8-b4df-b53bff4761a5","Type":"ContainerStarted","Data":"6318dcfd000387dfaa73c407e3331d299212b5ee25995602c2c8750476ab475d"} Mar 20 10:59:10 crc kubenswrapper[4748]: I0320 10:59:10.106645 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-dg9px" Mar 20 10:59:10 crc kubenswrapper[4748]: I0320 10:59:10.231033 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 20 10:59:10 crc kubenswrapper[4748]: I0320 10:59:10.231295 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="f45b2b72-df5d-4ae6-9f5a-8f6cced75642" containerName="nova-api-log" containerID="cri-o://ddcac3d621e07e876e9eb9fe3a0fa0559595aa5e04dc6c8838cf569b7effb9cf" gracePeriod=30 Mar 20 10:59:10 crc kubenswrapper[4748]: I0320 10:59:10.231771 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="f45b2b72-df5d-4ae6-9f5a-8f6cced75642" containerName="nova-api-api" containerID="cri-o://f92e189958f888ce283b036bc284b4431af82bd0c2f0bd94e1a43b803ebb2fc2" gracePeriod=30 Mar 20 10:59:10 crc kubenswrapper[4748]: I0320 10:59:10.236541 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f25ff3ee-c6f6-4b0c-ab9b-b16d5f1d33ba-ovsdbserver-sb\") pod \"f25ff3ee-c6f6-4b0c-ab9b-b16d5f1d33ba\" (UID: \"f25ff3ee-c6f6-4b0c-ab9b-b16d5f1d33ba\") " Mar 20 10:59:10 crc kubenswrapper[4748]: I0320 10:59:10.236591 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8kkfn\" (UniqueName: \"kubernetes.io/projected/f25ff3ee-c6f6-4b0c-ab9b-b16d5f1d33ba-kube-api-access-8kkfn\") pod \"f25ff3ee-c6f6-4b0c-ab9b-b16d5f1d33ba\" (UID: \"f25ff3ee-c6f6-4b0c-ab9b-b16d5f1d33ba\") " Mar 20 10:59:10 crc kubenswrapper[4748]: I0320 10:59:10.236659 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f25ff3ee-c6f6-4b0c-ab9b-b16d5f1d33ba-config\") pod \"f25ff3ee-c6f6-4b0c-ab9b-b16d5f1d33ba\" (UID: \"f25ff3ee-c6f6-4b0c-ab9b-b16d5f1d33ba\") " Mar 20 10:59:10 crc kubenswrapper[4748]: I0320 10:59:10.236758 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f25ff3ee-c6f6-4b0c-ab9b-b16d5f1d33ba-dns-svc\") pod \"f25ff3ee-c6f6-4b0c-ab9b-b16d5f1d33ba\" (UID: \"f25ff3ee-c6f6-4b0c-ab9b-b16d5f1d33ba\") " Mar 20 10:59:10 crc kubenswrapper[4748]: I0320 10:59:10.236822 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f25ff3ee-c6f6-4b0c-ab9b-b16d5f1d33ba-ovsdbserver-nb\") pod \"f25ff3ee-c6f6-4b0c-ab9b-b16d5f1d33ba\" (UID: \"f25ff3ee-c6f6-4b0c-ab9b-b16d5f1d33ba\") " Mar 20 10:59:10 crc kubenswrapper[4748]: I0320 10:59:10.236904 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f25ff3ee-c6f6-4b0c-ab9b-b16d5f1d33ba-dns-swift-storage-0\") pod \"f25ff3ee-c6f6-4b0c-ab9b-b16d5f1d33ba\" (UID: \"f25ff3ee-c6f6-4b0c-ab9b-b16d5f1d33ba\") " Mar 20 10:59:10 crc kubenswrapper[4748]: I0320 10:59:10.256850 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 10:59:10 crc kubenswrapper[4748]: I0320 10:59:10.273046 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f25ff3ee-c6f6-4b0c-ab9b-b16d5f1d33ba-kube-api-access-8kkfn" (OuterVolumeSpecName: "kube-api-access-8kkfn") pod "f25ff3ee-c6f6-4b0c-ab9b-b16d5f1d33ba" (UID: "f25ff3ee-c6f6-4b0c-ab9b-b16d5f1d33ba"). InnerVolumeSpecName "kube-api-access-8kkfn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:59:10 crc kubenswrapper[4748]: I0320 10:59:10.339098 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8kkfn\" (UniqueName: \"kubernetes.io/projected/f25ff3ee-c6f6-4b0c-ab9b-b16d5f1d33ba-kube-api-access-8kkfn\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:10 crc kubenswrapper[4748]: I0320 10:59:10.384029 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 10:59:10 crc kubenswrapper[4748]: I0320 10:59:10.395273 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f25ff3ee-c6f6-4b0c-ab9b-b16d5f1d33ba-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f25ff3ee-c6f6-4b0c-ab9b-b16d5f1d33ba" (UID: "f25ff3ee-c6f6-4b0c-ab9b-b16d5f1d33ba"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:59:10 crc kubenswrapper[4748]: I0320 10:59:10.402630 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f25ff3ee-c6f6-4b0c-ab9b-b16d5f1d33ba-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f25ff3ee-c6f6-4b0c-ab9b-b16d5f1d33ba" (UID: "f25ff3ee-c6f6-4b0c-ab9b-b16d5f1d33ba"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:59:10 crc kubenswrapper[4748]: I0320 10:59:10.404977 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f25ff3ee-c6f6-4b0c-ab9b-b16d5f1d33ba-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "f25ff3ee-c6f6-4b0c-ab9b-b16d5f1d33ba" (UID: "f25ff3ee-c6f6-4b0c-ab9b-b16d5f1d33ba"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:59:10 crc kubenswrapper[4748]: I0320 10:59:10.412918 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f25ff3ee-c6f6-4b0c-ab9b-b16d5f1d33ba-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f25ff3ee-c6f6-4b0c-ab9b-b16d5f1d33ba" (UID: "f25ff3ee-c6f6-4b0c-ab9b-b16d5f1d33ba"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:59:10 crc kubenswrapper[4748]: I0320 10:59:10.440685 4748 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f25ff3ee-c6f6-4b0c-ab9b-b16d5f1d33ba-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:10 crc kubenswrapper[4748]: I0320 10:59:10.440724 4748 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f25ff3ee-c6f6-4b0c-ab9b-b16d5f1d33ba-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:10 crc kubenswrapper[4748]: I0320 10:59:10.440738 4748 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f25ff3ee-c6f6-4b0c-ab9b-b16d5f1d33ba-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:10 crc kubenswrapper[4748]: I0320 10:59:10.440749 4748 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f25ff3ee-c6f6-4b0c-ab9b-b16d5f1d33ba-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:10 crc kubenswrapper[4748]: I0320 10:59:10.559614 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f25ff3ee-c6f6-4b0c-ab9b-b16d5f1d33ba-config" (OuterVolumeSpecName: "config") pod "f25ff3ee-c6f6-4b0c-ab9b-b16d5f1d33ba" (UID: "f25ff3ee-c6f6-4b0c-ab9b-b16d5f1d33ba"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:59:10 crc kubenswrapper[4748]: I0320 10:59:10.644767 4748 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f25ff3ee-c6f6-4b0c-ab9b-b16d5f1d33ba-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:11 crc kubenswrapper[4748]: I0320 10:59:11.025997 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-dg9px" event={"ID":"f25ff3ee-c6f6-4b0c-ab9b-b16d5f1d33ba","Type":"ContainerDied","Data":"093dd8b3922c27300d223e058e975fa705a15dff63764c4d7160dce70705a0f4"} Mar 20 10:59:11 crc kubenswrapper[4748]: I0320 10:59:11.026040 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-dg9px" Mar 20 10:59:11 crc kubenswrapper[4748]: I0320 10:59:11.026100 4748 scope.go:117] "RemoveContainer" containerID="9865d2f53475bb341f646c1a24fffb584a17e8effa6599ec36b2ad72671239b1" Mar 20 10:59:11 crc kubenswrapper[4748]: I0320 10:59:11.033848 4748 generic.go:334] "Generic (PLEG): container finished" podID="0c9d3443-1243-4415-a0fe-747695b73aa4" containerID="cfabfd00d8ac8ab3af11eab8c1a1eef9a80078ebe7308993892e26437662eebf" exitCode=0 Mar 20 10:59:11 crc kubenswrapper[4748]: I0320 10:59:11.033880 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0c9d3443-1243-4415-a0fe-747695b73aa4","Type":"ContainerDied","Data":"cfabfd00d8ac8ab3af11eab8c1a1eef9a80078ebe7308993892e26437662eebf"} Mar 20 10:59:11 crc kubenswrapper[4748]: I0320 10:59:11.049801 4748 generic.go:334] "Generic (PLEG): container finished" podID="f45b2b72-df5d-4ae6-9f5a-8f6cced75642" containerID="ddcac3d621e07e876e9eb9fe3a0fa0559595aa5e04dc6c8838cf569b7effb9cf" exitCode=143 Mar 20 10:59:11 crc kubenswrapper[4748]: I0320 10:59:11.050242 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="a3539da7-c3da-4c63-9eec-56bf69254d9f" containerName="nova-scheduler-scheduler" containerID="cri-o://8924d11d2e8e9ae5dc583afbb64c79638c1923c1eb98201390ee1811f9ad2fa6" gracePeriod=30 Mar 20 10:59:11 crc kubenswrapper[4748]: I0320 10:59:11.050706 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f45b2b72-df5d-4ae6-9f5a-8f6cced75642","Type":"ContainerDied","Data":"ddcac3d621e07e876e9eb9fe3a0fa0559595aa5e04dc6c8838cf569b7effb9cf"} Mar 20 10:59:11 crc kubenswrapper[4748]: I0320 10:59:11.050986 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="004a36eb-3b0d-4a36-acf2-f9b8187bcef4" containerName="nova-metadata-log" containerID="cri-o://b09444d0b28597ebdc3857a37e7f455eaea7e7cca49abbdac92f5f1276f00468" gracePeriod=30 Mar 20 10:59:11 crc kubenswrapper[4748]: I0320 10:59:11.051079 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="004a36eb-3b0d-4a36-acf2-f9b8187bcef4" containerName="nova-metadata-metadata" containerID="cri-o://f71e88db4a9250b6893dc8de5e8a6e00028b0de0a607e36410458115cbcbc439" gracePeriod=30 Mar 20 10:59:11 crc kubenswrapper[4748]: I0320 10:59:11.087480 4748 scope.go:117] "RemoveContainer" containerID="e47ff1992ccdb4eeeb03582ed6109b9aa838fd7fcc9045eabe6921d6bf8e5b87" Mar 20 10:59:11 crc kubenswrapper[4748]: I0320 10:59:11.126084 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-dg9px"] Mar 20 10:59:11 crc kubenswrapper[4748]: I0320 10:59:11.142574 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-dg9px"] Mar 20 10:59:11 crc kubenswrapper[4748]: I0320 10:59:11.529993 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f25ff3ee-c6f6-4b0c-ab9b-b16d5f1d33ba" path="/var/lib/kubelet/pods/f25ff3ee-c6f6-4b0c-ab9b-b16d5f1d33ba/volumes" Mar 20 10:59:12 crc kubenswrapper[4748]: I0320 10:59:12.060417 4748 generic.go:334] "Generic (PLEG): container finished" podID="004a36eb-3b0d-4a36-acf2-f9b8187bcef4" containerID="b09444d0b28597ebdc3857a37e7f455eaea7e7cca49abbdac92f5f1276f00468" exitCode=143 Mar 20 10:59:12 crc kubenswrapper[4748]: I0320 10:59:12.060483 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"004a36eb-3b0d-4a36-acf2-f9b8187bcef4","Type":"ContainerDied","Data":"b09444d0b28597ebdc3857a37e7f455eaea7e7cca49abbdac92f5f1276f00468"} Mar 20 10:59:12 crc kubenswrapper[4748]: I0320 10:59:12.064952 4748 generic.go:334] "Generic (PLEG): container finished" podID="0c9d3443-1243-4415-a0fe-747695b73aa4" containerID="160c93f583183a2c237e35815bfbc7537348f3742cb2a528ef8c6bbdf2f0bdab" exitCode=0 Mar 20 10:59:12 crc kubenswrapper[4748]: I0320 10:59:12.064985 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0c9d3443-1243-4415-a0fe-747695b73aa4","Type":"ContainerDied","Data":"160c93f583183a2c237e35815bfbc7537348f3742cb2a528ef8c6bbdf2f0bdab"} Mar 20 10:59:13 crc kubenswrapper[4748]: I0320 10:59:13.077414 4748 generic.go:334] "Generic (PLEG): container finished" podID="004a36eb-3b0d-4a36-acf2-f9b8187bcef4" containerID="f71e88db4a9250b6893dc8de5e8a6e00028b0de0a607e36410458115cbcbc439" exitCode=0 Mar 20 10:59:13 crc kubenswrapper[4748]: I0320 10:59:13.077480 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"004a36eb-3b0d-4a36-acf2-f9b8187bcef4","Type":"ContainerDied","Data":"f71e88db4a9250b6893dc8de5e8a6e00028b0de0a607e36410458115cbcbc439"} Mar 20 10:59:13 crc kubenswrapper[4748]: I0320 10:59:13.077984 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"004a36eb-3b0d-4a36-acf2-f9b8187bcef4","Type":"ContainerDied","Data":"253ff5e67f6ff840b4ccbd4729eced0001f2c3c4c20b8d26f5de2a889751aa7e"} Mar 20 10:59:13 crc kubenswrapper[4748]: I0320 10:59:13.078000 4748 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="253ff5e67f6ff840b4ccbd4729eced0001f2c3c4c20b8d26f5de2a889751aa7e" Mar 20 10:59:13 crc kubenswrapper[4748]: E0320 10:59:13.106434 4748 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8924d11d2e8e9ae5dc583afbb64c79638c1923c1eb98201390ee1811f9ad2fa6" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 20 10:59:13 crc kubenswrapper[4748]: E0320 10:59:13.107801 4748 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8924d11d2e8e9ae5dc583afbb64c79638c1923c1eb98201390ee1811f9ad2fa6" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 20 10:59:13 crc kubenswrapper[4748]: E0320 10:59:13.108808 4748 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8924d11d2e8e9ae5dc583afbb64c79638c1923c1eb98201390ee1811f9ad2fa6" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 20 10:59:13 crc kubenswrapper[4748]: E0320 10:59:13.108866 4748 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="a3539da7-c3da-4c63-9eec-56bf69254d9f" containerName="nova-scheduler-scheduler" Mar 20 10:59:13 crc kubenswrapper[4748]: I0320 10:59:13.124348 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 10:59:13 crc kubenswrapper[4748]: I0320 10:59:13.196872 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/004a36eb-3b0d-4a36-acf2-f9b8187bcef4-combined-ca-bundle\") pod \"004a36eb-3b0d-4a36-acf2-f9b8187bcef4\" (UID: \"004a36eb-3b0d-4a36-acf2-f9b8187bcef4\") " Mar 20 10:59:13 crc kubenswrapper[4748]: I0320 10:59:13.197000 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8n2vk\" (UniqueName: \"kubernetes.io/projected/004a36eb-3b0d-4a36-acf2-f9b8187bcef4-kube-api-access-8n2vk\") pod \"004a36eb-3b0d-4a36-acf2-f9b8187bcef4\" (UID: \"004a36eb-3b0d-4a36-acf2-f9b8187bcef4\") " Mar 20 10:59:13 crc kubenswrapper[4748]: I0320 10:59:13.197059 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/004a36eb-3b0d-4a36-acf2-f9b8187bcef4-nova-metadata-tls-certs\") pod \"004a36eb-3b0d-4a36-acf2-f9b8187bcef4\" (UID: \"004a36eb-3b0d-4a36-acf2-f9b8187bcef4\") " Mar 20 10:59:13 crc kubenswrapper[4748]: I0320 10:59:13.197181 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/004a36eb-3b0d-4a36-acf2-f9b8187bcef4-logs\") pod \"004a36eb-3b0d-4a36-acf2-f9b8187bcef4\" (UID: \"004a36eb-3b0d-4a36-acf2-f9b8187bcef4\") " Mar 20 10:59:13 crc kubenswrapper[4748]: I0320 10:59:13.197221 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/004a36eb-3b0d-4a36-acf2-f9b8187bcef4-config-data\") pod \"004a36eb-3b0d-4a36-acf2-f9b8187bcef4\" (UID: \"004a36eb-3b0d-4a36-acf2-f9b8187bcef4\") " Mar 20 10:59:13 crc kubenswrapper[4748]: I0320 10:59:13.197509 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/004a36eb-3b0d-4a36-acf2-f9b8187bcef4-logs" (OuterVolumeSpecName: "logs") pod "004a36eb-3b0d-4a36-acf2-f9b8187bcef4" (UID: "004a36eb-3b0d-4a36-acf2-f9b8187bcef4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:59:13 crc kubenswrapper[4748]: I0320 10:59:13.197649 4748 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/004a36eb-3b0d-4a36-acf2-f9b8187bcef4-logs\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:13 crc kubenswrapper[4748]: I0320 10:59:13.203902 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/004a36eb-3b0d-4a36-acf2-f9b8187bcef4-kube-api-access-8n2vk" (OuterVolumeSpecName: "kube-api-access-8n2vk") pod "004a36eb-3b0d-4a36-acf2-f9b8187bcef4" (UID: "004a36eb-3b0d-4a36-acf2-f9b8187bcef4"). InnerVolumeSpecName "kube-api-access-8n2vk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:59:13 crc kubenswrapper[4748]: I0320 10:59:13.231326 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/004a36eb-3b0d-4a36-acf2-f9b8187bcef4-config-data" (OuterVolumeSpecName: "config-data") pod "004a36eb-3b0d-4a36-acf2-f9b8187bcef4" (UID: "004a36eb-3b0d-4a36-acf2-f9b8187bcef4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:59:13 crc kubenswrapper[4748]: I0320 10:59:13.238586 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/004a36eb-3b0d-4a36-acf2-f9b8187bcef4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "004a36eb-3b0d-4a36-acf2-f9b8187bcef4" (UID: "004a36eb-3b0d-4a36-acf2-f9b8187bcef4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:59:13 crc kubenswrapper[4748]: I0320 10:59:13.249854 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/004a36eb-3b0d-4a36-acf2-f9b8187bcef4-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "004a36eb-3b0d-4a36-acf2-f9b8187bcef4" (UID: "004a36eb-3b0d-4a36-acf2-f9b8187bcef4"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:59:13 crc kubenswrapper[4748]: I0320 10:59:13.299153 4748 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/004a36eb-3b0d-4a36-acf2-f9b8187bcef4-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:13 crc kubenswrapper[4748]: I0320 10:59:13.299188 4748 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/004a36eb-3b0d-4a36-acf2-f9b8187bcef4-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:13 crc kubenswrapper[4748]: I0320 10:59:13.299201 4748 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/004a36eb-3b0d-4a36-acf2-f9b8187bcef4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:13 crc kubenswrapper[4748]: I0320 10:59:13.299211 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8n2vk\" (UniqueName: \"kubernetes.io/projected/004a36eb-3b0d-4a36-acf2-f9b8187bcef4-kube-api-access-8n2vk\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:13 crc kubenswrapper[4748]: I0320 10:59:13.743022 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 10:59:13 crc kubenswrapper[4748]: I0320 10:59:13.809177 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0c9d3443-1243-4415-a0fe-747695b73aa4-sg-core-conf-yaml\") pod \"0c9d3443-1243-4415-a0fe-747695b73aa4\" (UID: \"0c9d3443-1243-4415-a0fe-747695b73aa4\") " Mar 20 10:59:13 crc kubenswrapper[4748]: I0320 10:59:13.809352 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0c9d3443-1243-4415-a0fe-747695b73aa4-run-httpd\") pod \"0c9d3443-1243-4415-a0fe-747695b73aa4\" (UID: \"0c9d3443-1243-4415-a0fe-747695b73aa4\") " Mar 20 10:59:13 crc kubenswrapper[4748]: I0320 10:59:13.809999 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c9d3443-1243-4415-a0fe-747695b73aa4-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "0c9d3443-1243-4415-a0fe-747695b73aa4" (UID: "0c9d3443-1243-4415-a0fe-747695b73aa4"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:59:13 crc kubenswrapper[4748]: I0320 10:59:13.810116 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0c9d3443-1243-4415-a0fe-747695b73aa4-scripts\") pod \"0c9d3443-1243-4415-a0fe-747695b73aa4\" (UID: \"0c9d3443-1243-4415-a0fe-747695b73aa4\") " Mar 20 10:59:13 crc kubenswrapper[4748]: I0320 10:59:13.810146 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rddnp\" (UniqueName: \"kubernetes.io/projected/0c9d3443-1243-4415-a0fe-747695b73aa4-kube-api-access-rddnp\") pod \"0c9d3443-1243-4415-a0fe-747695b73aa4\" (UID: \"0c9d3443-1243-4415-a0fe-747695b73aa4\") " Mar 20 10:59:13 crc kubenswrapper[4748]: I0320 10:59:13.810533 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c9d3443-1243-4415-a0fe-747695b73aa4-combined-ca-bundle\") pod \"0c9d3443-1243-4415-a0fe-747695b73aa4\" (UID: \"0c9d3443-1243-4415-a0fe-747695b73aa4\") " Mar 20 10:59:13 crc kubenswrapper[4748]: I0320 10:59:13.810610 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c9d3443-1243-4415-a0fe-747695b73aa4-config-data\") pod \"0c9d3443-1243-4415-a0fe-747695b73aa4\" (UID: \"0c9d3443-1243-4415-a0fe-747695b73aa4\") " Mar 20 10:59:13 crc kubenswrapper[4748]: I0320 10:59:13.810718 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0c9d3443-1243-4415-a0fe-747695b73aa4-log-httpd\") pod \"0c9d3443-1243-4415-a0fe-747695b73aa4\" (UID: \"0c9d3443-1243-4415-a0fe-747695b73aa4\") " Mar 20 10:59:13 crc kubenswrapper[4748]: I0320 10:59:13.811675 4748 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0c9d3443-1243-4415-a0fe-747695b73aa4-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:13 crc kubenswrapper[4748]: I0320 10:59:13.812956 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c9d3443-1243-4415-a0fe-747695b73aa4-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "0c9d3443-1243-4415-a0fe-747695b73aa4" (UID: "0c9d3443-1243-4415-a0fe-747695b73aa4"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:59:13 crc kubenswrapper[4748]: I0320 10:59:13.816325 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c9d3443-1243-4415-a0fe-747695b73aa4-scripts" (OuterVolumeSpecName: "scripts") pod "0c9d3443-1243-4415-a0fe-747695b73aa4" (UID: "0c9d3443-1243-4415-a0fe-747695b73aa4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:59:13 crc kubenswrapper[4748]: I0320 10:59:13.819321 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c9d3443-1243-4415-a0fe-747695b73aa4-kube-api-access-rddnp" (OuterVolumeSpecName: "kube-api-access-rddnp") pod "0c9d3443-1243-4415-a0fe-747695b73aa4" (UID: "0c9d3443-1243-4415-a0fe-747695b73aa4"). InnerVolumeSpecName "kube-api-access-rddnp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:59:13 crc kubenswrapper[4748]: I0320 10:59:13.845356 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c9d3443-1243-4415-a0fe-747695b73aa4-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "0c9d3443-1243-4415-a0fe-747695b73aa4" (UID: "0c9d3443-1243-4415-a0fe-747695b73aa4"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:59:13 crc kubenswrapper[4748]: I0320 10:59:13.901343 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c9d3443-1243-4415-a0fe-747695b73aa4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0c9d3443-1243-4415-a0fe-747695b73aa4" (UID: "0c9d3443-1243-4415-a0fe-747695b73aa4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:59:13 crc kubenswrapper[4748]: I0320 10:59:13.917249 4748 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0c9d3443-1243-4415-a0fe-747695b73aa4-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:13 crc kubenswrapper[4748]: I0320 10:59:13.917287 4748 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0c9d3443-1243-4415-a0fe-747695b73aa4-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:13 crc kubenswrapper[4748]: I0320 10:59:13.917298 4748 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0c9d3443-1243-4415-a0fe-747695b73aa4-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:13 crc kubenswrapper[4748]: I0320 10:59:13.917310 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rddnp\" (UniqueName: \"kubernetes.io/projected/0c9d3443-1243-4415-a0fe-747695b73aa4-kube-api-access-rddnp\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:13 crc kubenswrapper[4748]: I0320 10:59:13.917320 4748 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c9d3443-1243-4415-a0fe-747695b73aa4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:13 crc kubenswrapper[4748]: I0320 10:59:13.953079 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c9d3443-1243-4415-a0fe-747695b73aa4-config-data" (OuterVolumeSpecName: "config-data") pod "0c9d3443-1243-4415-a0fe-747695b73aa4" (UID: "0c9d3443-1243-4415-a0fe-747695b73aa4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:59:14 crc kubenswrapper[4748]: I0320 10:59:14.019394 4748 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c9d3443-1243-4415-a0fe-747695b73aa4-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:14 crc kubenswrapper[4748]: I0320 10:59:14.096318 4748 generic.go:334] "Generic (PLEG): container finished" podID="0c9d3443-1243-4415-a0fe-747695b73aa4" containerID="7829ba8789365041b79bc2d828c39594d8a6d5d4aadd7880de8d298b87680499" exitCode=0 Mar 20 10:59:14 crc kubenswrapper[4748]: I0320 10:59:14.096411 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 10:59:14 crc kubenswrapper[4748]: I0320 10:59:14.096414 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0c9d3443-1243-4415-a0fe-747695b73aa4","Type":"ContainerDied","Data":"7829ba8789365041b79bc2d828c39594d8a6d5d4aadd7880de8d298b87680499"} Mar 20 10:59:14 crc kubenswrapper[4748]: I0320 10:59:14.096487 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0c9d3443-1243-4415-a0fe-747695b73aa4","Type":"ContainerDied","Data":"895665576deb408245e06e0e474d1d41dd11e139772f6d93ea44e63822923789"} Mar 20 10:59:14 crc kubenswrapper[4748]: I0320 10:59:14.096513 4748 scope.go:117] "RemoveContainer" containerID="cfabfd00d8ac8ab3af11eab8c1a1eef9a80078ebe7308993892e26437662eebf" Mar 20 10:59:14 crc kubenswrapper[4748]: I0320 10:59:14.102990 4748 generic.go:334] "Generic (PLEG): container finished" podID="a3539da7-c3da-4c63-9eec-56bf69254d9f" containerID="8924d11d2e8e9ae5dc583afbb64c79638c1923c1eb98201390ee1811f9ad2fa6" exitCode=0 Mar 20 10:59:14 crc kubenswrapper[4748]: I0320 10:59:14.103099 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a3539da7-c3da-4c63-9eec-56bf69254d9f","Type":"ContainerDied","Data":"8924d11d2e8e9ae5dc583afbb64c79638c1923c1eb98201390ee1811f9ad2fa6"} Mar 20 10:59:14 crc kubenswrapper[4748]: I0320 10:59:14.104814 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 10:59:14 crc kubenswrapper[4748]: I0320 10:59:14.105994 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"e9d06c7d-5d90-45f8-b4df-b53bff4761a5","Type":"ContainerStarted","Data":"98e18196ae24af733647df56d1b84ed1163a2feac3b0db335e69587f966ee785"} Mar 20 10:59:14 crc kubenswrapper[4748]: I0320 10:59:14.106105 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 20 10:59:14 crc kubenswrapper[4748]: I0320 10:59:14.129348 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.161456966 podStartE2EDuration="5.129326379s" podCreationTimestamp="2026-03-20 10:59:09 +0000 UTC" firstStartedPulling="2026-03-20 10:59:09.876279579 +0000 UTC m=+1385.017825383" lastFinishedPulling="2026-03-20 10:59:12.844148982 +0000 UTC m=+1387.985694796" observedRunningTime="2026-03-20 10:59:14.125235346 +0000 UTC m=+1389.266781170" watchObservedRunningTime="2026-03-20 10:59:14.129326379 +0000 UTC m=+1389.270872193" Mar 20 10:59:14 crc kubenswrapper[4748]: I0320 10:59:14.129390 4748 scope.go:117] "RemoveContainer" containerID="fd9aa1368ea7942f8510cc92d750f198adc44655f2573edb5b8deda50d649c2a" Mar 20 10:59:14 crc kubenswrapper[4748]: I0320 10:59:14.174036 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 10:59:14 crc kubenswrapper[4748]: I0320 10:59:14.192173 4748 scope.go:117] "RemoveContainer" containerID="7829ba8789365041b79bc2d828c39594d8a6d5d4aadd7880de8d298b87680499" Mar 20 10:59:14 crc kubenswrapper[4748]: I0320 10:59:14.196729 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 20 10:59:14 crc kubenswrapper[4748]: I0320 10:59:14.207073 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 10:59:14 crc kubenswrapper[4748]: I0320 10:59:14.220702 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 10:59:14 crc kubenswrapper[4748]: I0320 10:59:14.231013 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 10:59:14 crc kubenswrapper[4748]: E0320 10:59:14.231459 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c9d3443-1243-4415-a0fe-747695b73aa4" containerName="ceilometer-central-agent" Mar 20 10:59:14 crc kubenswrapper[4748]: I0320 10:59:14.231477 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c9d3443-1243-4415-a0fe-747695b73aa4" containerName="ceilometer-central-agent" Mar 20 10:59:14 crc kubenswrapper[4748]: E0320 10:59:14.231493 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f25ff3ee-c6f6-4b0c-ab9b-b16d5f1d33ba" containerName="dnsmasq-dns" Mar 20 10:59:14 crc kubenswrapper[4748]: I0320 10:59:14.231500 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="f25ff3ee-c6f6-4b0c-ab9b-b16d5f1d33ba" containerName="dnsmasq-dns" Mar 20 10:59:14 crc kubenswrapper[4748]: E0320 10:59:14.231511 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c9d3443-1243-4415-a0fe-747695b73aa4" containerName="proxy-httpd" Mar 20 10:59:14 crc kubenswrapper[4748]: I0320 10:59:14.231518 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c9d3443-1243-4415-a0fe-747695b73aa4" containerName="proxy-httpd" Mar 20 10:59:14 crc kubenswrapper[4748]: E0320 10:59:14.231529 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="004a36eb-3b0d-4a36-acf2-f9b8187bcef4" containerName="nova-metadata-metadata" Mar 20 10:59:14 crc kubenswrapper[4748]: I0320 10:59:14.231535 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="004a36eb-3b0d-4a36-acf2-f9b8187bcef4" containerName="nova-metadata-metadata" Mar 20 10:59:14 crc kubenswrapper[4748]: E0320 10:59:14.231545 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7263772-e7ec-43ad-815f-7c6a67575402" containerName="nova-manage" Mar 20 10:59:14 crc kubenswrapper[4748]: I0320 10:59:14.231550 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7263772-e7ec-43ad-815f-7c6a67575402" containerName="nova-manage" Mar 20 10:59:14 crc kubenswrapper[4748]: E0320 10:59:14.231576 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c9d3443-1243-4415-a0fe-747695b73aa4" containerName="ceilometer-notification-agent" Mar 20 10:59:14 crc kubenswrapper[4748]: I0320 10:59:14.231584 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c9d3443-1243-4415-a0fe-747695b73aa4" containerName="ceilometer-notification-agent" Mar 20 10:59:14 crc kubenswrapper[4748]: E0320 10:59:14.231598 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f25ff3ee-c6f6-4b0c-ab9b-b16d5f1d33ba" containerName="init" Mar 20 10:59:14 crc kubenswrapper[4748]: I0320 10:59:14.231604 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="f25ff3ee-c6f6-4b0c-ab9b-b16d5f1d33ba" containerName="init" Mar 20 10:59:14 crc kubenswrapper[4748]: E0320 10:59:14.231611 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="004a36eb-3b0d-4a36-acf2-f9b8187bcef4" containerName="nova-metadata-log" Mar 20 10:59:14 crc kubenswrapper[4748]: I0320 10:59:14.231617 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="004a36eb-3b0d-4a36-acf2-f9b8187bcef4" containerName="nova-metadata-log" Mar 20 10:59:14 crc kubenswrapper[4748]: E0320 10:59:14.231627 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c9d3443-1243-4415-a0fe-747695b73aa4" containerName="sg-core" Mar 20 10:59:14 crc kubenswrapper[4748]: I0320 10:59:14.231634 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c9d3443-1243-4415-a0fe-747695b73aa4" containerName="sg-core" Mar 20 10:59:14 crc kubenswrapper[4748]: I0320 10:59:14.231798 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="004a36eb-3b0d-4a36-acf2-f9b8187bcef4" containerName="nova-metadata-log" Mar 20 10:59:14 crc kubenswrapper[4748]: I0320 10:59:14.231815 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c9d3443-1243-4415-a0fe-747695b73aa4" containerName="ceilometer-central-agent" Mar 20 10:59:14 crc kubenswrapper[4748]: I0320 10:59:14.231825 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7263772-e7ec-43ad-815f-7c6a67575402" containerName="nova-manage" Mar 20 10:59:14 crc kubenswrapper[4748]: I0320 10:59:14.231845 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c9d3443-1243-4415-a0fe-747695b73aa4" containerName="proxy-httpd" Mar 20 10:59:14 crc kubenswrapper[4748]: I0320 10:59:14.231855 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c9d3443-1243-4415-a0fe-747695b73aa4" containerName="sg-core" Mar 20 10:59:14 crc kubenswrapper[4748]: I0320 10:59:14.231865 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="f25ff3ee-c6f6-4b0c-ab9b-b16d5f1d33ba" containerName="dnsmasq-dns" Mar 20 10:59:14 crc kubenswrapper[4748]: I0320 10:59:14.231876 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="004a36eb-3b0d-4a36-acf2-f9b8187bcef4" containerName="nova-metadata-metadata" Mar 20 10:59:14 crc kubenswrapper[4748]: I0320 10:59:14.231885 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c9d3443-1243-4415-a0fe-747695b73aa4" containerName="ceilometer-notification-agent" Mar 20 10:59:14 crc kubenswrapper[4748]: I0320 10:59:14.233533 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 10:59:14 crc kubenswrapper[4748]: I0320 10:59:14.235654 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 20 10:59:14 crc kubenswrapper[4748]: I0320 10:59:14.235988 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 10:59:14 crc kubenswrapper[4748]: I0320 10:59:14.236176 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 10:59:14 crc kubenswrapper[4748]: I0320 10:59:14.240405 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 20 10:59:14 crc kubenswrapper[4748]: I0320 10:59:14.241922 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 10:59:14 crc kubenswrapper[4748]: I0320 10:59:14.245548 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 20 10:59:14 crc kubenswrapper[4748]: I0320 10:59:14.245558 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 20 10:59:14 crc kubenswrapper[4748]: I0320 10:59:14.274764 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 10:59:14 crc kubenswrapper[4748]: I0320 10:59:14.290707 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 10:59:14 crc kubenswrapper[4748]: I0320 10:59:14.321593 4748 scope.go:117] "RemoveContainer" containerID="160c93f583183a2c237e35815bfbc7537348f3742cb2a528ef8c6bbdf2f0bdab" Mar 20 10:59:14 crc kubenswrapper[4748]: I0320 10:59:14.325460 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 10:59:14 crc kubenswrapper[4748]: I0320 10:59:14.332683 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad5425ff-9e86-4845-b272-14383e2166d9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ad5425ff-9e86-4845-b272-14383e2166d9\") " pod="openstack/ceilometer-0" Mar 20 10:59:14 crc kubenswrapper[4748]: I0320 10:59:14.332755 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74dbn\" (UniqueName: \"kubernetes.io/projected/d51df005-e870-4b00-9eac-72bb099077dd-kube-api-access-74dbn\") pod \"nova-metadata-0\" (UID: \"d51df005-e870-4b00-9eac-72bb099077dd\") " pod="openstack/nova-metadata-0" Mar 20 10:59:14 crc kubenswrapper[4748]: I0320 10:59:14.332807 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2xrx\" (UniqueName: \"kubernetes.io/projected/ad5425ff-9e86-4845-b272-14383e2166d9-kube-api-access-j2xrx\") pod \"ceilometer-0\" (UID: \"ad5425ff-9e86-4845-b272-14383e2166d9\") " pod="openstack/ceilometer-0" Mar 20 10:59:14 crc kubenswrapper[4748]: I0320 10:59:14.332850 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad5425ff-9e86-4845-b272-14383e2166d9-config-data\") pod \"ceilometer-0\" (UID: \"ad5425ff-9e86-4845-b272-14383e2166d9\") " pod="openstack/ceilometer-0" Mar 20 10:59:14 crc kubenswrapper[4748]: I0320 10:59:14.332877 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/d51df005-e870-4b00-9eac-72bb099077dd-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"d51df005-e870-4b00-9eac-72bb099077dd\") " pod="openstack/nova-metadata-0" Mar 20 10:59:14 crc kubenswrapper[4748]: I0320 10:59:14.332901 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d51df005-e870-4b00-9eac-72bb099077dd-config-data\") pod \"nova-metadata-0\" (UID: \"d51df005-e870-4b00-9eac-72bb099077dd\") " pod="openstack/nova-metadata-0" Mar 20 10:59:14 crc kubenswrapper[4748]: I0320 10:59:14.333139 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ad5425ff-9e86-4845-b272-14383e2166d9-run-httpd\") pod \"ceilometer-0\" (UID: \"ad5425ff-9e86-4845-b272-14383e2166d9\") " pod="openstack/ceilometer-0" Mar 20 10:59:14 crc kubenswrapper[4748]: I0320 10:59:14.333220 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ad5425ff-9e86-4845-b272-14383e2166d9-log-httpd\") pod \"ceilometer-0\" (UID: \"ad5425ff-9e86-4845-b272-14383e2166d9\") " pod="openstack/ceilometer-0" Mar 20 10:59:14 crc kubenswrapper[4748]: I0320 10:59:14.333257 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ad5425ff-9e86-4845-b272-14383e2166d9-scripts\") pod \"ceilometer-0\" (UID: \"ad5425ff-9e86-4845-b272-14383e2166d9\") " pod="openstack/ceilometer-0" Mar 20 10:59:14 crc kubenswrapper[4748]: I0320 10:59:14.333400 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ad5425ff-9e86-4845-b272-14383e2166d9-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"ad5425ff-9e86-4845-b272-14383e2166d9\") " pod="openstack/ceilometer-0" Mar 20 10:59:14 crc kubenswrapper[4748]: I0320 10:59:14.333509 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ad5425ff-9e86-4845-b272-14383e2166d9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ad5425ff-9e86-4845-b272-14383e2166d9\") " pod="openstack/ceilometer-0" Mar 20 10:59:14 crc kubenswrapper[4748]: I0320 10:59:14.333556 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d51df005-e870-4b00-9eac-72bb099077dd-logs\") pod \"nova-metadata-0\" (UID: \"d51df005-e870-4b00-9eac-72bb099077dd\") " pod="openstack/nova-metadata-0" Mar 20 10:59:14 crc kubenswrapper[4748]: I0320 10:59:14.333694 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d51df005-e870-4b00-9eac-72bb099077dd-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d51df005-e870-4b00-9eac-72bb099077dd\") " pod="openstack/nova-metadata-0" Mar 20 10:59:14 crc kubenswrapper[4748]: I0320 10:59:14.347710 4748 scope.go:117] "RemoveContainer" containerID="cfabfd00d8ac8ab3af11eab8c1a1eef9a80078ebe7308993892e26437662eebf" Mar 20 10:59:14 crc kubenswrapper[4748]: E0320 10:59:14.349481 4748 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cfabfd00d8ac8ab3af11eab8c1a1eef9a80078ebe7308993892e26437662eebf\": container with ID starting with cfabfd00d8ac8ab3af11eab8c1a1eef9a80078ebe7308993892e26437662eebf not found: ID does not exist" containerID="cfabfd00d8ac8ab3af11eab8c1a1eef9a80078ebe7308993892e26437662eebf" Mar 20 10:59:14 crc kubenswrapper[4748]: I0320 10:59:14.349523 4748 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cfabfd00d8ac8ab3af11eab8c1a1eef9a80078ebe7308993892e26437662eebf"} err="failed to get container status \"cfabfd00d8ac8ab3af11eab8c1a1eef9a80078ebe7308993892e26437662eebf\": rpc error: code = NotFound desc = could not find container \"cfabfd00d8ac8ab3af11eab8c1a1eef9a80078ebe7308993892e26437662eebf\": container with ID starting with cfabfd00d8ac8ab3af11eab8c1a1eef9a80078ebe7308993892e26437662eebf not found: ID does not exist" Mar 20 10:59:14 crc kubenswrapper[4748]: I0320 10:59:14.349554 4748 scope.go:117] "RemoveContainer" containerID="fd9aa1368ea7942f8510cc92d750f198adc44655f2573edb5b8deda50d649c2a" Mar 20 10:59:14 crc kubenswrapper[4748]: E0320 10:59:14.349942 4748 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd9aa1368ea7942f8510cc92d750f198adc44655f2573edb5b8deda50d649c2a\": container with ID starting with fd9aa1368ea7942f8510cc92d750f198adc44655f2573edb5b8deda50d649c2a not found: ID does not exist" containerID="fd9aa1368ea7942f8510cc92d750f198adc44655f2573edb5b8deda50d649c2a" Mar 20 10:59:14 crc kubenswrapper[4748]: I0320 10:59:14.349968 4748 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd9aa1368ea7942f8510cc92d750f198adc44655f2573edb5b8deda50d649c2a"} err="failed to get container status \"fd9aa1368ea7942f8510cc92d750f198adc44655f2573edb5b8deda50d649c2a\": rpc error: code = NotFound desc = could not find container \"fd9aa1368ea7942f8510cc92d750f198adc44655f2573edb5b8deda50d649c2a\": container with ID starting with fd9aa1368ea7942f8510cc92d750f198adc44655f2573edb5b8deda50d649c2a not found: ID does not exist" Mar 20 10:59:14 crc kubenswrapper[4748]: I0320 10:59:14.349986 4748 scope.go:117] "RemoveContainer" containerID="7829ba8789365041b79bc2d828c39594d8a6d5d4aadd7880de8d298b87680499" Mar 20 10:59:14 crc kubenswrapper[4748]: E0320 10:59:14.350226 4748 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7829ba8789365041b79bc2d828c39594d8a6d5d4aadd7880de8d298b87680499\": container with ID starting with 7829ba8789365041b79bc2d828c39594d8a6d5d4aadd7880de8d298b87680499 not found: ID does not exist" containerID="7829ba8789365041b79bc2d828c39594d8a6d5d4aadd7880de8d298b87680499" Mar 20 10:59:14 crc kubenswrapper[4748]: I0320 10:59:14.350248 4748 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7829ba8789365041b79bc2d828c39594d8a6d5d4aadd7880de8d298b87680499"} err="failed to get container status \"7829ba8789365041b79bc2d828c39594d8a6d5d4aadd7880de8d298b87680499\": rpc error: code = NotFound desc = could not find container \"7829ba8789365041b79bc2d828c39594d8a6d5d4aadd7880de8d298b87680499\": container with ID starting with 7829ba8789365041b79bc2d828c39594d8a6d5d4aadd7880de8d298b87680499 not found: ID does not exist" Mar 20 10:59:14 crc kubenswrapper[4748]: I0320 10:59:14.350263 4748 scope.go:117] "RemoveContainer" containerID="160c93f583183a2c237e35815bfbc7537348f3742cb2a528ef8c6bbdf2f0bdab" Mar 20 10:59:14 crc kubenswrapper[4748]: E0320 10:59:14.350469 4748 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"160c93f583183a2c237e35815bfbc7537348f3742cb2a528ef8c6bbdf2f0bdab\": container with ID starting with 160c93f583183a2c237e35815bfbc7537348f3742cb2a528ef8c6bbdf2f0bdab not found: ID does not exist" containerID="160c93f583183a2c237e35815bfbc7537348f3742cb2a528ef8c6bbdf2f0bdab" Mar 20 10:59:14 crc kubenswrapper[4748]: I0320 10:59:14.350490 4748 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"160c93f583183a2c237e35815bfbc7537348f3742cb2a528ef8c6bbdf2f0bdab"} err="failed to get container status \"160c93f583183a2c237e35815bfbc7537348f3742cb2a528ef8c6bbdf2f0bdab\": rpc error: code = NotFound desc = could not find container \"160c93f583183a2c237e35815bfbc7537348f3742cb2a528ef8c6bbdf2f0bdab\": container with ID starting with 160c93f583183a2c237e35815bfbc7537348f3742cb2a528ef8c6bbdf2f0bdab not found: ID does not exist" Mar 20 10:59:14 crc kubenswrapper[4748]: I0320 10:59:14.434501 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zqw4v\" (UniqueName: \"kubernetes.io/projected/a3539da7-c3da-4c63-9eec-56bf69254d9f-kube-api-access-zqw4v\") pod \"a3539da7-c3da-4c63-9eec-56bf69254d9f\" (UID: \"a3539da7-c3da-4c63-9eec-56bf69254d9f\") " Mar 20 10:59:14 crc kubenswrapper[4748]: I0320 10:59:14.434561 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3539da7-c3da-4c63-9eec-56bf69254d9f-config-data\") pod \"a3539da7-c3da-4c63-9eec-56bf69254d9f\" (UID: \"a3539da7-c3da-4c63-9eec-56bf69254d9f\") " Mar 20 10:59:14 crc kubenswrapper[4748]: I0320 10:59:14.434778 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3539da7-c3da-4c63-9eec-56bf69254d9f-combined-ca-bundle\") pod \"a3539da7-c3da-4c63-9eec-56bf69254d9f\" (UID: \"a3539da7-c3da-4c63-9eec-56bf69254d9f\") " Mar 20 10:59:14 crc kubenswrapper[4748]: I0320 10:59:14.435315 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2xrx\" (UniqueName: \"kubernetes.io/projected/ad5425ff-9e86-4845-b272-14383e2166d9-kube-api-access-j2xrx\") pod \"ceilometer-0\" (UID: \"ad5425ff-9e86-4845-b272-14383e2166d9\") " pod="openstack/ceilometer-0" Mar 20 10:59:14 crc kubenswrapper[4748]: I0320 10:59:14.435389 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad5425ff-9e86-4845-b272-14383e2166d9-config-data\") pod \"ceilometer-0\" (UID: \"ad5425ff-9e86-4845-b272-14383e2166d9\") " pod="openstack/ceilometer-0" Mar 20 10:59:14 crc kubenswrapper[4748]: I0320 10:59:14.435416 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/d51df005-e870-4b00-9eac-72bb099077dd-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"d51df005-e870-4b00-9eac-72bb099077dd\") " pod="openstack/nova-metadata-0" Mar 20 10:59:14 crc kubenswrapper[4748]: I0320 10:59:14.435436 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d51df005-e870-4b00-9eac-72bb099077dd-config-data\") pod \"nova-metadata-0\" (UID: \"d51df005-e870-4b00-9eac-72bb099077dd\") " pod="openstack/nova-metadata-0" Mar 20 10:59:14 crc kubenswrapper[4748]: I0320 10:59:14.435543 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ad5425ff-9e86-4845-b272-14383e2166d9-run-httpd\") pod \"ceilometer-0\" (UID: \"ad5425ff-9e86-4845-b272-14383e2166d9\") " pod="openstack/ceilometer-0" Mar 20 10:59:14 crc kubenswrapper[4748]: I0320 10:59:14.435574 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ad5425ff-9e86-4845-b272-14383e2166d9-log-httpd\") pod \"ceilometer-0\" (UID: \"ad5425ff-9e86-4845-b272-14383e2166d9\") " pod="openstack/ceilometer-0" Mar 20 10:59:14 crc kubenswrapper[4748]: I0320 10:59:14.435592 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ad5425ff-9e86-4845-b272-14383e2166d9-scripts\") pod \"ceilometer-0\" (UID: \"ad5425ff-9e86-4845-b272-14383e2166d9\") " pod="openstack/ceilometer-0" Mar 20 10:59:14 crc kubenswrapper[4748]: I0320 10:59:14.435696 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ad5425ff-9e86-4845-b272-14383e2166d9-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"ad5425ff-9e86-4845-b272-14383e2166d9\") " pod="openstack/ceilometer-0" Mar 20 10:59:14 crc kubenswrapper[4748]: I0320 10:59:14.435768 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ad5425ff-9e86-4845-b272-14383e2166d9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ad5425ff-9e86-4845-b272-14383e2166d9\") " pod="openstack/ceilometer-0" Mar 20 10:59:14 crc kubenswrapper[4748]: I0320 10:59:14.435792 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d51df005-e870-4b00-9eac-72bb099077dd-logs\") pod \"nova-metadata-0\" (UID: \"d51df005-e870-4b00-9eac-72bb099077dd\") " pod="openstack/nova-metadata-0" Mar 20 10:59:14 crc kubenswrapper[4748]: I0320 10:59:14.435886 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d51df005-e870-4b00-9eac-72bb099077dd-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d51df005-e870-4b00-9eac-72bb099077dd\") " pod="openstack/nova-metadata-0" Mar 20 10:59:14 crc kubenswrapper[4748]: I0320 10:59:14.435962 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad5425ff-9e86-4845-b272-14383e2166d9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ad5425ff-9e86-4845-b272-14383e2166d9\") " pod="openstack/ceilometer-0" Mar 20 10:59:14 crc kubenswrapper[4748]: I0320 10:59:14.435995 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74dbn\" (UniqueName: \"kubernetes.io/projected/d51df005-e870-4b00-9eac-72bb099077dd-kube-api-access-74dbn\") pod \"nova-metadata-0\" (UID: \"d51df005-e870-4b00-9eac-72bb099077dd\") " pod="openstack/nova-metadata-0" Mar 20 10:59:14 crc kubenswrapper[4748]: I0320 10:59:14.436116 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ad5425ff-9e86-4845-b272-14383e2166d9-run-httpd\") pod \"ceilometer-0\" (UID: \"ad5425ff-9e86-4845-b272-14383e2166d9\") " pod="openstack/ceilometer-0" Mar 20 10:59:14 crc kubenswrapper[4748]: I0320 10:59:14.436647 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ad5425ff-9e86-4845-b272-14383e2166d9-log-httpd\") pod \"ceilometer-0\" (UID: \"ad5425ff-9e86-4845-b272-14383e2166d9\") " pod="openstack/ceilometer-0" Mar 20 10:59:14 crc kubenswrapper[4748]: I0320 10:59:14.437382 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d51df005-e870-4b00-9eac-72bb099077dd-logs\") pod \"nova-metadata-0\" (UID: \"d51df005-e870-4b00-9eac-72bb099077dd\") " pod="openstack/nova-metadata-0" Mar 20 10:59:14 crc kubenswrapper[4748]: I0320 10:59:14.439336 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/d51df005-e870-4b00-9eac-72bb099077dd-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"d51df005-e870-4b00-9eac-72bb099077dd\") " pod="openstack/nova-metadata-0" Mar 20 10:59:14 crc kubenswrapper[4748]: I0320 10:59:14.441117 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad5425ff-9e86-4845-b272-14383e2166d9-config-data\") pod \"ceilometer-0\" (UID: \"ad5425ff-9e86-4845-b272-14383e2166d9\") " pod="openstack/ceilometer-0" Mar 20 10:59:14 crc kubenswrapper[4748]: I0320 10:59:14.442312 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d51df005-e870-4b00-9eac-72bb099077dd-config-data\") pod \"nova-metadata-0\" (UID: \"d51df005-e870-4b00-9eac-72bb099077dd\") " pod="openstack/nova-metadata-0" Mar 20 10:59:14 crc kubenswrapper[4748]: I0320 10:59:14.442580 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ad5425ff-9e86-4845-b272-14383e2166d9-scripts\") pod \"ceilometer-0\" (UID: \"ad5425ff-9e86-4845-b272-14383e2166d9\") " pod="openstack/ceilometer-0" Mar 20 10:59:14 crc kubenswrapper[4748]: I0320 10:59:14.442581 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3539da7-c3da-4c63-9eec-56bf69254d9f-kube-api-access-zqw4v" (OuterVolumeSpecName: "kube-api-access-zqw4v") pod "a3539da7-c3da-4c63-9eec-56bf69254d9f" (UID: "a3539da7-c3da-4c63-9eec-56bf69254d9f"). InnerVolumeSpecName "kube-api-access-zqw4v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:59:14 crc kubenswrapper[4748]: I0320 10:59:14.443269 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ad5425ff-9e86-4845-b272-14383e2166d9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ad5425ff-9e86-4845-b272-14383e2166d9\") " pod="openstack/ceilometer-0" Mar 20 10:59:14 crc kubenswrapper[4748]: I0320 10:59:14.443305 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ad5425ff-9e86-4845-b272-14383e2166d9-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"ad5425ff-9e86-4845-b272-14383e2166d9\") " pod="openstack/ceilometer-0" Mar 20 10:59:14 crc kubenswrapper[4748]: I0320 10:59:14.444577 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad5425ff-9e86-4845-b272-14383e2166d9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ad5425ff-9e86-4845-b272-14383e2166d9\") " pod="openstack/ceilometer-0" Mar 20 10:59:14 crc kubenswrapper[4748]: I0320 10:59:14.448539 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d51df005-e870-4b00-9eac-72bb099077dd-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d51df005-e870-4b00-9eac-72bb099077dd\") " pod="openstack/nova-metadata-0" Mar 20 10:59:14 crc kubenswrapper[4748]: I0320 10:59:14.452671 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74dbn\" (UniqueName: \"kubernetes.io/projected/d51df005-e870-4b00-9eac-72bb099077dd-kube-api-access-74dbn\") pod \"nova-metadata-0\" (UID: \"d51df005-e870-4b00-9eac-72bb099077dd\") " pod="openstack/nova-metadata-0" Mar 20 10:59:14 crc kubenswrapper[4748]: I0320 10:59:14.452962 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2xrx\" (UniqueName: \"kubernetes.io/projected/ad5425ff-9e86-4845-b272-14383e2166d9-kube-api-access-j2xrx\") pod \"ceilometer-0\" (UID: \"ad5425ff-9e86-4845-b272-14383e2166d9\") " pod="openstack/ceilometer-0" Mar 20 10:59:14 crc kubenswrapper[4748]: I0320 10:59:14.466872 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3539da7-c3da-4c63-9eec-56bf69254d9f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a3539da7-c3da-4c63-9eec-56bf69254d9f" (UID: "a3539da7-c3da-4c63-9eec-56bf69254d9f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:59:14 crc kubenswrapper[4748]: I0320 10:59:14.472940 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3539da7-c3da-4c63-9eec-56bf69254d9f-config-data" (OuterVolumeSpecName: "config-data") pod "a3539da7-c3da-4c63-9eec-56bf69254d9f" (UID: "a3539da7-c3da-4c63-9eec-56bf69254d9f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:59:14 crc kubenswrapper[4748]: I0320 10:59:14.538277 4748 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3539da7-c3da-4c63-9eec-56bf69254d9f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:14 crc kubenswrapper[4748]: I0320 10:59:14.538309 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zqw4v\" (UniqueName: \"kubernetes.io/projected/a3539da7-c3da-4c63-9eec-56bf69254d9f-kube-api-access-zqw4v\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:14 crc kubenswrapper[4748]: I0320 10:59:14.538321 4748 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3539da7-c3da-4c63-9eec-56bf69254d9f-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:14 crc kubenswrapper[4748]: I0320 10:59:14.649758 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 10:59:14 crc kubenswrapper[4748]: I0320 10:59:14.669481 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 10:59:15 crc kubenswrapper[4748]: I0320 10:59:15.123792 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 10:59:15 crc kubenswrapper[4748]: I0320 10:59:15.124071 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a3539da7-c3da-4c63-9eec-56bf69254d9f","Type":"ContainerDied","Data":"5e5a5de8544b037774fafaa7b774b6011ddae98c5647d3c8a2783f9e11d96560"} Mar 20 10:59:15 crc kubenswrapper[4748]: I0320 10:59:15.124332 4748 scope.go:117] "RemoveContainer" containerID="8924d11d2e8e9ae5dc583afbb64c79638c1923c1eb98201390ee1811f9ad2fa6" Mar 20 10:59:15 crc kubenswrapper[4748]: I0320 10:59:15.164151 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 10:59:15 crc kubenswrapper[4748]: I0320 10:59:15.173930 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 10:59:15 crc kubenswrapper[4748]: I0320 10:59:15.212545 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 10:59:15 crc kubenswrapper[4748]: E0320 10:59:15.212999 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3539da7-c3da-4c63-9eec-56bf69254d9f" containerName="nova-scheduler-scheduler" Mar 20 10:59:15 crc kubenswrapper[4748]: I0320 10:59:15.213015 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3539da7-c3da-4c63-9eec-56bf69254d9f" containerName="nova-scheduler-scheduler" Mar 20 10:59:15 crc kubenswrapper[4748]: I0320 10:59:15.213177 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3539da7-c3da-4c63-9eec-56bf69254d9f" containerName="nova-scheduler-scheduler" Mar 20 10:59:15 crc kubenswrapper[4748]: I0320 10:59:15.213911 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 10:59:15 crc kubenswrapper[4748]: I0320 10:59:15.218157 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 20 10:59:15 crc kubenswrapper[4748]: I0320 10:59:15.226679 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 10:59:15 crc kubenswrapper[4748]: I0320 10:59:15.244446 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 10:59:15 crc kubenswrapper[4748]: I0320 10:59:15.255377 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/794b1ed4-1370-4c35-b4b2-7ad2da01f690-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"794b1ed4-1370-4c35-b4b2-7ad2da01f690\") " pod="openstack/nova-scheduler-0" Mar 20 10:59:15 crc kubenswrapper[4748]: I0320 10:59:15.255474 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbmch\" (UniqueName: \"kubernetes.io/projected/794b1ed4-1370-4c35-b4b2-7ad2da01f690-kube-api-access-mbmch\") pod \"nova-scheduler-0\" (UID: \"794b1ed4-1370-4c35-b4b2-7ad2da01f690\") " pod="openstack/nova-scheduler-0" Mar 20 10:59:15 crc kubenswrapper[4748]: I0320 10:59:15.255656 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/794b1ed4-1370-4c35-b4b2-7ad2da01f690-config-data\") pod \"nova-scheduler-0\" (UID: \"794b1ed4-1370-4c35-b4b2-7ad2da01f690\") " pod="openstack/nova-scheduler-0" Mar 20 10:59:15 crc kubenswrapper[4748]: I0320 10:59:15.276028 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 10:59:15 crc kubenswrapper[4748]: W0320 10:59:15.280976 4748 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd51df005_e870_4b00_9eac_72bb099077dd.slice/crio-401b2aa649fbc7dd43a173a28b7f0cbb58b992cc040162efa4fbc478c77e2c88 WatchSource:0}: Error finding container 401b2aa649fbc7dd43a173a28b7f0cbb58b992cc040162efa4fbc478c77e2c88: Status 404 returned error can't find the container with id 401b2aa649fbc7dd43a173a28b7f0cbb58b992cc040162efa4fbc478c77e2c88 Mar 20 10:59:15 crc kubenswrapper[4748]: I0320 10:59:15.357207 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mbmch\" (UniqueName: \"kubernetes.io/projected/794b1ed4-1370-4c35-b4b2-7ad2da01f690-kube-api-access-mbmch\") pod \"nova-scheduler-0\" (UID: \"794b1ed4-1370-4c35-b4b2-7ad2da01f690\") " pod="openstack/nova-scheduler-0" Mar 20 10:59:15 crc kubenswrapper[4748]: I0320 10:59:15.357730 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/794b1ed4-1370-4c35-b4b2-7ad2da01f690-config-data\") pod \"nova-scheduler-0\" (UID: \"794b1ed4-1370-4c35-b4b2-7ad2da01f690\") " pod="openstack/nova-scheduler-0" Mar 20 10:59:15 crc kubenswrapper[4748]: I0320 10:59:15.357819 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/794b1ed4-1370-4c35-b4b2-7ad2da01f690-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"794b1ed4-1370-4c35-b4b2-7ad2da01f690\") " pod="openstack/nova-scheduler-0" Mar 20 10:59:15 crc kubenswrapper[4748]: I0320 10:59:15.363675 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/794b1ed4-1370-4c35-b4b2-7ad2da01f690-config-data\") pod \"nova-scheduler-0\" (UID: \"794b1ed4-1370-4c35-b4b2-7ad2da01f690\") " pod="openstack/nova-scheduler-0" Mar 20 10:59:15 crc kubenswrapper[4748]: I0320 10:59:15.363715 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/794b1ed4-1370-4c35-b4b2-7ad2da01f690-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"794b1ed4-1370-4c35-b4b2-7ad2da01f690\") " pod="openstack/nova-scheduler-0" Mar 20 10:59:15 crc kubenswrapper[4748]: I0320 10:59:15.379778 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mbmch\" (UniqueName: \"kubernetes.io/projected/794b1ed4-1370-4c35-b4b2-7ad2da01f690-kube-api-access-mbmch\") pod \"nova-scheduler-0\" (UID: \"794b1ed4-1370-4c35-b4b2-7ad2da01f690\") " pod="openstack/nova-scheduler-0" Mar 20 10:59:15 crc kubenswrapper[4748]: I0320 10:59:15.425712 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 10:59:15 crc kubenswrapper[4748]: I0320 10:59:15.543014 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="004a36eb-3b0d-4a36-acf2-f9b8187bcef4" path="/var/lib/kubelet/pods/004a36eb-3b0d-4a36-acf2-f9b8187bcef4/volumes" Mar 20 10:59:15 crc kubenswrapper[4748]: I0320 10:59:15.543788 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c9d3443-1243-4415-a0fe-747695b73aa4" path="/var/lib/kubelet/pods/0c9d3443-1243-4415-a0fe-747695b73aa4/volumes" Mar 20 10:59:15 crc kubenswrapper[4748]: I0320 10:59:15.548247 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3539da7-c3da-4c63-9eec-56bf69254d9f" path="/var/lib/kubelet/pods/a3539da7-c3da-4c63-9eec-56bf69254d9f/volumes" Mar 20 10:59:15 crc kubenswrapper[4748]: I0320 10:59:15.922647 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 10:59:16 crc kubenswrapper[4748]: I0320 10:59:16.069273 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 20 10:59:16 crc kubenswrapper[4748]: I0320 10:59:16.069381 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 20 10:59:16 crc kubenswrapper[4748]: I0320 10:59:16.136464 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d51df005-e870-4b00-9eac-72bb099077dd","Type":"ContainerStarted","Data":"9cee787e55ad75700ab13f12ae054c2f0b9292cb89cb56e7f80fde608a3c3d9d"} Mar 20 10:59:16 crc kubenswrapper[4748]: I0320 10:59:16.136510 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d51df005-e870-4b00-9eac-72bb099077dd","Type":"ContainerStarted","Data":"8aa854eb68b19e0600024ec449e7ab9c78521ca3a6ab8b2d88b95c6e254c9de4"} Mar 20 10:59:16 crc kubenswrapper[4748]: I0320 10:59:16.136522 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d51df005-e870-4b00-9eac-72bb099077dd","Type":"ContainerStarted","Data":"401b2aa649fbc7dd43a173a28b7f0cbb58b992cc040162efa4fbc478c77e2c88"} Mar 20 10:59:16 crc kubenswrapper[4748]: I0320 10:59:16.141694 4748 generic.go:334] "Generic (PLEG): container finished" podID="f45b2b72-df5d-4ae6-9f5a-8f6cced75642" containerID="f92e189958f888ce283b036bc284b4431af82bd0c2f0bd94e1a43b803ebb2fc2" exitCode=0 Mar 20 10:59:16 crc kubenswrapper[4748]: I0320 10:59:16.141745 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f45b2b72-df5d-4ae6-9f5a-8f6cced75642","Type":"ContainerDied","Data":"f92e189958f888ce283b036bc284b4431af82bd0c2f0bd94e1a43b803ebb2fc2"} Mar 20 10:59:16 crc kubenswrapper[4748]: I0320 10:59:16.141782 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f45b2b72-df5d-4ae6-9f5a-8f6cced75642","Type":"ContainerDied","Data":"d0b842ac82033584a141103c86b10919396c0bcaaed1b0b4f0466192b92eb967"} Mar 20 10:59:16 crc kubenswrapper[4748]: I0320 10:59:16.141802 4748 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d0b842ac82033584a141103c86b10919396c0bcaaed1b0b4f0466192b92eb967" Mar 20 10:59:16 crc kubenswrapper[4748]: I0320 10:59:16.147242 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ad5425ff-9e86-4845-b272-14383e2166d9","Type":"ContainerStarted","Data":"c7c2234181fbd72fa3280f077fdec6d37abe800855f80896842db22ed8c2c89a"} Mar 20 10:59:16 crc kubenswrapper[4748]: I0320 10:59:16.147282 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ad5425ff-9e86-4845-b272-14383e2166d9","Type":"ContainerStarted","Data":"9ae39ddda1c75ca7dfe27a2563686c8654c2b7c73adc9932c0209f96dcb78398"} Mar 20 10:59:16 crc kubenswrapper[4748]: I0320 10:59:16.152683 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"794b1ed4-1370-4c35-b4b2-7ad2da01f690","Type":"ContainerStarted","Data":"9c9e6d3afd0cba4660c04af89c500343aadf2d4b4eb1164c1a0e0fd13136e506"} Mar 20 10:59:16 crc kubenswrapper[4748]: I0320 10:59:16.193022 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 10:59:16 crc kubenswrapper[4748]: I0320 10:59:16.223609 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.223583943 podStartE2EDuration="2.223583943s" podCreationTimestamp="2026-03-20 10:59:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:59:16.18427123 +0000 UTC m=+1391.325817064" watchObservedRunningTime="2026-03-20 10:59:16.223583943 +0000 UTC m=+1391.365129757" Mar 20 10:59:16 crc kubenswrapper[4748]: I0320 10:59:16.285489 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gnnmd\" (UniqueName: \"kubernetes.io/projected/f45b2b72-df5d-4ae6-9f5a-8f6cced75642-kube-api-access-gnnmd\") pod \"f45b2b72-df5d-4ae6-9f5a-8f6cced75642\" (UID: \"f45b2b72-df5d-4ae6-9f5a-8f6cced75642\") " Mar 20 10:59:16 crc kubenswrapper[4748]: I0320 10:59:16.285843 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f45b2b72-df5d-4ae6-9f5a-8f6cced75642-combined-ca-bundle\") pod \"f45b2b72-df5d-4ae6-9f5a-8f6cced75642\" (UID: \"f45b2b72-df5d-4ae6-9f5a-8f6cced75642\") " Mar 20 10:59:16 crc kubenswrapper[4748]: I0320 10:59:16.285929 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f45b2b72-df5d-4ae6-9f5a-8f6cced75642-config-data\") pod \"f45b2b72-df5d-4ae6-9f5a-8f6cced75642\" (UID: \"f45b2b72-df5d-4ae6-9f5a-8f6cced75642\") " Mar 20 10:59:16 crc kubenswrapper[4748]: I0320 10:59:16.285994 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f45b2b72-df5d-4ae6-9f5a-8f6cced75642-logs\") pod \"f45b2b72-df5d-4ae6-9f5a-8f6cced75642\" (UID: \"f45b2b72-df5d-4ae6-9f5a-8f6cced75642\") " Mar 20 10:59:16 crc kubenswrapper[4748]: I0320 10:59:16.291561 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f45b2b72-df5d-4ae6-9f5a-8f6cced75642-logs" (OuterVolumeSpecName: "logs") pod "f45b2b72-df5d-4ae6-9f5a-8f6cced75642" (UID: "f45b2b72-df5d-4ae6-9f5a-8f6cced75642"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:59:16 crc kubenswrapper[4748]: I0320 10:59:16.293240 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f45b2b72-df5d-4ae6-9f5a-8f6cced75642-kube-api-access-gnnmd" (OuterVolumeSpecName: "kube-api-access-gnnmd") pod "f45b2b72-df5d-4ae6-9f5a-8f6cced75642" (UID: "f45b2b72-df5d-4ae6-9f5a-8f6cced75642"). InnerVolumeSpecName "kube-api-access-gnnmd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:59:16 crc kubenswrapper[4748]: I0320 10:59:16.316390 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f45b2b72-df5d-4ae6-9f5a-8f6cced75642-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f45b2b72-df5d-4ae6-9f5a-8f6cced75642" (UID: "f45b2b72-df5d-4ae6-9f5a-8f6cced75642"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:59:16 crc kubenswrapper[4748]: I0320 10:59:16.320314 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f45b2b72-df5d-4ae6-9f5a-8f6cced75642-config-data" (OuterVolumeSpecName: "config-data") pod "f45b2b72-df5d-4ae6-9f5a-8f6cced75642" (UID: "f45b2b72-df5d-4ae6-9f5a-8f6cced75642"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:59:16 crc kubenswrapper[4748]: I0320 10:59:16.391711 4748 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f45b2b72-df5d-4ae6-9f5a-8f6cced75642-logs\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:16 crc kubenswrapper[4748]: I0320 10:59:16.391738 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gnnmd\" (UniqueName: \"kubernetes.io/projected/f45b2b72-df5d-4ae6-9f5a-8f6cced75642-kube-api-access-gnnmd\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:16 crc kubenswrapper[4748]: I0320 10:59:16.391749 4748 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f45b2b72-df5d-4ae6-9f5a-8f6cced75642-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:16 crc kubenswrapper[4748]: I0320 10:59:16.391759 4748 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f45b2b72-df5d-4ae6-9f5a-8f6cced75642-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:17 crc kubenswrapper[4748]: I0320 10:59:17.164941 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"794b1ed4-1370-4c35-b4b2-7ad2da01f690","Type":"ContainerStarted","Data":"c7035bb73c98a18f77aaa64deb90fcdc3b05847f09179b722a88febf5926ad0e"} Mar 20 10:59:17 crc kubenswrapper[4748]: I0320 10:59:17.172132 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ad5425ff-9e86-4845-b272-14383e2166d9","Type":"ContainerStarted","Data":"b15e9fc54e396405feab323ce5a4653790b6964fdbc0b0735ae0c4f7de638c65"} Mar 20 10:59:17 crc kubenswrapper[4748]: I0320 10:59:17.173050 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 10:59:17 crc kubenswrapper[4748]: I0320 10:59:17.195500 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.195470208 podStartE2EDuration="2.195470208s" podCreationTimestamp="2026-03-20 10:59:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:59:17.192897134 +0000 UTC m=+1392.334442938" watchObservedRunningTime="2026-03-20 10:59:17.195470208 +0000 UTC m=+1392.337016022" Mar 20 10:59:17 crc kubenswrapper[4748]: I0320 10:59:17.219337 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 20 10:59:17 crc kubenswrapper[4748]: I0320 10:59:17.239696 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 20 10:59:17 crc kubenswrapper[4748]: I0320 10:59:17.273823 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 20 10:59:17 crc kubenswrapper[4748]: E0320 10:59:17.274577 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f45b2b72-df5d-4ae6-9f5a-8f6cced75642" containerName="nova-api-log" Mar 20 10:59:17 crc kubenswrapper[4748]: I0320 10:59:17.274647 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="f45b2b72-df5d-4ae6-9f5a-8f6cced75642" containerName="nova-api-log" Mar 20 10:59:17 crc kubenswrapper[4748]: E0320 10:59:17.274722 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f45b2b72-df5d-4ae6-9f5a-8f6cced75642" containerName="nova-api-api" Mar 20 10:59:17 crc kubenswrapper[4748]: I0320 10:59:17.274795 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="f45b2b72-df5d-4ae6-9f5a-8f6cced75642" containerName="nova-api-api" Mar 20 10:59:17 crc kubenswrapper[4748]: I0320 10:59:17.275133 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="f45b2b72-df5d-4ae6-9f5a-8f6cced75642" containerName="nova-api-log" Mar 20 10:59:17 crc kubenswrapper[4748]: I0320 10:59:17.275241 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="f45b2b72-df5d-4ae6-9f5a-8f6cced75642" containerName="nova-api-api" Mar 20 10:59:17 crc kubenswrapper[4748]: I0320 10:59:17.277460 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 10:59:17 crc kubenswrapper[4748]: I0320 10:59:17.281185 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 20 10:59:17 crc kubenswrapper[4748]: I0320 10:59:17.313132 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0a406ab-9700-4aff-ac24-931c6862568d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f0a406ab-9700-4aff-ac24-931c6862568d\") " pod="openstack/nova-api-0" Mar 20 10:59:17 crc kubenswrapper[4748]: I0320 10:59:17.313250 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f0a406ab-9700-4aff-ac24-931c6862568d-logs\") pod \"nova-api-0\" (UID: \"f0a406ab-9700-4aff-ac24-931c6862568d\") " pod="openstack/nova-api-0" Mar 20 10:59:17 crc kubenswrapper[4748]: I0320 10:59:17.313544 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0a406ab-9700-4aff-ac24-931c6862568d-config-data\") pod \"nova-api-0\" (UID: \"f0a406ab-9700-4aff-ac24-931c6862568d\") " pod="openstack/nova-api-0" Mar 20 10:59:17 crc kubenswrapper[4748]: I0320 10:59:17.313678 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qlzc\" (UniqueName: \"kubernetes.io/projected/f0a406ab-9700-4aff-ac24-931c6862568d-kube-api-access-7qlzc\") pod \"nova-api-0\" (UID: \"f0a406ab-9700-4aff-ac24-931c6862568d\") " pod="openstack/nova-api-0" Mar 20 10:59:17 crc kubenswrapper[4748]: I0320 10:59:17.324002 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 10:59:17 crc kubenswrapper[4748]: I0320 10:59:17.416310 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7qlzc\" (UniqueName: \"kubernetes.io/projected/f0a406ab-9700-4aff-ac24-931c6862568d-kube-api-access-7qlzc\") pod \"nova-api-0\" (UID: \"f0a406ab-9700-4aff-ac24-931c6862568d\") " pod="openstack/nova-api-0" Mar 20 10:59:17 crc kubenswrapper[4748]: I0320 10:59:17.416779 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0a406ab-9700-4aff-ac24-931c6862568d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f0a406ab-9700-4aff-ac24-931c6862568d\") " pod="openstack/nova-api-0" Mar 20 10:59:17 crc kubenswrapper[4748]: I0320 10:59:17.417685 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f0a406ab-9700-4aff-ac24-931c6862568d-logs\") pod \"nova-api-0\" (UID: \"f0a406ab-9700-4aff-ac24-931c6862568d\") " pod="openstack/nova-api-0" Mar 20 10:59:17 crc kubenswrapper[4748]: I0320 10:59:17.417969 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0a406ab-9700-4aff-ac24-931c6862568d-config-data\") pod \"nova-api-0\" (UID: \"f0a406ab-9700-4aff-ac24-931c6862568d\") " pod="openstack/nova-api-0" Mar 20 10:59:17 crc kubenswrapper[4748]: I0320 10:59:17.419162 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f0a406ab-9700-4aff-ac24-931c6862568d-logs\") pod \"nova-api-0\" (UID: \"f0a406ab-9700-4aff-ac24-931c6862568d\") " pod="openstack/nova-api-0" Mar 20 10:59:17 crc kubenswrapper[4748]: I0320 10:59:17.427090 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0a406ab-9700-4aff-ac24-931c6862568d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f0a406ab-9700-4aff-ac24-931c6862568d\") " pod="openstack/nova-api-0" Mar 20 10:59:17 crc kubenswrapper[4748]: I0320 10:59:17.431507 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0a406ab-9700-4aff-ac24-931c6862568d-config-data\") pod \"nova-api-0\" (UID: \"f0a406ab-9700-4aff-ac24-931c6862568d\") " pod="openstack/nova-api-0" Mar 20 10:59:17 crc kubenswrapper[4748]: I0320 10:59:17.436499 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qlzc\" (UniqueName: \"kubernetes.io/projected/f0a406ab-9700-4aff-ac24-931c6862568d-kube-api-access-7qlzc\") pod \"nova-api-0\" (UID: \"f0a406ab-9700-4aff-ac24-931c6862568d\") " pod="openstack/nova-api-0" Mar 20 10:59:17 crc kubenswrapper[4748]: I0320 10:59:17.542977 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f45b2b72-df5d-4ae6-9f5a-8f6cced75642" path="/var/lib/kubelet/pods/f45b2b72-df5d-4ae6-9f5a-8f6cced75642/volumes" Mar 20 10:59:17 crc kubenswrapper[4748]: I0320 10:59:17.620999 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 10:59:17 crc kubenswrapper[4748]: I0320 10:59:17.955394 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 10:59:18 crc kubenswrapper[4748]: I0320 10:59:18.188592 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f0a406ab-9700-4aff-ac24-931c6862568d","Type":"ContainerStarted","Data":"7716272e3fe5b71e79e73ea6c6841c615d63e889dbddd9e80231bc06db813617"} Mar 20 10:59:18 crc kubenswrapper[4748]: I0320 10:59:18.194088 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ad5425ff-9e86-4845-b272-14383e2166d9","Type":"ContainerStarted","Data":"a2780081a566931bbdf2713f8a64c548a1165dd595f82c47994f087275155ca2"} Mar 20 10:59:19 crc kubenswrapper[4748]: I0320 10:59:19.205868 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f0a406ab-9700-4aff-ac24-931c6862568d","Type":"ContainerStarted","Data":"ad6035d14943caa1073b7650c27606212f0b43e38e1604d5f0cfbb76c606d8b8"} Mar 20 10:59:19 crc kubenswrapper[4748]: I0320 10:59:19.206781 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f0a406ab-9700-4aff-ac24-931c6862568d","Type":"ContainerStarted","Data":"daa61352b9ef42dec0ace0828be267eeeca3c60d5c8b943702a096850b4b18ef"} Mar 20 10:59:19 crc kubenswrapper[4748]: I0320 10:59:19.239373 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.239349563 podStartE2EDuration="2.239349563s" podCreationTimestamp="2026-03-20 10:59:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:59:19.224973924 +0000 UTC m=+1394.366519748" watchObservedRunningTime="2026-03-20 10:59:19.239349563 +0000 UTC m=+1394.380895387" Mar 20 10:59:19 crc kubenswrapper[4748]: I0320 10:59:19.374701 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 20 10:59:20 crc kubenswrapper[4748]: I0320 10:59:20.219303 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ad5425ff-9e86-4845-b272-14383e2166d9","Type":"ContainerStarted","Data":"118d7996d5d51ba5518534ba529e848b160156aff361606a977d8635e5201b0f"} Mar 20 10:59:20 crc kubenswrapper[4748]: I0320 10:59:20.249336 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.029149672 podStartE2EDuration="6.249315511s" podCreationTimestamp="2026-03-20 10:59:14 +0000 UTC" firstStartedPulling="2026-03-20 10:59:15.237175904 +0000 UTC m=+1390.378721718" lastFinishedPulling="2026-03-20 10:59:19.457341743 +0000 UTC m=+1394.598887557" observedRunningTime="2026-03-20 10:59:20.239581538 +0000 UTC m=+1395.381127352" watchObservedRunningTime="2026-03-20 10:59:20.249315511 +0000 UTC m=+1395.390861325" Mar 20 10:59:20 crc kubenswrapper[4748]: I0320 10:59:20.426445 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 20 10:59:21 crc kubenswrapper[4748]: I0320 10:59:21.232576 4748 generic.go:334] "Generic (PLEG): container finished" podID="292d1168-7edc-4e05-a657-c03029450a6b" containerID="d1ebbb3667bd4d483bc0e2f4be4610d8a04be8354aac381ebc50df5e6bad9595" exitCode=0 Mar 20 10:59:21 crc kubenswrapper[4748]: I0320 10:59:21.233688 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-rghcf" event={"ID":"292d1168-7edc-4e05-a657-c03029450a6b","Type":"ContainerDied","Data":"d1ebbb3667bd4d483bc0e2f4be4610d8a04be8354aac381ebc50df5e6bad9595"} Mar 20 10:59:21 crc kubenswrapper[4748]: I0320 10:59:21.233728 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 20 10:59:22 crc kubenswrapper[4748]: I0320 10:59:22.605933 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-rghcf" Mar 20 10:59:22 crc kubenswrapper[4748]: I0320 10:59:22.627614 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/292d1168-7edc-4e05-a657-c03029450a6b-combined-ca-bundle\") pod \"292d1168-7edc-4e05-a657-c03029450a6b\" (UID: \"292d1168-7edc-4e05-a657-c03029450a6b\") " Mar 20 10:59:22 crc kubenswrapper[4748]: I0320 10:59:22.627686 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/292d1168-7edc-4e05-a657-c03029450a6b-config-data\") pod \"292d1168-7edc-4e05-a657-c03029450a6b\" (UID: \"292d1168-7edc-4e05-a657-c03029450a6b\") " Mar 20 10:59:22 crc kubenswrapper[4748]: I0320 10:59:22.627803 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcmbv\" (UniqueName: \"kubernetes.io/projected/292d1168-7edc-4e05-a657-c03029450a6b-kube-api-access-xcmbv\") pod \"292d1168-7edc-4e05-a657-c03029450a6b\" (UID: \"292d1168-7edc-4e05-a657-c03029450a6b\") " Mar 20 10:59:22 crc kubenswrapper[4748]: I0320 10:59:22.627985 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/292d1168-7edc-4e05-a657-c03029450a6b-scripts\") pod \"292d1168-7edc-4e05-a657-c03029450a6b\" (UID: \"292d1168-7edc-4e05-a657-c03029450a6b\") " Mar 20 10:59:22 crc kubenswrapper[4748]: I0320 10:59:22.636538 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/292d1168-7edc-4e05-a657-c03029450a6b-scripts" (OuterVolumeSpecName: "scripts") pod "292d1168-7edc-4e05-a657-c03029450a6b" (UID: "292d1168-7edc-4e05-a657-c03029450a6b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:59:22 crc kubenswrapper[4748]: I0320 10:59:22.641991 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/292d1168-7edc-4e05-a657-c03029450a6b-kube-api-access-xcmbv" (OuterVolumeSpecName: "kube-api-access-xcmbv") pod "292d1168-7edc-4e05-a657-c03029450a6b" (UID: "292d1168-7edc-4e05-a657-c03029450a6b"). InnerVolumeSpecName "kube-api-access-xcmbv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:59:22 crc kubenswrapper[4748]: I0320 10:59:22.676520 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/292d1168-7edc-4e05-a657-c03029450a6b-config-data" (OuterVolumeSpecName: "config-data") pod "292d1168-7edc-4e05-a657-c03029450a6b" (UID: "292d1168-7edc-4e05-a657-c03029450a6b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:59:22 crc kubenswrapper[4748]: I0320 10:59:22.676534 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/292d1168-7edc-4e05-a657-c03029450a6b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "292d1168-7edc-4e05-a657-c03029450a6b" (UID: "292d1168-7edc-4e05-a657-c03029450a6b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:59:22 crc kubenswrapper[4748]: I0320 10:59:22.731568 4748 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/292d1168-7edc-4e05-a657-c03029450a6b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:22 crc kubenswrapper[4748]: I0320 10:59:22.731614 4748 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/292d1168-7edc-4e05-a657-c03029450a6b-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:22 crc kubenswrapper[4748]: I0320 10:59:22.731626 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcmbv\" (UniqueName: \"kubernetes.io/projected/292d1168-7edc-4e05-a657-c03029450a6b-kube-api-access-xcmbv\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:22 crc kubenswrapper[4748]: I0320 10:59:22.731641 4748 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/292d1168-7edc-4e05-a657-c03029450a6b-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:23 crc kubenswrapper[4748]: I0320 10:59:23.268713 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-rghcf" event={"ID":"292d1168-7edc-4e05-a657-c03029450a6b","Type":"ContainerDied","Data":"1f6fdf596716024868d4249232ddfe589fb6b2692b2f13c40fd1e84a4c60a9c7"} Mar 20 10:59:23 crc kubenswrapper[4748]: I0320 10:59:23.268852 4748 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1f6fdf596716024868d4249232ddfe589fb6b2692b2f13c40fd1e84a4c60a9c7" Mar 20 10:59:23 crc kubenswrapper[4748]: I0320 10:59:23.268994 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-rghcf" Mar 20 10:59:23 crc kubenswrapper[4748]: I0320 10:59:23.349258 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 20 10:59:23 crc kubenswrapper[4748]: E0320 10:59:23.350127 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="292d1168-7edc-4e05-a657-c03029450a6b" containerName="nova-cell1-conductor-db-sync" Mar 20 10:59:23 crc kubenswrapper[4748]: I0320 10:59:23.350162 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="292d1168-7edc-4e05-a657-c03029450a6b" containerName="nova-cell1-conductor-db-sync" Mar 20 10:59:23 crc kubenswrapper[4748]: I0320 10:59:23.350413 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="292d1168-7edc-4e05-a657-c03029450a6b" containerName="nova-cell1-conductor-db-sync" Mar 20 10:59:23 crc kubenswrapper[4748]: I0320 10:59:23.351394 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 20 10:59:23 crc kubenswrapper[4748]: I0320 10:59:23.367581 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 20 10:59:23 crc kubenswrapper[4748]: I0320 10:59:23.381225 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 20 10:59:23 crc kubenswrapper[4748]: I0320 10:59:23.448569 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9acc08e-1cf9-4a43-9b60-2bd4e1cad401-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"a9acc08e-1cf9-4a43-9b60-2bd4e1cad401\") " pod="openstack/nova-cell1-conductor-0" Mar 20 10:59:23 crc kubenswrapper[4748]: I0320 10:59:23.448719 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6w2kz\" (UniqueName: \"kubernetes.io/projected/a9acc08e-1cf9-4a43-9b60-2bd4e1cad401-kube-api-access-6w2kz\") pod \"nova-cell1-conductor-0\" (UID: \"a9acc08e-1cf9-4a43-9b60-2bd4e1cad401\") " pod="openstack/nova-cell1-conductor-0" Mar 20 10:59:23 crc kubenswrapper[4748]: I0320 10:59:23.448956 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9acc08e-1cf9-4a43-9b60-2bd4e1cad401-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"a9acc08e-1cf9-4a43-9b60-2bd4e1cad401\") " pod="openstack/nova-cell1-conductor-0" Mar 20 10:59:23 crc kubenswrapper[4748]: I0320 10:59:23.552047 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6w2kz\" (UniqueName: \"kubernetes.io/projected/a9acc08e-1cf9-4a43-9b60-2bd4e1cad401-kube-api-access-6w2kz\") pod \"nova-cell1-conductor-0\" (UID: \"a9acc08e-1cf9-4a43-9b60-2bd4e1cad401\") " pod="openstack/nova-cell1-conductor-0" Mar 20 10:59:23 crc kubenswrapper[4748]: I0320 10:59:23.552485 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9acc08e-1cf9-4a43-9b60-2bd4e1cad401-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"a9acc08e-1cf9-4a43-9b60-2bd4e1cad401\") " pod="openstack/nova-cell1-conductor-0" Mar 20 10:59:23 crc kubenswrapper[4748]: I0320 10:59:23.553315 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9acc08e-1cf9-4a43-9b60-2bd4e1cad401-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"a9acc08e-1cf9-4a43-9b60-2bd4e1cad401\") " pod="openstack/nova-cell1-conductor-0" Mar 20 10:59:23 crc kubenswrapper[4748]: I0320 10:59:23.556679 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9acc08e-1cf9-4a43-9b60-2bd4e1cad401-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"a9acc08e-1cf9-4a43-9b60-2bd4e1cad401\") " pod="openstack/nova-cell1-conductor-0" Mar 20 10:59:23 crc kubenswrapper[4748]: I0320 10:59:23.557580 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9acc08e-1cf9-4a43-9b60-2bd4e1cad401-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"a9acc08e-1cf9-4a43-9b60-2bd4e1cad401\") " pod="openstack/nova-cell1-conductor-0" Mar 20 10:59:23 crc kubenswrapper[4748]: I0320 10:59:23.580677 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6w2kz\" (UniqueName: \"kubernetes.io/projected/a9acc08e-1cf9-4a43-9b60-2bd4e1cad401-kube-api-access-6w2kz\") pod \"nova-cell1-conductor-0\" (UID: \"a9acc08e-1cf9-4a43-9b60-2bd4e1cad401\") " pod="openstack/nova-cell1-conductor-0" Mar 20 10:59:23 crc kubenswrapper[4748]: I0320 10:59:23.675529 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 20 10:59:24 crc kubenswrapper[4748]: I0320 10:59:24.186384 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 20 10:59:24 crc kubenswrapper[4748]: W0320 10:59:24.201223 4748 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda9acc08e_1cf9_4a43_9b60_2bd4e1cad401.slice/crio-651139913c580f5bd09a545ef2f3cb5616c8e386d8db8a5c361310b95fa47d44 WatchSource:0}: Error finding container 651139913c580f5bd09a545ef2f3cb5616c8e386d8db8a5c361310b95fa47d44: Status 404 returned error can't find the container with id 651139913c580f5bd09a545ef2f3cb5616c8e386d8db8a5c361310b95fa47d44 Mar 20 10:59:24 crc kubenswrapper[4748]: I0320 10:59:24.281253 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"a9acc08e-1cf9-4a43-9b60-2bd4e1cad401","Type":"ContainerStarted","Data":"651139913c580f5bd09a545ef2f3cb5616c8e386d8db8a5c361310b95fa47d44"} Mar 20 10:59:24 crc kubenswrapper[4748]: I0320 10:59:24.670294 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 20 10:59:24 crc kubenswrapper[4748]: I0320 10:59:24.670641 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 20 10:59:25 crc kubenswrapper[4748]: I0320 10:59:25.291679 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"a9acc08e-1cf9-4a43-9b60-2bd4e1cad401","Type":"ContainerStarted","Data":"439c0798b23e6e65d2e668ceea36093839845ec9a2ef218cdd49449986898983"} Mar 20 10:59:25 crc kubenswrapper[4748]: I0320 10:59:25.292039 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Mar 20 10:59:25 crc kubenswrapper[4748]: I0320 10:59:25.312168 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.312148015 podStartE2EDuration="2.312148015s" podCreationTimestamp="2026-03-20 10:59:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:59:25.308709719 +0000 UTC m=+1400.450255553" watchObservedRunningTime="2026-03-20 10:59:25.312148015 +0000 UTC m=+1400.453693829" Mar 20 10:59:25 crc kubenswrapper[4748]: I0320 10:59:25.426366 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 20 10:59:25 crc kubenswrapper[4748]: I0320 10:59:25.462473 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 20 10:59:25 crc kubenswrapper[4748]: I0320 10:59:25.689087 4748 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="d51df005-e870-4b00-9eac-72bb099077dd" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.200:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 10:59:25 crc kubenswrapper[4748]: I0320 10:59:25.689092 4748 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="d51df005-e870-4b00-9eac-72bb099077dd" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.200:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 10:59:26 crc kubenswrapper[4748]: I0320 10:59:26.344222 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 20 10:59:27 crc kubenswrapper[4748]: I0320 10:59:27.622332 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 20 10:59:27 crc kubenswrapper[4748]: I0320 10:59:27.622655 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 20 10:59:28 crc kubenswrapper[4748]: I0320 10:59:28.705136 4748 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="f0a406ab-9700-4aff-ac24-931c6862568d" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.202:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 10:59:28 crc kubenswrapper[4748]: I0320 10:59:28.705537 4748 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="f0a406ab-9700-4aff-ac24-931c6862568d" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.202:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 10:59:32 crc kubenswrapper[4748]: I0320 10:59:32.670297 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 20 10:59:32 crc kubenswrapper[4748]: I0320 10:59:32.670888 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 20 10:59:33 crc kubenswrapper[4748]: I0320 10:59:33.705528 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Mar 20 10:59:34 crc kubenswrapper[4748]: I0320 10:59:34.677271 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 20 10:59:34 crc kubenswrapper[4748]: I0320 10:59:34.681959 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 20 10:59:34 crc kubenswrapper[4748]: I0320 10:59:34.685801 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 20 10:59:35 crc kubenswrapper[4748]: I0320 10:59:35.357653 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 20 10:59:35 crc kubenswrapper[4748]: I0320 10:59:35.392181 4748 generic.go:334] "Generic (PLEG): container finished" podID="b9da5564-8bfd-4c19-acba-4b382519195a" containerID="0cbe55127ac649ea0b81d09353b3b581486ceba01c492c5d463a67bf50225f18" exitCode=137 Mar 20 10:59:35 crc kubenswrapper[4748]: I0320 10:59:35.392273 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"b9da5564-8bfd-4c19-acba-4b382519195a","Type":"ContainerDied","Data":"0cbe55127ac649ea0b81d09353b3b581486ceba01c492c5d463a67bf50225f18"} Mar 20 10:59:35 crc kubenswrapper[4748]: I0320 10:59:35.392645 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"b9da5564-8bfd-4c19-acba-4b382519195a","Type":"ContainerDied","Data":"45d93bab766ffb83b62010f9f0ba1e6f9649e2e23bd740094589ffd3990f860d"} Mar 20 10:59:35 crc kubenswrapper[4748]: I0320 10:59:35.392306 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 20 10:59:35 crc kubenswrapper[4748]: I0320 10:59:35.392698 4748 scope.go:117] "RemoveContainer" containerID="0cbe55127ac649ea0b81d09353b3b581486ceba01c492c5d463a67bf50225f18" Mar 20 10:59:35 crc kubenswrapper[4748]: I0320 10:59:35.400128 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 20 10:59:35 crc kubenswrapper[4748]: I0320 10:59:35.431645 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9da5564-8bfd-4c19-acba-4b382519195a-config-data\") pod \"b9da5564-8bfd-4c19-acba-4b382519195a\" (UID: \"b9da5564-8bfd-4c19-acba-4b382519195a\") " Mar 20 10:59:35 crc kubenswrapper[4748]: I0320 10:59:35.431780 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dhh87\" (UniqueName: \"kubernetes.io/projected/b9da5564-8bfd-4c19-acba-4b382519195a-kube-api-access-dhh87\") pod \"b9da5564-8bfd-4c19-acba-4b382519195a\" (UID: \"b9da5564-8bfd-4c19-acba-4b382519195a\") " Mar 20 10:59:35 crc kubenswrapper[4748]: I0320 10:59:35.432027 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9da5564-8bfd-4c19-acba-4b382519195a-combined-ca-bundle\") pod \"b9da5564-8bfd-4c19-acba-4b382519195a\" (UID: \"b9da5564-8bfd-4c19-acba-4b382519195a\") " Mar 20 10:59:35 crc kubenswrapper[4748]: I0320 10:59:35.479920 4748 scope.go:117] "RemoveContainer" containerID="0cbe55127ac649ea0b81d09353b3b581486ceba01c492c5d463a67bf50225f18" Mar 20 10:59:35 crc kubenswrapper[4748]: E0320 10:59:35.482438 4748 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0cbe55127ac649ea0b81d09353b3b581486ceba01c492c5d463a67bf50225f18\": container with ID starting with 0cbe55127ac649ea0b81d09353b3b581486ceba01c492c5d463a67bf50225f18 not found: ID does not exist" containerID="0cbe55127ac649ea0b81d09353b3b581486ceba01c492c5d463a67bf50225f18" Mar 20 10:59:35 crc kubenswrapper[4748]: I0320 10:59:35.482485 4748 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0cbe55127ac649ea0b81d09353b3b581486ceba01c492c5d463a67bf50225f18"} err="failed to get container status \"0cbe55127ac649ea0b81d09353b3b581486ceba01c492c5d463a67bf50225f18\": rpc error: code = NotFound desc = could not find container \"0cbe55127ac649ea0b81d09353b3b581486ceba01c492c5d463a67bf50225f18\": container with ID starting with 0cbe55127ac649ea0b81d09353b3b581486ceba01c492c5d463a67bf50225f18 not found: ID does not exist" Mar 20 10:59:35 crc kubenswrapper[4748]: I0320 10:59:35.518588 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9da5564-8bfd-4c19-acba-4b382519195a-kube-api-access-dhh87" (OuterVolumeSpecName: "kube-api-access-dhh87") pod "b9da5564-8bfd-4c19-acba-4b382519195a" (UID: "b9da5564-8bfd-4c19-acba-4b382519195a"). InnerVolumeSpecName "kube-api-access-dhh87". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:59:35 crc kubenswrapper[4748]: I0320 10:59:35.535238 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dhh87\" (UniqueName: \"kubernetes.io/projected/b9da5564-8bfd-4c19-acba-4b382519195a-kube-api-access-dhh87\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:35 crc kubenswrapper[4748]: I0320 10:59:35.601147 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9da5564-8bfd-4c19-acba-4b382519195a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b9da5564-8bfd-4c19-acba-4b382519195a" (UID: "b9da5564-8bfd-4c19-acba-4b382519195a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:59:35 crc kubenswrapper[4748]: I0320 10:59:35.620108 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9da5564-8bfd-4c19-acba-4b382519195a-config-data" (OuterVolumeSpecName: "config-data") pod "b9da5564-8bfd-4c19-acba-4b382519195a" (UID: "b9da5564-8bfd-4c19-acba-4b382519195a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:59:35 crc kubenswrapper[4748]: I0320 10:59:35.622007 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 20 10:59:35 crc kubenswrapper[4748]: I0320 10:59:35.623435 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 20 10:59:35 crc kubenswrapper[4748]: I0320 10:59:35.641168 4748 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9da5564-8bfd-4c19-acba-4b382519195a-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:35 crc kubenswrapper[4748]: I0320 10:59:35.641224 4748 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9da5564-8bfd-4c19-acba-4b382519195a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:35 crc kubenswrapper[4748]: I0320 10:59:35.756639 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 10:59:35 crc kubenswrapper[4748]: I0320 10:59:35.778481 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 10:59:35 crc kubenswrapper[4748]: I0320 10:59:35.802351 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 10:59:35 crc kubenswrapper[4748]: E0320 10:59:35.802932 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9da5564-8bfd-4c19-acba-4b382519195a" containerName="nova-cell1-novncproxy-novncproxy" Mar 20 10:59:35 crc kubenswrapper[4748]: I0320 10:59:35.802957 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9da5564-8bfd-4c19-acba-4b382519195a" containerName="nova-cell1-novncproxy-novncproxy" Mar 20 10:59:35 crc kubenswrapper[4748]: I0320 10:59:35.803217 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9da5564-8bfd-4c19-acba-4b382519195a" containerName="nova-cell1-novncproxy-novncproxy" Mar 20 10:59:35 crc kubenswrapper[4748]: I0320 10:59:35.804326 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 20 10:59:35 crc kubenswrapper[4748]: I0320 10:59:35.807673 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Mar 20 10:59:35 crc kubenswrapper[4748]: I0320 10:59:35.809481 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Mar 20 10:59:35 crc kubenswrapper[4748]: I0320 10:59:35.809535 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 20 10:59:35 crc kubenswrapper[4748]: I0320 10:59:35.814159 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 10:59:35 crc kubenswrapper[4748]: I0320 10:59:35.949615 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2376389-554f-4c38-bfc1-00962d858ff4-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"a2376389-554f-4c38-bfc1-00962d858ff4\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 10:59:35 crc kubenswrapper[4748]: I0320 10:59:35.949759 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2376389-554f-4c38-bfc1-00962d858ff4-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"a2376389-554f-4c38-bfc1-00962d858ff4\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 10:59:35 crc kubenswrapper[4748]: I0320 10:59:35.949825 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/a2376389-554f-4c38-bfc1-00962d858ff4-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"a2376389-554f-4c38-bfc1-00962d858ff4\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 10:59:35 crc kubenswrapper[4748]: I0320 10:59:35.949886 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhbmn\" (UniqueName: \"kubernetes.io/projected/a2376389-554f-4c38-bfc1-00962d858ff4-kube-api-access-rhbmn\") pod \"nova-cell1-novncproxy-0\" (UID: \"a2376389-554f-4c38-bfc1-00962d858ff4\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 10:59:35 crc kubenswrapper[4748]: I0320 10:59:35.949924 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/a2376389-554f-4c38-bfc1-00962d858ff4-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"a2376389-554f-4c38-bfc1-00962d858ff4\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 10:59:36 crc kubenswrapper[4748]: I0320 10:59:36.052825 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/a2376389-554f-4c38-bfc1-00962d858ff4-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"a2376389-554f-4c38-bfc1-00962d858ff4\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 10:59:36 crc kubenswrapper[4748]: I0320 10:59:36.053021 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2376389-554f-4c38-bfc1-00962d858ff4-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"a2376389-554f-4c38-bfc1-00962d858ff4\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 10:59:36 crc kubenswrapper[4748]: I0320 10:59:36.053118 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2376389-554f-4c38-bfc1-00962d858ff4-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"a2376389-554f-4c38-bfc1-00962d858ff4\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 10:59:36 crc kubenswrapper[4748]: I0320 10:59:36.053206 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/a2376389-554f-4c38-bfc1-00962d858ff4-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"a2376389-554f-4c38-bfc1-00962d858ff4\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 10:59:36 crc kubenswrapper[4748]: I0320 10:59:36.053264 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rhbmn\" (UniqueName: \"kubernetes.io/projected/a2376389-554f-4c38-bfc1-00962d858ff4-kube-api-access-rhbmn\") pod \"nova-cell1-novncproxy-0\" (UID: \"a2376389-554f-4c38-bfc1-00962d858ff4\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 10:59:36 crc kubenswrapper[4748]: I0320 10:59:36.060371 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/a2376389-554f-4c38-bfc1-00962d858ff4-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"a2376389-554f-4c38-bfc1-00962d858ff4\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 10:59:36 crc kubenswrapper[4748]: I0320 10:59:36.060599 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2376389-554f-4c38-bfc1-00962d858ff4-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"a2376389-554f-4c38-bfc1-00962d858ff4\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 10:59:36 crc kubenswrapper[4748]: I0320 10:59:36.061367 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/a2376389-554f-4c38-bfc1-00962d858ff4-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"a2376389-554f-4c38-bfc1-00962d858ff4\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 10:59:36 crc kubenswrapper[4748]: I0320 10:59:36.066083 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2376389-554f-4c38-bfc1-00962d858ff4-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"a2376389-554f-4c38-bfc1-00962d858ff4\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 10:59:36 crc kubenswrapper[4748]: I0320 10:59:36.072022 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhbmn\" (UniqueName: \"kubernetes.io/projected/a2376389-554f-4c38-bfc1-00962d858ff4-kube-api-access-rhbmn\") pod \"nova-cell1-novncproxy-0\" (UID: \"a2376389-554f-4c38-bfc1-00962d858ff4\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 10:59:36 crc kubenswrapper[4748]: I0320 10:59:36.126468 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 20 10:59:36 crc kubenswrapper[4748]: W0320 10:59:36.669485 4748 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda2376389_554f_4c38_bfc1_00962d858ff4.slice/crio-b2f10afcd208d8f27730d6c22fee69608533070a7d031ad1cbac4aae939552dd WatchSource:0}: Error finding container b2f10afcd208d8f27730d6c22fee69608533070a7d031ad1cbac4aae939552dd: Status 404 returned error can't find the container with id b2f10afcd208d8f27730d6c22fee69608533070a7d031ad1cbac4aae939552dd Mar 20 10:59:36 crc kubenswrapper[4748]: I0320 10:59:36.672137 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 10:59:37 crc kubenswrapper[4748]: I0320 10:59:37.439872 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"a2376389-554f-4c38-bfc1-00962d858ff4","Type":"ContainerStarted","Data":"8aaedc80f16ba473417e0da85d44112478e8f4a83ec4275922a1585895e9d6e8"} Mar 20 10:59:37 crc kubenswrapper[4748]: I0320 10:59:37.440501 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"a2376389-554f-4c38-bfc1-00962d858ff4","Type":"ContainerStarted","Data":"b2f10afcd208d8f27730d6c22fee69608533070a7d031ad1cbac4aae939552dd"} Mar 20 10:59:37 crc kubenswrapper[4748]: I0320 10:59:37.467329 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.467305289 podStartE2EDuration="2.467305289s" podCreationTimestamp="2026-03-20 10:59:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:59:37.463783631 +0000 UTC m=+1412.605329445" watchObservedRunningTime="2026-03-20 10:59:37.467305289 +0000 UTC m=+1412.608851103" Mar 20 10:59:37 crc kubenswrapper[4748]: I0320 10:59:37.527697 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9da5564-8bfd-4c19-acba-4b382519195a" path="/var/lib/kubelet/pods/b9da5564-8bfd-4c19-acba-4b382519195a/volumes" Mar 20 10:59:37 crc kubenswrapper[4748]: I0320 10:59:37.628038 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 20 10:59:37 crc kubenswrapper[4748]: I0320 10:59:37.628196 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 20 10:59:37 crc kubenswrapper[4748]: I0320 10:59:37.631357 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 20 10:59:37 crc kubenswrapper[4748]: I0320 10:59:37.631445 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 20 10:59:37 crc kubenswrapper[4748]: I0320 10:59:37.881928 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-s9zg6"] Mar 20 10:59:37 crc kubenswrapper[4748]: I0320 10:59:37.884710 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-s9zg6" Mar 20 10:59:37 crc kubenswrapper[4748]: I0320 10:59:37.889542 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-s9zg6"] Mar 20 10:59:38 crc kubenswrapper[4748]: I0320 10:59:38.003262 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n96fp\" (UniqueName: \"kubernetes.io/projected/58e56aa6-3665-4020-827c-4b961f13924b-kube-api-access-n96fp\") pod \"dnsmasq-dns-89c5cd4d5-s9zg6\" (UID: \"58e56aa6-3665-4020-827c-4b961f13924b\") " pod="openstack/dnsmasq-dns-89c5cd4d5-s9zg6" Mar 20 10:59:38 crc kubenswrapper[4748]: I0320 10:59:38.003321 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/58e56aa6-3665-4020-827c-4b961f13924b-dns-swift-storage-0\") pod \"dnsmasq-dns-89c5cd4d5-s9zg6\" (UID: \"58e56aa6-3665-4020-827c-4b961f13924b\") " pod="openstack/dnsmasq-dns-89c5cd4d5-s9zg6" Mar 20 10:59:38 crc kubenswrapper[4748]: I0320 10:59:38.003348 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/58e56aa6-3665-4020-827c-4b961f13924b-config\") pod \"dnsmasq-dns-89c5cd4d5-s9zg6\" (UID: \"58e56aa6-3665-4020-827c-4b961f13924b\") " pod="openstack/dnsmasq-dns-89c5cd4d5-s9zg6" Mar 20 10:59:38 crc kubenswrapper[4748]: I0320 10:59:38.003390 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/58e56aa6-3665-4020-827c-4b961f13924b-ovsdbserver-sb\") pod \"dnsmasq-dns-89c5cd4d5-s9zg6\" (UID: \"58e56aa6-3665-4020-827c-4b961f13924b\") " pod="openstack/dnsmasq-dns-89c5cd4d5-s9zg6" Mar 20 10:59:38 crc kubenswrapper[4748]: I0320 10:59:38.003412 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/58e56aa6-3665-4020-827c-4b961f13924b-dns-svc\") pod \"dnsmasq-dns-89c5cd4d5-s9zg6\" (UID: \"58e56aa6-3665-4020-827c-4b961f13924b\") " pod="openstack/dnsmasq-dns-89c5cd4d5-s9zg6" Mar 20 10:59:38 crc kubenswrapper[4748]: I0320 10:59:38.003606 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/58e56aa6-3665-4020-827c-4b961f13924b-ovsdbserver-nb\") pod \"dnsmasq-dns-89c5cd4d5-s9zg6\" (UID: \"58e56aa6-3665-4020-827c-4b961f13924b\") " pod="openstack/dnsmasq-dns-89c5cd4d5-s9zg6" Mar 20 10:59:38 crc kubenswrapper[4748]: I0320 10:59:38.105684 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/58e56aa6-3665-4020-827c-4b961f13924b-ovsdbserver-sb\") pod \"dnsmasq-dns-89c5cd4d5-s9zg6\" (UID: \"58e56aa6-3665-4020-827c-4b961f13924b\") " pod="openstack/dnsmasq-dns-89c5cd4d5-s9zg6" Mar 20 10:59:38 crc kubenswrapper[4748]: I0320 10:59:38.105738 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/58e56aa6-3665-4020-827c-4b961f13924b-dns-svc\") pod \"dnsmasq-dns-89c5cd4d5-s9zg6\" (UID: \"58e56aa6-3665-4020-827c-4b961f13924b\") " pod="openstack/dnsmasq-dns-89c5cd4d5-s9zg6" Mar 20 10:59:38 crc kubenswrapper[4748]: I0320 10:59:38.105785 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/58e56aa6-3665-4020-827c-4b961f13924b-ovsdbserver-nb\") pod \"dnsmasq-dns-89c5cd4d5-s9zg6\" (UID: \"58e56aa6-3665-4020-827c-4b961f13924b\") " pod="openstack/dnsmasq-dns-89c5cd4d5-s9zg6" Mar 20 10:59:38 crc kubenswrapper[4748]: I0320 10:59:38.105893 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n96fp\" (UniqueName: \"kubernetes.io/projected/58e56aa6-3665-4020-827c-4b961f13924b-kube-api-access-n96fp\") pod \"dnsmasq-dns-89c5cd4d5-s9zg6\" (UID: \"58e56aa6-3665-4020-827c-4b961f13924b\") " pod="openstack/dnsmasq-dns-89c5cd4d5-s9zg6" Mar 20 10:59:38 crc kubenswrapper[4748]: I0320 10:59:38.105920 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/58e56aa6-3665-4020-827c-4b961f13924b-dns-swift-storage-0\") pod \"dnsmasq-dns-89c5cd4d5-s9zg6\" (UID: \"58e56aa6-3665-4020-827c-4b961f13924b\") " pod="openstack/dnsmasq-dns-89c5cd4d5-s9zg6" Mar 20 10:59:38 crc kubenswrapper[4748]: I0320 10:59:38.105940 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/58e56aa6-3665-4020-827c-4b961f13924b-config\") pod \"dnsmasq-dns-89c5cd4d5-s9zg6\" (UID: \"58e56aa6-3665-4020-827c-4b961f13924b\") " pod="openstack/dnsmasq-dns-89c5cd4d5-s9zg6" Mar 20 10:59:38 crc kubenswrapper[4748]: I0320 10:59:38.107006 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/58e56aa6-3665-4020-827c-4b961f13924b-config\") pod \"dnsmasq-dns-89c5cd4d5-s9zg6\" (UID: \"58e56aa6-3665-4020-827c-4b961f13924b\") " pod="openstack/dnsmasq-dns-89c5cd4d5-s9zg6" Mar 20 10:59:38 crc kubenswrapper[4748]: I0320 10:59:38.107087 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/58e56aa6-3665-4020-827c-4b961f13924b-ovsdbserver-sb\") pod \"dnsmasq-dns-89c5cd4d5-s9zg6\" (UID: \"58e56aa6-3665-4020-827c-4b961f13924b\") " pod="openstack/dnsmasq-dns-89c5cd4d5-s9zg6" Mar 20 10:59:38 crc kubenswrapper[4748]: I0320 10:59:38.107131 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/58e56aa6-3665-4020-827c-4b961f13924b-ovsdbserver-nb\") pod \"dnsmasq-dns-89c5cd4d5-s9zg6\" (UID: \"58e56aa6-3665-4020-827c-4b961f13924b\") " pod="openstack/dnsmasq-dns-89c5cd4d5-s9zg6" Mar 20 10:59:38 crc kubenswrapper[4748]: I0320 10:59:38.107185 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/58e56aa6-3665-4020-827c-4b961f13924b-dns-swift-storage-0\") pod \"dnsmasq-dns-89c5cd4d5-s9zg6\" (UID: \"58e56aa6-3665-4020-827c-4b961f13924b\") " pod="openstack/dnsmasq-dns-89c5cd4d5-s9zg6" Mar 20 10:59:38 crc kubenswrapper[4748]: I0320 10:59:38.107904 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/58e56aa6-3665-4020-827c-4b961f13924b-dns-svc\") pod \"dnsmasq-dns-89c5cd4d5-s9zg6\" (UID: \"58e56aa6-3665-4020-827c-4b961f13924b\") " pod="openstack/dnsmasq-dns-89c5cd4d5-s9zg6" Mar 20 10:59:38 crc kubenswrapper[4748]: I0320 10:59:38.136948 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n96fp\" (UniqueName: \"kubernetes.io/projected/58e56aa6-3665-4020-827c-4b961f13924b-kube-api-access-n96fp\") pod \"dnsmasq-dns-89c5cd4d5-s9zg6\" (UID: \"58e56aa6-3665-4020-827c-4b961f13924b\") " pod="openstack/dnsmasq-dns-89c5cd4d5-s9zg6" Mar 20 10:59:38 crc kubenswrapper[4748]: I0320 10:59:38.224478 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-s9zg6" Mar 20 10:59:38 crc kubenswrapper[4748]: I0320 10:59:38.862827 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-s9zg6"] Mar 20 10:59:39 crc kubenswrapper[4748]: I0320 10:59:39.470148 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-s9zg6" event={"ID":"58e56aa6-3665-4020-827c-4b961f13924b","Type":"ContainerStarted","Data":"2edad229b6a897d80f6d86fd6f68e37a1aac8156c95f134e75704724811388b0"} Mar 20 10:59:40 crc kubenswrapper[4748]: I0320 10:59:40.495931 4748 generic.go:334] "Generic (PLEG): container finished" podID="58e56aa6-3665-4020-827c-4b961f13924b" containerID="4e1e7aa1c41e26606a5631a17ee256676f245b65264b83c015acba498ebfe920" exitCode=0 Mar 20 10:59:40 crc kubenswrapper[4748]: I0320 10:59:40.496025 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-s9zg6" event={"ID":"58e56aa6-3665-4020-827c-4b961f13924b","Type":"ContainerDied","Data":"4e1e7aa1c41e26606a5631a17ee256676f245b65264b83c015acba498ebfe920"} Mar 20 10:59:40 crc kubenswrapper[4748]: I0320 10:59:40.646730 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 10:59:40 crc kubenswrapper[4748]: I0320 10:59:40.647354 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ad5425ff-9e86-4845-b272-14383e2166d9" containerName="ceilometer-central-agent" containerID="cri-o://c7c2234181fbd72fa3280f077fdec6d37abe800855f80896842db22ed8c2c89a" gracePeriod=30 Mar 20 10:59:40 crc kubenswrapper[4748]: I0320 10:59:40.647492 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ad5425ff-9e86-4845-b272-14383e2166d9" containerName="proxy-httpd" containerID="cri-o://118d7996d5d51ba5518534ba529e848b160156aff361606a977d8635e5201b0f" gracePeriod=30 Mar 20 10:59:40 crc kubenswrapper[4748]: I0320 10:59:40.647553 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ad5425ff-9e86-4845-b272-14383e2166d9" containerName="sg-core" containerID="cri-o://a2780081a566931bbdf2713f8a64c548a1165dd595f82c47994f087275155ca2" gracePeriod=30 Mar 20 10:59:40 crc kubenswrapper[4748]: I0320 10:59:40.647595 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ad5425ff-9e86-4845-b272-14383e2166d9" containerName="ceilometer-notification-agent" containerID="cri-o://b15e9fc54e396405feab323ce5a4653790b6964fdbc0b0735ae0c4f7de638c65" gracePeriod=30 Mar 20 10:59:40 crc kubenswrapper[4748]: I0320 10:59:40.751732 4748 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="ad5425ff-9e86-4845-b272-14383e2166d9" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.199:3000/\": read tcp 10.217.0.2:41054->10.217.0.199:3000: read: connection reset by peer" Mar 20 10:59:41 crc kubenswrapper[4748]: I0320 10:59:41.089048 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 20 10:59:41 crc kubenswrapper[4748]: I0320 10:59:41.089323 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="f0a406ab-9700-4aff-ac24-931c6862568d" containerName="nova-api-log" containerID="cri-o://ad6035d14943caa1073b7650c27606212f0b43e38e1604d5f0cfbb76c606d8b8" gracePeriod=30 Mar 20 10:59:41 crc kubenswrapper[4748]: I0320 10:59:41.089945 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="f0a406ab-9700-4aff-ac24-931c6862568d" containerName="nova-api-api" containerID="cri-o://daa61352b9ef42dec0ace0828be267eeeca3c60d5c8b943702a096850b4b18ef" gracePeriod=30 Mar 20 10:59:41 crc kubenswrapper[4748]: I0320 10:59:41.126733 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 20 10:59:41 crc kubenswrapper[4748]: I0320 10:59:41.511012 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-s9zg6" event={"ID":"58e56aa6-3665-4020-827c-4b961f13924b","Type":"ContainerStarted","Data":"92e782d1a7e09d2cc3eb10cc8bbd162df5212bb983fd019e2e5780d3df198373"} Mar 20 10:59:41 crc kubenswrapper[4748]: I0320 10:59:41.511444 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-89c5cd4d5-s9zg6" Mar 20 10:59:41 crc kubenswrapper[4748]: I0320 10:59:41.516437 4748 generic.go:334] "Generic (PLEG): container finished" podID="f0a406ab-9700-4aff-ac24-931c6862568d" containerID="ad6035d14943caa1073b7650c27606212f0b43e38e1604d5f0cfbb76c606d8b8" exitCode=143 Mar 20 10:59:41 crc kubenswrapper[4748]: I0320 10:59:41.520867 4748 generic.go:334] "Generic (PLEG): container finished" podID="ad5425ff-9e86-4845-b272-14383e2166d9" containerID="118d7996d5d51ba5518534ba529e848b160156aff361606a977d8635e5201b0f" exitCode=0 Mar 20 10:59:41 crc kubenswrapper[4748]: I0320 10:59:41.520905 4748 generic.go:334] "Generic (PLEG): container finished" podID="ad5425ff-9e86-4845-b272-14383e2166d9" containerID="a2780081a566931bbdf2713f8a64c548a1165dd595f82c47994f087275155ca2" exitCode=2 Mar 20 10:59:41 crc kubenswrapper[4748]: I0320 10:59:41.520914 4748 generic.go:334] "Generic (PLEG): container finished" podID="ad5425ff-9e86-4845-b272-14383e2166d9" containerID="c7c2234181fbd72fa3280f077fdec6d37abe800855f80896842db22ed8c2c89a" exitCode=0 Mar 20 10:59:41 crc kubenswrapper[4748]: I0320 10:59:41.525991 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f0a406ab-9700-4aff-ac24-931c6862568d","Type":"ContainerDied","Data":"ad6035d14943caa1073b7650c27606212f0b43e38e1604d5f0cfbb76c606d8b8"} Mar 20 10:59:41 crc kubenswrapper[4748]: I0320 10:59:41.526041 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ad5425ff-9e86-4845-b272-14383e2166d9","Type":"ContainerDied","Data":"118d7996d5d51ba5518534ba529e848b160156aff361606a977d8635e5201b0f"} Mar 20 10:59:41 crc kubenswrapper[4748]: I0320 10:59:41.526055 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ad5425ff-9e86-4845-b272-14383e2166d9","Type":"ContainerDied","Data":"a2780081a566931bbdf2713f8a64c548a1165dd595f82c47994f087275155ca2"} Mar 20 10:59:41 crc kubenswrapper[4748]: I0320 10:59:41.526066 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ad5425ff-9e86-4845-b272-14383e2166d9","Type":"ContainerDied","Data":"c7c2234181fbd72fa3280f077fdec6d37abe800855f80896842db22ed8c2c89a"} Mar 20 10:59:41 crc kubenswrapper[4748]: I0320 10:59:41.553695 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-89c5cd4d5-s9zg6" podStartSLOduration=4.553671172 podStartE2EDuration="4.553671172s" podCreationTimestamp="2026-03-20 10:59:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:59:41.53960135 +0000 UTC m=+1416.681147164" watchObservedRunningTime="2026-03-20 10:59:41.553671172 +0000 UTC m=+1416.695216986" Mar 20 10:59:42 crc kubenswrapper[4748]: I0320 10:59:42.079798 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 10:59:42 crc kubenswrapper[4748]: I0320 10:59:42.187695 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad5425ff-9e86-4845-b272-14383e2166d9-config-data\") pod \"ad5425ff-9e86-4845-b272-14383e2166d9\" (UID: \"ad5425ff-9e86-4845-b272-14383e2166d9\") " Mar 20 10:59:42 crc kubenswrapper[4748]: I0320 10:59:42.188051 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ad5425ff-9e86-4845-b272-14383e2166d9-sg-core-conf-yaml\") pod \"ad5425ff-9e86-4845-b272-14383e2166d9\" (UID: \"ad5425ff-9e86-4845-b272-14383e2166d9\") " Mar 20 10:59:42 crc kubenswrapper[4748]: I0320 10:59:42.188180 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ad5425ff-9e86-4845-b272-14383e2166d9-run-httpd\") pod \"ad5425ff-9e86-4845-b272-14383e2166d9\" (UID: \"ad5425ff-9e86-4845-b272-14383e2166d9\") " Mar 20 10:59:42 crc kubenswrapper[4748]: I0320 10:59:42.188217 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ad5425ff-9e86-4845-b272-14383e2166d9-ceilometer-tls-certs\") pod \"ad5425ff-9e86-4845-b272-14383e2166d9\" (UID: \"ad5425ff-9e86-4845-b272-14383e2166d9\") " Mar 20 10:59:42 crc kubenswrapper[4748]: I0320 10:59:42.188252 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ad5425ff-9e86-4845-b272-14383e2166d9-scripts\") pod \"ad5425ff-9e86-4845-b272-14383e2166d9\" (UID: \"ad5425ff-9e86-4845-b272-14383e2166d9\") " Mar 20 10:59:42 crc kubenswrapper[4748]: I0320 10:59:42.188277 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ad5425ff-9e86-4845-b272-14383e2166d9-log-httpd\") pod \"ad5425ff-9e86-4845-b272-14383e2166d9\" (UID: \"ad5425ff-9e86-4845-b272-14383e2166d9\") " Mar 20 10:59:42 crc kubenswrapper[4748]: I0320 10:59:42.188311 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j2xrx\" (UniqueName: \"kubernetes.io/projected/ad5425ff-9e86-4845-b272-14383e2166d9-kube-api-access-j2xrx\") pod \"ad5425ff-9e86-4845-b272-14383e2166d9\" (UID: \"ad5425ff-9e86-4845-b272-14383e2166d9\") " Mar 20 10:59:42 crc kubenswrapper[4748]: I0320 10:59:42.188393 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad5425ff-9e86-4845-b272-14383e2166d9-combined-ca-bundle\") pod \"ad5425ff-9e86-4845-b272-14383e2166d9\" (UID: \"ad5425ff-9e86-4845-b272-14383e2166d9\") " Mar 20 10:59:42 crc kubenswrapper[4748]: I0320 10:59:42.188540 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad5425ff-9e86-4845-b272-14383e2166d9-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "ad5425ff-9e86-4845-b272-14383e2166d9" (UID: "ad5425ff-9e86-4845-b272-14383e2166d9"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:59:42 crc kubenswrapper[4748]: I0320 10:59:42.188930 4748 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ad5425ff-9e86-4845-b272-14383e2166d9-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:42 crc kubenswrapper[4748]: I0320 10:59:42.192344 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad5425ff-9e86-4845-b272-14383e2166d9-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "ad5425ff-9e86-4845-b272-14383e2166d9" (UID: "ad5425ff-9e86-4845-b272-14383e2166d9"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:59:42 crc kubenswrapper[4748]: I0320 10:59:42.195089 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad5425ff-9e86-4845-b272-14383e2166d9-scripts" (OuterVolumeSpecName: "scripts") pod "ad5425ff-9e86-4845-b272-14383e2166d9" (UID: "ad5425ff-9e86-4845-b272-14383e2166d9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:59:42 crc kubenswrapper[4748]: I0320 10:59:42.195232 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad5425ff-9e86-4845-b272-14383e2166d9-kube-api-access-j2xrx" (OuterVolumeSpecName: "kube-api-access-j2xrx") pod "ad5425ff-9e86-4845-b272-14383e2166d9" (UID: "ad5425ff-9e86-4845-b272-14383e2166d9"). InnerVolumeSpecName "kube-api-access-j2xrx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:59:42 crc kubenswrapper[4748]: I0320 10:59:42.220243 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad5425ff-9e86-4845-b272-14383e2166d9-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "ad5425ff-9e86-4845-b272-14383e2166d9" (UID: "ad5425ff-9e86-4845-b272-14383e2166d9"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:59:42 crc kubenswrapper[4748]: I0320 10:59:42.264733 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad5425ff-9e86-4845-b272-14383e2166d9-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "ad5425ff-9e86-4845-b272-14383e2166d9" (UID: "ad5425ff-9e86-4845-b272-14383e2166d9"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:59:42 crc kubenswrapper[4748]: I0320 10:59:42.286792 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad5425ff-9e86-4845-b272-14383e2166d9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ad5425ff-9e86-4845-b272-14383e2166d9" (UID: "ad5425ff-9e86-4845-b272-14383e2166d9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:59:42 crc kubenswrapper[4748]: I0320 10:59:42.290296 4748 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ad5425ff-9e86-4845-b272-14383e2166d9-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:42 crc kubenswrapper[4748]: I0320 10:59:42.290333 4748 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ad5425ff-9e86-4845-b272-14383e2166d9-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:42 crc kubenswrapper[4748]: I0320 10:59:42.290346 4748 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ad5425ff-9e86-4845-b272-14383e2166d9-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:42 crc kubenswrapper[4748]: I0320 10:59:42.290359 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j2xrx\" (UniqueName: \"kubernetes.io/projected/ad5425ff-9e86-4845-b272-14383e2166d9-kube-api-access-j2xrx\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:42 crc kubenswrapper[4748]: I0320 10:59:42.290374 4748 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad5425ff-9e86-4845-b272-14383e2166d9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:42 crc kubenswrapper[4748]: I0320 10:59:42.290385 4748 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ad5425ff-9e86-4845-b272-14383e2166d9-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:42 crc kubenswrapper[4748]: I0320 10:59:42.313166 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad5425ff-9e86-4845-b272-14383e2166d9-config-data" (OuterVolumeSpecName: "config-data") pod "ad5425ff-9e86-4845-b272-14383e2166d9" (UID: "ad5425ff-9e86-4845-b272-14383e2166d9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:59:42 crc kubenswrapper[4748]: I0320 10:59:42.391930 4748 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad5425ff-9e86-4845-b272-14383e2166d9-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:42 crc kubenswrapper[4748]: I0320 10:59:42.534949 4748 generic.go:334] "Generic (PLEG): container finished" podID="ad5425ff-9e86-4845-b272-14383e2166d9" containerID="b15e9fc54e396405feab323ce5a4653790b6964fdbc0b0735ae0c4f7de638c65" exitCode=0 Mar 20 10:59:42 crc kubenswrapper[4748]: I0320 10:59:42.535039 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 10:59:42 crc kubenswrapper[4748]: I0320 10:59:42.535034 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ad5425ff-9e86-4845-b272-14383e2166d9","Type":"ContainerDied","Data":"b15e9fc54e396405feab323ce5a4653790b6964fdbc0b0735ae0c4f7de638c65"} Mar 20 10:59:42 crc kubenswrapper[4748]: I0320 10:59:42.535132 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ad5425ff-9e86-4845-b272-14383e2166d9","Type":"ContainerDied","Data":"9ae39ddda1c75ca7dfe27a2563686c8654c2b7c73adc9932c0209f96dcb78398"} Mar 20 10:59:42 crc kubenswrapper[4748]: I0320 10:59:42.535163 4748 scope.go:117] "RemoveContainer" containerID="118d7996d5d51ba5518534ba529e848b160156aff361606a977d8635e5201b0f" Mar 20 10:59:42 crc kubenswrapper[4748]: I0320 10:59:42.620544 4748 scope.go:117] "RemoveContainer" containerID="a2780081a566931bbdf2713f8a64c548a1165dd595f82c47994f087275155ca2" Mar 20 10:59:42 crc kubenswrapper[4748]: I0320 10:59:42.635678 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 10:59:42 crc kubenswrapper[4748]: I0320 10:59:42.645081 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 20 10:59:42 crc kubenswrapper[4748]: I0320 10:59:42.646387 4748 scope.go:117] "RemoveContainer" containerID="b15e9fc54e396405feab323ce5a4653790b6964fdbc0b0735ae0c4f7de638c65" Mar 20 10:59:42 crc kubenswrapper[4748]: I0320 10:59:42.668678 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 10:59:42 crc kubenswrapper[4748]: E0320 10:59:42.669256 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad5425ff-9e86-4845-b272-14383e2166d9" containerName="ceilometer-notification-agent" Mar 20 10:59:42 crc kubenswrapper[4748]: I0320 10:59:42.669281 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad5425ff-9e86-4845-b272-14383e2166d9" containerName="ceilometer-notification-agent" Mar 20 10:59:42 crc kubenswrapper[4748]: E0320 10:59:42.669299 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad5425ff-9e86-4845-b272-14383e2166d9" containerName="sg-core" Mar 20 10:59:42 crc kubenswrapper[4748]: I0320 10:59:42.669309 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad5425ff-9e86-4845-b272-14383e2166d9" containerName="sg-core" Mar 20 10:59:42 crc kubenswrapper[4748]: E0320 10:59:42.669329 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad5425ff-9e86-4845-b272-14383e2166d9" containerName="proxy-httpd" Mar 20 10:59:42 crc kubenswrapper[4748]: I0320 10:59:42.669339 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad5425ff-9e86-4845-b272-14383e2166d9" containerName="proxy-httpd" Mar 20 10:59:42 crc kubenswrapper[4748]: E0320 10:59:42.669364 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad5425ff-9e86-4845-b272-14383e2166d9" containerName="ceilometer-central-agent" Mar 20 10:59:42 crc kubenswrapper[4748]: I0320 10:59:42.669372 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad5425ff-9e86-4845-b272-14383e2166d9" containerName="ceilometer-central-agent" Mar 20 10:59:42 crc kubenswrapper[4748]: I0320 10:59:42.669602 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad5425ff-9e86-4845-b272-14383e2166d9" containerName="sg-core" Mar 20 10:59:42 crc kubenswrapper[4748]: I0320 10:59:42.669625 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad5425ff-9e86-4845-b272-14383e2166d9" containerName="ceilometer-central-agent" Mar 20 10:59:42 crc kubenswrapper[4748]: I0320 10:59:42.669638 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad5425ff-9e86-4845-b272-14383e2166d9" containerName="ceilometer-notification-agent" Mar 20 10:59:42 crc kubenswrapper[4748]: I0320 10:59:42.669650 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad5425ff-9e86-4845-b272-14383e2166d9" containerName="proxy-httpd" Mar 20 10:59:42 crc kubenswrapper[4748]: I0320 10:59:42.672329 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 10:59:42 crc kubenswrapper[4748]: I0320 10:59:42.675487 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 10:59:42 crc kubenswrapper[4748]: I0320 10:59:42.675681 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 20 10:59:42 crc kubenswrapper[4748]: I0320 10:59:42.689449 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 10:59:42 crc kubenswrapper[4748]: I0320 10:59:42.689873 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 10:59:42 crc kubenswrapper[4748]: I0320 10:59:42.691455 4748 scope.go:117] "RemoveContainer" containerID="c7c2234181fbd72fa3280f077fdec6d37abe800855f80896842db22ed8c2c89a" Mar 20 10:59:42 crc kubenswrapper[4748]: I0320 10:59:42.728136 4748 scope.go:117] "RemoveContainer" containerID="118d7996d5d51ba5518534ba529e848b160156aff361606a977d8635e5201b0f" Mar 20 10:59:42 crc kubenswrapper[4748]: E0320 10:59:42.729231 4748 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"118d7996d5d51ba5518534ba529e848b160156aff361606a977d8635e5201b0f\": container with ID starting with 118d7996d5d51ba5518534ba529e848b160156aff361606a977d8635e5201b0f not found: ID does not exist" containerID="118d7996d5d51ba5518534ba529e848b160156aff361606a977d8635e5201b0f" Mar 20 10:59:42 crc kubenswrapper[4748]: I0320 10:59:42.729304 4748 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"118d7996d5d51ba5518534ba529e848b160156aff361606a977d8635e5201b0f"} err="failed to get container status \"118d7996d5d51ba5518534ba529e848b160156aff361606a977d8635e5201b0f\": rpc error: code = NotFound desc = could not find container \"118d7996d5d51ba5518534ba529e848b160156aff361606a977d8635e5201b0f\": container with ID starting with 118d7996d5d51ba5518534ba529e848b160156aff361606a977d8635e5201b0f not found: ID does not exist" Mar 20 10:59:42 crc kubenswrapper[4748]: I0320 10:59:42.729346 4748 scope.go:117] "RemoveContainer" containerID="a2780081a566931bbdf2713f8a64c548a1165dd595f82c47994f087275155ca2" Mar 20 10:59:42 crc kubenswrapper[4748]: E0320 10:59:42.730725 4748 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2780081a566931bbdf2713f8a64c548a1165dd595f82c47994f087275155ca2\": container with ID starting with a2780081a566931bbdf2713f8a64c548a1165dd595f82c47994f087275155ca2 not found: ID does not exist" containerID="a2780081a566931bbdf2713f8a64c548a1165dd595f82c47994f087275155ca2" Mar 20 10:59:42 crc kubenswrapper[4748]: I0320 10:59:42.730812 4748 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2780081a566931bbdf2713f8a64c548a1165dd595f82c47994f087275155ca2"} err="failed to get container status \"a2780081a566931bbdf2713f8a64c548a1165dd595f82c47994f087275155ca2\": rpc error: code = NotFound desc = could not find container \"a2780081a566931bbdf2713f8a64c548a1165dd595f82c47994f087275155ca2\": container with ID starting with a2780081a566931bbdf2713f8a64c548a1165dd595f82c47994f087275155ca2 not found: ID does not exist" Mar 20 10:59:42 crc kubenswrapper[4748]: I0320 10:59:42.730876 4748 scope.go:117] "RemoveContainer" containerID="b15e9fc54e396405feab323ce5a4653790b6964fdbc0b0735ae0c4f7de638c65" Mar 20 10:59:42 crc kubenswrapper[4748]: E0320 10:59:42.735721 4748 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b15e9fc54e396405feab323ce5a4653790b6964fdbc0b0735ae0c4f7de638c65\": container with ID starting with b15e9fc54e396405feab323ce5a4653790b6964fdbc0b0735ae0c4f7de638c65 not found: ID does not exist" containerID="b15e9fc54e396405feab323ce5a4653790b6964fdbc0b0735ae0c4f7de638c65" Mar 20 10:59:42 crc kubenswrapper[4748]: I0320 10:59:42.735853 4748 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b15e9fc54e396405feab323ce5a4653790b6964fdbc0b0735ae0c4f7de638c65"} err="failed to get container status \"b15e9fc54e396405feab323ce5a4653790b6964fdbc0b0735ae0c4f7de638c65\": rpc error: code = NotFound desc = could not find container \"b15e9fc54e396405feab323ce5a4653790b6964fdbc0b0735ae0c4f7de638c65\": container with ID starting with b15e9fc54e396405feab323ce5a4653790b6964fdbc0b0735ae0c4f7de638c65 not found: ID does not exist" Mar 20 10:59:42 crc kubenswrapper[4748]: I0320 10:59:42.735896 4748 scope.go:117] "RemoveContainer" containerID="c7c2234181fbd72fa3280f077fdec6d37abe800855f80896842db22ed8c2c89a" Mar 20 10:59:42 crc kubenswrapper[4748]: E0320 10:59:42.738638 4748 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c7c2234181fbd72fa3280f077fdec6d37abe800855f80896842db22ed8c2c89a\": container with ID starting with c7c2234181fbd72fa3280f077fdec6d37abe800855f80896842db22ed8c2c89a not found: ID does not exist" containerID="c7c2234181fbd72fa3280f077fdec6d37abe800855f80896842db22ed8c2c89a" Mar 20 10:59:42 crc kubenswrapper[4748]: I0320 10:59:42.738738 4748 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7c2234181fbd72fa3280f077fdec6d37abe800855f80896842db22ed8c2c89a"} err="failed to get container status \"c7c2234181fbd72fa3280f077fdec6d37abe800855f80896842db22ed8c2c89a\": rpc error: code = NotFound desc = could not find container \"c7c2234181fbd72fa3280f077fdec6d37abe800855f80896842db22ed8c2c89a\": container with ID starting with c7c2234181fbd72fa3280f077fdec6d37abe800855f80896842db22ed8c2c89a not found: ID does not exist" Mar 20 10:59:42 crc kubenswrapper[4748]: I0320 10:59:42.801335 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e1cc91f-da3f-4748-85d6-dbf4ef8cb729-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"6e1cc91f-da3f-4748-85d6-dbf4ef8cb729\") " pod="openstack/ceilometer-0" Mar 20 10:59:42 crc kubenswrapper[4748]: I0320 10:59:42.801394 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e1cc91f-da3f-4748-85d6-dbf4ef8cb729-log-httpd\") pod \"ceilometer-0\" (UID: \"6e1cc91f-da3f-4748-85d6-dbf4ef8cb729\") " pod="openstack/ceilometer-0" Mar 20 10:59:42 crc kubenswrapper[4748]: I0320 10:59:42.801414 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6e1cc91f-da3f-4748-85d6-dbf4ef8cb729-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6e1cc91f-da3f-4748-85d6-dbf4ef8cb729\") " pod="openstack/ceilometer-0" Mar 20 10:59:42 crc kubenswrapper[4748]: I0320 10:59:42.801688 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-psk4v\" (UniqueName: \"kubernetes.io/projected/6e1cc91f-da3f-4748-85d6-dbf4ef8cb729-kube-api-access-psk4v\") pod \"ceilometer-0\" (UID: \"6e1cc91f-da3f-4748-85d6-dbf4ef8cb729\") " pod="openstack/ceilometer-0" Mar 20 10:59:42 crc kubenswrapper[4748]: I0320 10:59:42.801976 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e1cc91f-da3f-4748-85d6-dbf4ef8cb729-config-data\") pod \"ceilometer-0\" (UID: \"6e1cc91f-da3f-4748-85d6-dbf4ef8cb729\") " pod="openstack/ceilometer-0" Mar 20 10:59:42 crc kubenswrapper[4748]: I0320 10:59:42.802049 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e1cc91f-da3f-4748-85d6-dbf4ef8cb729-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6e1cc91f-da3f-4748-85d6-dbf4ef8cb729\") " pod="openstack/ceilometer-0" Mar 20 10:59:42 crc kubenswrapper[4748]: I0320 10:59:42.802088 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e1cc91f-da3f-4748-85d6-dbf4ef8cb729-run-httpd\") pod \"ceilometer-0\" (UID: \"6e1cc91f-da3f-4748-85d6-dbf4ef8cb729\") " pod="openstack/ceilometer-0" Mar 20 10:59:42 crc kubenswrapper[4748]: I0320 10:59:42.802170 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e1cc91f-da3f-4748-85d6-dbf4ef8cb729-scripts\") pod \"ceilometer-0\" (UID: \"6e1cc91f-da3f-4748-85d6-dbf4ef8cb729\") " pod="openstack/ceilometer-0" Mar 20 10:59:42 crc kubenswrapper[4748]: I0320 10:59:42.904286 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e1cc91f-da3f-4748-85d6-dbf4ef8cb729-run-httpd\") pod \"ceilometer-0\" (UID: \"6e1cc91f-da3f-4748-85d6-dbf4ef8cb729\") " pod="openstack/ceilometer-0" Mar 20 10:59:42 crc kubenswrapper[4748]: I0320 10:59:42.904338 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e1cc91f-da3f-4748-85d6-dbf4ef8cb729-scripts\") pod \"ceilometer-0\" (UID: \"6e1cc91f-da3f-4748-85d6-dbf4ef8cb729\") " pod="openstack/ceilometer-0" Mar 20 10:59:42 crc kubenswrapper[4748]: I0320 10:59:42.904406 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e1cc91f-da3f-4748-85d6-dbf4ef8cb729-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"6e1cc91f-da3f-4748-85d6-dbf4ef8cb729\") " pod="openstack/ceilometer-0" Mar 20 10:59:42 crc kubenswrapper[4748]: I0320 10:59:42.904447 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e1cc91f-da3f-4748-85d6-dbf4ef8cb729-log-httpd\") pod \"ceilometer-0\" (UID: \"6e1cc91f-da3f-4748-85d6-dbf4ef8cb729\") " pod="openstack/ceilometer-0" Mar 20 10:59:42 crc kubenswrapper[4748]: I0320 10:59:42.904470 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6e1cc91f-da3f-4748-85d6-dbf4ef8cb729-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6e1cc91f-da3f-4748-85d6-dbf4ef8cb729\") " pod="openstack/ceilometer-0" Mar 20 10:59:42 crc kubenswrapper[4748]: I0320 10:59:42.904546 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-psk4v\" (UniqueName: \"kubernetes.io/projected/6e1cc91f-da3f-4748-85d6-dbf4ef8cb729-kube-api-access-psk4v\") pod \"ceilometer-0\" (UID: \"6e1cc91f-da3f-4748-85d6-dbf4ef8cb729\") " pod="openstack/ceilometer-0" Mar 20 10:59:42 crc kubenswrapper[4748]: I0320 10:59:42.904674 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e1cc91f-da3f-4748-85d6-dbf4ef8cb729-config-data\") pod \"ceilometer-0\" (UID: \"6e1cc91f-da3f-4748-85d6-dbf4ef8cb729\") " pod="openstack/ceilometer-0" Mar 20 10:59:42 crc kubenswrapper[4748]: I0320 10:59:42.904709 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e1cc91f-da3f-4748-85d6-dbf4ef8cb729-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6e1cc91f-da3f-4748-85d6-dbf4ef8cb729\") " pod="openstack/ceilometer-0" Mar 20 10:59:42 crc kubenswrapper[4748]: I0320 10:59:42.904792 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e1cc91f-da3f-4748-85d6-dbf4ef8cb729-run-httpd\") pod \"ceilometer-0\" (UID: \"6e1cc91f-da3f-4748-85d6-dbf4ef8cb729\") " pod="openstack/ceilometer-0" Mar 20 10:59:42 crc kubenswrapper[4748]: I0320 10:59:42.905564 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e1cc91f-da3f-4748-85d6-dbf4ef8cb729-log-httpd\") pod \"ceilometer-0\" (UID: \"6e1cc91f-da3f-4748-85d6-dbf4ef8cb729\") " pod="openstack/ceilometer-0" Mar 20 10:59:42 crc kubenswrapper[4748]: I0320 10:59:42.912229 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6e1cc91f-da3f-4748-85d6-dbf4ef8cb729-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6e1cc91f-da3f-4748-85d6-dbf4ef8cb729\") " pod="openstack/ceilometer-0" Mar 20 10:59:42 crc kubenswrapper[4748]: I0320 10:59:42.912514 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e1cc91f-da3f-4748-85d6-dbf4ef8cb729-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6e1cc91f-da3f-4748-85d6-dbf4ef8cb729\") " pod="openstack/ceilometer-0" Mar 20 10:59:42 crc kubenswrapper[4748]: I0320 10:59:42.912558 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e1cc91f-da3f-4748-85d6-dbf4ef8cb729-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"6e1cc91f-da3f-4748-85d6-dbf4ef8cb729\") " pod="openstack/ceilometer-0" Mar 20 10:59:42 crc kubenswrapper[4748]: I0320 10:59:42.912757 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e1cc91f-da3f-4748-85d6-dbf4ef8cb729-scripts\") pod \"ceilometer-0\" (UID: \"6e1cc91f-da3f-4748-85d6-dbf4ef8cb729\") " pod="openstack/ceilometer-0" Mar 20 10:59:42 crc kubenswrapper[4748]: I0320 10:59:42.913386 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e1cc91f-da3f-4748-85d6-dbf4ef8cb729-config-data\") pod \"ceilometer-0\" (UID: \"6e1cc91f-da3f-4748-85d6-dbf4ef8cb729\") " pod="openstack/ceilometer-0" Mar 20 10:59:42 crc kubenswrapper[4748]: I0320 10:59:42.926867 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-psk4v\" (UniqueName: \"kubernetes.io/projected/6e1cc91f-da3f-4748-85d6-dbf4ef8cb729-kube-api-access-psk4v\") pod \"ceilometer-0\" (UID: \"6e1cc91f-da3f-4748-85d6-dbf4ef8cb729\") " pod="openstack/ceilometer-0" Mar 20 10:59:42 crc kubenswrapper[4748]: I0320 10:59:42.962774 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 10:59:42 crc kubenswrapper[4748]: I0320 10:59:42.964271 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 10:59:43 crc kubenswrapper[4748]: I0320 10:59:43.490702 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 10:59:43 crc kubenswrapper[4748]: I0320 10:59:43.530114 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad5425ff-9e86-4845-b272-14383e2166d9" path="/var/lib/kubelet/pods/ad5425ff-9e86-4845-b272-14383e2166d9/volumes" Mar 20 10:59:43 crc kubenswrapper[4748]: I0320 10:59:43.549007 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6e1cc91f-da3f-4748-85d6-dbf4ef8cb729","Type":"ContainerStarted","Data":"3132411f1033e083f8c0cca1a4a815dd3508fc3f51ebe1cc80bba354d7999b54"} Mar 20 10:59:45 crc kubenswrapper[4748]: I0320 10:59:44.583948 4748 generic.go:334] "Generic (PLEG): container finished" podID="f0a406ab-9700-4aff-ac24-931c6862568d" containerID="daa61352b9ef42dec0ace0828be267eeeca3c60d5c8b943702a096850b4b18ef" exitCode=0 Mar 20 10:59:45 crc kubenswrapper[4748]: I0320 10:59:44.584321 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f0a406ab-9700-4aff-ac24-931c6862568d","Type":"ContainerDied","Data":"daa61352b9ef42dec0ace0828be267eeeca3c60d5c8b943702a096850b4b18ef"} Mar 20 10:59:45 crc kubenswrapper[4748]: I0320 10:59:44.843954 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 10:59:45 crc kubenswrapper[4748]: I0320 10:59:44.945375 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f0a406ab-9700-4aff-ac24-931c6862568d-logs\") pod \"f0a406ab-9700-4aff-ac24-931c6862568d\" (UID: \"f0a406ab-9700-4aff-ac24-931c6862568d\") " Mar 20 10:59:45 crc kubenswrapper[4748]: I0320 10:59:44.945533 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7qlzc\" (UniqueName: \"kubernetes.io/projected/f0a406ab-9700-4aff-ac24-931c6862568d-kube-api-access-7qlzc\") pod \"f0a406ab-9700-4aff-ac24-931c6862568d\" (UID: \"f0a406ab-9700-4aff-ac24-931c6862568d\") " Mar 20 10:59:45 crc kubenswrapper[4748]: I0320 10:59:44.945560 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0a406ab-9700-4aff-ac24-931c6862568d-config-data\") pod \"f0a406ab-9700-4aff-ac24-931c6862568d\" (UID: \"f0a406ab-9700-4aff-ac24-931c6862568d\") " Mar 20 10:59:45 crc kubenswrapper[4748]: I0320 10:59:44.945622 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0a406ab-9700-4aff-ac24-931c6862568d-combined-ca-bundle\") pod \"f0a406ab-9700-4aff-ac24-931c6862568d\" (UID: \"f0a406ab-9700-4aff-ac24-931c6862568d\") " Mar 20 10:59:45 crc kubenswrapper[4748]: I0320 10:59:44.946966 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f0a406ab-9700-4aff-ac24-931c6862568d-logs" (OuterVolumeSpecName: "logs") pod "f0a406ab-9700-4aff-ac24-931c6862568d" (UID: "f0a406ab-9700-4aff-ac24-931c6862568d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:59:45 crc kubenswrapper[4748]: I0320 10:59:44.950467 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0a406ab-9700-4aff-ac24-931c6862568d-kube-api-access-7qlzc" (OuterVolumeSpecName: "kube-api-access-7qlzc") pod "f0a406ab-9700-4aff-ac24-931c6862568d" (UID: "f0a406ab-9700-4aff-ac24-931c6862568d"). InnerVolumeSpecName "kube-api-access-7qlzc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:59:45 crc kubenswrapper[4748]: I0320 10:59:44.974471 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0a406ab-9700-4aff-ac24-931c6862568d-config-data" (OuterVolumeSpecName: "config-data") pod "f0a406ab-9700-4aff-ac24-931c6862568d" (UID: "f0a406ab-9700-4aff-ac24-931c6862568d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:59:45 crc kubenswrapper[4748]: I0320 10:59:44.988182 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0a406ab-9700-4aff-ac24-931c6862568d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f0a406ab-9700-4aff-ac24-931c6862568d" (UID: "f0a406ab-9700-4aff-ac24-931c6862568d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:59:45 crc kubenswrapper[4748]: I0320 10:59:45.047989 4748 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f0a406ab-9700-4aff-ac24-931c6862568d-logs\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:45 crc kubenswrapper[4748]: I0320 10:59:45.048031 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7qlzc\" (UniqueName: \"kubernetes.io/projected/f0a406ab-9700-4aff-ac24-931c6862568d-kube-api-access-7qlzc\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:45 crc kubenswrapper[4748]: I0320 10:59:45.048044 4748 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0a406ab-9700-4aff-ac24-931c6862568d-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:45 crc kubenswrapper[4748]: I0320 10:59:45.048053 4748 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0a406ab-9700-4aff-ac24-931c6862568d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:45 crc kubenswrapper[4748]: I0320 10:59:45.594263 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6e1cc91f-da3f-4748-85d6-dbf4ef8cb729","Type":"ContainerStarted","Data":"de2cf53897f1fae8cfe402a1267d257326195e4f7f9853a322c0a26d10c18478"} Mar 20 10:59:45 crc kubenswrapper[4748]: I0320 10:59:45.599212 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f0a406ab-9700-4aff-ac24-931c6862568d","Type":"ContainerDied","Data":"7716272e3fe5b71e79e73ea6c6841c615d63e889dbddd9e80231bc06db813617"} Mar 20 10:59:45 crc kubenswrapper[4748]: I0320 10:59:45.599252 4748 scope.go:117] "RemoveContainer" containerID="daa61352b9ef42dec0ace0828be267eeeca3c60d5c8b943702a096850b4b18ef" Mar 20 10:59:45 crc kubenswrapper[4748]: I0320 10:59:45.599382 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 10:59:45 crc kubenswrapper[4748]: I0320 10:59:45.630384 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 20 10:59:45 crc kubenswrapper[4748]: I0320 10:59:45.633752 4748 scope.go:117] "RemoveContainer" containerID="ad6035d14943caa1073b7650c27606212f0b43e38e1604d5f0cfbb76c606d8b8" Mar 20 10:59:45 crc kubenswrapper[4748]: I0320 10:59:45.642579 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 20 10:59:45 crc kubenswrapper[4748]: I0320 10:59:45.649328 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 20 10:59:45 crc kubenswrapper[4748]: E0320 10:59:45.649857 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0a406ab-9700-4aff-ac24-931c6862568d" containerName="nova-api-api" Mar 20 10:59:45 crc kubenswrapper[4748]: I0320 10:59:45.649871 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0a406ab-9700-4aff-ac24-931c6862568d" containerName="nova-api-api" Mar 20 10:59:45 crc kubenswrapper[4748]: E0320 10:59:45.649891 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0a406ab-9700-4aff-ac24-931c6862568d" containerName="nova-api-log" Mar 20 10:59:45 crc kubenswrapper[4748]: I0320 10:59:45.649897 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0a406ab-9700-4aff-ac24-931c6862568d" containerName="nova-api-log" Mar 20 10:59:45 crc kubenswrapper[4748]: I0320 10:59:45.650099 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0a406ab-9700-4aff-ac24-931c6862568d" containerName="nova-api-log" Mar 20 10:59:45 crc kubenswrapper[4748]: I0320 10:59:45.650118 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0a406ab-9700-4aff-ac24-931c6862568d" containerName="nova-api-api" Mar 20 10:59:45 crc kubenswrapper[4748]: I0320 10:59:45.652307 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 10:59:45 crc kubenswrapper[4748]: I0320 10:59:45.655130 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 20 10:59:45 crc kubenswrapper[4748]: I0320 10:59:45.655435 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 20 10:59:45 crc kubenswrapper[4748]: I0320 10:59:45.655609 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 20 10:59:45 crc kubenswrapper[4748]: I0320 10:59:45.660632 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 10:59:45 crc kubenswrapper[4748]: I0320 10:59:45.759675 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/763813e1-3dd6-4a8b-9aa4-460abe73e264-logs\") pod \"nova-api-0\" (UID: \"763813e1-3dd6-4a8b-9aa4-460abe73e264\") " pod="openstack/nova-api-0" Mar 20 10:59:45 crc kubenswrapper[4748]: I0320 10:59:45.759816 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/763813e1-3dd6-4a8b-9aa4-460abe73e264-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"763813e1-3dd6-4a8b-9aa4-460abe73e264\") " pod="openstack/nova-api-0" Mar 20 10:59:45 crc kubenswrapper[4748]: I0320 10:59:45.759871 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/763813e1-3dd6-4a8b-9aa4-460abe73e264-internal-tls-certs\") pod \"nova-api-0\" (UID: \"763813e1-3dd6-4a8b-9aa4-460abe73e264\") " pod="openstack/nova-api-0" Mar 20 10:59:45 crc kubenswrapper[4748]: I0320 10:59:45.760186 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/763813e1-3dd6-4a8b-9aa4-460abe73e264-public-tls-certs\") pod \"nova-api-0\" (UID: \"763813e1-3dd6-4a8b-9aa4-460abe73e264\") " pod="openstack/nova-api-0" Mar 20 10:59:45 crc kubenswrapper[4748]: I0320 10:59:45.760335 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86nsx\" (UniqueName: \"kubernetes.io/projected/763813e1-3dd6-4a8b-9aa4-460abe73e264-kube-api-access-86nsx\") pod \"nova-api-0\" (UID: \"763813e1-3dd6-4a8b-9aa4-460abe73e264\") " pod="openstack/nova-api-0" Mar 20 10:59:45 crc kubenswrapper[4748]: I0320 10:59:45.760364 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/763813e1-3dd6-4a8b-9aa4-460abe73e264-config-data\") pod \"nova-api-0\" (UID: \"763813e1-3dd6-4a8b-9aa4-460abe73e264\") " pod="openstack/nova-api-0" Mar 20 10:59:45 crc kubenswrapper[4748]: I0320 10:59:45.862540 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/763813e1-3dd6-4a8b-9aa4-460abe73e264-public-tls-certs\") pod \"nova-api-0\" (UID: \"763813e1-3dd6-4a8b-9aa4-460abe73e264\") " pod="openstack/nova-api-0" Mar 20 10:59:45 crc kubenswrapper[4748]: I0320 10:59:45.862633 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-86nsx\" (UniqueName: \"kubernetes.io/projected/763813e1-3dd6-4a8b-9aa4-460abe73e264-kube-api-access-86nsx\") pod \"nova-api-0\" (UID: \"763813e1-3dd6-4a8b-9aa4-460abe73e264\") " pod="openstack/nova-api-0" Mar 20 10:59:45 crc kubenswrapper[4748]: I0320 10:59:45.862669 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/763813e1-3dd6-4a8b-9aa4-460abe73e264-config-data\") pod \"nova-api-0\" (UID: \"763813e1-3dd6-4a8b-9aa4-460abe73e264\") " pod="openstack/nova-api-0" Mar 20 10:59:45 crc kubenswrapper[4748]: I0320 10:59:45.862772 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/763813e1-3dd6-4a8b-9aa4-460abe73e264-logs\") pod \"nova-api-0\" (UID: \"763813e1-3dd6-4a8b-9aa4-460abe73e264\") " pod="openstack/nova-api-0" Mar 20 10:59:45 crc kubenswrapper[4748]: I0320 10:59:45.862868 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/763813e1-3dd6-4a8b-9aa4-460abe73e264-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"763813e1-3dd6-4a8b-9aa4-460abe73e264\") " pod="openstack/nova-api-0" Mar 20 10:59:45 crc kubenswrapper[4748]: I0320 10:59:45.862899 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/763813e1-3dd6-4a8b-9aa4-460abe73e264-internal-tls-certs\") pod \"nova-api-0\" (UID: \"763813e1-3dd6-4a8b-9aa4-460abe73e264\") " pod="openstack/nova-api-0" Mar 20 10:59:45 crc kubenswrapper[4748]: I0320 10:59:45.865329 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/763813e1-3dd6-4a8b-9aa4-460abe73e264-logs\") pod \"nova-api-0\" (UID: \"763813e1-3dd6-4a8b-9aa4-460abe73e264\") " pod="openstack/nova-api-0" Mar 20 10:59:45 crc kubenswrapper[4748]: I0320 10:59:45.868489 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/763813e1-3dd6-4a8b-9aa4-460abe73e264-public-tls-certs\") pod \"nova-api-0\" (UID: \"763813e1-3dd6-4a8b-9aa4-460abe73e264\") " pod="openstack/nova-api-0" Mar 20 10:59:45 crc kubenswrapper[4748]: I0320 10:59:45.868571 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/763813e1-3dd6-4a8b-9aa4-460abe73e264-internal-tls-certs\") pod \"nova-api-0\" (UID: \"763813e1-3dd6-4a8b-9aa4-460abe73e264\") " pod="openstack/nova-api-0" Mar 20 10:59:45 crc kubenswrapper[4748]: I0320 10:59:45.868630 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/763813e1-3dd6-4a8b-9aa4-460abe73e264-config-data\") pod \"nova-api-0\" (UID: \"763813e1-3dd6-4a8b-9aa4-460abe73e264\") " pod="openstack/nova-api-0" Mar 20 10:59:45 crc kubenswrapper[4748]: I0320 10:59:45.868730 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/763813e1-3dd6-4a8b-9aa4-460abe73e264-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"763813e1-3dd6-4a8b-9aa4-460abe73e264\") " pod="openstack/nova-api-0" Mar 20 10:59:45 crc kubenswrapper[4748]: I0320 10:59:45.888303 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-86nsx\" (UniqueName: \"kubernetes.io/projected/763813e1-3dd6-4a8b-9aa4-460abe73e264-kube-api-access-86nsx\") pod \"nova-api-0\" (UID: \"763813e1-3dd6-4a8b-9aa4-460abe73e264\") " pod="openstack/nova-api-0" Mar 20 10:59:45 crc kubenswrapper[4748]: I0320 10:59:45.997686 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 10:59:46 crc kubenswrapper[4748]: I0320 10:59:46.127165 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Mar 20 10:59:46 crc kubenswrapper[4748]: I0320 10:59:46.155756 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Mar 20 10:59:46 crc kubenswrapper[4748]: I0320 10:59:46.538855 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 10:59:46 crc kubenswrapper[4748]: I0320 10:59:46.617321 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6e1cc91f-da3f-4748-85d6-dbf4ef8cb729","Type":"ContainerStarted","Data":"6f5ccf3fc6f379e479084fc05831e25807d7b24baa886b124561078d730d0b76"} Mar 20 10:59:46 crc kubenswrapper[4748]: I0320 10:59:46.619345 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"763813e1-3dd6-4a8b-9aa4-460abe73e264","Type":"ContainerStarted","Data":"b637a3506a91b510b5222f108b28c3c0093a4deba7404b662b623bf408e3a7ae"} Mar 20 10:59:46 crc kubenswrapper[4748]: I0320 10:59:46.639691 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Mar 20 10:59:46 crc kubenswrapper[4748]: I0320 10:59:46.984928 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-hkmq9"] Mar 20 10:59:46 crc kubenswrapper[4748]: I0320 10:59:46.986757 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-hkmq9" Mar 20 10:59:46 crc kubenswrapper[4748]: I0320 10:59:46.990166 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Mar 20 10:59:46 crc kubenswrapper[4748]: I0320 10:59:46.995849 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Mar 20 10:59:47 crc kubenswrapper[4748]: I0320 10:59:47.010599 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-hkmq9"] Mar 20 10:59:47 crc kubenswrapper[4748]: I0320 10:59:47.098148 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/864d7cb6-6b7f-4b08-9555-9c89fb2f0e04-config-data\") pod \"nova-cell1-cell-mapping-hkmq9\" (UID: \"864d7cb6-6b7f-4b08-9555-9c89fb2f0e04\") " pod="openstack/nova-cell1-cell-mapping-hkmq9" Mar 20 10:59:47 crc kubenswrapper[4748]: I0320 10:59:47.098209 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/864d7cb6-6b7f-4b08-9555-9c89fb2f0e04-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-hkmq9\" (UID: \"864d7cb6-6b7f-4b08-9555-9c89fb2f0e04\") " pod="openstack/nova-cell1-cell-mapping-hkmq9" Mar 20 10:59:47 crc kubenswrapper[4748]: I0320 10:59:47.098257 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rz28\" (UniqueName: \"kubernetes.io/projected/864d7cb6-6b7f-4b08-9555-9c89fb2f0e04-kube-api-access-9rz28\") pod \"nova-cell1-cell-mapping-hkmq9\" (UID: \"864d7cb6-6b7f-4b08-9555-9c89fb2f0e04\") " pod="openstack/nova-cell1-cell-mapping-hkmq9" Mar 20 10:59:47 crc kubenswrapper[4748]: I0320 10:59:47.098316 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/864d7cb6-6b7f-4b08-9555-9c89fb2f0e04-scripts\") pod \"nova-cell1-cell-mapping-hkmq9\" (UID: \"864d7cb6-6b7f-4b08-9555-9c89fb2f0e04\") " pod="openstack/nova-cell1-cell-mapping-hkmq9" Mar 20 10:59:47 crc kubenswrapper[4748]: I0320 10:59:47.201047 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/864d7cb6-6b7f-4b08-9555-9c89fb2f0e04-scripts\") pod \"nova-cell1-cell-mapping-hkmq9\" (UID: \"864d7cb6-6b7f-4b08-9555-9c89fb2f0e04\") " pod="openstack/nova-cell1-cell-mapping-hkmq9" Mar 20 10:59:47 crc kubenswrapper[4748]: I0320 10:59:47.201326 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/864d7cb6-6b7f-4b08-9555-9c89fb2f0e04-config-data\") pod \"nova-cell1-cell-mapping-hkmq9\" (UID: \"864d7cb6-6b7f-4b08-9555-9c89fb2f0e04\") " pod="openstack/nova-cell1-cell-mapping-hkmq9" Mar 20 10:59:47 crc kubenswrapper[4748]: I0320 10:59:47.201413 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/864d7cb6-6b7f-4b08-9555-9c89fb2f0e04-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-hkmq9\" (UID: \"864d7cb6-6b7f-4b08-9555-9c89fb2f0e04\") " pod="openstack/nova-cell1-cell-mapping-hkmq9" Mar 20 10:59:47 crc kubenswrapper[4748]: I0320 10:59:47.201531 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9rz28\" (UniqueName: \"kubernetes.io/projected/864d7cb6-6b7f-4b08-9555-9c89fb2f0e04-kube-api-access-9rz28\") pod \"nova-cell1-cell-mapping-hkmq9\" (UID: \"864d7cb6-6b7f-4b08-9555-9c89fb2f0e04\") " pod="openstack/nova-cell1-cell-mapping-hkmq9" Mar 20 10:59:47 crc kubenswrapper[4748]: I0320 10:59:47.207067 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/864d7cb6-6b7f-4b08-9555-9c89fb2f0e04-scripts\") pod \"nova-cell1-cell-mapping-hkmq9\" (UID: \"864d7cb6-6b7f-4b08-9555-9c89fb2f0e04\") " pod="openstack/nova-cell1-cell-mapping-hkmq9" Mar 20 10:59:47 crc kubenswrapper[4748]: I0320 10:59:47.207576 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/864d7cb6-6b7f-4b08-9555-9c89fb2f0e04-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-hkmq9\" (UID: \"864d7cb6-6b7f-4b08-9555-9c89fb2f0e04\") " pod="openstack/nova-cell1-cell-mapping-hkmq9" Mar 20 10:59:47 crc kubenswrapper[4748]: I0320 10:59:47.216904 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/864d7cb6-6b7f-4b08-9555-9c89fb2f0e04-config-data\") pod \"nova-cell1-cell-mapping-hkmq9\" (UID: \"864d7cb6-6b7f-4b08-9555-9c89fb2f0e04\") " pod="openstack/nova-cell1-cell-mapping-hkmq9" Mar 20 10:59:47 crc kubenswrapper[4748]: I0320 10:59:47.224681 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rz28\" (UniqueName: \"kubernetes.io/projected/864d7cb6-6b7f-4b08-9555-9c89fb2f0e04-kube-api-access-9rz28\") pod \"nova-cell1-cell-mapping-hkmq9\" (UID: \"864d7cb6-6b7f-4b08-9555-9c89fb2f0e04\") " pod="openstack/nova-cell1-cell-mapping-hkmq9" Mar 20 10:59:47 crc kubenswrapper[4748]: I0320 10:59:47.330737 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-hkmq9" Mar 20 10:59:47 crc kubenswrapper[4748]: I0320 10:59:47.533254 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0a406ab-9700-4aff-ac24-931c6862568d" path="/var/lib/kubelet/pods/f0a406ab-9700-4aff-ac24-931c6862568d/volumes" Mar 20 10:59:47 crc kubenswrapper[4748]: I0320 10:59:47.639343 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6e1cc91f-da3f-4748-85d6-dbf4ef8cb729","Type":"ContainerStarted","Data":"c353ed06e6f8f53e1243b5b8f308e8427c57a3e3422e265e2be563a2aa08eb88"} Mar 20 10:59:47 crc kubenswrapper[4748]: I0320 10:59:47.642461 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"763813e1-3dd6-4a8b-9aa4-460abe73e264","Type":"ContainerStarted","Data":"64f1b3c23f8a06b1ecbb7dc583024c16dfd6c3a26a4a2d5a91c0ec40a4c77ea0"} Mar 20 10:59:47 crc kubenswrapper[4748]: I0320 10:59:47.642545 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"763813e1-3dd6-4a8b-9aa4-460abe73e264","Type":"ContainerStarted","Data":"f5eedbe5a3ee5247af59209199132fc918fab77c45c858a4ca2971ee3f78951c"} Mar 20 10:59:47 crc kubenswrapper[4748]: I0320 10:59:47.684350 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.684320009 podStartE2EDuration="2.684320009s" podCreationTimestamp="2026-03-20 10:59:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:59:47.67432438 +0000 UTC m=+1422.815870214" watchObservedRunningTime="2026-03-20 10:59:47.684320009 +0000 UTC m=+1422.825865823" Mar 20 10:59:47 crc kubenswrapper[4748]: I0320 10:59:47.856589 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-hkmq9"] Mar 20 10:59:47 crc kubenswrapper[4748]: W0320 10:59:47.858604 4748 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod864d7cb6_6b7f_4b08_9555_9c89fb2f0e04.slice/crio-f74d0fbafb4ca17c97db11d45ca477d70358c69c5388c0f4eaa13617c6aad07f WatchSource:0}: Error finding container f74d0fbafb4ca17c97db11d45ca477d70358c69c5388c0f4eaa13617c6aad07f: Status 404 returned error can't find the container with id f74d0fbafb4ca17c97db11d45ca477d70358c69c5388c0f4eaa13617c6aad07f Mar 20 10:59:48 crc kubenswrapper[4748]: I0320 10:59:48.225520 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-89c5cd4d5-s9zg6" Mar 20 10:59:48 crc kubenswrapper[4748]: I0320 10:59:48.295769 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-9c85k"] Mar 20 10:59:48 crc kubenswrapper[4748]: I0320 10:59:48.296147 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-757b4f8459-9c85k" podUID="46817d78-848a-4ab6-8de1-2e2b03381000" containerName="dnsmasq-dns" containerID="cri-o://7c379ff449728ba81f9de6130ce47080f5d3d4a8fac849e17e9e6c3e04c6b10e" gracePeriod=10 Mar 20 10:59:48 crc kubenswrapper[4748]: I0320 10:59:48.420411 4748 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-757b4f8459-9c85k" podUID="46817d78-848a-4ab6-8de1-2e2b03381000" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.195:5353: connect: connection refused" Mar 20 10:59:48 crc kubenswrapper[4748]: I0320 10:59:48.657506 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-hkmq9" event={"ID":"864d7cb6-6b7f-4b08-9555-9c89fb2f0e04","Type":"ContainerStarted","Data":"2273cd0372819a347527cbc3d6009a6f2bb4e2e90d83429149b6691f73d025d6"} Mar 20 10:59:48 crc kubenswrapper[4748]: I0320 10:59:48.657879 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-hkmq9" event={"ID":"864d7cb6-6b7f-4b08-9555-9c89fb2f0e04","Type":"ContainerStarted","Data":"f74d0fbafb4ca17c97db11d45ca477d70358c69c5388c0f4eaa13617c6aad07f"} Mar 20 10:59:48 crc kubenswrapper[4748]: I0320 10:59:48.659163 4748 generic.go:334] "Generic (PLEG): container finished" podID="46817d78-848a-4ab6-8de1-2e2b03381000" containerID="7c379ff449728ba81f9de6130ce47080f5d3d4a8fac849e17e9e6c3e04c6b10e" exitCode=0 Mar 20 10:59:48 crc kubenswrapper[4748]: I0320 10:59:48.659691 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-9c85k" event={"ID":"46817d78-848a-4ab6-8de1-2e2b03381000","Type":"ContainerDied","Data":"7c379ff449728ba81f9de6130ce47080f5d3d4a8fac849e17e9e6c3e04c6b10e"} Mar 20 10:59:48 crc kubenswrapper[4748]: I0320 10:59:48.684674 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-hkmq9" podStartSLOduration=2.684654037 podStartE2EDuration="2.684654037s" podCreationTimestamp="2026-03-20 10:59:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:59:48.675037046 +0000 UTC m=+1423.816582870" watchObservedRunningTime="2026-03-20 10:59:48.684654037 +0000 UTC m=+1423.826199851" Mar 20 10:59:48 crc kubenswrapper[4748]: I0320 10:59:48.712346 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-9c85k" Mar 20 10:59:48 crc kubenswrapper[4748]: I0320 10:59:48.735392 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/46817d78-848a-4ab6-8de1-2e2b03381000-ovsdbserver-nb\") pod \"46817d78-848a-4ab6-8de1-2e2b03381000\" (UID: \"46817d78-848a-4ab6-8de1-2e2b03381000\") " Mar 20 10:59:48 crc kubenswrapper[4748]: I0320 10:59:48.735459 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46817d78-848a-4ab6-8de1-2e2b03381000-config\") pod \"46817d78-848a-4ab6-8de1-2e2b03381000\" (UID: \"46817d78-848a-4ab6-8de1-2e2b03381000\") " Mar 20 10:59:48 crc kubenswrapper[4748]: I0320 10:59:48.735546 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/46817d78-848a-4ab6-8de1-2e2b03381000-dns-swift-storage-0\") pod \"46817d78-848a-4ab6-8de1-2e2b03381000\" (UID: \"46817d78-848a-4ab6-8de1-2e2b03381000\") " Mar 20 10:59:48 crc kubenswrapper[4748]: I0320 10:59:48.735568 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/46817d78-848a-4ab6-8de1-2e2b03381000-dns-svc\") pod \"46817d78-848a-4ab6-8de1-2e2b03381000\" (UID: \"46817d78-848a-4ab6-8de1-2e2b03381000\") " Mar 20 10:59:48 crc kubenswrapper[4748]: I0320 10:59:48.735673 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gq7xk\" (UniqueName: \"kubernetes.io/projected/46817d78-848a-4ab6-8de1-2e2b03381000-kube-api-access-gq7xk\") pod \"46817d78-848a-4ab6-8de1-2e2b03381000\" (UID: \"46817d78-848a-4ab6-8de1-2e2b03381000\") " Mar 20 10:59:48 crc kubenswrapper[4748]: I0320 10:59:48.735715 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/46817d78-848a-4ab6-8de1-2e2b03381000-ovsdbserver-sb\") pod \"46817d78-848a-4ab6-8de1-2e2b03381000\" (UID: \"46817d78-848a-4ab6-8de1-2e2b03381000\") " Mar 20 10:59:48 crc kubenswrapper[4748]: I0320 10:59:48.763040 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46817d78-848a-4ab6-8de1-2e2b03381000-kube-api-access-gq7xk" (OuterVolumeSpecName: "kube-api-access-gq7xk") pod "46817d78-848a-4ab6-8de1-2e2b03381000" (UID: "46817d78-848a-4ab6-8de1-2e2b03381000"). InnerVolumeSpecName "kube-api-access-gq7xk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:59:48 crc kubenswrapper[4748]: I0320 10:59:48.802191 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46817d78-848a-4ab6-8de1-2e2b03381000-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "46817d78-848a-4ab6-8de1-2e2b03381000" (UID: "46817d78-848a-4ab6-8de1-2e2b03381000"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:59:48 crc kubenswrapper[4748]: I0320 10:59:48.804412 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46817d78-848a-4ab6-8de1-2e2b03381000-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "46817d78-848a-4ab6-8de1-2e2b03381000" (UID: "46817d78-848a-4ab6-8de1-2e2b03381000"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:59:48 crc kubenswrapper[4748]: I0320 10:59:48.819117 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46817d78-848a-4ab6-8de1-2e2b03381000-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "46817d78-848a-4ab6-8de1-2e2b03381000" (UID: "46817d78-848a-4ab6-8de1-2e2b03381000"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:59:48 crc kubenswrapper[4748]: I0320 10:59:48.821310 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46817d78-848a-4ab6-8de1-2e2b03381000-config" (OuterVolumeSpecName: "config") pod "46817d78-848a-4ab6-8de1-2e2b03381000" (UID: "46817d78-848a-4ab6-8de1-2e2b03381000"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:59:48 crc kubenswrapper[4748]: I0320 10:59:48.833579 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46817d78-848a-4ab6-8de1-2e2b03381000-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "46817d78-848a-4ab6-8de1-2e2b03381000" (UID: "46817d78-848a-4ab6-8de1-2e2b03381000"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:59:48 crc kubenswrapper[4748]: I0320 10:59:48.841227 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gq7xk\" (UniqueName: \"kubernetes.io/projected/46817d78-848a-4ab6-8de1-2e2b03381000-kube-api-access-gq7xk\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:48 crc kubenswrapper[4748]: I0320 10:59:48.841252 4748 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/46817d78-848a-4ab6-8de1-2e2b03381000-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:48 crc kubenswrapper[4748]: I0320 10:59:48.841261 4748 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/46817d78-848a-4ab6-8de1-2e2b03381000-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:48 crc kubenswrapper[4748]: I0320 10:59:48.841270 4748 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46817d78-848a-4ab6-8de1-2e2b03381000-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:48 crc kubenswrapper[4748]: I0320 10:59:48.841280 4748 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/46817d78-848a-4ab6-8de1-2e2b03381000-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:48 crc kubenswrapper[4748]: I0320 10:59:48.841288 4748 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/46817d78-848a-4ab6-8de1-2e2b03381000-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:49 crc kubenswrapper[4748]: I0320 10:59:49.672353 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6e1cc91f-da3f-4748-85d6-dbf4ef8cb729","Type":"ContainerStarted","Data":"4e908ab37e2c79196cb9b30af43500cb2e4a3912257202aab505a4d6f6d22626"} Mar 20 10:59:49 crc kubenswrapper[4748]: I0320 10:59:49.673230 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 20 10:59:49 crc kubenswrapper[4748]: I0320 10:59:49.673215 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6e1cc91f-da3f-4748-85d6-dbf4ef8cb729" containerName="ceilometer-central-agent" containerID="cri-o://de2cf53897f1fae8cfe402a1267d257326195e4f7f9853a322c0a26d10c18478" gracePeriod=30 Mar 20 10:59:49 crc kubenswrapper[4748]: I0320 10:59:49.673249 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6e1cc91f-da3f-4748-85d6-dbf4ef8cb729" containerName="sg-core" containerID="cri-o://c353ed06e6f8f53e1243b5b8f308e8427c57a3e3422e265e2be563a2aa08eb88" gracePeriod=30 Mar 20 10:59:49 crc kubenswrapper[4748]: I0320 10:59:49.673252 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6e1cc91f-da3f-4748-85d6-dbf4ef8cb729" containerName="proxy-httpd" containerID="cri-o://4e908ab37e2c79196cb9b30af43500cb2e4a3912257202aab505a4d6f6d22626" gracePeriod=30 Mar 20 10:59:49 crc kubenswrapper[4748]: I0320 10:59:49.673371 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6e1cc91f-da3f-4748-85d6-dbf4ef8cb729" containerName="ceilometer-notification-agent" containerID="cri-o://6f5ccf3fc6f379e479084fc05831e25807d7b24baa886b124561078d730d0b76" gracePeriod=30 Mar 20 10:59:49 crc kubenswrapper[4748]: I0320 10:59:49.685182 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-9c85k" Mar 20 10:59:49 crc kubenswrapper[4748]: I0320 10:59:49.685248 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-9c85k" event={"ID":"46817d78-848a-4ab6-8de1-2e2b03381000","Type":"ContainerDied","Data":"97b691cd7d02b4968233ba62e4838ab17b2af99371ea1276f7c3526d7dcd93df"} Mar 20 10:59:49 crc kubenswrapper[4748]: I0320 10:59:49.687072 4748 scope.go:117] "RemoveContainer" containerID="7c379ff449728ba81f9de6130ce47080f5d3d4a8fac849e17e9e6c3e04c6b10e" Mar 20 10:59:49 crc kubenswrapper[4748]: I0320 10:59:49.707183 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.356297652 podStartE2EDuration="7.707153237s" podCreationTimestamp="2026-03-20 10:59:42 +0000 UTC" firstStartedPulling="2026-03-20 10:59:43.494250553 +0000 UTC m=+1418.635796367" lastFinishedPulling="2026-03-20 10:59:48.845106138 +0000 UTC m=+1423.986651952" observedRunningTime="2026-03-20 10:59:49.701016354 +0000 UTC m=+1424.842562218" watchObservedRunningTime="2026-03-20 10:59:49.707153237 +0000 UTC m=+1424.848699051" Mar 20 10:59:49 crc kubenswrapper[4748]: I0320 10:59:49.719355 4748 scope.go:117] "RemoveContainer" containerID="75303485c9a0643eebc9b8259b42027729c61be3251c5259866868d260dda6db" Mar 20 10:59:49 crc kubenswrapper[4748]: I0320 10:59:49.736915 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-9c85k"] Mar 20 10:59:49 crc kubenswrapper[4748]: I0320 10:59:49.785304 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-9c85k"] Mar 20 10:59:49 crc kubenswrapper[4748]: E0320 10:59:49.933801 4748 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6e1cc91f_da3f_4748_85d6_dbf4ef8cb729.slice/crio-c353ed06e6f8f53e1243b5b8f308e8427c57a3e3422e265e2be563a2aa08eb88.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6e1cc91f_da3f_4748_85d6_dbf4ef8cb729.slice/crio-4e908ab37e2c79196cb9b30af43500cb2e4a3912257202aab505a4d6f6d22626.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6e1cc91f_da3f_4748_85d6_dbf4ef8cb729.slice/crio-conmon-c353ed06e6f8f53e1243b5b8f308e8427c57a3e3422e265e2be563a2aa08eb88.scope\": RecentStats: unable to find data in memory cache]" Mar 20 10:59:50 crc kubenswrapper[4748]: I0320 10:59:50.700863 4748 generic.go:334] "Generic (PLEG): container finished" podID="6e1cc91f-da3f-4748-85d6-dbf4ef8cb729" containerID="4e908ab37e2c79196cb9b30af43500cb2e4a3912257202aab505a4d6f6d22626" exitCode=0 Mar 20 10:59:50 crc kubenswrapper[4748]: I0320 10:59:50.701173 4748 generic.go:334] "Generic (PLEG): container finished" podID="6e1cc91f-da3f-4748-85d6-dbf4ef8cb729" containerID="c353ed06e6f8f53e1243b5b8f308e8427c57a3e3422e265e2be563a2aa08eb88" exitCode=2 Mar 20 10:59:50 crc kubenswrapper[4748]: I0320 10:59:50.701186 4748 generic.go:334] "Generic (PLEG): container finished" podID="6e1cc91f-da3f-4748-85d6-dbf4ef8cb729" containerID="6f5ccf3fc6f379e479084fc05831e25807d7b24baa886b124561078d730d0b76" exitCode=0 Mar 20 10:59:50 crc kubenswrapper[4748]: I0320 10:59:50.700952 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6e1cc91f-da3f-4748-85d6-dbf4ef8cb729","Type":"ContainerDied","Data":"4e908ab37e2c79196cb9b30af43500cb2e4a3912257202aab505a4d6f6d22626"} Mar 20 10:59:50 crc kubenswrapper[4748]: I0320 10:59:50.701224 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6e1cc91f-da3f-4748-85d6-dbf4ef8cb729","Type":"ContainerDied","Data":"c353ed06e6f8f53e1243b5b8f308e8427c57a3e3422e265e2be563a2aa08eb88"} Mar 20 10:59:50 crc kubenswrapper[4748]: I0320 10:59:50.701247 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6e1cc91f-da3f-4748-85d6-dbf4ef8cb729","Type":"ContainerDied","Data":"6f5ccf3fc6f379e479084fc05831e25807d7b24baa886b124561078d730d0b76"} Mar 20 10:59:51 crc kubenswrapper[4748]: I0320 10:59:51.528947 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46817d78-848a-4ab6-8de1-2e2b03381000" path="/var/lib/kubelet/pods/46817d78-848a-4ab6-8de1-2e2b03381000/volumes" Mar 20 10:59:52 crc kubenswrapper[4748]: I0320 10:59:52.729443 4748 generic.go:334] "Generic (PLEG): container finished" podID="6e1cc91f-da3f-4748-85d6-dbf4ef8cb729" containerID="de2cf53897f1fae8cfe402a1267d257326195e4f7f9853a322c0a26d10c18478" exitCode=0 Mar 20 10:59:52 crc kubenswrapper[4748]: I0320 10:59:52.729998 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6e1cc91f-da3f-4748-85d6-dbf4ef8cb729","Type":"ContainerDied","Data":"de2cf53897f1fae8cfe402a1267d257326195e4f7f9853a322c0a26d10c18478"} Mar 20 10:59:52 crc kubenswrapper[4748]: I0320 10:59:52.731010 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6e1cc91f-da3f-4748-85d6-dbf4ef8cb729","Type":"ContainerDied","Data":"3132411f1033e083f8c0cca1a4a815dd3508fc3f51ebe1cc80bba354d7999b54"} Mar 20 10:59:52 crc kubenswrapper[4748]: I0320 10:59:52.731097 4748 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3132411f1033e083f8c0cca1a4a815dd3508fc3f51ebe1cc80bba354d7999b54" Mar 20 10:59:52 crc kubenswrapper[4748]: I0320 10:59:52.796776 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 10:59:52 crc kubenswrapper[4748]: I0320 10:59:52.942632 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e1cc91f-da3f-4748-85d6-dbf4ef8cb729-log-httpd\") pod \"6e1cc91f-da3f-4748-85d6-dbf4ef8cb729\" (UID: \"6e1cc91f-da3f-4748-85d6-dbf4ef8cb729\") " Mar 20 10:59:52 crc kubenswrapper[4748]: I0320 10:59:52.942711 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e1cc91f-da3f-4748-85d6-dbf4ef8cb729-config-data\") pod \"6e1cc91f-da3f-4748-85d6-dbf4ef8cb729\" (UID: \"6e1cc91f-da3f-4748-85d6-dbf4ef8cb729\") " Mar 20 10:59:52 crc kubenswrapper[4748]: I0320 10:59:52.942768 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e1cc91f-da3f-4748-85d6-dbf4ef8cb729-ceilometer-tls-certs\") pod \"6e1cc91f-da3f-4748-85d6-dbf4ef8cb729\" (UID: \"6e1cc91f-da3f-4748-85d6-dbf4ef8cb729\") " Mar 20 10:59:52 crc kubenswrapper[4748]: I0320 10:59:52.942811 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6e1cc91f-da3f-4748-85d6-dbf4ef8cb729-sg-core-conf-yaml\") pod \"6e1cc91f-da3f-4748-85d6-dbf4ef8cb729\" (UID: \"6e1cc91f-da3f-4748-85d6-dbf4ef8cb729\") " Mar 20 10:59:52 crc kubenswrapper[4748]: I0320 10:59:52.942876 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-psk4v\" (UniqueName: \"kubernetes.io/projected/6e1cc91f-da3f-4748-85d6-dbf4ef8cb729-kube-api-access-psk4v\") pod \"6e1cc91f-da3f-4748-85d6-dbf4ef8cb729\" (UID: \"6e1cc91f-da3f-4748-85d6-dbf4ef8cb729\") " Mar 20 10:59:52 crc kubenswrapper[4748]: I0320 10:59:52.942983 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e1cc91f-da3f-4748-85d6-dbf4ef8cb729-run-httpd\") pod \"6e1cc91f-da3f-4748-85d6-dbf4ef8cb729\" (UID: \"6e1cc91f-da3f-4748-85d6-dbf4ef8cb729\") " Mar 20 10:59:52 crc kubenswrapper[4748]: I0320 10:59:52.943017 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e1cc91f-da3f-4748-85d6-dbf4ef8cb729-combined-ca-bundle\") pod \"6e1cc91f-da3f-4748-85d6-dbf4ef8cb729\" (UID: \"6e1cc91f-da3f-4748-85d6-dbf4ef8cb729\") " Mar 20 10:59:52 crc kubenswrapper[4748]: I0320 10:59:52.943035 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e1cc91f-da3f-4748-85d6-dbf4ef8cb729-scripts\") pod \"6e1cc91f-da3f-4748-85d6-dbf4ef8cb729\" (UID: \"6e1cc91f-da3f-4748-85d6-dbf4ef8cb729\") " Mar 20 10:59:52 crc kubenswrapper[4748]: I0320 10:59:52.943216 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e1cc91f-da3f-4748-85d6-dbf4ef8cb729-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "6e1cc91f-da3f-4748-85d6-dbf4ef8cb729" (UID: "6e1cc91f-da3f-4748-85d6-dbf4ef8cb729"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:59:52 crc kubenswrapper[4748]: I0320 10:59:52.943440 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e1cc91f-da3f-4748-85d6-dbf4ef8cb729-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "6e1cc91f-da3f-4748-85d6-dbf4ef8cb729" (UID: "6e1cc91f-da3f-4748-85d6-dbf4ef8cb729"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:59:52 crc kubenswrapper[4748]: I0320 10:59:52.943496 4748 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e1cc91f-da3f-4748-85d6-dbf4ef8cb729-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:52 crc kubenswrapper[4748]: I0320 10:59:52.948375 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e1cc91f-da3f-4748-85d6-dbf4ef8cb729-kube-api-access-psk4v" (OuterVolumeSpecName: "kube-api-access-psk4v") pod "6e1cc91f-da3f-4748-85d6-dbf4ef8cb729" (UID: "6e1cc91f-da3f-4748-85d6-dbf4ef8cb729"). InnerVolumeSpecName "kube-api-access-psk4v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:59:52 crc kubenswrapper[4748]: I0320 10:59:52.956046 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e1cc91f-da3f-4748-85d6-dbf4ef8cb729-scripts" (OuterVolumeSpecName: "scripts") pod "6e1cc91f-da3f-4748-85d6-dbf4ef8cb729" (UID: "6e1cc91f-da3f-4748-85d6-dbf4ef8cb729"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:59:52 crc kubenswrapper[4748]: I0320 10:59:52.968349 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e1cc91f-da3f-4748-85d6-dbf4ef8cb729-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "6e1cc91f-da3f-4748-85d6-dbf4ef8cb729" (UID: "6e1cc91f-da3f-4748-85d6-dbf4ef8cb729"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:59:52 crc kubenswrapper[4748]: I0320 10:59:52.997732 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e1cc91f-da3f-4748-85d6-dbf4ef8cb729-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "6e1cc91f-da3f-4748-85d6-dbf4ef8cb729" (UID: "6e1cc91f-da3f-4748-85d6-dbf4ef8cb729"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:59:53 crc kubenswrapper[4748]: I0320 10:59:53.024244 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e1cc91f-da3f-4748-85d6-dbf4ef8cb729-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6e1cc91f-da3f-4748-85d6-dbf4ef8cb729" (UID: "6e1cc91f-da3f-4748-85d6-dbf4ef8cb729"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:59:53 crc kubenswrapper[4748]: I0320 10:59:53.045900 4748 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e1cc91f-da3f-4748-85d6-dbf4ef8cb729-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:53 crc kubenswrapper[4748]: I0320 10:59:53.045936 4748 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6e1cc91f-da3f-4748-85d6-dbf4ef8cb729-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:53 crc kubenswrapper[4748]: I0320 10:59:53.045952 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-psk4v\" (UniqueName: \"kubernetes.io/projected/6e1cc91f-da3f-4748-85d6-dbf4ef8cb729-kube-api-access-psk4v\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:53 crc kubenswrapper[4748]: I0320 10:59:53.045965 4748 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e1cc91f-da3f-4748-85d6-dbf4ef8cb729-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:53 crc kubenswrapper[4748]: I0320 10:59:53.045974 4748 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e1cc91f-da3f-4748-85d6-dbf4ef8cb729-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:53 crc kubenswrapper[4748]: I0320 10:59:53.045982 4748 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e1cc91f-da3f-4748-85d6-dbf4ef8cb729-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:53 crc kubenswrapper[4748]: I0320 10:59:53.072721 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e1cc91f-da3f-4748-85d6-dbf4ef8cb729-config-data" (OuterVolumeSpecName: "config-data") pod "6e1cc91f-da3f-4748-85d6-dbf4ef8cb729" (UID: "6e1cc91f-da3f-4748-85d6-dbf4ef8cb729"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:59:53 crc kubenswrapper[4748]: I0320 10:59:53.147710 4748 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e1cc91f-da3f-4748-85d6-dbf4ef8cb729-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:53 crc kubenswrapper[4748]: I0320 10:59:53.743168 4748 generic.go:334] "Generic (PLEG): container finished" podID="864d7cb6-6b7f-4b08-9555-9c89fb2f0e04" containerID="2273cd0372819a347527cbc3d6009a6f2bb4e2e90d83429149b6691f73d025d6" exitCode=0 Mar 20 10:59:53 crc kubenswrapper[4748]: I0320 10:59:53.743257 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-hkmq9" event={"ID":"864d7cb6-6b7f-4b08-9555-9c89fb2f0e04","Type":"ContainerDied","Data":"2273cd0372819a347527cbc3d6009a6f2bb4e2e90d83429149b6691f73d025d6"} Mar 20 10:59:53 crc kubenswrapper[4748]: I0320 10:59:53.744270 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 10:59:53 crc kubenswrapper[4748]: I0320 10:59:53.826530 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 10:59:53 crc kubenswrapper[4748]: I0320 10:59:53.838743 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 20 10:59:53 crc kubenswrapper[4748]: I0320 10:59:53.849894 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 10:59:53 crc kubenswrapper[4748]: E0320 10:59:53.850362 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e1cc91f-da3f-4748-85d6-dbf4ef8cb729" containerName="sg-core" Mar 20 10:59:53 crc kubenswrapper[4748]: I0320 10:59:53.850385 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e1cc91f-da3f-4748-85d6-dbf4ef8cb729" containerName="sg-core" Mar 20 10:59:53 crc kubenswrapper[4748]: E0320 10:59:53.850410 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46817d78-848a-4ab6-8de1-2e2b03381000" containerName="init" Mar 20 10:59:53 crc kubenswrapper[4748]: I0320 10:59:53.850418 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="46817d78-848a-4ab6-8de1-2e2b03381000" containerName="init" Mar 20 10:59:53 crc kubenswrapper[4748]: E0320 10:59:53.850436 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e1cc91f-da3f-4748-85d6-dbf4ef8cb729" containerName="ceilometer-notification-agent" Mar 20 10:59:53 crc kubenswrapper[4748]: I0320 10:59:53.850446 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e1cc91f-da3f-4748-85d6-dbf4ef8cb729" containerName="ceilometer-notification-agent" Mar 20 10:59:53 crc kubenswrapper[4748]: E0320 10:59:53.850464 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e1cc91f-da3f-4748-85d6-dbf4ef8cb729" containerName="proxy-httpd" Mar 20 10:59:53 crc kubenswrapper[4748]: I0320 10:59:53.850471 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e1cc91f-da3f-4748-85d6-dbf4ef8cb729" containerName="proxy-httpd" Mar 20 10:59:53 crc kubenswrapper[4748]: E0320 10:59:53.850485 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e1cc91f-da3f-4748-85d6-dbf4ef8cb729" containerName="ceilometer-central-agent" Mar 20 10:59:53 crc kubenswrapper[4748]: I0320 10:59:53.850492 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e1cc91f-da3f-4748-85d6-dbf4ef8cb729" containerName="ceilometer-central-agent" Mar 20 10:59:53 crc kubenswrapper[4748]: E0320 10:59:53.850512 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46817d78-848a-4ab6-8de1-2e2b03381000" containerName="dnsmasq-dns" Mar 20 10:59:53 crc kubenswrapper[4748]: I0320 10:59:53.850520 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="46817d78-848a-4ab6-8de1-2e2b03381000" containerName="dnsmasq-dns" Mar 20 10:59:53 crc kubenswrapper[4748]: I0320 10:59:53.850758 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e1cc91f-da3f-4748-85d6-dbf4ef8cb729" containerName="proxy-httpd" Mar 20 10:59:53 crc kubenswrapper[4748]: I0320 10:59:53.850779 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="46817d78-848a-4ab6-8de1-2e2b03381000" containerName="dnsmasq-dns" Mar 20 10:59:53 crc kubenswrapper[4748]: I0320 10:59:53.850795 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e1cc91f-da3f-4748-85d6-dbf4ef8cb729" containerName="ceilometer-central-agent" Mar 20 10:59:53 crc kubenswrapper[4748]: I0320 10:59:53.850821 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e1cc91f-da3f-4748-85d6-dbf4ef8cb729" containerName="ceilometer-notification-agent" Mar 20 10:59:53 crc kubenswrapper[4748]: I0320 10:59:53.850855 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e1cc91f-da3f-4748-85d6-dbf4ef8cb729" containerName="sg-core" Mar 20 10:59:53 crc kubenswrapper[4748]: I0320 10:59:53.852716 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 10:59:53 crc kubenswrapper[4748]: I0320 10:59:53.858807 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 20 10:59:53 crc kubenswrapper[4748]: I0320 10:59:53.859113 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 10:59:53 crc kubenswrapper[4748]: I0320 10:59:53.859269 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 10:59:53 crc kubenswrapper[4748]: I0320 10:59:53.863556 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 10:59:53 crc kubenswrapper[4748]: I0320 10:59:53.963568 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/06f5e1bb-615b-4f6e-9c62-93a82f0984c8-run-httpd\") pod \"ceilometer-0\" (UID: \"06f5e1bb-615b-4f6e-9c62-93a82f0984c8\") " pod="openstack/ceilometer-0" Mar 20 10:59:53 crc kubenswrapper[4748]: I0320 10:59:53.963667 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/06f5e1bb-615b-4f6e-9c62-93a82f0984c8-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"06f5e1bb-615b-4f6e-9c62-93a82f0984c8\") " pod="openstack/ceilometer-0" Mar 20 10:59:53 crc kubenswrapper[4748]: I0320 10:59:53.963711 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/06f5e1bb-615b-4f6e-9c62-93a82f0984c8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"06f5e1bb-615b-4f6e-9c62-93a82f0984c8\") " pod="openstack/ceilometer-0" Mar 20 10:59:53 crc kubenswrapper[4748]: I0320 10:59:53.963809 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06f5e1bb-615b-4f6e-9c62-93a82f0984c8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"06f5e1bb-615b-4f6e-9c62-93a82f0984c8\") " pod="openstack/ceilometer-0" Mar 20 10:59:53 crc kubenswrapper[4748]: I0320 10:59:53.963858 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckfrr\" (UniqueName: \"kubernetes.io/projected/06f5e1bb-615b-4f6e-9c62-93a82f0984c8-kube-api-access-ckfrr\") pod \"ceilometer-0\" (UID: \"06f5e1bb-615b-4f6e-9c62-93a82f0984c8\") " pod="openstack/ceilometer-0" Mar 20 10:59:53 crc kubenswrapper[4748]: I0320 10:59:53.963894 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/06f5e1bb-615b-4f6e-9c62-93a82f0984c8-scripts\") pod \"ceilometer-0\" (UID: \"06f5e1bb-615b-4f6e-9c62-93a82f0984c8\") " pod="openstack/ceilometer-0" Mar 20 10:59:53 crc kubenswrapper[4748]: I0320 10:59:53.964071 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06f5e1bb-615b-4f6e-9c62-93a82f0984c8-config-data\") pod \"ceilometer-0\" (UID: \"06f5e1bb-615b-4f6e-9c62-93a82f0984c8\") " pod="openstack/ceilometer-0" Mar 20 10:59:53 crc kubenswrapper[4748]: I0320 10:59:53.964105 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/06f5e1bb-615b-4f6e-9c62-93a82f0984c8-log-httpd\") pod \"ceilometer-0\" (UID: \"06f5e1bb-615b-4f6e-9c62-93a82f0984c8\") " pod="openstack/ceilometer-0" Mar 20 10:59:54 crc kubenswrapper[4748]: I0320 10:59:54.065658 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06f5e1bb-615b-4f6e-9c62-93a82f0984c8-config-data\") pod \"ceilometer-0\" (UID: \"06f5e1bb-615b-4f6e-9c62-93a82f0984c8\") " pod="openstack/ceilometer-0" Mar 20 10:59:54 crc kubenswrapper[4748]: I0320 10:59:54.065726 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/06f5e1bb-615b-4f6e-9c62-93a82f0984c8-log-httpd\") pod \"ceilometer-0\" (UID: \"06f5e1bb-615b-4f6e-9c62-93a82f0984c8\") " pod="openstack/ceilometer-0" Mar 20 10:59:54 crc kubenswrapper[4748]: I0320 10:59:54.065765 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/06f5e1bb-615b-4f6e-9c62-93a82f0984c8-run-httpd\") pod \"ceilometer-0\" (UID: \"06f5e1bb-615b-4f6e-9c62-93a82f0984c8\") " pod="openstack/ceilometer-0" Mar 20 10:59:54 crc kubenswrapper[4748]: I0320 10:59:54.065848 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/06f5e1bb-615b-4f6e-9c62-93a82f0984c8-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"06f5e1bb-615b-4f6e-9c62-93a82f0984c8\") " pod="openstack/ceilometer-0" Mar 20 10:59:54 crc kubenswrapper[4748]: I0320 10:59:54.065913 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/06f5e1bb-615b-4f6e-9c62-93a82f0984c8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"06f5e1bb-615b-4f6e-9c62-93a82f0984c8\") " pod="openstack/ceilometer-0" Mar 20 10:59:54 crc kubenswrapper[4748]: I0320 10:59:54.066541 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/06f5e1bb-615b-4f6e-9c62-93a82f0984c8-run-httpd\") pod \"ceilometer-0\" (UID: \"06f5e1bb-615b-4f6e-9c62-93a82f0984c8\") " pod="openstack/ceilometer-0" Mar 20 10:59:54 crc kubenswrapper[4748]: I0320 10:59:54.066569 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06f5e1bb-615b-4f6e-9c62-93a82f0984c8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"06f5e1bb-615b-4f6e-9c62-93a82f0984c8\") " pod="openstack/ceilometer-0" Mar 20 10:59:54 crc kubenswrapper[4748]: I0320 10:59:54.066662 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckfrr\" (UniqueName: \"kubernetes.io/projected/06f5e1bb-615b-4f6e-9c62-93a82f0984c8-kube-api-access-ckfrr\") pod \"ceilometer-0\" (UID: \"06f5e1bb-615b-4f6e-9c62-93a82f0984c8\") " pod="openstack/ceilometer-0" Mar 20 10:59:54 crc kubenswrapper[4748]: I0320 10:59:54.066732 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/06f5e1bb-615b-4f6e-9c62-93a82f0984c8-scripts\") pod \"ceilometer-0\" (UID: \"06f5e1bb-615b-4f6e-9c62-93a82f0984c8\") " pod="openstack/ceilometer-0" Mar 20 10:59:54 crc kubenswrapper[4748]: I0320 10:59:54.067661 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/06f5e1bb-615b-4f6e-9c62-93a82f0984c8-log-httpd\") pod \"ceilometer-0\" (UID: \"06f5e1bb-615b-4f6e-9c62-93a82f0984c8\") " pod="openstack/ceilometer-0" Mar 20 10:59:54 crc kubenswrapper[4748]: I0320 10:59:54.071293 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06f5e1bb-615b-4f6e-9c62-93a82f0984c8-config-data\") pod \"ceilometer-0\" (UID: \"06f5e1bb-615b-4f6e-9c62-93a82f0984c8\") " pod="openstack/ceilometer-0" Mar 20 10:59:54 crc kubenswrapper[4748]: I0320 10:59:54.071353 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/06f5e1bb-615b-4f6e-9c62-93a82f0984c8-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"06f5e1bb-615b-4f6e-9c62-93a82f0984c8\") " pod="openstack/ceilometer-0" Mar 20 10:59:54 crc kubenswrapper[4748]: I0320 10:59:54.072528 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06f5e1bb-615b-4f6e-9c62-93a82f0984c8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"06f5e1bb-615b-4f6e-9c62-93a82f0984c8\") " pod="openstack/ceilometer-0" Mar 20 10:59:54 crc kubenswrapper[4748]: I0320 10:59:54.080372 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/06f5e1bb-615b-4f6e-9c62-93a82f0984c8-scripts\") pod \"ceilometer-0\" (UID: \"06f5e1bb-615b-4f6e-9c62-93a82f0984c8\") " pod="openstack/ceilometer-0" Mar 20 10:59:54 crc kubenswrapper[4748]: I0320 10:59:54.082797 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckfrr\" (UniqueName: \"kubernetes.io/projected/06f5e1bb-615b-4f6e-9c62-93a82f0984c8-kube-api-access-ckfrr\") pod \"ceilometer-0\" (UID: \"06f5e1bb-615b-4f6e-9c62-93a82f0984c8\") " pod="openstack/ceilometer-0" Mar 20 10:59:54 crc kubenswrapper[4748]: I0320 10:59:54.088093 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/06f5e1bb-615b-4f6e-9c62-93a82f0984c8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"06f5e1bb-615b-4f6e-9c62-93a82f0984c8\") " pod="openstack/ceilometer-0" Mar 20 10:59:54 crc kubenswrapper[4748]: I0320 10:59:54.193486 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 10:59:54 crc kubenswrapper[4748]: I0320 10:59:54.631343 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 10:59:54 crc kubenswrapper[4748]: I0320 10:59:54.754692 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"06f5e1bb-615b-4f6e-9c62-93a82f0984c8","Type":"ContainerStarted","Data":"bcbe98f9a7144c833aacc637320841ef6694a55b4d45efdfebea3642a3068dc2"} Mar 20 10:59:55 crc kubenswrapper[4748]: I0320 10:59:55.073579 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-hkmq9" Mar 20 10:59:55 crc kubenswrapper[4748]: I0320 10:59:55.215055 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/864d7cb6-6b7f-4b08-9555-9c89fb2f0e04-config-data\") pod \"864d7cb6-6b7f-4b08-9555-9c89fb2f0e04\" (UID: \"864d7cb6-6b7f-4b08-9555-9c89fb2f0e04\") " Mar 20 10:59:55 crc kubenswrapper[4748]: I0320 10:59:55.215234 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9rz28\" (UniqueName: \"kubernetes.io/projected/864d7cb6-6b7f-4b08-9555-9c89fb2f0e04-kube-api-access-9rz28\") pod \"864d7cb6-6b7f-4b08-9555-9c89fb2f0e04\" (UID: \"864d7cb6-6b7f-4b08-9555-9c89fb2f0e04\") " Mar 20 10:59:55 crc kubenswrapper[4748]: I0320 10:59:55.215279 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/864d7cb6-6b7f-4b08-9555-9c89fb2f0e04-combined-ca-bundle\") pod \"864d7cb6-6b7f-4b08-9555-9c89fb2f0e04\" (UID: \"864d7cb6-6b7f-4b08-9555-9c89fb2f0e04\") " Mar 20 10:59:55 crc kubenswrapper[4748]: I0320 10:59:55.215341 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/864d7cb6-6b7f-4b08-9555-9c89fb2f0e04-scripts\") pod \"864d7cb6-6b7f-4b08-9555-9c89fb2f0e04\" (UID: \"864d7cb6-6b7f-4b08-9555-9c89fb2f0e04\") " Mar 20 10:59:55 crc kubenswrapper[4748]: I0320 10:59:55.220827 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/864d7cb6-6b7f-4b08-9555-9c89fb2f0e04-kube-api-access-9rz28" (OuterVolumeSpecName: "kube-api-access-9rz28") pod "864d7cb6-6b7f-4b08-9555-9c89fb2f0e04" (UID: "864d7cb6-6b7f-4b08-9555-9c89fb2f0e04"). InnerVolumeSpecName "kube-api-access-9rz28". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:59:55 crc kubenswrapper[4748]: I0320 10:59:55.228087 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/864d7cb6-6b7f-4b08-9555-9c89fb2f0e04-scripts" (OuterVolumeSpecName: "scripts") pod "864d7cb6-6b7f-4b08-9555-9c89fb2f0e04" (UID: "864d7cb6-6b7f-4b08-9555-9c89fb2f0e04"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:59:55 crc kubenswrapper[4748]: I0320 10:59:55.244631 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/864d7cb6-6b7f-4b08-9555-9c89fb2f0e04-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "864d7cb6-6b7f-4b08-9555-9c89fb2f0e04" (UID: "864d7cb6-6b7f-4b08-9555-9c89fb2f0e04"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:59:55 crc kubenswrapper[4748]: I0320 10:59:55.251152 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/864d7cb6-6b7f-4b08-9555-9c89fb2f0e04-config-data" (OuterVolumeSpecName: "config-data") pod "864d7cb6-6b7f-4b08-9555-9c89fb2f0e04" (UID: "864d7cb6-6b7f-4b08-9555-9c89fb2f0e04"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:59:55 crc kubenswrapper[4748]: I0320 10:59:55.317578 4748 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/864d7cb6-6b7f-4b08-9555-9c89fb2f0e04-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:55 crc kubenswrapper[4748]: I0320 10:59:55.317611 4748 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/864d7cb6-6b7f-4b08-9555-9c89fb2f0e04-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:55 crc kubenswrapper[4748]: I0320 10:59:55.317621 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9rz28\" (UniqueName: \"kubernetes.io/projected/864d7cb6-6b7f-4b08-9555-9c89fb2f0e04-kube-api-access-9rz28\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:55 crc kubenswrapper[4748]: I0320 10:59:55.317633 4748 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/864d7cb6-6b7f-4b08-9555-9c89fb2f0e04-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:55 crc kubenswrapper[4748]: I0320 10:59:55.526895 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e1cc91f-da3f-4748-85d6-dbf4ef8cb729" path="/var/lib/kubelet/pods/6e1cc91f-da3f-4748-85d6-dbf4ef8cb729/volumes" Mar 20 10:59:55 crc kubenswrapper[4748]: I0320 10:59:55.765536 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-hkmq9" event={"ID":"864d7cb6-6b7f-4b08-9555-9c89fb2f0e04","Type":"ContainerDied","Data":"f74d0fbafb4ca17c97db11d45ca477d70358c69c5388c0f4eaa13617c6aad07f"} Mar 20 10:59:55 crc kubenswrapper[4748]: I0320 10:59:55.765569 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-hkmq9" Mar 20 10:59:55 crc kubenswrapper[4748]: I0320 10:59:55.765625 4748 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f74d0fbafb4ca17c97db11d45ca477d70358c69c5388c0f4eaa13617c6aad07f" Mar 20 10:59:55 crc kubenswrapper[4748]: I0320 10:59:55.768885 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"06f5e1bb-615b-4f6e-9c62-93a82f0984c8","Type":"ContainerStarted","Data":"149c8b3d66c3ade2ccb44f3407a57636088141cd7769dcdb09222fc2bb13fbf4"} Mar 20 10:59:55 crc kubenswrapper[4748]: I0320 10:59:55.994227 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 20 10:59:55 crc kubenswrapper[4748]: I0320 10:59:55.994624 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="763813e1-3dd6-4a8b-9aa4-460abe73e264" containerName="nova-api-log" containerID="cri-o://f5eedbe5a3ee5247af59209199132fc918fab77c45c858a4ca2971ee3f78951c" gracePeriod=30 Mar 20 10:59:55 crc kubenswrapper[4748]: I0320 10:59:55.995202 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="763813e1-3dd6-4a8b-9aa4-460abe73e264" containerName="nova-api-api" containerID="cri-o://64f1b3c23f8a06b1ecbb7dc583024c16dfd6c3a26a4a2d5a91c0ec40a4c77ea0" gracePeriod=30 Mar 20 10:59:56 crc kubenswrapper[4748]: I0320 10:59:56.030291 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 10:59:56 crc kubenswrapper[4748]: I0320 10:59:56.030736 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="794b1ed4-1370-4c35-b4b2-7ad2da01f690" containerName="nova-scheduler-scheduler" containerID="cri-o://c7035bb73c98a18f77aaa64deb90fcdc3b05847f09179b722a88febf5926ad0e" gracePeriod=30 Mar 20 10:59:56 crc kubenswrapper[4748]: I0320 10:59:56.040454 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 10:59:56 crc kubenswrapper[4748]: I0320 10:59:56.040713 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="d51df005-e870-4b00-9eac-72bb099077dd" containerName="nova-metadata-log" containerID="cri-o://8aa854eb68b19e0600024ec449e7ab9c78521ca3a6ab8b2d88b95c6e254c9de4" gracePeriod=30 Mar 20 10:59:56 crc kubenswrapper[4748]: I0320 10:59:56.040789 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="d51df005-e870-4b00-9eac-72bb099077dd" containerName="nova-metadata-metadata" containerID="cri-o://9cee787e55ad75700ab13f12ae054c2f0b9292cb89cb56e7f80fde608a3c3d9d" gracePeriod=30 Mar 20 10:59:56 crc kubenswrapper[4748]: I0320 10:59:56.542220 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 10:59:56 crc kubenswrapper[4748]: I0320 10:59:56.640597 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-86nsx\" (UniqueName: \"kubernetes.io/projected/763813e1-3dd6-4a8b-9aa4-460abe73e264-kube-api-access-86nsx\") pod \"763813e1-3dd6-4a8b-9aa4-460abe73e264\" (UID: \"763813e1-3dd6-4a8b-9aa4-460abe73e264\") " Mar 20 10:59:56 crc kubenswrapper[4748]: I0320 10:59:56.640644 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/763813e1-3dd6-4a8b-9aa4-460abe73e264-logs\") pod \"763813e1-3dd6-4a8b-9aa4-460abe73e264\" (UID: \"763813e1-3dd6-4a8b-9aa4-460abe73e264\") " Mar 20 10:59:56 crc kubenswrapper[4748]: I0320 10:59:56.640669 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/763813e1-3dd6-4a8b-9aa4-460abe73e264-combined-ca-bundle\") pod \"763813e1-3dd6-4a8b-9aa4-460abe73e264\" (UID: \"763813e1-3dd6-4a8b-9aa4-460abe73e264\") " Mar 20 10:59:56 crc kubenswrapper[4748]: I0320 10:59:56.640700 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/763813e1-3dd6-4a8b-9aa4-460abe73e264-internal-tls-certs\") pod \"763813e1-3dd6-4a8b-9aa4-460abe73e264\" (UID: \"763813e1-3dd6-4a8b-9aa4-460abe73e264\") " Mar 20 10:59:56 crc kubenswrapper[4748]: I0320 10:59:56.640793 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/763813e1-3dd6-4a8b-9aa4-460abe73e264-config-data\") pod \"763813e1-3dd6-4a8b-9aa4-460abe73e264\" (UID: \"763813e1-3dd6-4a8b-9aa4-460abe73e264\") " Mar 20 10:59:56 crc kubenswrapper[4748]: I0320 10:59:56.640896 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/763813e1-3dd6-4a8b-9aa4-460abe73e264-public-tls-certs\") pod \"763813e1-3dd6-4a8b-9aa4-460abe73e264\" (UID: \"763813e1-3dd6-4a8b-9aa4-460abe73e264\") " Mar 20 10:59:56 crc kubenswrapper[4748]: I0320 10:59:56.641192 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/763813e1-3dd6-4a8b-9aa4-460abe73e264-logs" (OuterVolumeSpecName: "logs") pod "763813e1-3dd6-4a8b-9aa4-460abe73e264" (UID: "763813e1-3dd6-4a8b-9aa4-460abe73e264"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:59:56 crc kubenswrapper[4748]: I0320 10:59:56.641418 4748 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/763813e1-3dd6-4a8b-9aa4-460abe73e264-logs\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:56 crc kubenswrapper[4748]: I0320 10:59:56.647403 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/763813e1-3dd6-4a8b-9aa4-460abe73e264-kube-api-access-86nsx" (OuterVolumeSpecName: "kube-api-access-86nsx") pod "763813e1-3dd6-4a8b-9aa4-460abe73e264" (UID: "763813e1-3dd6-4a8b-9aa4-460abe73e264"). InnerVolumeSpecName "kube-api-access-86nsx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:59:56 crc kubenswrapper[4748]: I0320 10:59:56.673148 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/763813e1-3dd6-4a8b-9aa4-460abe73e264-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "763813e1-3dd6-4a8b-9aa4-460abe73e264" (UID: "763813e1-3dd6-4a8b-9aa4-460abe73e264"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:59:56 crc kubenswrapper[4748]: I0320 10:59:56.684598 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/763813e1-3dd6-4a8b-9aa4-460abe73e264-config-data" (OuterVolumeSpecName: "config-data") pod "763813e1-3dd6-4a8b-9aa4-460abe73e264" (UID: "763813e1-3dd6-4a8b-9aa4-460abe73e264"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:59:56 crc kubenswrapper[4748]: I0320 10:59:56.695559 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/763813e1-3dd6-4a8b-9aa4-460abe73e264-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "763813e1-3dd6-4a8b-9aa4-460abe73e264" (UID: "763813e1-3dd6-4a8b-9aa4-460abe73e264"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:59:56 crc kubenswrapper[4748]: I0320 10:59:56.710990 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/763813e1-3dd6-4a8b-9aa4-460abe73e264-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "763813e1-3dd6-4a8b-9aa4-460abe73e264" (UID: "763813e1-3dd6-4a8b-9aa4-460abe73e264"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:59:56 crc kubenswrapper[4748]: I0320 10:59:56.743461 4748 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/763813e1-3dd6-4a8b-9aa4-460abe73e264-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:56 crc kubenswrapper[4748]: I0320 10:59:56.743497 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-86nsx\" (UniqueName: \"kubernetes.io/projected/763813e1-3dd6-4a8b-9aa4-460abe73e264-kube-api-access-86nsx\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:56 crc kubenswrapper[4748]: I0320 10:59:56.743508 4748 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/763813e1-3dd6-4a8b-9aa4-460abe73e264-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:56 crc kubenswrapper[4748]: I0320 10:59:56.743519 4748 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/763813e1-3dd6-4a8b-9aa4-460abe73e264-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:56 crc kubenswrapper[4748]: I0320 10:59:56.743528 4748 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/763813e1-3dd6-4a8b-9aa4-460abe73e264-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:56 crc kubenswrapper[4748]: I0320 10:59:56.787173 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"06f5e1bb-615b-4f6e-9c62-93a82f0984c8","Type":"ContainerStarted","Data":"f1cd34953a1ad397add45dc21927218f016c3ecb68bdf2ce75d753d08ecd4a63"} Mar 20 10:59:56 crc kubenswrapper[4748]: I0320 10:59:56.789681 4748 generic.go:334] "Generic (PLEG): container finished" podID="763813e1-3dd6-4a8b-9aa4-460abe73e264" containerID="64f1b3c23f8a06b1ecbb7dc583024c16dfd6c3a26a4a2d5a91c0ec40a4c77ea0" exitCode=0 Mar 20 10:59:56 crc kubenswrapper[4748]: I0320 10:59:56.789726 4748 generic.go:334] "Generic (PLEG): container finished" podID="763813e1-3dd6-4a8b-9aa4-460abe73e264" containerID="f5eedbe5a3ee5247af59209199132fc918fab77c45c858a4ca2971ee3f78951c" exitCode=143 Mar 20 10:59:56 crc kubenswrapper[4748]: I0320 10:59:56.789753 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 10:59:56 crc kubenswrapper[4748]: I0320 10:59:56.789776 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"763813e1-3dd6-4a8b-9aa4-460abe73e264","Type":"ContainerDied","Data":"64f1b3c23f8a06b1ecbb7dc583024c16dfd6c3a26a4a2d5a91c0ec40a4c77ea0"} Mar 20 10:59:56 crc kubenswrapper[4748]: I0320 10:59:56.789806 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"763813e1-3dd6-4a8b-9aa4-460abe73e264","Type":"ContainerDied","Data":"f5eedbe5a3ee5247af59209199132fc918fab77c45c858a4ca2971ee3f78951c"} Mar 20 10:59:56 crc kubenswrapper[4748]: I0320 10:59:56.789817 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"763813e1-3dd6-4a8b-9aa4-460abe73e264","Type":"ContainerDied","Data":"b637a3506a91b510b5222f108b28c3c0093a4deba7404b662b623bf408e3a7ae"} Mar 20 10:59:56 crc kubenswrapper[4748]: I0320 10:59:56.789843 4748 scope.go:117] "RemoveContainer" containerID="64f1b3c23f8a06b1ecbb7dc583024c16dfd6c3a26a4a2d5a91c0ec40a4c77ea0" Mar 20 10:59:56 crc kubenswrapper[4748]: I0320 10:59:56.796225 4748 generic.go:334] "Generic (PLEG): container finished" podID="d51df005-e870-4b00-9eac-72bb099077dd" containerID="8aa854eb68b19e0600024ec449e7ab9c78521ca3a6ab8b2d88b95c6e254c9de4" exitCode=143 Mar 20 10:59:56 crc kubenswrapper[4748]: I0320 10:59:56.796283 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d51df005-e870-4b00-9eac-72bb099077dd","Type":"ContainerDied","Data":"8aa854eb68b19e0600024ec449e7ab9c78521ca3a6ab8b2d88b95c6e254c9de4"} Mar 20 10:59:56 crc kubenswrapper[4748]: I0320 10:59:56.829433 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 20 10:59:56 crc kubenswrapper[4748]: I0320 10:59:56.833191 4748 scope.go:117] "RemoveContainer" containerID="f5eedbe5a3ee5247af59209199132fc918fab77c45c858a4ca2971ee3f78951c" Mar 20 10:59:56 crc kubenswrapper[4748]: I0320 10:59:56.865262 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 20 10:59:56 crc kubenswrapper[4748]: I0320 10:59:56.867456 4748 scope.go:117] "RemoveContainer" containerID="64f1b3c23f8a06b1ecbb7dc583024c16dfd6c3a26a4a2d5a91c0ec40a4c77ea0" Mar 20 10:59:56 crc kubenswrapper[4748]: E0320 10:59:56.871441 4748 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"64f1b3c23f8a06b1ecbb7dc583024c16dfd6c3a26a4a2d5a91c0ec40a4c77ea0\": container with ID starting with 64f1b3c23f8a06b1ecbb7dc583024c16dfd6c3a26a4a2d5a91c0ec40a4c77ea0 not found: ID does not exist" containerID="64f1b3c23f8a06b1ecbb7dc583024c16dfd6c3a26a4a2d5a91c0ec40a4c77ea0" Mar 20 10:59:56 crc kubenswrapper[4748]: I0320 10:59:56.871484 4748 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64f1b3c23f8a06b1ecbb7dc583024c16dfd6c3a26a4a2d5a91c0ec40a4c77ea0"} err="failed to get container status \"64f1b3c23f8a06b1ecbb7dc583024c16dfd6c3a26a4a2d5a91c0ec40a4c77ea0\": rpc error: code = NotFound desc = could not find container \"64f1b3c23f8a06b1ecbb7dc583024c16dfd6c3a26a4a2d5a91c0ec40a4c77ea0\": container with ID starting with 64f1b3c23f8a06b1ecbb7dc583024c16dfd6c3a26a4a2d5a91c0ec40a4c77ea0 not found: ID does not exist" Mar 20 10:59:56 crc kubenswrapper[4748]: I0320 10:59:56.871512 4748 scope.go:117] "RemoveContainer" containerID="f5eedbe5a3ee5247af59209199132fc918fab77c45c858a4ca2971ee3f78951c" Mar 20 10:59:56 crc kubenswrapper[4748]: I0320 10:59:56.876947 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 20 10:59:56 crc kubenswrapper[4748]: E0320 10:59:56.877392 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="763813e1-3dd6-4a8b-9aa4-460abe73e264" containerName="nova-api-api" Mar 20 10:59:56 crc kubenswrapper[4748]: I0320 10:59:56.877415 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="763813e1-3dd6-4a8b-9aa4-460abe73e264" containerName="nova-api-api" Mar 20 10:59:56 crc kubenswrapper[4748]: E0320 10:59:56.877430 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="864d7cb6-6b7f-4b08-9555-9c89fb2f0e04" containerName="nova-manage" Mar 20 10:59:56 crc kubenswrapper[4748]: I0320 10:59:56.877439 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="864d7cb6-6b7f-4b08-9555-9c89fb2f0e04" containerName="nova-manage" Mar 20 10:59:56 crc kubenswrapper[4748]: E0320 10:59:56.877465 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="763813e1-3dd6-4a8b-9aa4-460abe73e264" containerName="nova-api-log" Mar 20 10:59:56 crc kubenswrapper[4748]: I0320 10:59:56.877474 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="763813e1-3dd6-4a8b-9aa4-460abe73e264" containerName="nova-api-log" Mar 20 10:59:56 crc kubenswrapper[4748]: I0320 10:59:56.877719 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="763813e1-3dd6-4a8b-9aa4-460abe73e264" containerName="nova-api-api" Mar 20 10:59:56 crc kubenswrapper[4748]: I0320 10:59:56.877744 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="763813e1-3dd6-4a8b-9aa4-460abe73e264" containerName="nova-api-log" Mar 20 10:59:56 crc kubenswrapper[4748]: I0320 10:59:56.877761 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="864d7cb6-6b7f-4b08-9555-9c89fb2f0e04" containerName="nova-manage" Mar 20 10:59:56 crc kubenswrapper[4748]: I0320 10:59:56.878993 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 10:59:56 crc kubenswrapper[4748]: I0320 10:59:56.881099 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 20 10:59:56 crc kubenswrapper[4748]: I0320 10:59:56.881511 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 20 10:59:56 crc kubenswrapper[4748]: I0320 10:59:56.881664 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 20 10:59:56 crc kubenswrapper[4748]: E0320 10:59:56.883807 4748 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f5eedbe5a3ee5247af59209199132fc918fab77c45c858a4ca2971ee3f78951c\": container with ID starting with f5eedbe5a3ee5247af59209199132fc918fab77c45c858a4ca2971ee3f78951c not found: ID does not exist" containerID="f5eedbe5a3ee5247af59209199132fc918fab77c45c858a4ca2971ee3f78951c" Mar 20 10:59:56 crc kubenswrapper[4748]: I0320 10:59:56.883873 4748 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5eedbe5a3ee5247af59209199132fc918fab77c45c858a4ca2971ee3f78951c"} err="failed to get container status \"f5eedbe5a3ee5247af59209199132fc918fab77c45c858a4ca2971ee3f78951c\": rpc error: code = NotFound desc = could not find container \"f5eedbe5a3ee5247af59209199132fc918fab77c45c858a4ca2971ee3f78951c\": container with ID starting with f5eedbe5a3ee5247af59209199132fc918fab77c45c858a4ca2971ee3f78951c not found: ID does not exist" Mar 20 10:59:56 crc kubenswrapper[4748]: I0320 10:59:56.883908 4748 scope.go:117] "RemoveContainer" containerID="64f1b3c23f8a06b1ecbb7dc583024c16dfd6c3a26a4a2d5a91c0ec40a4c77ea0" Mar 20 10:59:56 crc kubenswrapper[4748]: I0320 10:59:56.887661 4748 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64f1b3c23f8a06b1ecbb7dc583024c16dfd6c3a26a4a2d5a91c0ec40a4c77ea0"} err="failed to get container status \"64f1b3c23f8a06b1ecbb7dc583024c16dfd6c3a26a4a2d5a91c0ec40a4c77ea0\": rpc error: code = NotFound desc = could not find container \"64f1b3c23f8a06b1ecbb7dc583024c16dfd6c3a26a4a2d5a91c0ec40a4c77ea0\": container with ID starting with 64f1b3c23f8a06b1ecbb7dc583024c16dfd6c3a26a4a2d5a91c0ec40a4c77ea0 not found: ID does not exist" Mar 20 10:59:56 crc kubenswrapper[4748]: I0320 10:59:56.887697 4748 scope.go:117] "RemoveContainer" containerID="f5eedbe5a3ee5247af59209199132fc918fab77c45c858a4ca2971ee3f78951c" Mar 20 10:59:56 crc kubenswrapper[4748]: I0320 10:59:56.888460 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 10:59:56 crc kubenswrapper[4748]: I0320 10:59:56.891244 4748 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5eedbe5a3ee5247af59209199132fc918fab77c45c858a4ca2971ee3f78951c"} err="failed to get container status \"f5eedbe5a3ee5247af59209199132fc918fab77c45c858a4ca2971ee3f78951c\": rpc error: code = NotFound desc = could not find container \"f5eedbe5a3ee5247af59209199132fc918fab77c45c858a4ca2971ee3f78951c\": container with ID starting with f5eedbe5a3ee5247af59209199132fc918fab77c45c858a4ca2971ee3f78951c not found: ID does not exist" Mar 20 10:59:57 crc kubenswrapper[4748]: I0320 10:59:57.050213 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-689wb\" (UniqueName: \"kubernetes.io/projected/6acc81f7-4f7d-4828-a328-1e2a4426bd57-kube-api-access-689wb\") pod \"nova-api-0\" (UID: \"6acc81f7-4f7d-4828-a328-1e2a4426bd57\") " pod="openstack/nova-api-0" Mar 20 10:59:57 crc kubenswrapper[4748]: I0320 10:59:57.050280 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6acc81f7-4f7d-4828-a328-1e2a4426bd57-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"6acc81f7-4f7d-4828-a328-1e2a4426bd57\") " pod="openstack/nova-api-0" Mar 20 10:59:57 crc kubenswrapper[4748]: I0320 10:59:57.050316 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6acc81f7-4f7d-4828-a328-1e2a4426bd57-logs\") pod \"nova-api-0\" (UID: \"6acc81f7-4f7d-4828-a328-1e2a4426bd57\") " pod="openstack/nova-api-0" Mar 20 10:59:57 crc kubenswrapper[4748]: I0320 10:59:57.050352 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6acc81f7-4f7d-4828-a328-1e2a4426bd57-internal-tls-certs\") pod \"nova-api-0\" (UID: \"6acc81f7-4f7d-4828-a328-1e2a4426bd57\") " pod="openstack/nova-api-0" Mar 20 10:59:57 crc kubenswrapper[4748]: I0320 10:59:57.050430 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6acc81f7-4f7d-4828-a328-1e2a4426bd57-public-tls-certs\") pod \"nova-api-0\" (UID: \"6acc81f7-4f7d-4828-a328-1e2a4426bd57\") " pod="openstack/nova-api-0" Mar 20 10:59:57 crc kubenswrapper[4748]: I0320 10:59:57.050479 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6acc81f7-4f7d-4828-a328-1e2a4426bd57-config-data\") pod \"nova-api-0\" (UID: \"6acc81f7-4f7d-4828-a328-1e2a4426bd57\") " pod="openstack/nova-api-0" Mar 20 10:59:57 crc kubenswrapper[4748]: I0320 10:59:57.152182 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-689wb\" (UniqueName: \"kubernetes.io/projected/6acc81f7-4f7d-4828-a328-1e2a4426bd57-kube-api-access-689wb\") pod \"nova-api-0\" (UID: \"6acc81f7-4f7d-4828-a328-1e2a4426bd57\") " pod="openstack/nova-api-0" Mar 20 10:59:57 crc kubenswrapper[4748]: I0320 10:59:57.152537 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6acc81f7-4f7d-4828-a328-1e2a4426bd57-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"6acc81f7-4f7d-4828-a328-1e2a4426bd57\") " pod="openstack/nova-api-0" Mar 20 10:59:57 crc kubenswrapper[4748]: I0320 10:59:57.152565 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6acc81f7-4f7d-4828-a328-1e2a4426bd57-logs\") pod \"nova-api-0\" (UID: \"6acc81f7-4f7d-4828-a328-1e2a4426bd57\") " pod="openstack/nova-api-0" Mar 20 10:59:57 crc kubenswrapper[4748]: I0320 10:59:57.152593 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6acc81f7-4f7d-4828-a328-1e2a4426bd57-internal-tls-certs\") pod \"nova-api-0\" (UID: \"6acc81f7-4f7d-4828-a328-1e2a4426bd57\") " pod="openstack/nova-api-0" Mar 20 10:59:57 crc kubenswrapper[4748]: I0320 10:59:57.152624 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6acc81f7-4f7d-4828-a328-1e2a4426bd57-public-tls-certs\") pod \"nova-api-0\" (UID: \"6acc81f7-4f7d-4828-a328-1e2a4426bd57\") " pod="openstack/nova-api-0" Mar 20 10:59:57 crc kubenswrapper[4748]: I0320 10:59:57.152658 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6acc81f7-4f7d-4828-a328-1e2a4426bd57-config-data\") pod \"nova-api-0\" (UID: \"6acc81f7-4f7d-4828-a328-1e2a4426bd57\") " pod="openstack/nova-api-0" Mar 20 10:59:57 crc kubenswrapper[4748]: I0320 10:59:57.153222 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6acc81f7-4f7d-4828-a328-1e2a4426bd57-logs\") pod \"nova-api-0\" (UID: \"6acc81f7-4f7d-4828-a328-1e2a4426bd57\") " pod="openstack/nova-api-0" Mar 20 10:59:57 crc kubenswrapper[4748]: I0320 10:59:57.160981 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6acc81f7-4f7d-4828-a328-1e2a4426bd57-public-tls-certs\") pod \"nova-api-0\" (UID: \"6acc81f7-4f7d-4828-a328-1e2a4426bd57\") " pod="openstack/nova-api-0" Mar 20 10:59:57 crc kubenswrapper[4748]: I0320 10:59:57.162505 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6acc81f7-4f7d-4828-a328-1e2a4426bd57-internal-tls-certs\") pod \"nova-api-0\" (UID: \"6acc81f7-4f7d-4828-a328-1e2a4426bd57\") " pod="openstack/nova-api-0" Mar 20 10:59:57 crc kubenswrapper[4748]: I0320 10:59:57.163044 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6acc81f7-4f7d-4828-a328-1e2a4426bd57-config-data\") pod \"nova-api-0\" (UID: \"6acc81f7-4f7d-4828-a328-1e2a4426bd57\") " pod="openstack/nova-api-0" Mar 20 10:59:57 crc kubenswrapper[4748]: I0320 10:59:57.170386 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6acc81f7-4f7d-4828-a328-1e2a4426bd57-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"6acc81f7-4f7d-4828-a328-1e2a4426bd57\") " pod="openstack/nova-api-0" Mar 20 10:59:57 crc kubenswrapper[4748]: I0320 10:59:57.171692 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-689wb\" (UniqueName: \"kubernetes.io/projected/6acc81f7-4f7d-4828-a328-1e2a4426bd57-kube-api-access-689wb\") pod \"nova-api-0\" (UID: \"6acc81f7-4f7d-4828-a328-1e2a4426bd57\") " pod="openstack/nova-api-0" Mar 20 10:59:57 crc kubenswrapper[4748]: I0320 10:59:57.201169 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 10:59:57 crc kubenswrapper[4748]: I0320 10:59:57.531800 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="763813e1-3dd6-4a8b-9aa4-460abe73e264" path="/var/lib/kubelet/pods/763813e1-3dd6-4a8b-9aa4-460abe73e264/volumes" Mar 20 10:59:57 crc kubenswrapper[4748]: I0320 10:59:57.664385 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 10:59:57 crc kubenswrapper[4748]: I0320 10:59:57.808511 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"06f5e1bb-615b-4f6e-9c62-93a82f0984c8","Type":"ContainerStarted","Data":"18a5b9215af574d00bcdc274acd92b23b2b782d4dba5c1dece47502595db2935"} Mar 20 10:59:57 crc kubenswrapper[4748]: I0320 10:59:57.819186 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6acc81f7-4f7d-4828-a328-1e2a4426bd57","Type":"ContainerStarted","Data":"3908ed8727d0e6e30ab704f7143809cea4393e9234125e7ed2202306345d050b"} Mar 20 10:59:58 crc kubenswrapper[4748]: I0320 10:59:58.829464 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6acc81f7-4f7d-4828-a328-1e2a4426bd57","Type":"ContainerStarted","Data":"a406d0d35294ff1bdccc61a9b26cbdea75f3bd7e9f0dd896637165d181fccd22"} Mar 20 10:59:58 crc kubenswrapper[4748]: I0320 10:59:58.830652 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6acc81f7-4f7d-4828-a328-1e2a4426bd57","Type":"ContainerStarted","Data":"b05b05162d511f7adc4049a2a4eb37714886a6aeae3896fab5cf5cca289127e6"} Mar 20 10:59:58 crc kubenswrapper[4748]: I0320 10:59:58.864393 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.864372666 podStartE2EDuration="2.864372666s" podCreationTimestamp="2026-03-20 10:59:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:59:58.860962921 +0000 UTC m=+1434.002508805" watchObservedRunningTime="2026-03-20 10:59:58.864372666 +0000 UTC m=+1434.005918480" Mar 20 10:59:59 crc kubenswrapper[4748]: I0320 10:59:59.709793 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 10:59:59 crc kubenswrapper[4748]: I0320 10:59:59.855689 4748 generic.go:334] "Generic (PLEG): container finished" podID="d51df005-e870-4b00-9eac-72bb099077dd" containerID="9cee787e55ad75700ab13f12ae054c2f0b9292cb89cb56e7f80fde608a3c3d9d" exitCode=0 Mar 20 10:59:59 crc kubenswrapper[4748]: I0320 10:59:59.855758 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d51df005-e870-4b00-9eac-72bb099077dd","Type":"ContainerDied","Data":"9cee787e55ad75700ab13f12ae054c2f0b9292cb89cb56e7f80fde608a3c3d9d"} Mar 20 10:59:59 crc kubenswrapper[4748]: I0320 10:59:59.855796 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d51df005-e870-4b00-9eac-72bb099077dd","Type":"ContainerDied","Data":"401b2aa649fbc7dd43a173a28b7f0cbb58b992cc040162efa4fbc478c77e2c88"} Mar 20 10:59:59 crc kubenswrapper[4748]: I0320 10:59:59.855817 4748 scope.go:117] "RemoveContainer" containerID="9cee787e55ad75700ab13f12ae054c2f0b9292cb89cb56e7f80fde608a3c3d9d" Mar 20 10:59:59 crc kubenswrapper[4748]: I0320 10:59:59.855994 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 10:59:59 crc kubenswrapper[4748]: I0320 10:59:59.864086 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"06f5e1bb-615b-4f6e-9c62-93a82f0984c8","Type":"ContainerStarted","Data":"21701f8ee57d144ba26dd04805fe0977708b6311efbfdffba2c24403aa07bf99"} Mar 20 10:59:59 crc kubenswrapper[4748]: I0320 10:59:59.864231 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 20 10:59:59 crc kubenswrapper[4748]: I0320 10:59:59.866655 4748 generic.go:334] "Generic (PLEG): container finished" podID="794b1ed4-1370-4c35-b4b2-7ad2da01f690" containerID="c7035bb73c98a18f77aaa64deb90fcdc3b05847f09179b722a88febf5926ad0e" exitCode=0 Mar 20 10:59:59 crc kubenswrapper[4748]: I0320 10:59:59.867343 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"794b1ed4-1370-4c35-b4b2-7ad2da01f690","Type":"ContainerDied","Data":"c7035bb73c98a18f77aaa64deb90fcdc3b05847f09179b722a88febf5926ad0e"} Mar 20 10:59:59 crc kubenswrapper[4748]: I0320 10:59:59.903342 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.497335065 podStartE2EDuration="6.903316579s" podCreationTimestamp="2026-03-20 10:59:53 +0000 UTC" firstStartedPulling="2026-03-20 10:59:54.637718955 +0000 UTC m=+1429.779264769" lastFinishedPulling="2026-03-20 10:59:59.043700419 +0000 UTC m=+1434.185246283" observedRunningTime="2026-03-20 10:59:59.899165935 +0000 UTC m=+1435.040711769" watchObservedRunningTime="2026-03-20 10:59:59.903316579 +0000 UTC m=+1435.044862393" Mar 20 10:59:59 crc kubenswrapper[4748]: I0320 10:59:59.905011 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d51df005-e870-4b00-9eac-72bb099077dd-combined-ca-bundle\") pod \"d51df005-e870-4b00-9eac-72bb099077dd\" (UID: \"d51df005-e870-4b00-9eac-72bb099077dd\") " Mar 20 10:59:59 crc kubenswrapper[4748]: I0320 10:59:59.905107 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-74dbn\" (UniqueName: \"kubernetes.io/projected/d51df005-e870-4b00-9eac-72bb099077dd-kube-api-access-74dbn\") pod \"d51df005-e870-4b00-9eac-72bb099077dd\" (UID: \"d51df005-e870-4b00-9eac-72bb099077dd\") " Mar 20 10:59:59 crc kubenswrapper[4748]: I0320 10:59:59.905186 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/d51df005-e870-4b00-9eac-72bb099077dd-nova-metadata-tls-certs\") pod \"d51df005-e870-4b00-9eac-72bb099077dd\" (UID: \"d51df005-e870-4b00-9eac-72bb099077dd\") " Mar 20 10:59:59 crc kubenswrapper[4748]: I0320 10:59:59.905299 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d51df005-e870-4b00-9eac-72bb099077dd-logs\") pod \"d51df005-e870-4b00-9eac-72bb099077dd\" (UID: \"d51df005-e870-4b00-9eac-72bb099077dd\") " Mar 20 10:59:59 crc kubenswrapper[4748]: I0320 10:59:59.905383 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d51df005-e870-4b00-9eac-72bb099077dd-config-data\") pod \"d51df005-e870-4b00-9eac-72bb099077dd\" (UID: \"d51df005-e870-4b00-9eac-72bb099077dd\") " Mar 20 10:59:59 crc kubenswrapper[4748]: I0320 10:59:59.907772 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d51df005-e870-4b00-9eac-72bb099077dd-logs" (OuterVolumeSpecName: "logs") pod "d51df005-e870-4b00-9eac-72bb099077dd" (UID: "d51df005-e870-4b00-9eac-72bb099077dd"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:59:59 crc kubenswrapper[4748]: I0320 10:59:59.913197 4748 scope.go:117] "RemoveContainer" containerID="8aa854eb68b19e0600024ec449e7ab9c78521ca3a6ab8b2d88b95c6e254c9de4" Mar 20 10:59:59 crc kubenswrapper[4748]: I0320 10:59:59.924416 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d51df005-e870-4b00-9eac-72bb099077dd-kube-api-access-74dbn" (OuterVolumeSpecName: "kube-api-access-74dbn") pod "d51df005-e870-4b00-9eac-72bb099077dd" (UID: "d51df005-e870-4b00-9eac-72bb099077dd"). InnerVolumeSpecName "kube-api-access-74dbn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:59:59 crc kubenswrapper[4748]: I0320 10:59:59.939590 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 10:59:59 crc kubenswrapper[4748]: I0320 10:59:59.947804 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d51df005-e870-4b00-9eac-72bb099077dd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d51df005-e870-4b00-9eac-72bb099077dd" (UID: "d51df005-e870-4b00-9eac-72bb099077dd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:59:59 crc kubenswrapper[4748]: I0320 10:59:59.951457 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d51df005-e870-4b00-9eac-72bb099077dd-config-data" (OuterVolumeSpecName: "config-data") pod "d51df005-e870-4b00-9eac-72bb099077dd" (UID: "d51df005-e870-4b00-9eac-72bb099077dd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:59:59 crc kubenswrapper[4748]: I0320 10:59:59.964001 4748 scope.go:117] "RemoveContainer" containerID="9cee787e55ad75700ab13f12ae054c2f0b9292cb89cb56e7f80fde608a3c3d9d" Mar 20 10:59:59 crc kubenswrapper[4748]: E0320 10:59:59.964598 4748 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9cee787e55ad75700ab13f12ae054c2f0b9292cb89cb56e7f80fde608a3c3d9d\": container with ID starting with 9cee787e55ad75700ab13f12ae054c2f0b9292cb89cb56e7f80fde608a3c3d9d not found: ID does not exist" containerID="9cee787e55ad75700ab13f12ae054c2f0b9292cb89cb56e7f80fde608a3c3d9d" Mar 20 10:59:59 crc kubenswrapper[4748]: I0320 10:59:59.964642 4748 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9cee787e55ad75700ab13f12ae054c2f0b9292cb89cb56e7f80fde608a3c3d9d"} err="failed to get container status \"9cee787e55ad75700ab13f12ae054c2f0b9292cb89cb56e7f80fde608a3c3d9d\": rpc error: code = NotFound desc = could not find container \"9cee787e55ad75700ab13f12ae054c2f0b9292cb89cb56e7f80fde608a3c3d9d\": container with ID starting with 9cee787e55ad75700ab13f12ae054c2f0b9292cb89cb56e7f80fde608a3c3d9d not found: ID does not exist" Mar 20 10:59:59 crc kubenswrapper[4748]: I0320 10:59:59.964671 4748 scope.go:117] "RemoveContainer" containerID="8aa854eb68b19e0600024ec449e7ab9c78521ca3a6ab8b2d88b95c6e254c9de4" Mar 20 10:59:59 crc kubenswrapper[4748]: E0320 10:59:59.965025 4748 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8aa854eb68b19e0600024ec449e7ab9c78521ca3a6ab8b2d88b95c6e254c9de4\": container with ID starting with 8aa854eb68b19e0600024ec449e7ab9c78521ca3a6ab8b2d88b95c6e254c9de4 not found: ID does not exist" containerID="8aa854eb68b19e0600024ec449e7ab9c78521ca3a6ab8b2d88b95c6e254c9de4" Mar 20 10:59:59 crc kubenswrapper[4748]: I0320 10:59:59.965052 4748 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8aa854eb68b19e0600024ec449e7ab9c78521ca3a6ab8b2d88b95c6e254c9de4"} err="failed to get container status \"8aa854eb68b19e0600024ec449e7ab9c78521ca3a6ab8b2d88b95c6e254c9de4\": rpc error: code = NotFound desc = could not find container \"8aa854eb68b19e0600024ec449e7ab9c78521ca3a6ab8b2d88b95c6e254c9de4\": container with ID starting with 8aa854eb68b19e0600024ec449e7ab9c78521ca3a6ab8b2d88b95c6e254c9de4 not found: ID does not exist" Mar 20 11:00:00 crc kubenswrapper[4748]: I0320 11:00:00.007407 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d51df005-e870-4b00-9eac-72bb099077dd-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "d51df005-e870-4b00-9eac-72bb099077dd" (UID: "d51df005-e870-4b00-9eac-72bb099077dd"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 11:00:00 crc kubenswrapper[4748]: I0320 11:00:00.008269 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/794b1ed4-1370-4c35-b4b2-7ad2da01f690-combined-ca-bundle\") pod \"794b1ed4-1370-4c35-b4b2-7ad2da01f690\" (UID: \"794b1ed4-1370-4c35-b4b2-7ad2da01f690\") " Mar 20 11:00:00 crc kubenswrapper[4748]: I0320 11:00:00.008571 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mbmch\" (UniqueName: \"kubernetes.io/projected/794b1ed4-1370-4c35-b4b2-7ad2da01f690-kube-api-access-mbmch\") pod \"794b1ed4-1370-4c35-b4b2-7ad2da01f690\" (UID: \"794b1ed4-1370-4c35-b4b2-7ad2da01f690\") " Mar 20 11:00:00 crc kubenswrapper[4748]: I0320 11:00:00.008716 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/794b1ed4-1370-4c35-b4b2-7ad2da01f690-config-data\") pod \"794b1ed4-1370-4c35-b4b2-7ad2da01f690\" (UID: \"794b1ed4-1370-4c35-b4b2-7ad2da01f690\") " Mar 20 11:00:00 crc kubenswrapper[4748]: I0320 11:00:00.008772 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/d51df005-e870-4b00-9eac-72bb099077dd-nova-metadata-tls-certs\") pod \"d51df005-e870-4b00-9eac-72bb099077dd\" (UID: \"d51df005-e870-4b00-9eac-72bb099077dd\") " Mar 20 11:00:00 crc kubenswrapper[4748]: W0320 11:00:00.009226 4748 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/d51df005-e870-4b00-9eac-72bb099077dd/volumes/kubernetes.io~secret/nova-metadata-tls-certs Mar 20 11:00:00 crc kubenswrapper[4748]: I0320 11:00:00.009247 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d51df005-e870-4b00-9eac-72bb099077dd-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "d51df005-e870-4b00-9eac-72bb099077dd" (UID: "d51df005-e870-4b00-9eac-72bb099077dd"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 11:00:00 crc kubenswrapper[4748]: I0320 11:00:00.009714 4748 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d51df005-e870-4b00-9eac-72bb099077dd-logs\") on node \"crc\" DevicePath \"\"" Mar 20 11:00:00 crc kubenswrapper[4748]: I0320 11:00:00.009735 4748 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d51df005-e870-4b00-9eac-72bb099077dd-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 11:00:00 crc kubenswrapper[4748]: I0320 11:00:00.009745 4748 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d51df005-e870-4b00-9eac-72bb099077dd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 11:00:00 crc kubenswrapper[4748]: I0320 11:00:00.009756 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-74dbn\" (UniqueName: \"kubernetes.io/projected/d51df005-e870-4b00-9eac-72bb099077dd-kube-api-access-74dbn\") on node \"crc\" DevicePath \"\"" Mar 20 11:00:00 crc kubenswrapper[4748]: I0320 11:00:00.009766 4748 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/d51df005-e870-4b00-9eac-72bb099077dd-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 11:00:00 crc kubenswrapper[4748]: I0320 11:00:00.012519 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/794b1ed4-1370-4c35-b4b2-7ad2da01f690-kube-api-access-mbmch" (OuterVolumeSpecName: "kube-api-access-mbmch") pod "794b1ed4-1370-4c35-b4b2-7ad2da01f690" (UID: "794b1ed4-1370-4c35-b4b2-7ad2da01f690"). InnerVolumeSpecName "kube-api-access-mbmch". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:00:00 crc kubenswrapper[4748]: I0320 11:00:00.049691 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/794b1ed4-1370-4c35-b4b2-7ad2da01f690-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "794b1ed4-1370-4c35-b4b2-7ad2da01f690" (UID: "794b1ed4-1370-4c35-b4b2-7ad2da01f690"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 11:00:00 crc kubenswrapper[4748]: I0320 11:00:00.059069 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/794b1ed4-1370-4c35-b4b2-7ad2da01f690-config-data" (OuterVolumeSpecName: "config-data") pod "794b1ed4-1370-4c35-b4b2-7ad2da01f690" (UID: "794b1ed4-1370-4c35-b4b2-7ad2da01f690"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 11:00:00 crc kubenswrapper[4748]: I0320 11:00:00.123372 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mbmch\" (UniqueName: \"kubernetes.io/projected/794b1ed4-1370-4c35-b4b2-7ad2da01f690-kube-api-access-mbmch\") on node \"crc\" DevicePath \"\"" Mar 20 11:00:00 crc kubenswrapper[4748]: I0320 11:00:00.123413 4748 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/794b1ed4-1370-4c35-b4b2-7ad2da01f690-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 11:00:00 crc kubenswrapper[4748]: I0320 11:00:00.123425 4748 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/794b1ed4-1370-4c35-b4b2-7ad2da01f690-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 11:00:00 crc kubenswrapper[4748]: I0320 11:00:00.164156 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566740-jzzl4"] Mar 20 11:00:00 crc kubenswrapper[4748]: E0320 11:00:00.164639 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="794b1ed4-1370-4c35-b4b2-7ad2da01f690" containerName="nova-scheduler-scheduler" Mar 20 11:00:00 crc kubenswrapper[4748]: I0320 11:00:00.164661 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="794b1ed4-1370-4c35-b4b2-7ad2da01f690" containerName="nova-scheduler-scheduler" Mar 20 11:00:00 crc kubenswrapper[4748]: E0320 11:00:00.164674 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d51df005-e870-4b00-9eac-72bb099077dd" containerName="nova-metadata-metadata" Mar 20 11:00:00 crc kubenswrapper[4748]: I0320 11:00:00.164681 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="d51df005-e870-4b00-9eac-72bb099077dd" containerName="nova-metadata-metadata" Mar 20 11:00:00 crc kubenswrapper[4748]: E0320 11:00:00.164706 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d51df005-e870-4b00-9eac-72bb099077dd" containerName="nova-metadata-log" Mar 20 11:00:00 crc kubenswrapper[4748]: I0320 11:00:00.164717 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="d51df005-e870-4b00-9eac-72bb099077dd" containerName="nova-metadata-log" Mar 20 11:00:00 crc kubenswrapper[4748]: I0320 11:00:00.164917 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="d51df005-e870-4b00-9eac-72bb099077dd" containerName="nova-metadata-metadata" Mar 20 11:00:00 crc kubenswrapper[4748]: I0320 11:00:00.164930 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="794b1ed4-1370-4c35-b4b2-7ad2da01f690" containerName="nova-scheduler-scheduler" Mar 20 11:00:00 crc kubenswrapper[4748]: I0320 11:00:00.164945 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="d51df005-e870-4b00-9eac-72bb099077dd" containerName="nova-metadata-log" Mar 20 11:00:00 crc kubenswrapper[4748]: I0320 11:00:00.165798 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566740-jzzl4" Mar 20 11:00:00 crc kubenswrapper[4748]: I0320 11:00:00.172402 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 11:00:00 crc kubenswrapper[4748]: I0320 11:00:00.172440 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-p6q8z" Mar 20 11:00:00 crc kubenswrapper[4748]: I0320 11:00:00.173044 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 11:00:00 crc kubenswrapper[4748]: I0320 11:00:00.186486 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566740-nhhgf"] Mar 20 11:00:00 crc kubenswrapper[4748]: I0320 11:00:00.188045 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566740-nhhgf" Mar 20 11:00:00 crc kubenswrapper[4748]: I0320 11:00:00.190555 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 20 11:00:00 crc kubenswrapper[4748]: I0320 11:00:00.191076 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 20 11:00:00 crc kubenswrapper[4748]: I0320 11:00:00.208499 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566740-jzzl4"] Mar 20 11:00:00 crc kubenswrapper[4748]: I0320 11:00:00.226454 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mn8lg\" (UniqueName: \"kubernetes.io/projected/7d6cdaef-9045-42e8-8033-abba36827d27-kube-api-access-mn8lg\") pod \"collect-profiles-29566740-nhhgf\" (UID: \"7d6cdaef-9045-42e8-8033-abba36827d27\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566740-nhhgf" Mar 20 11:00:00 crc kubenswrapper[4748]: I0320 11:00:00.227155 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7d6cdaef-9045-42e8-8033-abba36827d27-config-volume\") pod \"collect-profiles-29566740-nhhgf\" (UID: \"7d6cdaef-9045-42e8-8033-abba36827d27\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566740-nhhgf" Mar 20 11:00:00 crc kubenswrapper[4748]: I0320 11:00:00.227246 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7d6cdaef-9045-42e8-8033-abba36827d27-secret-volume\") pod \"collect-profiles-29566740-nhhgf\" (UID: \"7d6cdaef-9045-42e8-8033-abba36827d27\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566740-nhhgf" Mar 20 11:00:00 crc kubenswrapper[4748]: I0320 11:00:00.228477 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tx6l5\" (UniqueName: \"kubernetes.io/projected/4b7b4ae8-52b3-4fb3-8684-9b64c2e24ac4-kube-api-access-tx6l5\") pod \"auto-csr-approver-29566740-jzzl4\" (UID: \"4b7b4ae8-52b3-4fb3-8684-9b64c2e24ac4\") " pod="openshift-infra/auto-csr-approver-29566740-jzzl4" Mar 20 11:00:00 crc kubenswrapper[4748]: I0320 11:00:00.232495 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566740-nhhgf"] Mar 20 11:00:00 crc kubenswrapper[4748]: E0320 11:00:00.275172 4748 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd51df005_e870_4b00_9eac_72bb099077dd.slice\": RecentStats: unable to find data in memory cache]" Mar 20 11:00:00 crc kubenswrapper[4748]: I0320 11:00:00.291613 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 11:00:00 crc kubenswrapper[4748]: I0320 11:00:00.309809 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 11:00:00 crc kubenswrapper[4748]: I0320 11:00:00.325587 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 20 11:00:00 crc kubenswrapper[4748]: I0320 11:00:00.328853 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 11:00:00 crc kubenswrapper[4748]: I0320 11:00:00.333575 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 20 11:00:00 crc kubenswrapper[4748]: I0320 11:00:00.333809 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 20 11:00:00 crc kubenswrapper[4748]: I0320 11:00:00.336730 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7d6cdaef-9045-42e8-8033-abba36827d27-config-volume\") pod \"collect-profiles-29566740-nhhgf\" (UID: \"7d6cdaef-9045-42e8-8033-abba36827d27\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566740-nhhgf" Mar 20 11:00:00 crc kubenswrapper[4748]: I0320 11:00:00.336808 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7d6cdaef-9045-42e8-8033-abba36827d27-secret-volume\") pod \"collect-profiles-29566740-nhhgf\" (UID: \"7d6cdaef-9045-42e8-8033-abba36827d27\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566740-nhhgf" Mar 20 11:00:00 crc kubenswrapper[4748]: I0320 11:00:00.337790 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 11:00:00 crc kubenswrapper[4748]: I0320 11:00:00.337985 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tx6l5\" (UniqueName: \"kubernetes.io/projected/4b7b4ae8-52b3-4fb3-8684-9b64c2e24ac4-kube-api-access-tx6l5\") pod \"auto-csr-approver-29566740-jzzl4\" (UID: \"4b7b4ae8-52b3-4fb3-8684-9b64c2e24ac4\") " pod="openshift-infra/auto-csr-approver-29566740-jzzl4" Mar 20 11:00:00 crc kubenswrapper[4748]: I0320 11:00:00.338561 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7d6cdaef-9045-42e8-8033-abba36827d27-config-volume\") pod \"collect-profiles-29566740-nhhgf\" (UID: \"7d6cdaef-9045-42e8-8033-abba36827d27\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566740-nhhgf" Mar 20 11:00:00 crc kubenswrapper[4748]: I0320 11:00:00.343370 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7d6cdaef-9045-42e8-8033-abba36827d27-secret-volume\") pod \"collect-profiles-29566740-nhhgf\" (UID: \"7d6cdaef-9045-42e8-8033-abba36827d27\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566740-nhhgf" Mar 20 11:00:00 crc kubenswrapper[4748]: I0320 11:00:00.343611 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mn8lg\" (UniqueName: \"kubernetes.io/projected/7d6cdaef-9045-42e8-8033-abba36827d27-kube-api-access-mn8lg\") pod \"collect-profiles-29566740-nhhgf\" (UID: \"7d6cdaef-9045-42e8-8033-abba36827d27\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566740-nhhgf" Mar 20 11:00:00 crc kubenswrapper[4748]: I0320 11:00:00.366368 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tx6l5\" (UniqueName: \"kubernetes.io/projected/4b7b4ae8-52b3-4fb3-8684-9b64c2e24ac4-kube-api-access-tx6l5\") pod \"auto-csr-approver-29566740-jzzl4\" (UID: \"4b7b4ae8-52b3-4fb3-8684-9b64c2e24ac4\") " pod="openshift-infra/auto-csr-approver-29566740-jzzl4" Mar 20 11:00:00 crc kubenswrapper[4748]: I0320 11:00:00.374803 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mn8lg\" (UniqueName: \"kubernetes.io/projected/7d6cdaef-9045-42e8-8033-abba36827d27-kube-api-access-mn8lg\") pod \"collect-profiles-29566740-nhhgf\" (UID: \"7d6cdaef-9045-42e8-8033-abba36827d27\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566740-nhhgf" Mar 20 11:00:00 crc kubenswrapper[4748]: I0320 11:00:00.445139 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2f35a381-dc79-4781-97a4-1d0c8f96a0d2-logs\") pod \"nova-metadata-0\" (UID: \"2f35a381-dc79-4781-97a4-1d0c8f96a0d2\") " pod="openstack/nova-metadata-0" Mar 20 11:00:00 crc kubenswrapper[4748]: I0320 11:00:00.445206 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f35a381-dc79-4781-97a4-1d0c8f96a0d2-config-data\") pod \"nova-metadata-0\" (UID: \"2f35a381-dc79-4781-97a4-1d0c8f96a0d2\") " pod="openstack/nova-metadata-0" Mar 20 11:00:00 crc kubenswrapper[4748]: I0320 11:00:00.445223 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rp7k\" (UniqueName: \"kubernetes.io/projected/2f35a381-dc79-4781-97a4-1d0c8f96a0d2-kube-api-access-6rp7k\") pod \"nova-metadata-0\" (UID: \"2f35a381-dc79-4781-97a4-1d0c8f96a0d2\") " pod="openstack/nova-metadata-0" Mar 20 11:00:00 crc kubenswrapper[4748]: I0320 11:00:00.445259 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f35a381-dc79-4781-97a4-1d0c8f96a0d2-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"2f35a381-dc79-4781-97a4-1d0c8f96a0d2\") " pod="openstack/nova-metadata-0" Mar 20 11:00:00 crc kubenswrapper[4748]: I0320 11:00:00.445322 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f35a381-dc79-4781-97a4-1d0c8f96a0d2-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2f35a381-dc79-4781-97a4-1d0c8f96a0d2\") " pod="openstack/nova-metadata-0" Mar 20 11:00:00 crc kubenswrapper[4748]: I0320 11:00:00.526372 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566740-jzzl4" Mar 20 11:00:00 crc kubenswrapper[4748]: I0320 11:00:00.536757 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566740-nhhgf" Mar 20 11:00:00 crc kubenswrapper[4748]: I0320 11:00:00.547767 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2f35a381-dc79-4781-97a4-1d0c8f96a0d2-logs\") pod \"nova-metadata-0\" (UID: \"2f35a381-dc79-4781-97a4-1d0c8f96a0d2\") " pod="openstack/nova-metadata-0" Mar 20 11:00:00 crc kubenswrapper[4748]: I0320 11:00:00.547850 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f35a381-dc79-4781-97a4-1d0c8f96a0d2-config-data\") pod \"nova-metadata-0\" (UID: \"2f35a381-dc79-4781-97a4-1d0c8f96a0d2\") " pod="openstack/nova-metadata-0" Mar 20 11:00:00 crc kubenswrapper[4748]: I0320 11:00:00.547872 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rp7k\" (UniqueName: \"kubernetes.io/projected/2f35a381-dc79-4781-97a4-1d0c8f96a0d2-kube-api-access-6rp7k\") pod \"nova-metadata-0\" (UID: \"2f35a381-dc79-4781-97a4-1d0c8f96a0d2\") " pod="openstack/nova-metadata-0" Mar 20 11:00:00 crc kubenswrapper[4748]: I0320 11:00:00.547925 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f35a381-dc79-4781-97a4-1d0c8f96a0d2-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"2f35a381-dc79-4781-97a4-1d0c8f96a0d2\") " pod="openstack/nova-metadata-0" Mar 20 11:00:00 crc kubenswrapper[4748]: I0320 11:00:00.547957 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f35a381-dc79-4781-97a4-1d0c8f96a0d2-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2f35a381-dc79-4781-97a4-1d0c8f96a0d2\") " pod="openstack/nova-metadata-0" Mar 20 11:00:00 crc kubenswrapper[4748]: I0320 11:00:00.550143 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2f35a381-dc79-4781-97a4-1d0c8f96a0d2-logs\") pod \"nova-metadata-0\" (UID: \"2f35a381-dc79-4781-97a4-1d0c8f96a0d2\") " pod="openstack/nova-metadata-0" Mar 20 11:00:00 crc kubenswrapper[4748]: I0320 11:00:00.552007 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f35a381-dc79-4781-97a4-1d0c8f96a0d2-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2f35a381-dc79-4781-97a4-1d0c8f96a0d2\") " pod="openstack/nova-metadata-0" Mar 20 11:00:00 crc kubenswrapper[4748]: I0320 11:00:00.552279 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f35a381-dc79-4781-97a4-1d0c8f96a0d2-config-data\") pod \"nova-metadata-0\" (UID: \"2f35a381-dc79-4781-97a4-1d0c8f96a0d2\") " pod="openstack/nova-metadata-0" Mar 20 11:00:00 crc kubenswrapper[4748]: I0320 11:00:00.553158 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f35a381-dc79-4781-97a4-1d0c8f96a0d2-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"2f35a381-dc79-4781-97a4-1d0c8f96a0d2\") " pod="openstack/nova-metadata-0" Mar 20 11:00:00 crc kubenswrapper[4748]: I0320 11:00:00.571625 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rp7k\" (UniqueName: \"kubernetes.io/projected/2f35a381-dc79-4781-97a4-1d0c8f96a0d2-kube-api-access-6rp7k\") pod \"nova-metadata-0\" (UID: \"2f35a381-dc79-4781-97a4-1d0c8f96a0d2\") " pod="openstack/nova-metadata-0" Mar 20 11:00:00 crc kubenswrapper[4748]: I0320 11:00:00.756512 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 11:00:00 crc kubenswrapper[4748]: I0320 11:00:00.881605 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"794b1ed4-1370-4c35-b4b2-7ad2da01f690","Type":"ContainerDied","Data":"9c9e6d3afd0cba4660c04af89c500343aadf2d4b4eb1164c1a0e0fd13136e506"} Mar 20 11:00:00 crc kubenswrapper[4748]: I0320 11:00:00.881662 4748 scope.go:117] "RemoveContainer" containerID="c7035bb73c98a18f77aaa64deb90fcdc3b05847f09179b722a88febf5926ad0e" Mar 20 11:00:00 crc kubenswrapper[4748]: I0320 11:00:00.881764 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 11:00:00 crc kubenswrapper[4748]: I0320 11:00:00.940367 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 11:00:00 crc kubenswrapper[4748]: I0320 11:00:00.958954 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 11:00:01 crc kubenswrapper[4748]: I0320 11:00:01.007036 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 11:00:01 crc kubenswrapper[4748]: I0320 11:00:01.008888 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 11:00:01 crc kubenswrapper[4748]: I0320 11:00:01.013238 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 20 11:00:01 crc kubenswrapper[4748]: I0320 11:00:01.018324 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 11:00:01 crc kubenswrapper[4748]: I0320 11:00:01.059508 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wt6rl\" (UniqueName: \"kubernetes.io/projected/734b70b0-5549-4b8e-aa70-a9d589c5b457-kube-api-access-wt6rl\") pod \"nova-scheduler-0\" (UID: \"734b70b0-5549-4b8e-aa70-a9d589c5b457\") " pod="openstack/nova-scheduler-0" Mar 20 11:00:01 crc kubenswrapper[4748]: I0320 11:00:01.059742 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/734b70b0-5549-4b8e-aa70-a9d589c5b457-config-data\") pod \"nova-scheduler-0\" (UID: \"734b70b0-5549-4b8e-aa70-a9d589c5b457\") " pod="openstack/nova-scheduler-0" Mar 20 11:00:01 crc kubenswrapper[4748]: I0320 11:00:01.059855 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/734b70b0-5549-4b8e-aa70-a9d589c5b457-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"734b70b0-5549-4b8e-aa70-a9d589c5b457\") " pod="openstack/nova-scheduler-0" Mar 20 11:00:01 crc kubenswrapper[4748]: I0320 11:00:01.068188 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566740-jzzl4"] Mar 20 11:00:01 crc kubenswrapper[4748]: I0320 11:00:01.117848 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566740-nhhgf"] Mar 20 11:00:01 crc kubenswrapper[4748]: I0320 11:00:01.162504 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/734b70b0-5549-4b8e-aa70-a9d589c5b457-config-data\") pod \"nova-scheduler-0\" (UID: \"734b70b0-5549-4b8e-aa70-a9d589c5b457\") " pod="openstack/nova-scheduler-0" Mar 20 11:00:01 crc kubenswrapper[4748]: I0320 11:00:01.162613 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/734b70b0-5549-4b8e-aa70-a9d589c5b457-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"734b70b0-5549-4b8e-aa70-a9d589c5b457\") " pod="openstack/nova-scheduler-0" Mar 20 11:00:01 crc kubenswrapper[4748]: I0320 11:00:01.162700 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wt6rl\" (UniqueName: \"kubernetes.io/projected/734b70b0-5549-4b8e-aa70-a9d589c5b457-kube-api-access-wt6rl\") pod \"nova-scheduler-0\" (UID: \"734b70b0-5549-4b8e-aa70-a9d589c5b457\") " pod="openstack/nova-scheduler-0" Mar 20 11:00:01 crc kubenswrapper[4748]: I0320 11:00:01.169046 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/734b70b0-5549-4b8e-aa70-a9d589c5b457-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"734b70b0-5549-4b8e-aa70-a9d589c5b457\") " pod="openstack/nova-scheduler-0" Mar 20 11:00:01 crc kubenswrapper[4748]: I0320 11:00:01.170416 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/734b70b0-5549-4b8e-aa70-a9d589c5b457-config-data\") pod \"nova-scheduler-0\" (UID: \"734b70b0-5549-4b8e-aa70-a9d589c5b457\") " pod="openstack/nova-scheduler-0" Mar 20 11:00:01 crc kubenswrapper[4748]: I0320 11:00:01.181496 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wt6rl\" (UniqueName: \"kubernetes.io/projected/734b70b0-5549-4b8e-aa70-a9d589c5b457-kube-api-access-wt6rl\") pod \"nova-scheduler-0\" (UID: \"734b70b0-5549-4b8e-aa70-a9d589c5b457\") " pod="openstack/nova-scheduler-0" Mar 20 11:00:01 crc kubenswrapper[4748]: I0320 11:00:01.313306 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 11:00:01 crc kubenswrapper[4748]: W0320 11:00:01.316003 4748 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2f35a381_dc79_4781_97a4_1d0c8f96a0d2.slice/crio-e9305f1e15e608e05bb6b7681ca424c501b5ce0a4336e8834a56dce5e9fa70e0 WatchSource:0}: Error finding container e9305f1e15e608e05bb6b7681ca424c501b5ce0a4336e8834a56dce5e9fa70e0: Status 404 returned error can't find the container with id e9305f1e15e608e05bb6b7681ca424c501b5ce0a4336e8834a56dce5e9fa70e0 Mar 20 11:00:01 crc kubenswrapper[4748]: I0320 11:00:01.342158 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 11:00:01 crc kubenswrapper[4748]: I0320 11:00:01.526404 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="794b1ed4-1370-4c35-b4b2-7ad2da01f690" path="/var/lib/kubelet/pods/794b1ed4-1370-4c35-b4b2-7ad2da01f690/volumes" Mar 20 11:00:01 crc kubenswrapper[4748]: I0320 11:00:01.527177 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d51df005-e870-4b00-9eac-72bb099077dd" path="/var/lib/kubelet/pods/d51df005-e870-4b00-9eac-72bb099077dd/volumes" Mar 20 11:00:01 crc kubenswrapper[4748]: I0320 11:00:01.823546 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 11:00:01 crc kubenswrapper[4748]: W0320 11:00:01.826184 4748 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod734b70b0_5549_4b8e_aa70_a9d589c5b457.slice/crio-9787a3956e323006b54031ae635ed8c13ffc718c490b3de55c27639093c708de WatchSource:0}: Error finding container 9787a3956e323006b54031ae635ed8c13ffc718c490b3de55c27639093c708de: Status 404 returned error can't find the container with id 9787a3956e323006b54031ae635ed8c13ffc718c490b3de55c27639093c708de Mar 20 11:00:01 crc kubenswrapper[4748]: I0320 11:00:01.909575 4748 generic.go:334] "Generic (PLEG): container finished" podID="7d6cdaef-9045-42e8-8033-abba36827d27" containerID="1e7b1cf60089813463bee8757db9fe6e1668ec9a86893c9a2657ea67cb2d3cb0" exitCode=0 Mar 20 11:00:01 crc kubenswrapper[4748]: I0320 11:00:01.909673 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566740-nhhgf" event={"ID":"7d6cdaef-9045-42e8-8033-abba36827d27","Type":"ContainerDied","Data":"1e7b1cf60089813463bee8757db9fe6e1668ec9a86893c9a2657ea67cb2d3cb0"} Mar 20 11:00:01 crc kubenswrapper[4748]: I0320 11:00:01.909741 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566740-nhhgf" event={"ID":"7d6cdaef-9045-42e8-8033-abba36827d27","Type":"ContainerStarted","Data":"cd81c86b7ac49cf26f1c217b4eef26eca4703585fa13801f31db1c09e4c96bb0"} Mar 20 11:00:01 crc kubenswrapper[4748]: I0320 11:00:01.912050 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2f35a381-dc79-4781-97a4-1d0c8f96a0d2","Type":"ContainerStarted","Data":"d39fbe66263b4e7593d34d74d9cc78cfb249fe16050b4dbc94930f3e84e29caf"} Mar 20 11:00:01 crc kubenswrapper[4748]: I0320 11:00:01.912093 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2f35a381-dc79-4781-97a4-1d0c8f96a0d2","Type":"ContainerStarted","Data":"a7c363c3fd8244cabff65c31514a5f834f1697eebbf698eb995d9f6ed50dddb1"} Mar 20 11:00:01 crc kubenswrapper[4748]: I0320 11:00:01.912114 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2f35a381-dc79-4781-97a4-1d0c8f96a0d2","Type":"ContainerStarted","Data":"e9305f1e15e608e05bb6b7681ca424c501b5ce0a4336e8834a56dce5e9fa70e0"} Mar 20 11:00:01 crc kubenswrapper[4748]: I0320 11:00:01.915222 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566740-jzzl4" event={"ID":"4b7b4ae8-52b3-4fb3-8684-9b64c2e24ac4","Type":"ContainerStarted","Data":"cb6f055714cd6b552ffe524dd020baaacbb62db8954cb7b587377dee031973dd"} Mar 20 11:00:01 crc kubenswrapper[4748]: I0320 11:00:01.916255 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"734b70b0-5549-4b8e-aa70-a9d589c5b457","Type":"ContainerStarted","Data":"9787a3956e323006b54031ae635ed8c13ffc718c490b3de55c27639093c708de"} Mar 20 11:00:01 crc kubenswrapper[4748]: I0320 11:00:01.956583 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=1.9565590670000002 podStartE2EDuration="1.956559067s" podCreationTimestamp="2026-03-20 11:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 11:00:01.9466799 +0000 UTC m=+1437.088225714" watchObservedRunningTime="2026-03-20 11:00:01.956559067 +0000 UTC m=+1437.098104881" Mar 20 11:00:02 crc kubenswrapper[4748]: I0320 11:00:02.930071 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"734b70b0-5549-4b8e-aa70-a9d589c5b457","Type":"ContainerStarted","Data":"85df4af97393c8d2b9233342f9109bc803089e48baeb50968f4b04c0e3e1d1e1"} Mar 20 11:00:02 crc kubenswrapper[4748]: I0320 11:00:02.950643 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.950623468 podStartE2EDuration="2.950623468s" podCreationTimestamp="2026-03-20 11:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 11:00:02.943198372 +0000 UTC m=+1438.084744206" watchObservedRunningTime="2026-03-20 11:00:02.950623468 +0000 UTC m=+1438.092169282" Mar 20 11:00:03 crc kubenswrapper[4748]: I0320 11:00:03.369865 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566740-nhhgf" Mar 20 11:00:03 crc kubenswrapper[4748]: I0320 11:00:03.405700 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mn8lg\" (UniqueName: \"kubernetes.io/projected/7d6cdaef-9045-42e8-8033-abba36827d27-kube-api-access-mn8lg\") pod \"7d6cdaef-9045-42e8-8033-abba36827d27\" (UID: \"7d6cdaef-9045-42e8-8033-abba36827d27\") " Mar 20 11:00:03 crc kubenswrapper[4748]: I0320 11:00:03.405896 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7d6cdaef-9045-42e8-8033-abba36827d27-config-volume\") pod \"7d6cdaef-9045-42e8-8033-abba36827d27\" (UID: \"7d6cdaef-9045-42e8-8033-abba36827d27\") " Mar 20 11:00:03 crc kubenswrapper[4748]: I0320 11:00:03.405966 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7d6cdaef-9045-42e8-8033-abba36827d27-secret-volume\") pod \"7d6cdaef-9045-42e8-8033-abba36827d27\" (UID: \"7d6cdaef-9045-42e8-8033-abba36827d27\") " Mar 20 11:00:03 crc kubenswrapper[4748]: I0320 11:00:03.406766 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d6cdaef-9045-42e8-8033-abba36827d27-config-volume" (OuterVolumeSpecName: "config-volume") pod "7d6cdaef-9045-42e8-8033-abba36827d27" (UID: "7d6cdaef-9045-42e8-8033-abba36827d27"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 11:00:03 crc kubenswrapper[4748]: I0320 11:00:03.412780 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d6cdaef-9045-42e8-8033-abba36827d27-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "7d6cdaef-9045-42e8-8033-abba36827d27" (UID: "7d6cdaef-9045-42e8-8033-abba36827d27"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 11:00:03 crc kubenswrapper[4748]: I0320 11:00:03.412965 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d6cdaef-9045-42e8-8033-abba36827d27-kube-api-access-mn8lg" (OuterVolumeSpecName: "kube-api-access-mn8lg") pod "7d6cdaef-9045-42e8-8033-abba36827d27" (UID: "7d6cdaef-9045-42e8-8033-abba36827d27"). InnerVolumeSpecName "kube-api-access-mn8lg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:00:03 crc kubenswrapper[4748]: I0320 11:00:03.508860 4748 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7d6cdaef-9045-42e8-8033-abba36827d27-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 11:00:03 crc kubenswrapper[4748]: I0320 11:00:03.508900 4748 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7d6cdaef-9045-42e8-8033-abba36827d27-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 20 11:00:03 crc kubenswrapper[4748]: I0320 11:00:03.508912 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mn8lg\" (UniqueName: \"kubernetes.io/projected/7d6cdaef-9045-42e8-8033-abba36827d27-kube-api-access-mn8lg\") on node \"crc\" DevicePath \"\"" Mar 20 11:00:03 crc kubenswrapper[4748]: I0320 11:00:03.944785 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566740-nhhgf" Mar 20 11:00:03 crc kubenswrapper[4748]: I0320 11:00:03.944862 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566740-nhhgf" event={"ID":"7d6cdaef-9045-42e8-8033-abba36827d27","Type":"ContainerDied","Data":"cd81c86b7ac49cf26f1c217b4eef26eca4703585fa13801f31db1c09e4c96bb0"} Mar 20 11:00:03 crc kubenswrapper[4748]: I0320 11:00:03.944896 4748 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cd81c86b7ac49cf26f1c217b4eef26eca4703585fa13801f31db1c09e4c96bb0" Mar 20 11:00:06 crc kubenswrapper[4748]: I0320 11:00:06.342325 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 20 11:00:06 crc kubenswrapper[4748]: I0320 11:00:06.979477 4748 generic.go:334] "Generic (PLEG): container finished" podID="4b7b4ae8-52b3-4fb3-8684-9b64c2e24ac4" containerID="8699a18cdf70da16c891192e1dcf8147b0e4f26040a7df9f400a9bdf1dfd12e2" exitCode=0 Mar 20 11:00:06 crc kubenswrapper[4748]: I0320 11:00:06.979535 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566740-jzzl4" event={"ID":"4b7b4ae8-52b3-4fb3-8684-9b64c2e24ac4","Type":"ContainerDied","Data":"8699a18cdf70da16c891192e1dcf8147b0e4f26040a7df9f400a9bdf1dfd12e2"} Mar 20 11:00:07 crc kubenswrapper[4748]: I0320 11:00:07.202069 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 20 11:00:07 crc kubenswrapper[4748]: I0320 11:00:07.202462 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 20 11:00:08 crc kubenswrapper[4748]: I0320 11:00:08.218022 4748 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="6acc81f7-4f7d-4828-a328-1e2a4426bd57" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.210:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 11:00:08 crc kubenswrapper[4748]: I0320 11:00:08.218283 4748 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="6acc81f7-4f7d-4828-a328-1e2a4426bd57" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.210:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 11:00:08 crc kubenswrapper[4748]: I0320 11:00:08.387273 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566740-jzzl4" Mar 20 11:00:08 crc kubenswrapper[4748]: I0320 11:00:08.422029 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tx6l5\" (UniqueName: \"kubernetes.io/projected/4b7b4ae8-52b3-4fb3-8684-9b64c2e24ac4-kube-api-access-tx6l5\") pod \"4b7b4ae8-52b3-4fb3-8684-9b64c2e24ac4\" (UID: \"4b7b4ae8-52b3-4fb3-8684-9b64c2e24ac4\") " Mar 20 11:00:08 crc kubenswrapper[4748]: I0320 11:00:08.430138 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b7b4ae8-52b3-4fb3-8684-9b64c2e24ac4-kube-api-access-tx6l5" (OuterVolumeSpecName: "kube-api-access-tx6l5") pod "4b7b4ae8-52b3-4fb3-8684-9b64c2e24ac4" (UID: "4b7b4ae8-52b3-4fb3-8684-9b64c2e24ac4"). InnerVolumeSpecName "kube-api-access-tx6l5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:00:08 crc kubenswrapper[4748]: I0320 11:00:08.524636 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tx6l5\" (UniqueName: \"kubernetes.io/projected/4b7b4ae8-52b3-4fb3-8684-9b64c2e24ac4-kube-api-access-tx6l5\") on node \"crc\" DevicePath \"\"" Mar 20 11:00:09 crc kubenswrapper[4748]: I0320 11:00:09.006669 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566740-jzzl4" event={"ID":"4b7b4ae8-52b3-4fb3-8684-9b64c2e24ac4","Type":"ContainerDied","Data":"cb6f055714cd6b552ffe524dd020baaacbb62db8954cb7b587377dee031973dd"} Mar 20 11:00:09 crc kubenswrapper[4748]: I0320 11:00:09.006720 4748 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cb6f055714cd6b552ffe524dd020baaacbb62db8954cb7b587377dee031973dd" Mar 20 11:00:09 crc kubenswrapper[4748]: I0320 11:00:09.007073 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566740-jzzl4" Mar 20 11:00:09 crc kubenswrapper[4748]: I0320 11:00:09.031279 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-tbls4"] Mar 20 11:00:09 crc kubenswrapper[4748]: E0320 11:00:09.031769 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d6cdaef-9045-42e8-8033-abba36827d27" containerName="collect-profiles" Mar 20 11:00:09 crc kubenswrapper[4748]: I0320 11:00:09.031788 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d6cdaef-9045-42e8-8033-abba36827d27" containerName="collect-profiles" Mar 20 11:00:09 crc kubenswrapper[4748]: E0320 11:00:09.031967 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b7b4ae8-52b3-4fb3-8684-9b64c2e24ac4" containerName="oc" Mar 20 11:00:09 crc kubenswrapper[4748]: I0320 11:00:09.031981 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b7b4ae8-52b3-4fb3-8684-9b64c2e24ac4" containerName="oc" Mar 20 11:00:09 crc kubenswrapper[4748]: I0320 11:00:09.032250 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d6cdaef-9045-42e8-8033-abba36827d27" containerName="collect-profiles" Mar 20 11:00:09 crc kubenswrapper[4748]: I0320 11:00:09.032287 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b7b4ae8-52b3-4fb3-8684-9b64c2e24ac4" containerName="oc" Mar 20 11:00:09 crc kubenswrapper[4748]: I0320 11:00:09.033751 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tbls4" Mar 20 11:00:09 crc kubenswrapper[4748]: I0320 11:00:09.042693 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tbls4"] Mar 20 11:00:09 crc kubenswrapper[4748]: I0320 11:00:09.137402 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzrwm\" (UniqueName: \"kubernetes.io/projected/24fc5546-d8d1-4618-97a8-2389b545ff79-kube-api-access-mzrwm\") pod \"redhat-operators-tbls4\" (UID: \"24fc5546-d8d1-4618-97a8-2389b545ff79\") " pod="openshift-marketplace/redhat-operators-tbls4" Mar 20 11:00:09 crc kubenswrapper[4748]: I0320 11:00:09.137550 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24fc5546-d8d1-4618-97a8-2389b545ff79-utilities\") pod \"redhat-operators-tbls4\" (UID: \"24fc5546-d8d1-4618-97a8-2389b545ff79\") " pod="openshift-marketplace/redhat-operators-tbls4" Mar 20 11:00:09 crc kubenswrapper[4748]: I0320 11:00:09.137682 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24fc5546-d8d1-4618-97a8-2389b545ff79-catalog-content\") pod \"redhat-operators-tbls4\" (UID: \"24fc5546-d8d1-4618-97a8-2389b545ff79\") " pod="openshift-marketplace/redhat-operators-tbls4" Mar 20 11:00:09 crc kubenswrapper[4748]: I0320 11:00:09.238767 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mzrwm\" (UniqueName: \"kubernetes.io/projected/24fc5546-d8d1-4618-97a8-2389b545ff79-kube-api-access-mzrwm\") pod \"redhat-operators-tbls4\" (UID: \"24fc5546-d8d1-4618-97a8-2389b545ff79\") " pod="openshift-marketplace/redhat-operators-tbls4" Mar 20 11:00:09 crc kubenswrapper[4748]: I0320 11:00:09.239376 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24fc5546-d8d1-4618-97a8-2389b545ff79-utilities\") pod \"redhat-operators-tbls4\" (UID: \"24fc5546-d8d1-4618-97a8-2389b545ff79\") " pod="openshift-marketplace/redhat-operators-tbls4" Mar 20 11:00:09 crc kubenswrapper[4748]: I0320 11:00:09.239484 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24fc5546-d8d1-4618-97a8-2389b545ff79-catalog-content\") pod \"redhat-operators-tbls4\" (UID: \"24fc5546-d8d1-4618-97a8-2389b545ff79\") " pod="openshift-marketplace/redhat-operators-tbls4" Mar 20 11:00:09 crc kubenswrapper[4748]: I0320 11:00:09.239814 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24fc5546-d8d1-4618-97a8-2389b545ff79-utilities\") pod \"redhat-operators-tbls4\" (UID: \"24fc5546-d8d1-4618-97a8-2389b545ff79\") " pod="openshift-marketplace/redhat-operators-tbls4" Mar 20 11:00:09 crc kubenswrapper[4748]: I0320 11:00:09.240102 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24fc5546-d8d1-4618-97a8-2389b545ff79-catalog-content\") pod \"redhat-operators-tbls4\" (UID: \"24fc5546-d8d1-4618-97a8-2389b545ff79\") " pod="openshift-marketplace/redhat-operators-tbls4" Mar 20 11:00:09 crc kubenswrapper[4748]: I0320 11:00:09.269911 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzrwm\" (UniqueName: \"kubernetes.io/projected/24fc5546-d8d1-4618-97a8-2389b545ff79-kube-api-access-mzrwm\") pod \"redhat-operators-tbls4\" (UID: \"24fc5546-d8d1-4618-97a8-2389b545ff79\") " pod="openshift-marketplace/redhat-operators-tbls4" Mar 20 11:00:09 crc kubenswrapper[4748]: I0320 11:00:09.358430 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tbls4" Mar 20 11:00:09 crc kubenswrapper[4748]: I0320 11:00:09.486128 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566734-jbtgv"] Mar 20 11:00:09 crc kubenswrapper[4748]: I0320 11:00:09.498930 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566734-jbtgv"] Mar 20 11:00:09 crc kubenswrapper[4748]: I0320 11:00:09.536425 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1f05295-f9d1-467f-9b86-99f226ca7765" path="/var/lib/kubelet/pods/a1f05295-f9d1-467f-9b86-99f226ca7765/volumes" Mar 20 11:00:09 crc kubenswrapper[4748]: I0320 11:00:09.845426 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tbls4"] Mar 20 11:00:10 crc kubenswrapper[4748]: I0320 11:00:10.020398 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tbls4" event={"ID":"24fc5546-d8d1-4618-97a8-2389b545ff79","Type":"ContainerStarted","Data":"5c528858b8b091411d6e7fde6f4b3febe72fd687aff14adad264f7f0cf38aeef"} Mar 20 11:00:10 crc kubenswrapper[4748]: I0320 11:00:10.757585 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 20 11:00:10 crc kubenswrapper[4748]: I0320 11:00:10.757982 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 20 11:00:11 crc kubenswrapper[4748]: I0320 11:00:11.033310 4748 generic.go:334] "Generic (PLEG): container finished" podID="24fc5546-d8d1-4618-97a8-2389b545ff79" containerID="14dac6319fb86b917254c0cb7eb59e418171e392bf633a2bd89ff05c18d1a205" exitCode=0 Mar 20 11:00:11 crc kubenswrapper[4748]: I0320 11:00:11.033365 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tbls4" event={"ID":"24fc5546-d8d1-4618-97a8-2389b545ff79","Type":"ContainerDied","Data":"14dac6319fb86b917254c0cb7eb59e418171e392bf633a2bd89ff05c18d1a205"} Mar 20 11:00:11 crc kubenswrapper[4748]: I0320 11:00:11.343278 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 20 11:00:11 crc kubenswrapper[4748]: I0320 11:00:11.373314 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 20 11:00:11 crc kubenswrapper[4748]: I0320 11:00:11.810195 4748 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="2f35a381-dc79-4781-97a4-1d0c8f96a0d2" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.213:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 11:00:11 crc kubenswrapper[4748]: I0320 11:00:11.810560 4748 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="2f35a381-dc79-4781-97a4-1d0c8f96a0d2" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.213:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 11:00:12 crc kubenswrapper[4748]: I0320 11:00:12.089938 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 20 11:00:13 crc kubenswrapper[4748]: I0320 11:00:13.055083 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tbls4" event={"ID":"24fc5546-d8d1-4618-97a8-2389b545ff79","Type":"ContainerStarted","Data":"8b18e163fd44da4e2e1d23d26c017e7d063b857c2518fcf2fe4c4926d033aa17"} Mar 20 11:00:15 crc kubenswrapper[4748]: I0320 11:00:15.075534 4748 generic.go:334] "Generic (PLEG): container finished" podID="24fc5546-d8d1-4618-97a8-2389b545ff79" containerID="8b18e163fd44da4e2e1d23d26c017e7d063b857c2518fcf2fe4c4926d033aa17" exitCode=0 Mar 20 11:00:15 crc kubenswrapper[4748]: I0320 11:00:15.075644 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tbls4" event={"ID":"24fc5546-d8d1-4618-97a8-2389b545ff79","Type":"ContainerDied","Data":"8b18e163fd44da4e2e1d23d26c017e7d063b857c2518fcf2fe4c4926d033aa17"} Mar 20 11:00:15 crc kubenswrapper[4748]: I0320 11:00:15.202000 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 20 11:00:15 crc kubenswrapper[4748]: I0320 11:00:15.202076 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 20 11:00:16 crc kubenswrapper[4748]: I0320 11:00:16.971307 4748 scope.go:117] "RemoveContainer" containerID="ff823410f00759d7564d35667a82e4412741fc7128102625a43ed2140f0ec63a" Mar 20 11:00:17 crc kubenswrapper[4748]: I0320 11:00:17.102986 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tbls4" event={"ID":"24fc5546-d8d1-4618-97a8-2389b545ff79","Type":"ContainerStarted","Data":"183b1de2281b17db8fd17651d9055207e82e4e032031cb4aa8d989052cda0ad6"} Mar 20 11:00:17 crc kubenswrapper[4748]: I0320 11:00:17.209808 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 20 11:00:17 crc kubenswrapper[4748]: I0320 11:00:17.209923 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 20 11:00:17 crc kubenswrapper[4748]: I0320 11:00:17.217527 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 20 11:00:17 crc kubenswrapper[4748]: I0320 11:00:17.225911 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 20 11:00:17 crc kubenswrapper[4748]: I0320 11:00:17.238883 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-tbls4" podStartSLOduration=4.15242068 podStartE2EDuration="9.238826694s" podCreationTimestamp="2026-03-20 11:00:08 +0000 UTC" firstStartedPulling="2026-03-20 11:00:11.03598009 +0000 UTC m=+1446.177525904" lastFinishedPulling="2026-03-20 11:00:16.122386104 +0000 UTC m=+1451.263931918" observedRunningTime="2026-03-20 11:00:17.13905267 +0000 UTC m=+1452.280598604" watchObservedRunningTime="2026-03-20 11:00:17.238826694 +0000 UTC m=+1452.380372508" Mar 20 11:00:18 crc kubenswrapper[4748]: I0320 11:00:18.757413 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 20 11:00:18 crc kubenswrapper[4748]: I0320 11:00:18.757730 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 20 11:00:19 crc kubenswrapper[4748]: I0320 11:00:19.358897 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-tbls4" Mar 20 11:00:19 crc kubenswrapper[4748]: I0320 11:00:19.359185 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-tbls4" Mar 20 11:00:20 crc kubenswrapper[4748]: I0320 11:00:20.411072 4748 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-tbls4" podUID="24fc5546-d8d1-4618-97a8-2389b545ff79" containerName="registry-server" probeResult="failure" output=< Mar 20 11:00:20 crc kubenswrapper[4748]: timeout: failed to connect service ":50051" within 1s Mar 20 11:00:20 crc kubenswrapper[4748]: > Mar 20 11:00:20 crc kubenswrapper[4748]: I0320 11:00:20.763160 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 20 11:00:20 crc kubenswrapper[4748]: I0320 11:00:20.770272 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 20 11:00:20 crc kubenswrapper[4748]: I0320 11:00:20.777419 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 20 11:00:21 crc kubenswrapper[4748]: I0320 11:00:21.178427 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 20 11:00:24 crc kubenswrapper[4748]: I0320 11:00:24.210171 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 20 11:00:29 crc kubenswrapper[4748]: I0320 11:00:29.408134 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-tbls4" Mar 20 11:00:29 crc kubenswrapper[4748]: I0320 11:00:29.469588 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-tbls4" Mar 20 11:00:29 crc kubenswrapper[4748]: I0320 11:00:29.645591 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tbls4"] Mar 20 11:00:31 crc kubenswrapper[4748]: I0320 11:00:31.261141 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-tbls4" podUID="24fc5546-d8d1-4618-97a8-2389b545ff79" containerName="registry-server" containerID="cri-o://183b1de2281b17db8fd17651d9055207e82e4e032031cb4aa8d989052cda0ad6" gracePeriod=2 Mar 20 11:00:31 crc kubenswrapper[4748]: I0320 11:00:31.700795 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tbls4" Mar 20 11:00:31 crc kubenswrapper[4748]: I0320 11:00:31.785686 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24fc5546-d8d1-4618-97a8-2389b545ff79-catalog-content\") pod \"24fc5546-d8d1-4618-97a8-2389b545ff79\" (UID: \"24fc5546-d8d1-4618-97a8-2389b545ff79\") " Mar 20 11:00:31 crc kubenswrapper[4748]: I0320 11:00:31.786282 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24fc5546-d8d1-4618-97a8-2389b545ff79-utilities\") pod \"24fc5546-d8d1-4618-97a8-2389b545ff79\" (UID: \"24fc5546-d8d1-4618-97a8-2389b545ff79\") " Mar 20 11:00:31 crc kubenswrapper[4748]: I0320 11:00:31.786483 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mzrwm\" (UniqueName: \"kubernetes.io/projected/24fc5546-d8d1-4618-97a8-2389b545ff79-kube-api-access-mzrwm\") pod \"24fc5546-d8d1-4618-97a8-2389b545ff79\" (UID: \"24fc5546-d8d1-4618-97a8-2389b545ff79\") " Mar 20 11:00:31 crc kubenswrapper[4748]: I0320 11:00:31.787105 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/24fc5546-d8d1-4618-97a8-2389b545ff79-utilities" (OuterVolumeSpecName: "utilities") pod "24fc5546-d8d1-4618-97a8-2389b545ff79" (UID: "24fc5546-d8d1-4618-97a8-2389b545ff79"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:00:31 crc kubenswrapper[4748]: I0320 11:00:31.787359 4748 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24fc5546-d8d1-4618-97a8-2389b545ff79-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 11:00:31 crc kubenswrapper[4748]: I0320 11:00:31.795427 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24fc5546-d8d1-4618-97a8-2389b545ff79-kube-api-access-mzrwm" (OuterVolumeSpecName: "kube-api-access-mzrwm") pod "24fc5546-d8d1-4618-97a8-2389b545ff79" (UID: "24fc5546-d8d1-4618-97a8-2389b545ff79"). InnerVolumeSpecName "kube-api-access-mzrwm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:00:31 crc kubenswrapper[4748]: I0320 11:00:31.889393 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mzrwm\" (UniqueName: \"kubernetes.io/projected/24fc5546-d8d1-4618-97a8-2389b545ff79-kube-api-access-mzrwm\") on node \"crc\" DevicePath \"\"" Mar 20 11:00:31 crc kubenswrapper[4748]: I0320 11:00:31.925395 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/24fc5546-d8d1-4618-97a8-2389b545ff79-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "24fc5546-d8d1-4618-97a8-2389b545ff79" (UID: "24fc5546-d8d1-4618-97a8-2389b545ff79"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:00:31 crc kubenswrapper[4748]: I0320 11:00:31.991211 4748 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24fc5546-d8d1-4618-97a8-2389b545ff79-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 11:00:32 crc kubenswrapper[4748]: I0320 11:00:32.271993 4748 generic.go:334] "Generic (PLEG): container finished" podID="24fc5546-d8d1-4618-97a8-2389b545ff79" containerID="183b1de2281b17db8fd17651d9055207e82e4e032031cb4aa8d989052cda0ad6" exitCode=0 Mar 20 11:00:32 crc kubenswrapper[4748]: I0320 11:00:32.272041 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tbls4" event={"ID":"24fc5546-d8d1-4618-97a8-2389b545ff79","Type":"ContainerDied","Data":"183b1de2281b17db8fd17651d9055207e82e4e032031cb4aa8d989052cda0ad6"} Mar 20 11:00:32 crc kubenswrapper[4748]: I0320 11:00:32.272072 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tbls4" event={"ID":"24fc5546-d8d1-4618-97a8-2389b545ff79","Type":"ContainerDied","Data":"5c528858b8b091411d6e7fde6f4b3febe72fd687aff14adad264f7f0cf38aeef"} Mar 20 11:00:32 crc kubenswrapper[4748]: I0320 11:00:32.272090 4748 scope.go:117] "RemoveContainer" containerID="183b1de2281b17db8fd17651d9055207e82e4e032031cb4aa8d989052cda0ad6" Mar 20 11:00:32 crc kubenswrapper[4748]: I0320 11:00:32.272227 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tbls4" Mar 20 11:00:32 crc kubenswrapper[4748]: I0320 11:00:32.299606 4748 scope.go:117] "RemoveContainer" containerID="8b18e163fd44da4e2e1d23d26c017e7d063b857c2518fcf2fe4c4926d033aa17" Mar 20 11:00:32 crc kubenswrapper[4748]: I0320 11:00:32.318098 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tbls4"] Mar 20 11:00:32 crc kubenswrapper[4748]: I0320 11:00:32.327892 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-tbls4"] Mar 20 11:00:32 crc kubenswrapper[4748]: I0320 11:00:32.342048 4748 scope.go:117] "RemoveContainer" containerID="14dac6319fb86b917254c0cb7eb59e418171e392bf633a2bd89ff05c18d1a205" Mar 20 11:00:32 crc kubenswrapper[4748]: I0320 11:00:32.391342 4748 scope.go:117] "RemoveContainer" containerID="183b1de2281b17db8fd17651d9055207e82e4e032031cb4aa8d989052cda0ad6" Mar 20 11:00:32 crc kubenswrapper[4748]: E0320 11:00:32.391951 4748 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"183b1de2281b17db8fd17651d9055207e82e4e032031cb4aa8d989052cda0ad6\": container with ID starting with 183b1de2281b17db8fd17651d9055207e82e4e032031cb4aa8d989052cda0ad6 not found: ID does not exist" containerID="183b1de2281b17db8fd17651d9055207e82e4e032031cb4aa8d989052cda0ad6" Mar 20 11:00:32 crc kubenswrapper[4748]: I0320 11:00:32.392010 4748 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"183b1de2281b17db8fd17651d9055207e82e4e032031cb4aa8d989052cda0ad6"} err="failed to get container status \"183b1de2281b17db8fd17651d9055207e82e4e032031cb4aa8d989052cda0ad6\": rpc error: code = NotFound desc = could not find container \"183b1de2281b17db8fd17651d9055207e82e4e032031cb4aa8d989052cda0ad6\": container with ID starting with 183b1de2281b17db8fd17651d9055207e82e4e032031cb4aa8d989052cda0ad6 not found: ID does not exist" Mar 20 11:00:32 crc kubenswrapper[4748]: I0320 11:00:32.392046 4748 scope.go:117] "RemoveContainer" containerID="8b18e163fd44da4e2e1d23d26c017e7d063b857c2518fcf2fe4c4926d033aa17" Mar 20 11:00:32 crc kubenswrapper[4748]: E0320 11:00:32.392439 4748 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b18e163fd44da4e2e1d23d26c017e7d063b857c2518fcf2fe4c4926d033aa17\": container with ID starting with 8b18e163fd44da4e2e1d23d26c017e7d063b857c2518fcf2fe4c4926d033aa17 not found: ID does not exist" containerID="8b18e163fd44da4e2e1d23d26c017e7d063b857c2518fcf2fe4c4926d033aa17" Mar 20 11:00:32 crc kubenswrapper[4748]: I0320 11:00:32.392508 4748 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b18e163fd44da4e2e1d23d26c017e7d063b857c2518fcf2fe4c4926d033aa17"} err="failed to get container status \"8b18e163fd44da4e2e1d23d26c017e7d063b857c2518fcf2fe4c4926d033aa17\": rpc error: code = NotFound desc = could not find container \"8b18e163fd44da4e2e1d23d26c017e7d063b857c2518fcf2fe4c4926d033aa17\": container with ID starting with 8b18e163fd44da4e2e1d23d26c017e7d063b857c2518fcf2fe4c4926d033aa17 not found: ID does not exist" Mar 20 11:00:32 crc kubenswrapper[4748]: I0320 11:00:32.392552 4748 scope.go:117] "RemoveContainer" containerID="14dac6319fb86b917254c0cb7eb59e418171e392bf633a2bd89ff05c18d1a205" Mar 20 11:00:32 crc kubenswrapper[4748]: E0320 11:00:32.393063 4748 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14dac6319fb86b917254c0cb7eb59e418171e392bf633a2bd89ff05c18d1a205\": container with ID starting with 14dac6319fb86b917254c0cb7eb59e418171e392bf633a2bd89ff05c18d1a205 not found: ID does not exist" containerID="14dac6319fb86b917254c0cb7eb59e418171e392bf633a2bd89ff05c18d1a205" Mar 20 11:00:32 crc kubenswrapper[4748]: I0320 11:00:32.393094 4748 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14dac6319fb86b917254c0cb7eb59e418171e392bf633a2bd89ff05c18d1a205"} err="failed to get container status \"14dac6319fb86b917254c0cb7eb59e418171e392bf633a2bd89ff05c18d1a205\": rpc error: code = NotFound desc = could not find container \"14dac6319fb86b917254c0cb7eb59e418171e392bf633a2bd89ff05c18d1a205\": container with ID starting with 14dac6319fb86b917254c0cb7eb59e418171e392bf633a2bd89ff05c18d1a205 not found: ID does not exist" Mar 20 11:00:33 crc kubenswrapper[4748]: I0320 11:00:33.481251 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 20 11:00:33 crc kubenswrapper[4748]: I0320 11:00:33.526369 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24fc5546-d8d1-4618-97a8-2389b545ff79" path="/var/lib/kubelet/pods/24fc5546-d8d1-4618-97a8-2389b545ff79/volumes" Mar 20 11:00:34 crc kubenswrapper[4748]: I0320 11:00:34.320660 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 20 11:00:37 crc kubenswrapper[4748]: I0320 11:00:37.815010 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="c9362889-0195-4aad-96bd-ed63db88da83" containerName="rabbitmq" containerID="cri-o://80f1fafc8886612f8d64fc6572f4a7fedd2096e9035a5f9faa8fb4a3ddc04814" gracePeriod=604796 Mar 20 11:00:38 crc kubenswrapper[4748]: I0320 11:00:38.599478 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="a5a9b3e3-3a44-4765-ab5b-0e7955b524f7" containerName="rabbitmq" containerID="cri-o://3b70d0f50bcb025dda893bb95fefe7d1a64f4c657a0f73723c19dd93258de0c9" gracePeriod=604796 Mar 20 11:00:39 crc kubenswrapper[4748]: I0320 11:00:39.275565 4748 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="a5a9b3e3-3a44-4765-ab5b-0e7955b524f7" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.99:5671: connect: connection refused" Mar 20 11:00:39 crc kubenswrapper[4748]: I0320 11:00:39.814237 4748 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="c9362889-0195-4aad-96bd-ed63db88da83" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.100:5671: connect: connection refused" Mar 20 11:00:44 crc kubenswrapper[4748]: I0320 11:00:44.387953 4748 generic.go:334] "Generic (PLEG): container finished" podID="c9362889-0195-4aad-96bd-ed63db88da83" containerID="80f1fafc8886612f8d64fc6572f4a7fedd2096e9035a5f9faa8fb4a3ddc04814" exitCode=0 Mar 20 11:00:44 crc kubenswrapper[4748]: I0320 11:00:44.388042 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c9362889-0195-4aad-96bd-ed63db88da83","Type":"ContainerDied","Data":"80f1fafc8886612f8d64fc6572f4a7fedd2096e9035a5f9faa8fb4a3ddc04814"} Mar 20 11:00:44 crc kubenswrapper[4748]: I0320 11:00:44.657652 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 20 11:00:44 crc kubenswrapper[4748]: I0320 11:00:44.790088 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c9362889-0195-4aad-96bd-ed63db88da83-server-conf\") pod \"c9362889-0195-4aad-96bd-ed63db88da83\" (UID: \"c9362889-0195-4aad-96bd-ed63db88da83\") " Mar 20 11:00:44 crc kubenswrapper[4748]: I0320 11:00:44.790166 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4d4s\" (UniqueName: \"kubernetes.io/projected/c9362889-0195-4aad-96bd-ed63db88da83-kube-api-access-w4d4s\") pod \"c9362889-0195-4aad-96bd-ed63db88da83\" (UID: \"c9362889-0195-4aad-96bd-ed63db88da83\") " Mar 20 11:00:44 crc kubenswrapper[4748]: I0320 11:00:44.790193 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"c9362889-0195-4aad-96bd-ed63db88da83\" (UID: \"c9362889-0195-4aad-96bd-ed63db88da83\") " Mar 20 11:00:44 crc kubenswrapper[4748]: I0320 11:00:44.790247 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c9362889-0195-4aad-96bd-ed63db88da83-rabbitmq-confd\") pod \"c9362889-0195-4aad-96bd-ed63db88da83\" (UID: \"c9362889-0195-4aad-96bd-ed63db88da83\") " Mar 20 11:00:44 crc kubenswrapper[4748]: I0320 11:00:44.791394 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c9362889-0195-4aad-96bd-ed63db88da83-config-data\") pod \"c9362889-0195-4aad-96bd-ed63db88da83\" (UID: \"c9362889-0195-4aad-96bd-ed63db88da83\") " Mar 20 11:00:44 crc kubenswrapper[4748]: I0320 11:00:44.791459 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c9362889-0195-4aad-96bd-ed63db88da83-rabbitmq-erlang-cookie\") pod \"c9362889-0195-4aad-96bd-ed63db88da83\" (UID: \"c9362889-0195-4aad-96bd-ed63db88da83\") " Mar 20 11:00:44 crc kubenswrapper[4748]: I0320 11:00:44.791549 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c9362889-0195-4aad-96bd-ed63db88da83-rabbitmq-plugins\") pod \"c9362889-0195-4aad-96bd-ed63db88da83\" (UID: \"c9362889-0195-4aad-96bd-ed63db88da83\") " Mar 20 11:00:44 crc kubenswrapper[4748]: I0320 11:00:44.791569 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c9362889-0195-4aad-96bd-ed63db88da83-pod-info\") pod \"c9362889-0195-4aad-96bd-ed63db88da83\" (UID: \"c9362889-0195-4aad-96bd-ed63db88da83\") " Mar 20 11:00:44 crc kubenswrapper[4748]: I0320 11:00:44.791648 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c9362889-0195-4aad-96bd-ed63db88da83-plugins-conf\") pod \"c9362889-0195-4aad-96bd-ed63db88da83\" (UID: \"c9362889-0195-4aad-96bd-ed63db88da83\") " Mar 20 11:00:44 crc kubenswrapper[4748]: I0320 11:00:44.791712 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c9362889-0195-4aad-96bd-ed63db88da83-rabbitmq-tls\") pod \"c9362889-0195-4aad-96bd-ed63db88da83\" (UID: \"c9362889-0195-4aad-96bd-ed63db88da83\") " Mar 20 11:00:44 crc kubenswrapper[4748]: I0320 11:00:44.791744 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c9362889-0195-4aad-96bd-ed63db88da83-erlang-cookie-secret\") pod \"c9362889-0195-4aad-96bd-ed63db88da83\" (UID: \"c9362889-0195-4aad-96bd-ed63db88da83\") " Mar 20 11:00:44 crc kubenswrapper[4748]: I0320 11:00:44.792128 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9362889-0195-4aad-96bd-ed63db88da83-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "c9362889-0195-4aad-96bd-ed63db88da83" (UID: "c9362889-0195-4aad-96bd-ed63db88da83"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:00:44 crc kubenswrapper[4748]: I0320 11:00:44.792428 4748 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c9362889-0195-4aad-96bd-ed63db88da83-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 20 11:00:44 crc kubenswrapper[4748]: I0320 11:00:44.793284 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9362889-0195-4aad-96bd-ed63db88da83-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "c9362889-0195-4aad-96bd-ed63db88da83" (UID: "c9362889-0195-4aad-96bd-ed63db88da83"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:00:44 crc kubenswrapper[4748]: I0320 11:00:44.793813 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c9362889-0195-4aad-96bd-ed63db88da83-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "c9362889-0195-4aad-96bd-ed63db88da83" (UID: "c9362889-0195-4aad-96bd-ed63db88da83"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 11:00:44 crc kubenswrapper[4748]: I0320 11:00:44.797213 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9362889-0195-4aad-96bd-ed63db88da83-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "c9362889-0195-4aad-96bd-ed63db88da83" (UID: "c9362889-0195-4aad-96bd-ed63db88da83"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 11:00:44 crc kubenswrapper[4748]: I0320 11:00:44.797736 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "persistence") pod "c9362889-0195-4aad-96bd-ed63db88da83" (UID: "c9362889-0195-4aad-96bd-ed63db88da83"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 20 11:00:44 crc kubenswrapper[4748]: I0320 11:00:44.797874 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9362889-0195-4aad-96bd-ed63db88da83-kube-api-access-w4d4s" (OuterVolumeSpecName: "kube-api-access-w4d4s") pod "c9362889-0195-4aad-96bd-ed63db88da83" (UID: "c9362889-0195-4aad-96bd-ed63db88da83"). InnerVolumeSpecName "kube-api-access-w4d4s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:00:44 crc kubenswrapper[4748]: I0320 11:00:44.802655 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/c9362889-0195-4aad-96bd-ed63db88da83-pod-info" (OuterVolumeSpecName: "pod-info") pod "c9362889-0195-4aad-96bd-ed63db88da83" (UID: "c9362889-0195-4aad-96bd-ed63db88da83"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 20 11:00:44 crc kubenswrapper[4748]: I0320 11:00:44.805210 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9362889-0195-4aad-96bd-ed63db88da83-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "c9362889-0195-4aad-96bd-ed63db88da83" (UID: "c9362889-0195-4aad-96bd-ed63db88da83"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:00:44 crc kubenswrapper[4748]: I0320 11:00:44.831534 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c9362889-0195-4aad-96bd-ed63db88da83-config-data" (OuterVolumeSpecName: "config-data") pod "c9362889-0195-4aad-96bd-ed63db88da83" (UID: "c9362889-0195-4aad-96bd-ed63db88da83"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 11:00:44 crc kubenswrapper[4748]: I0320 11:00:44.879077 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c9362889-0195-4aad-96bd-ed63db88da83-server-conf" (OuterVolumeSpecName: "server-conf") pod "c9362889-0195-4aad-96bd-ed63db88da83" (UID: "c9362889-0195-4aad-96bd-ed63db88da83"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 11:00:44 crc kubenswrapper[4748]: I0320 11:00:44.894614 4748 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c9362889-0195-4aad-96bd-ed63db88da83-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 20 11:00:44 crc kubenswrapper[4748]: I0320 11:00:44.894660 4748 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c9362889-0195-4aad-96bd-ed63db88da83-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 20 11:00:44 crc kubenswrapper[4748]: I0320 11:00:44.894673 4748 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c9362889-0195-4aad-96bd-ed63db88da83-server-conf\") on node \"crc\" DevicePath \"\"" Mar 20 11:00:44 crc kubenswrapper[4748]: I0320 11:00:44.894686 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4d4s\" (UniqueName: \"kubernetes.io/projected/c9362889-0195-4aad-96bd-ed63db88da83-kube-api-access-w4d4s\") on node \"crc\" DevicePath \"\"" Mar 20 11:00:44 crc kubenswrapper[4748]: I0320 11:00:44.894723 4748 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Mar 20 11:00:44 crc kubenswrapper[4748]: I0320 11:00:44.894735 4748 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c9362889-0195-4aad-96bd-ed63db88da83-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 11:00:44 crc kubenswrapper[4748]: I0320 11:00:44.894749 4748 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c9362889-0195-4aad-96bd-ed63db88da83-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 20 11:00:44 crc kubenswrapper[4748]: I0320 11:00:44.894761 4748 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c9362889-0195-4aad-96bd-ed63db88da83-pod-info\") on node \"crc\" DevicePath \"\"" Mar 20 11:00:44 crc kubenswrapper[4748]: I0320 11:00:44.894771 4748 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c9362889-0195-4aad-96bd-ed63db88da83-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 20 11:00:44 crc kubenswrapper[4748]: I0320 11:00:44.922611 4748 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Mar 20 11:00:44 crc kubenswrapper[4748]: I0320 11:00:44.942631 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9362889-0195-4aad-96bd-ed63db88da83-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "c9362889-0195-4aad-96bd-ed63db88da83" (UID: "c9362889-0195-4aad-96bd-ed63db88da83"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:00:44 crc kubenswrapper[4748]: I0320 11:00:44.996423 4748 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Mar 20 11:00:44 crc kubenswrapper[4748]: I0320 11:00:44.996471 4748 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c9362889-0195-4aad-96bd-ed63db88da83-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 20 11:00:45 crc kubenswrapper[4748]: I0320 11:00:45.138159 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 20 11:00:45 crc kubenswrapper[4748]: I0320 11:00:45.301114 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a5a9b3e3-3a44-4765-ab5b-0e7955b524f7-config-data\") pod \"a5a9b3e3-3a44-4765-ab5b-0e7955b524f7\" (UID: \"a5a9b3e3-3a44-4765-ab5b-0e7955b524f7\") " Mar 20 11:00:45 crc kubenswrapper[4748]: I0320 11:00:45.301433 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a5a9b3e3-3a44-4765-ab5b-0e7955b524f7-rabbitmq-erlang-cookie\") pod \"a5a9b3e3-3a44-4765-ab5b-0e7955b524f7\" (UID: \"a5a9b3e3-3a44-4765-ab5b-0e7955b524f7\") " Mar 20 11:00:45 crc kubenswrapper[4748]: I0320 11:00:45.301504 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a5a9b3e3-3a44-4765-ab5b-0e7955b524f7-server-conf\") pod \"a5a9b3e3-3a44-4765-ab5b-0e7955b524f7\" (UID: \"a5a9b3e3-3a44-4765-ab5b-0e7955b524f7\") " Mar 20 11:00:45 crc kubenswrapper[4748]: I0320 11:00:45.301524 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a5a9b3e3-3a44-4765-ab5b-0e7955b524f7-erlang-cookie-secret\") pod \"a5a9b3e3-3a44-4765-ab5b-0e7955b524f7\" (UID: \"a5a9b3e3-3a44-4765-ab5b-0e7955b524f7\") " Mar 20 11:00:45 crc kubenswrapper[4748]: I0320 11:00:45.301591 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a5a9b3e3-3a44-4765-ab5b-0e7955b524f7-rabbitmq-plugins\") pod \"a5a9b3e3-3a44-4765-ab5b-0e7955b524f7\" (UID: \"a5a9b3e3-3a44-4765-ab5b-0e7955b524f7\") " Mar 20 11:00:45 crc kubenswrapper[4748]: I0320 11:00:45.301632 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a5a9b3e3-3a44-4765-ab5b-0e7955b524f7-rabbitmq-confd\") pod \"a5a9b3e3-3a44-4765-ab5b-0e7955b524f7\" (UID: \"a5a9b3e3-3a44-4765-ab5b-0e7955b524f7\") " Mar 20 11:00:45 crc kubenswrapper[4748]: I0320 11:00:45.301661 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a5a9b3e3-3a44-4765-ab5b-0e7955b524f7-rabbitmq-tls\") pod \"a5a9b3e3-3a44-4765-ab5b-0e7955b524f7\" (UID: \"a5a9b3e3-3a44-4765-ab5b-0e7955b524f7\") " Mar 20 11:00:45 crc kubenswrapper[4748]: I0320 11:00:45.301703 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"a5a9b3e3-3a44-4765-ab5b-0e7955b524f7\" (UID: \"a5a9b3e3-3a44-4765-ab5b-0e7955b524f7\") " Mar 20 11:00:45 crc kubenswrapper[4748]: I0320 11:00:45.301747 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vmt7j\" (UniqueName: \"kubernetes.io/projected/a5a9b3e3-3a44-4765-ab5b-0e7955b524f7-kube-api-access-vmt7j\") pod \"a5a9b3e3-3a44-4765-ab5b-0e7955b524f7\" (UID: \"a5a9b3e3-3a44-4765-ab5b-0e7955b524f7\") " Mar 20 11:00:45 crc kubenswrapper[4748]: I0320 11:00:45.301881 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a5a9b3e3-3a44-4765-ab5b-0e7955b524f7-pod-info\") pod \"a5a9b3e3-3a44-4765-ab5b-0e7955b524f7\" (UID: \"a5a9b3e3-3a44-4765-ab5b-0e7955b524f7\") " Mar 20 11:00:45 crc kubenswrapper[4748]: I0320 11:00:45.301910 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a5a9b3e3-3a44-4765-ab5b-0e7955b524f7-plugins-conf\") pod \"a5a9b3e3-3a44-4765-ab5b-0e7955b524f7\" (UID: \"a5a9b3e3-3a44-4765-ab5b-0e7955b524f7\") " Mar 20 11:00:45 crc kubenswrapper[4748]: I0320 11:00:45.302772 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a5a9b3e3-3a44-4765-ab5b-0e7955b524f7-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "a5a9b3e3-3a44-4765-ab5b-0e7955b524f7" (UID: "a5a9b3e3-3a44-4765-ab5b-0e7955b524f7"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:00:45 crc kubenswrapper[4748]: I0320 11:00:45.302871 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a5a9b3e3-3a44-4765-ab5b-0e7955b524f7-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "a5a9b3e3-3a44-4765-ab5b-0e7955b524f7" (UID: "a5a9b3e3-3a44-4765-ab5b-0e7955b524f7"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 11:00:45 crc kubenswrapper[4748]: I0320 11:00:45.306625 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a5a9b3e3-3a44-4765-ab5b-0e7955b524f7-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "a5a9b3e3-3a44-4765-ab5b-0e7955b524f7" (UID: "a5a9b3e3-3a44-4765-ab5b-0e7955b524f7"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:00:45 crc kubenswrapper[4748]: I0320 11:00:45.311039 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/a5a9b3e3-3a44-4765-ab5b-0e7955b524f7-pod-info" (OuterVolumeSpecName: "pod-info") pod "a5a9b3e3-3a44-4765-ab5b-0e7955b524f7" (UID: "a5a9b3e3-3a44-4765-ab5b-0e7955b524f7"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 20 11:00:45 crc kubenswrapper[4748]: I0320 11:00:45.311107 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5a9b3e3-3a44-4765-ab5b-0e7955b524f7-kube-api-access-vmt7j" (OuterVolumeSpecName: "kube-api-access-vmt7j") pod "a5a9b3e3-3a44-4765-ab5b-0e7955b524f7" (UID: "a5a9b3e3-3a44-4765-ab5b-0e7955b524f7"). InnerVolumeSpecName "kube-api-access-vmt7j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:00:45 crc kubenswrapper[4748]: I0320 11:00:45.311113 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5a9b3e3-3a44-4765-ab5b-0e7955b524f7-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "a5a9b3e3-3a44-4765-ab5b-0e7955b524f7" (UID: "a5a9b3e3-3a44-4765-ab5b-0e7955b524f7"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 11:00:45 crc kubenswrapper[4748]: I0320 11:00:45.311253 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5a9b3e3-3a44-4765-ab5b-0e7955b524f7-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "a5a9b3e3-3a44-4765-ab5b-0e7955b524f7" (UID: "a5a9b3e3-3a44-4765-ab5b-0e7955b524f7"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:00:45 crc kubenswrapper[4748]: I0320 11:00:45.312202 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "persistence") pod "a5a9b3e3-3a44-4765-ab5b-0e7955b524f7" (UID: "a5a9b3e3-3a44-4765-ab5b-0e7955b524f7"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 20 11:00:45 crc kubenswrapper[4748]: I0320 11:00:45.349646 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a5a9b3e3-3a44-4765-ab5b-0e7955b524f7-config-data" (OuterVolumeSpecName: "config-data") pod "a5a9b3e3-3a44-4765-ab5b-0e7955b524f7" (UID: "a5a9b3e3-3a44-4765-ab5b-0e7955b524f7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 11:00:45 crc kubenswrapper[4748]: I0320 11:00:45.387993 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a5a9b3e3-3a44-4765-ab5b-0e7955b524f7-server-conf" (OuterVolumeSpecName: "server-conf") pod "a5a9b3e3-3a44-4765-ab5b-0e7955b524f7" (UID: "a5a9b3e3-3a44-4765-ab5b-0e7955b524f7"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 11:00:45 crc kubenswrapper[4748]: I0320 11:00:45.405684 4748 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a5a9b3e3-3a44-4765-ab5b-0e7955b524f7-pod-info\") on node \"crc\" DevicePath \"\"" Mar 20 11:00:45 crc kubenswrapper[4748]: I0320 11:00:45.405731 4748 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a5a9b3e3-3a44-4765-ab5b-0e7955b524f7-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 20 11:00:45 crc kubenswrapper[4748]: I0320 11:00:45.405744 4748 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a5a9b3e3-3a44-4765-ab5b-0e7955b524f7-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 11:00:45 crc kubenswrapper[4748]: I0320 11:00:45.405757 4748 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a5a9b3e3-3a44-4765-ab5b-0e7955b524f7-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 20 11:00:45 crc kubenswrapper[4748]: I0320 11:00:45.405772 4748 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a5a9b3e3-3a44-4765-ab5b-0e7955b524f7-server-conf\") on node \"crc\" DevicePath \"\"" Mar 20 11:00:45 crc kubenswrapper[4748]: I0320 11:00:45.405786 4748 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a5a9b3e3-3a44-4765-ab5b-0e7955b524f7-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 20 11:00:45 crc kubenswrapper[4748]: I0320 11:00:45.405795 4748 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a5a9b3e3-3a44-4765-ab5b-0e7955b524f7-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 20 11:00:45 crc kubenswrapper[4748]: I0320 11:00:45.405805 4748 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a5a9b3e3-3a44-4765-ab5b-0e7955b524f7-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 20 11:00:45 crc kubenswrapper[4748]: I0320 11:00:45.405868 4748 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Mar 20 11:00:45 crc kubenswrapper[4748]: I0320 11:00:45.405882 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vmt7j\" (UniqueName: \"kubernetes.io/projected/a5a9b3e3-3a44-4765-ab5b-0e7955b524f7-kube-api-access-vmt7j\") on node \"crc\" DevicePath \"\"" Mar 20 11:00:45 crc kubenswrapper[4748]: I0320 11:00:45.432355 4748 generic.go:334] "Generic (PLEG): container finished" podID="a5a9b3e3-3a44-4765-ab5b-0e7955b524f7" containerID="3b70d0f50bcb025dda893bb95fefe7d1a64f4c657a0f73723c19dd93258de0c9" exitCode=0 Mar 20 11:00:45 crc kubenswrapper[4748]: I0320 11:00:45.432424 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a5a9b3e3-3a44-4765-ab5b-0e7955b524f7","Type":"ContainerDied","Data":"3b70d0f50bcb025dda893bb95fefe7d1a64f4c657a0f73723c19dd93258de0c9"} Mar 20 11:00:45 crc kubenswrapper[4748]: I0320 11:00:45.432453 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a5a9b3e3-3a44-4765-ab5b-0e7955b524f7","Type":"ContainerDied","Data":"33d77a2ba8801769f0bbcc59b4a07c93ff3300d311223f5055ef01fe17c63a03"} Mar 20 11:00:45 crc kubenswrapper[4748]: I0320 11:00:45.432474 4748 scope.go:117] "RemoveContainer" containerID="3b70d0f50bcb025dda893bb95fefe7d1a64f4c657a0f73723c19dd93258de0c9" Mar 20 11:00:45 crc kubenswrapper[4748]: I0320 11:00:45.432899 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 20 11:00:45 crc kubenswrapper[4748]: I0320 11:00:45.457081 4748 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Mar 20 11:00:45 crc kubenswrapper[4748]: I0320 11:00:45.470756 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c9362889-0195-4aad-96bd-ed63db88da83","Type":"ContainerDied","Data":"6ddb29d78aaa1efa0d79ef7d996e0d06f7985af2a78b3e4a1947ce3b405d8608"} Mar 20 11:00:45 crc kubenswrapper[4748]: I0320 11:00:45.470917 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 20 11:00:45 crc kubenswrapper[4748]: I0320 11:00:45.508293 4748 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Mar 20 11:00:45 crc kubenswrapper[4748]: I0320 11:00:45.583503 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5a9b3e3-3a44-4765-ab5b-0e7955b524f7-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "a5a9b3e3-3a44-4765-ab5b-0e7955b524f7" (UID: "a5a9b3e3-3a44-4765-ab5b-0e7955b524f7"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:00:45 crc kubenswrapper[4748]: I0320 11:00:45.615483 4748 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a5a9b3e3-3a44-4765-ab5b-0e7955b524f7-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 20 11:00:45 crc kubenswrapper[4748]: I0320 11:00:45.625969 4748 scope.go:117] "RemoveContainer" containerID="6ba003918b6897670cc8bdcdc8ff22f66bc5073d251d1b4a3f6edadbbe769774" Mar 20 11:00:45 crc kubenswrapper[4748]: I0320 11:00:45.656070 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 20 11:00:45 crc kubenswrapper[4748]: I0320 11:00:45.683568 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 20 11:00:45 crc kubenswrapper[4748]: I0320 11:00:45.685304 4748 scope.go:117] "RemoveContainer" containerID="3b70d0f50bcb025dda893bb95fefe7d1a64f4c657a0f73723c19dd93258de0c9" Mar 20 11:00:45 crc kubenswrapper[4748]: E0320 11:00:45.685862 4748 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b70d0f50bcb025dda893bb95fefe7d1a64f4c657a0f73723c19dd93258de0c9\": container with ID starting with 3b70d0f50bcb025dda893bb95fefe7d1a64f4c657a0f73723c19dd93258de0c9 not found: ID does not exist" containerID="3b70d0f50bcb025dda893bb95fefe7d1a64f4c657a0f73723c19dd93258de0c9" Mar 20 11:00:45 crc kubenswrapper[4748]: I0320 11:00:45.685894 4748 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b70d0f50bcb025dda893bb95fefe7d1a64f4c657a0f73723c19dd93258de0c9"} err="failed to get container status \"3b70d0f50bcb025dda893bb95fefe7d1a64f4c657a0f73723c19dd93258de0c9\": rpc error: code = NotFound desc = could not find container \"3b70d0f50bcb025dda893bb95fefe7d1a64f4c657a0f73723c19dd93258de0c9\": container with ID starting with 3b70d0f50bcb025dda893bb95fefe7d1a64f4c657a0f73723c19dd93258de0c9 not found: ID does not exist" Mar 20 11:00:45 crc kubenswrapper[4748]: I0320 11:00:45.685917 4748 scope.go:117] "RemoveContainer" containerID="6ba003918b6897670cc8bdcdc8ff22f66bc5073d251d1b4a3f6edadbbe769774" Mar 20 11:00:45 crc kubenswrapper[4748]: E0320 11:00:45.687262 4748 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ba003918b6897670cc8bdcdc8ff22f66bc5073d251d1b4a3f6edadbbe769774\": container with ID starting with 6ba003918b6897670cc8bdcdc8ff22f66bc5073d251d1b4a3f6edadbbe769774 not found: ID does not exist" containerID="6ba003918b6897670cc8bdcdc8ff22f66bc5073d251d1b4a3f6edadbbe769774" Mar 20 11:00:45 crc kubenswrapper[4748]: I0320 11:00:45.687313 4748 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ba003918b6897670cc8bdcdc8ff22f66bc5073d251d1b4a3f6edadbbe769774"} err="failed to get container status \"6ba003918b6897670cc8bdcdc8ff22f66bc5073d251d1b4a3f6edadbbe769774\": rpc error: code = NotFound desc = could not find container \"6ba003918b6897670cc8bdcdc8ff22f66bc5073d251d1b4a3f6edadbbe769774\": container with ID starting with 6ba003918b6897670cc8bdcdc8ff22f66bc5073d251d1b4a3f6edadbbe769774 not found: ID does not exist" Mar 20 11:00:45 crc kubenswrapper[4748]: I0320 11:00:45.687349 4748 scope.go:117] "RemoveContainer" containerID="80f1fafc8886612f8d64fc6572f4a7fedd2096e9035a5f9faa8fb4a3ddc04814" Mar 20 11:00:45 crc kubenswrapper[4748]: I0320 11:00:45.700972 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 20 11:00:45 crc kubenswrapper[4748]: E0320 11:00:45.701437 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24fc5546-d8d1-4618-97a8-2389b545ff79" containerName="registry-server" Mar 20 11:00:45 crc kubenswrapper[4748]: I0320 11:00:45.701461 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="24fc5546-d8d1-4618-97a8-2389b545ff79" containerName="registry-server" Mar 20 11:00:45 crc kubenswrapper[4748]: E0320 11:00:45.701482 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9362889-0195-4aad-96bd-ed63db88da83" containerName="setup-container" Mar 20 11:00:45 crc kubenswrapper[4748]: I0320 11:00:45.701489 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9362889-0195-4aad-96bd-ed63db88da83" containerName="setup-container" Mar 20 11:00:45 crc kubenswrapper[4748]: E0320 11:00:45.701502 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24fc5546-d8d1-4618-97a8-2389b545ff79" containerName="extract-utilities" Mar 20 11:00:45 crc kubenswrapper[4748]: I0320 11:00:45.701507 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="24fc5546-d8d1-4618-97a8-2389b545ff79" containerName="extract-utilities" Mar 20 11:00:45 crc kubenswrapper[4748]: E0320 11:00:45.701521 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5a9b3e3-3a44-4765-ab5b-0e7955b524f7" containerName="rabbitmq" Mar 20 11:00:45 crc kubenswrapper[4748]: I0320 11:00:45.701527 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5a9b3e3-3a44-4765-ab5b-0e7955b524f7" containerName="rabbitmq" Mar 20 11:00:45 crc kubenswrapper[4748]: E0320 11:00:45.701541 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9362889-0195-4aad-96bd-ed63db88da83" containerName="rabbitmq" Mar 20 11:00:45 crc kubenswrapper[4748]: I0320 11:00:45.701547 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9362889-0195-4aad-96bd-ed63db88da83" containerName="rabbitmq" Mar 20 11:00:45 crc kubenswrapper[4748]: E0320 11:00:45.701564 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5a9b3e3-3a44-4765-ab5b-0e7955b524f7" containerName="setup-container" Mar 20 11:00:45 crc kubenswrapper[4748]: I0320 11:00:45.701570 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5a9b3e3-3a44-4765-ab5b-0e7955b524f7" containerName="setup-container" Mar 20 11:00:45 crc kubenswrapper[4748]: E0320 11:00:45.701582 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24fc5546-d8d1-4618-97a8-2389b545ff79" containerName="extract-content" Mar 20 11:00:45 crc kubenswrapper[4748]: I0320 11:00:45.701589 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="24fc5546-d8d1-4618-97a8-2389b545ff79" containerName="extract-content" Mar 20 11:00:45 crc kubenswrapper[4748]: I0320 11:00:45.701750 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="24fc5546-d8d1-4618-97a8-2389b545ff79" containerName="registry-server" Mar 20 11:00:45 crc kubenswrapper[4748]: I0320 11:00:45.701773 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5a9b3e3-3a44-4765-ab5b-0e7955b524f7" containerName="rabbitmq" Mar 20 11:00:45 crc kubenswrapper[4748]: I0320 11:00:45.701785 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9362889-0195-4aad-96bd-ed63db88da83" containerName="rabbitmq" Mar 20 11:00:45 crc kubenswrapper[4748]: I0320 11:00:45.703422 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 20 11:00:45 crc kubenswrapper[4748]: I0320 11:00:45.708491 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 20 11:00:45 crc kubenswrapper[4748]: I0320 11:00:45.708514 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 20 11:00:45 crc kubenswrapper[4748]: I0320 11:00:45.709295 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 20 11:00:45 crc kubenswrapper[4748]: I0320 11:00:45.709420 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 20 11:00:45 crc kubenswrapper[4748]: I0320 11:00:45.709538 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 20 11:00:45 crc kubenswrapper[4748]: I0320 11:00:45.709636 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 20 11:00:45 crc kubenswrapper[4748]: I0320 11:00:45.709815 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-kzfkz" Mar 20 11:00:45 crc kubenswrapper[4748]: I0320 11:00:45.731377 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 20 11:00:45 crc kubenswrapper[4748]: I0320 11:00:45.732317 4748 scope.go:117] "RemoveContainer" containerID="6ccf567ca597230b5f8e105670b07f93dbf0776dfe28b4aadbed6ba96ca21f1b" Mar 20 11:00:45 crc kubenswrapper[4748]: I0320 11:00:45.770898 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 20 11:00:45 crc kubenswrapper[4748]: I0320 11:00:45.792547 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 20 11:00:45 crc kubenswrapper[4748]: I0320 11:00:45.817368 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 20 11:00:45 crc kubenswrapper[4748]: I0320 11:00:45.819145 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 20 11:00:45 crc kubenswrapper[4748]: I0320 11:00:45.821161 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-server-0\" (UID: \"b6bc49d0-a6dc-4a70-9d54-2cc66e8cf07a\") " pod="openstack/rabbitmq-server-0" Mar 20 11:00:45 crc kubenswrapper[4748]: I0320 11:00:45.821259 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b6bc49d0-a6dc-4a70-9d54-2cc66e8cf07a-config-data\") pod \"rabbitmq-server-0\" (UID: \"b6bc49d0-a6dc-4a70-9d54-2cc66e8cf07a\") " pod="openstack/rabbitmq-server-0" Mar 20 11:00:45 crc kubenswrapper[4748]: I0320 11:00:45.821335 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b6bc49d0-a6dc-4a70-9d54-2cc66e8cf07a-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"b6bc49d0-a6dc-4a70-9d54-2cc66e8cf07a\") " pod="openstack/rabbitmq-server-0" Mar 20 11:00:45 crc kubenswrapper[4748]: I0320 11:00:45.821417 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b6bc49d0-a6dc-4a70-9d54-2cc66e8cf07a-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"b6bc49d0-a6dc-4a70-9d54-2cc66e8cf07a\") " pod="openstack/rabbitmq-server-0" Mar 20 11:00:45 crc kubenswrapper[4748]: I0320 11:00:45.821505 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b6bc49d0-a6dc-4a70-9d54-2cc66e8cf07a-server-conf\") pod \"rabbitmq-server-0\" (UID: \"b6bc49d0-a6dc-4a70-9d54-2cc66e8cf07a\") " pod="openstack/rabbitmq-server-0" Mar 20 11:00:45 crc kubenswrapper[4748]: I0320 11:00:45.821799 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b6bc49d0-a6dc-4a70-9d54-2cc66e8cf07a-pod-info\") pod \"rabbitmq-server-0\" (UID: \"b6bc49d0-a6dc-4a70-9d54-2cc66e8cf07a\") " pod="openstack/rabbitmq-server-0" Mar 20 11:00:45 crc kubenswrapper[4748]: I0320 11:00:45.821885 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b6bc49d0-a6dc-4a70-9d54-2cc66e8cf07a-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"b6bc49d0-a6dc-4a70-9d54-2cc66e8cf07a\") " pod="openstack/rabbitmq-server-0" Mar 20 11:00:45 crc kubenswrapper[4748]: I0320 11:00:45.821956 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b6bc49d0-a6dc-4a70-9d54-2cc66e8cf07a-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"b6bc49d0-a6dc-4a70-9d54-2cc66e8cf07a\") " pod="openstack/rabbitmq-server-0" Mar 20 11:00:45 crc kubenswrapper[4748]: I0320 11:00:45.822018 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b6bc49d0-a6dc-4a70-9d54-2cc66e8cf07a-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"b6bc49d0-a6dc-4a70-9d54-2cc66e8cf07a\") " pod="openstack/rabbitmq-server-0" Mar 20 11:00:45 crc kubenswrapper[4748]: I0320 11:00:45.822140 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdjqh\" (UniqueName: \"kubernetes.io/projected/b6bc49d0-a6dc-4a70-9d54-2cc66e8cf07a-kube-api-access-xdjqh\") pod \"rabbitmq-server-0\" (UID: \"b6bc49d0-a6dc-4a70-9d54-2cc66e8cf07a\") " pod="openstack/rabbitmq-server-0" Mar 20 11:00:45 crc kubenswrapper[4748]: I0320 11:00:45.822223 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b6bc49d0-a6dc-4a70-9d54-2cc66e8cf07a-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"b6bc49d0-a6dc-4a70-9d54-2cc66e8cf07a\") " pod="openstack/rabbitmq-server-0" Mar 20 11:00:45 crc kubenswrapper[4748]: I0320 11:00:45.823224 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 20 11:00:45 crc kubenswrapper[4748]: I0320 11:00:45.823456 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 20 11:00:45 crc kubenswrapper[4748]: I0320 11:00:45.823492 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 20 11:00:45 crc kubenswrapper[4748]: I0320 11:00:45.823688 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 20 11:00:45 crc kubenswrapper[4748]: I0320 11:00:45.823828 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 20 11:00:45 crc kubenswrapper[4748]: I0320 11:00:45.823854 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-86jnz" Mar 20 11:00:45 crc kubenswrapper[4748]: I0320 11:00:45.824046 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 20 11:00:45 crc kubenswrapper[4748]: I0320 11:00:45.828560 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 20 11:00:45 crc kubenswrapper[4748]: I0320 11:00:45.924150 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xdjqh\" (UniqueName: \"kubernetes.io/projected/b6bc49d0-a6dc-4a70-9d54-2cc66e8cf07a-kube-api-access-xdjqh\") pod \"rabbitmq-server-0\" (UID: \"b6bc49d0-a6dc-4a70-9d54-2cc66e8cf07a\") " pod="openstack/rabbitmq-server-0" Mar 20 11:00:45 crc kubenswrapper[4748]: I0320 11:00:45.924953 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b6bc49d0-a6dc-4a70-9d54-2cc66e8cf07a-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"b6bc49d0-a6dc-4a70-9d54-2cc66e8cf07a\") " pod="openstack/rabbitmq-server-0" Mar 20 11:00:45 crc kubenswrapper[4748]: I0320 11:00:45.925573 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b6bc49d0-a6dc-4a70-9d54-2cc66e8cf07a-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"b6bc49d0-a6dc-4a70-9d54-2cc66e8cf07a\") " pod="openstack/rabbitmq-server-0" Mar 20 11:00:45 crc kubenswrapper[4748]: I0320 11:00:45.926055 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-server-0\" (UID: \"b6bc49d0-a6dc-4a70-9d54-2cc66e8cf07a\") " pod="openstack/rabbitmq-server-0" Mar 20 11:00:45 crc kubenswrapper[4748]: I0320 11:00:45.926576 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b6bc49d0-a6dc-4a70-9d54-2cc66e8cf07a-config-data\") pod \"rabbitmq-server-0\" (UID: \"b6bc49d0-a6dc-4a70-9d54-2cc66e8cf07a\") " pod="openstack/rabbitmq-server-0" Mar 20 11:00:45 crc kubenswrapper[4748]: I0320 11:00:45.928026 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/507647d5-8633-4346-a9e0-4af3eb0e3e5f-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"507647d5-8633-4346-a9e0-4af3eb0e3e5f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 11:00:45 crc kubenswrapper[4748]: I0320 11:00:45.928170 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/507647d5-8633-4346-a9e0-4af3eb0e3e5f-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"507647d5-8633-4346-a9e0-4af3eb0e3e5f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 11:00:45 crc kubenswrapper[4748]: I0320 11:00:45.926513 4748 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-server-0\" (UID: \"b6bc49d0-a6dc-4a70-9d54-2cc66e8cf07a\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/rabbitmq-server-0" Mar 20 11:00:45 crc kubenswrapper[4748]: I0320 11:00:45.928335 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b6bc49d0-a6dc-4a70-9d54-2cc66e8cf07a-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"b6bc49d0-a6dc-4a70-9d54-2cc66e8cf07a\") " pod="openstack/rabbitmq-server-0" Mar 20 11:00:45 crc kubenswrapper[4748]: I0320 11:00:45.928427 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/507647d5-8633-4346-a9e0-4af3eb0e3e5f-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"507647d5-8633-4346-a9e0-4af3eb0e3e5f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 11:00:45 crc kubenswrapper[4748]: I0320 11:00:45.928464 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b6bc49d0-a6dc-4a70-9d54-2cc66e8cf07a-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"b6bc49d0-a6dc-4a70-9d54-2cc66e8cf07a\") " pod="openstack/rabbitmq-server-0" Mar 20 11:00:45 crc kubenswrapper[4748]: I0320 11:00:45.928501 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/507647d5-8633-4346-a9e0-4af3eb0e3e5f-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"507647d5-8633-4346-a9e0-4af3eb0e3e5f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 11:00:45 crc kubenswrapper[4748]: I0320 11:00:45.928561 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/507647d5-8633-4346-a9e0-4af3eb0e3e5f-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"507647d5-8633-4346-a9e0-4af3eb0e3e5f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 11:00:45 crc kubenswrapper[4748]: I0320 11:00:45.928595 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/507647d5-8633-4346-a9e0-4af3eb0e3e5f-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"507647d5-8633-4346-a9e0-4af3eb0e3e5f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 11:00:45 crc kubenswrapper[4748]: I0320 11:00:45.928621 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/507647d5-8633-4346-a9e0-4af3eb0e3e5f-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"507647d5-8633-4346-a9e0-4af3eb0e3e5f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 11:00:45 crc kubenswrapper[4748]: I0320 11:00:45.928654 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b6bc49d0-a6dc-4a70-9d54-2cc66e8cf07a-server-conf\") pod \"rabbitmq-server-0\" (UID: \"b6bc49d0-a6dc-4a70-9d54-2cc66e8cf07a\") " pod="openstack/rabbitmq-server-0" Mar 20 11:00:45 crc kubenswrapper[4748]: I0320 11:00:45.928699 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/507647d5-8633-4346-a9e0-4af3eb0e3e5f-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"507647d5-8633-4346-a9e0-4af3eb0e3e5f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 11:00:45 crc kubenswrapper[4748]: I0320 11:00:45.928740 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b6bc49d0-a6dc-4a70-9d54-2cc66e8cf07a-pod-info\") pod \"rabbitmq-server-0\" (UID: \"b6bc49d0-a6dc-4a70-9d54-2cc66e8cf07a\") " pod="openstack/rabbitmq-server-0" Mar 20 11:00:45 crc kubenswrapper[4748]: I0320 11:00:45.928763 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b6bc49d0-a6dc-4a70-9d54-2cc66e8cf07a-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"b6bc49d0-a6dc-4a70-9d54-2cc66e8cf07a\") " pod="openstack/rabbitmq-server-0" Mar 20 11:00:45 crc kubenswrapper[4748]: I0320 11:00:45.928792 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b6bc49d0-a6dc-4a70-9d54-2cc66e8cf07a-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"b6bc49d0-a6dc-4a70-9d54-2cc66e8cf07a\") " pod="openstack/rabbitmq-server-0" Mar 20 11:00:45 crc kubenswrapper[4748]: I0320 11:00:45.928816 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b6bc49d0-a6dc-4a70-9d54-2cc66e8cf07a-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"b6bc49d0-a6dc-4a70-9d54-2cc66e8cf07a\") " pod="openstack/rabbitmq-server-0" Mar 20 11:00:45 crc kubenswrapper[4748]: I0320 11:00:45.928874 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/507647d5-8633-4346-a9e0-4af3eb0e3e5f-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"507647d5-8633-4346-a9e0-4af3eb0e3e5f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 11:00:45 crc kubenswrapper[4748]: I0320 11:00:45.928968 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87jmd\" (UniqueName: \"kubernetes.io/projected/507647d5-8633-4346-a9e0-4af3eb0e3e5f-kube-api-access-87jmd\") pod \"rabbitmq-cell1-server-0\" (UID: \"507647d5-8633-4346-a9e0-4af3eb0e3e5f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 11:00:45 crc kubenswrapper[4748]: I0320 11:00:45.929029 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"507647d5-8633-4346-a9e0-4af3eb0e3e5f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 11:00:45 crc kubenswrapper[4748]: I0320 11:00:45.929982 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b6bc49d0-a6dc-4a70-9d54-2cc66e8cf07a-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"b6bc49d0-a6dc-4a70-9d54-2cc66e8cf07a\") " pod="openstack/rabbitmq-server-0" Mar 20 11:00:45 crc kubenswrapper[4748]: I0320 11:00:45.930582 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b6bc49d0-a6dc-4a70-9d54-2cc66e8cf07a-server-conf\") pod \"rabbitmq-server-0\" (UID: \"b6bc49d0-a6dc-4a70-9d54-2cc66e8cf07a\") " pod="openstack/rabbitmq-server-0" Mar 20 11:00:45 crc kubenswrapper[4748]: I0320 11:00:45.933402 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b6bc49d0-a6dc-4a70-9d54-2cc66e8cf07a-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"b6bc49d0-a6dc-4a70-9d54-2cc66e8cf07a\") " pod="openstack/rabbitmq-server-0" Mar 20 11:00:45 crc kubenswrapper[4748]: I0320 11:00:45.933703 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b6bc49d0-a6dc-4a70-9d54-2cc66e8cf07a-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"b6bc49d0-a6dc-4a70-9d54-2cc66e8cf07a\") " pod="openstack/rabbitmq-server-0" Mar 20 11:00:45 crc kubenswrapper[4748]: I0320 11:00:45.933865 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b6bc49d0-a6dc-4a70-9d54-2cc66e8cf07a-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"b6bc49d0-a6dc-4a70-9d54-2cc66e8cf07a\") " pod="openstack/rabbitmq-server-0" Mar 20 11:00:45 crc kubenswrapper[4748]: I0320 11:00:45.927946 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b6bc49d0-a6dc-4a70-9d54-2cc66e8cf07a-config-data\") pod \"rabbitmq-server-0\" (UID: \"b6bc49d0-a6dc-4a70-9d54-2cc66e8cf07a\") " pod="openstack/rabbitmq-server-0" Mar 20 11:00:45 crc kubenswrapper[4748]: I0320 11:00:45.934227 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b6bc49d0-a6dc-4a70-9d54-2cc66e8cf07a-pod-info\") pod \"rabbitmq-server-0\" (UID: \"b6bc49d0-a6dc-4a70-9d54-2cc66e8cf07a\") " pod="openstack/rabbitmq-server-0" Mar 20 11:00:45 crc kubenswrapper[4748]: I0320 11:00:45.936610 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b6bc49d0-a6dc-4a70-9d54-2cc66e8cf07a-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"b6bc49d0-a6dc-4a70-9d54-2cc66e8cf07a\") " pod="openstack/rabbitmq-server-0" Mar 20 11:00:45 crc kubenswrapper[4748]: I0320 11:00:45.939571 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdjqh\" (UniqueName: \"kubernetes.io/projected/b6bc49d0-a6dc-4a70-9d54-2cc66e8cf07a-kube-api-access-xdjqh\") pod \"rabbitmq-server-0\" (UID: \"b6bc49d0-a6dc-4a70-9d54-2cc66e8cf07a\") " pod="openstack/rabbitmq-server-0" Mar 20 11:00:45 crc kubenswrapper[4748]: I0320 11:00:45.974566 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-server-0\" (UID: \"b6bc49d0-a6dc-4a70-9d54-2cc66e8cf07a\") " pod="openstack/rabbitmq-server-0" Mar 20 11:00:46 crc kubenswrapper[4748]: I0320 11:00:46.030659 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/507647d5-8633-4346-a9e0-4af3eb0e3e5f-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"507647d5-8633-4346-a9e0-4af3eb0e3e5f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 11:00:46 crc kubenswrapper[4748]: I0320 11:00:46.030699 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/507647d5-8633-4346-a9e0-4af3eb0e3e5f-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"507647d5-8633-4346-a9e0-4af3eb0e3e5f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 11:00:46 crc kubenswrapper[4748]: I0320 11:00:46.030721 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/507647d5-8633-4346-a9e0-4af3eb0e3e5f-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"507647d5-8633-4346-a9e0-4af3eb0e3e5f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 11:00:46 crc kubenswrapper[4748]: I0320 11:00:46.030744 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/507647d5-8633-4346-a9e0-4af3eb0e3e5f-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"507647d5-8633-4346-a9e0-4af3eb0e3e5f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 11:00:46 crc kubenswrapper[4748]: I0320 11:00:46.030774 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/507647d5-8633-4346-a9e0-4af3eb0e3e5f-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"507647d5-8633-4346-a9e0-4af3eb0e3e5f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 11:00:46 crc kubenswrapper[4748]: I0320 11:00:46.030811 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-87jmd\" (UniqueName: \"kubernetes.io/projected/507647d5-8633-4346-a9e0-4af3eb0e3e5f-kube-api-access-87jmd\") pod \"rabbitmq-cell1-server-0\" (UID: \"507647d5-8633-4346-a9e0-4af3eb0e3e5f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 11:00:46 crc kubenswrapper[4748]: I0320 11:00:46.030871 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"507647d5-8633-4346-a9e0-4af3eb0e3e5f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 11:00:46 crc kubenswrapper[4748]: I0320 11:00:46.030943 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/507647d5-8633-4346-a9e0-4af3eb0e3e5f-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"507647d5-8633-4346-a9e0-4af3eb0e3e5f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 11:00:46 crc kubenswrapper[4748]: I0320 11:00:46.030963 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/507647d5-8633-4346-a9e0-4af3eb0e3e5f-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"507647d5-8633-4346-a9e0-4af3eb0e3e5f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 11:00:46 crc kubenswrapper[4748]: I0320 11:00:46.030987 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/507647d5-8633-4346-a9e0-4af3eb0e3e5f-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"507647d5-8633-4346-a9e0-4af3eb0e3e5f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 11:00:46 crc kubenswrapper[4748]: I0320 11:00:46.031010 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/507647d5-8633-4346-a9e0-4af3eb0e3e5f-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"507647d5-8633-4346-a9e0-4af3eb0e3e5f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 11:00:46 crc kubenswrapper[4748]: I0320 11:00:46.031280 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/507647d5-8633-4346-a9e0-4af3eb0e3e5f-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"507647d5-8633-4346-a9e0-4af3eb0e3e5f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 11:00:46 crc kubenswrapper[4748]: I0320 11:00:46.031386 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/507647d5-8633-4346-a9e0-4af3eb0e3e5f-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"507647d5-8633-4346-a9e0-4af3eb0e3e5f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 11:00:46 crc kubenswrapper[4748]: I0320 11:00:46.031484 4748 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"507647d5-8633-4346-a9e0-4af3eb0e3e5f\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/rabbitmq-cell1-server-0" Mar 20 11:00:46 crc kubenswrapper[4748]: I0320 11:00:46.032366 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/507647d5-8633-4346-a9e0-4af3eb0e3e5f-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"507647d5-8633-4346-a9e0-4af3eb0e3e5f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 11:00:46 crc kubenswrapper[4748]: I0320 11:00:46.034177 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/507647d5-8633-4346-a9e0-4af3eb0e3e5f-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"507647d5-8633-4346-a9e0-4af3eb0e3e5f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 11:00:46 crc kubenswrapper[4748]: I0320 11:00:46.034540 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/507647d5-8633-4346-a9e0-4af3eb0e3e5f-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"507647d5-8633-4346-a9e0-4af3eb0e3e5f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 11:00:46 crc kubenswrapper[4748]: I0320 11:00:46.035038 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/507647d5-8633-4346-a9e0-4af3eb0e3e5f-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"507647d5-8633-4346-a9e0-4af3eb0e3e5f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 11:00:46 crc kubenswrapper[4748]: I0320 11:00:46.035124 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/507647d5-8633-4346-a9e0-4af3eb0e3e5f-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"507647d5-8633-4346-a9e0-4af3eb0e3e5f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 11:00:46 crc kubenswrapper[4748]: I0320 11:00:46.039045 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 20 11:00:46 crc kubenswrapper[4748]: I0320 11:00:46.039859 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/507647d5-8633-4346-a9e0-4af3eb0e3e5f-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"507647d5-8633-4346-a9e0-4af3eb0e3e5f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 11:00:46 crc kubenswrapper[4748]: I0320 11:00:46.042410 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/507647d5-8633-4346-a9e0-4af3eb0e3e5f-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"507647d5-8633-4346-a9e0-4af3eb0e3e5f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 11:00:46 crc kubenswrapper[4748]: I0320 11:00:46.054553 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-87jmd\" (UniqueName: \"kubernetes.io/projected/507647d5-8633-4346-a9e0-4af3eb0e3e5f-kube-api-access-87jmd\") pod \"rabbitmq-cell1-server-0\" (UID: \"507647d5-8633-4346-a9e0-4af3eb0e3e5f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 11:00:46 crc kubenswrapper[4748]: I0320 11:00:46.065537 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"507647d5-8633-4346-a9e0-4af3eb0e3e5f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 11:00:46 crc kubenswrapper[4748]: I0320 11:00:46.144296 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 20 11:00:46 crc kubenswrapper[4748]: I0320 11:00:46.498401 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 20 11:00:46 crc kubenswrapper[4748]: I0320 11:00:46.634624 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 20 11:00:46 crc kubenswrapper[4748]: W0320 11:00:46.638085 4748 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod507647d5_8633_4346_a9e0_4af3eb0e3e5f.slice/crio-27d8c87aedd51300509acc020229a5f4a69a02065021004ff67dcc4fe69b6c5a WatchSource:0}: Error finding container 27d8c87aedd51300509acc020229a5f4a69a02065021004ff67dcc4fe69b6c5a: Status 404 returned error can't find the container with id 27d8c87aedd51300509acc020229a5f4a69a02065021004ff67dcc4fe69b6c5a Mar 20 11:00:47 crc kubenswrapper[4748]: I0320 11:00:47.501529 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"507647d5-8633-4346-a9e0-4af3eb0e3e5f","Type":"ContainerStarted","Data":"27d8c87aedd51300509acc020229a5f4a69a02065021004ff67dcc4fe69b6c5a"} Mar 20 11:00:47 crc kubenswrapper[4748]: I0320 11:00:47.503013 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"b6bc49d0-a6dc-4a70-9d54-2cc66e8cf07a","Type":"ContainerStarted","Data":"9abaabc7f7277f0fadb806b27f81a4a1aa2d3fd88d7a763482ae2f37b96e5856"} Mar 20 11:00:47 crc kubenswrapper[4748]: I0320 11:00:47.569102 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5a9b3e3-3a44-4765-ab5b-0e7955b524f7" path="/var/lib/kubelet/pods/a5a9b3e3-3a44-4765-ab5b-0e7955b524f7/volumes" Mar 20 11:00:47 crc kubenswrapper[4748]: I0320 11:00:47.570183 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9362889-0195-4aad-96bd-ed63db88da83" path="/var/lib/kubelet/pods/c9362889-0195-4aad-96bd-ed63db88da83/volumes" Mar 20 11:00:48 crc kubenswrapper[4748]: I0320 11:00:48.426432 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-7zk86"] Mar 20 11:00:48 crc kubenswrapper[4748]: I0320 11:00:48.429481 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79bd4cc8c9-7zk86" Mar 20 11:00:48 crc kubenswrapper[4748]: I0320 11:00:48.432173 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Mar 20 11:00:48 crc kubenswrapper[4748]: I0320 11:00:48.491401 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-7zk86"] Mar 20 11:00:48 crc kubenswrapper[4748]: I0320 11:00:48.514598 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"507647d5-8633-4346-a9e0-4af3eb0e3e5f","Type":"ContainerStarted","Data":"0c7552290b90407d9f27c2c2ba3622fa03227521cfded9fde55cf5b4478d5622"} Mar 20 11:00:48 crc kubenswrapper[4748]: I0320 11:00:48.517593 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"b6bc49d0-a6dc-4a70-9d54-2cc66e8cf07a","Type":"ContainerStarted","Data":"427e1c298d95f58f79b7797e69454eb041aece5f7b89def3c34ecdb1dfeba329"} Mar 20 11:00:48 crc kubenswrapper[4748]: I0320 11:00:48.578400 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d7861e5c-351b-442a-8b86-3a7e9ea3e856-ovsdbserver-nb\") pod \"dnsmasq-dns-79bd4cc8c9-7zk86\" (UID: \"d7861e5c-351b-442a-8b86-3a7e9ea3e856\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-7zk86" Mar 20 11:00:48 crc kubenswrapper[4748]: I0320 11:00:48.578481 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d7861e5c-351b-442a-8b86-3a7e9ea3e856-ovsdbserver-sb\") pod \"dnsmasq-dns-79bd4cc8c9-7zk86\" (UID: \"d7861e5c-351b-442a-8b86-3a7e9ea3e856\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-7zk86" Mar 20 11:00:48 crc kubenswrapper[4748]: I0320 11:00:48.578519 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/d7861e5c-351b-442a-8b86-3a7e9ea3e856-openstack-edpm-ipam\") pod \"dnsmasq-dns-79bd4cc8c9-7zk86\" (UID: \"d7861e5c-351b-442a-8b86-3a7e9ea3e856\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-7zk86" Mar 20 11:00:48 crc kubenswrapper[4748]: I0320 11:00:48.578550 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7861e5c-351b-442a-8b86-3a7e9ea3e856-config\") pod \"dnsmasq-dns-79bd4cc8c9-7zk86\" (UID: \"d7861e5c-351b-442a-8b86-3a7e9ea3e856\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-7zk86" Mar 20 11:00:48 crc kubenswrapper[4748]: I0320 11:00:48.578623 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d7861e5c-351b-442a-8b86-3a7e9ea3e856-dns-svc\") pod \"dnsmasq-dns-79bd4cc8c9-7zk86\" (UID: \"d7861e5c-351b-442a-8b86-3a7e9ea3e856\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-7zk86" Mar 20 11:00:48 crc kubenswrapper[4748]: I0320 11:00:48.578650 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fz878\" (UniqueName: \"kubernetes.io/projected/d7861e5c-351b-442a-8b86-3a7e9ea3e856-kube-api-access-fz878\") pod \"dnsmasq-dns-79bd4cc8c9-7zk86\" (UID: \"d7861e5c-351b-442a-8b86-3a7e9ea3e856\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-7zk86" Mar 20 11:00:48 crc kubenswrapper[4748]: I0320 11:00:48.578709 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d7861e5c-351b-442a-8b86-3a7e9ea3e856-dns-swift-storage-0\") pod \"dnsmasq-dns-79bd4cc8c9-7zk86\" (UID: \"d7861e5c-351b-442a-8b86-3a7e9ea3e856\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-7zk86" Mar 20 11:00:48 crc kubenswrapper[4748]: I0320 11:00:48.680318 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d7861e5c-351b-442a-8b86-3a7e9ea3e856-ovsdbserver-sb\") pod \"dnsmasq-dns-79bd4cc8c9-7zk86\" (UID: \"d7861e5c-351b-442a-8b86-3a7e9ea3e856\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-7zk86" Mar 20 11:00:48 crc kubenswrapper[4748]: I0320 11:00:48.680436 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/d7861e5c-351b-442a-8b86-3a7e9ea3e856-openstack-edpm-ipam\") pod \"dnsmasq-dns-79bd4cc8c9-7zk86\" (UID: \"d7861e5c-351b-442a-8b86-3a7e9ea3e856\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-7zk86" Mar 20 11:00:48 crc kubenswrapper[4748]: I0320 11:00:48.680490 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7861e5c-351b-442a-8b86-3a7e9ea3e856-config\") pod \"dnsmasq-dns-79bd4cc8c9-7zk86\" (UID: \"d7861e5c-351b-442a-8b86-3a7e9ea3e856\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-7zk86" Mar 20 11:00:48 crc kubenswrapper[4748]: I0320 11:00:48.680599 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d7861e5c-351b-442a-8b86-3a7e9ea3e856-dns-svc\") pod \"dnsmasq-dns-79bd4cc8c9-7zk86\" (UID: \"d7861e5c-351b-442a-8b86-3a7e9ea3e856\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-7zk86" Mar 20 11:00:48 crc kubenswrapper[4748]: I0320 11:00:48.680623 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fz878\" (UniqueName: \"kubernetes.io/projected/d7861e5c-351b-442a-8b86-3a7e9ea3e856-kube-api-access-fz878\") pod \"dnsmasq-dns-79bd4cc8c9-7zk86\" (UID: \"d7861e5c-351b-442a-8b86-3a7e9ea3e856\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-7zk86" Mar 20 11:00:48 crc kubenswrapper[4748]: I0320 11:00:48.680706 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d7861e5c-351b-442a-8b86-3a7e9ea3e856-dns-swift-storage-0\") pod \"dnsmasq-dns-79bd4cc8c9-7zk86\" (UID: \"d7861e5c-351b-442a-8b86-3a7e9ea3e856\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-7zk86" Mar 20 11:00:48 crc kubenswrapper[4748]: I0320 11:00:48.680755 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d7861e5c-351b-442a-8b86-3a7e9ea3e856-ovsdbserver-nb\") pod \"dnsmasq-dns-79bd4cc8c9-7zk86\" (UID: \"d7861e5c-351b-442a-8b86-3a7e9ea3e856\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-7zk86" Mar 20 11:00:48 crc kubenswrapper[4748]: I0320 11:00:48.681132 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d7861e5c-351b-442a-8b86-3a7e9ea3e856-ovsdbserver-sb\") pod \"dnsmasq-dns-79bd4cc8c9-7zk86\" (UID: \"d7861e5c-351b-442a-8b86-3a7e9ea3e856\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-7zk86" Mar 20 11:00:48 crc kubenswrapper[4748]: I0320 11:00:48.682132 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d7861e5c-351b-442a-8b86-3a7e9ea3e856-dns-svc\") pod \"dnsmasq-dns-79bd4cc8c9-7zk86\" (UID: \"d7861e5c-351b-442a-8b86-3a7e9ea3e856\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-7zk86" Mar 20 11:00:48 crc kubenswrapper[4748]: I0320 11:00:48.683008 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d7861e5c-351b-442a-8b86-3a7e9ea3e856-ovsdbserver-nb\") pod \"dnsmasq-dns-79bd4cc8c9-7zk86\" (UID: \"d7861e5c-351b-442a-8b86-3a7e9ea3e856\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-7zk86" Mar 20 11:00:48 crc kubenswrapper[4748]: I0320 11:00:48.683502 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/d7861e5c-351b-442a-8b86-3a7e9ea3e856-openstack-edpm-ipam\") pod \"dnsmasq-dns-79bd4cc8c9-7zk86\" (UID: \"d7861e5c-351b-442a-8b86-3a7e9ea3e856\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-7zk86" Mar 20 11:00:48 crc kubenswrapper[4748]: I0320 11:00:48.684000 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d7861e5c-351b-442a-8b86-3a7e9ea3e856-dns-swift-storage-0\") pod \"dnsmasq-dns-79bd4cc8c9-7zk86\" (UID: \"d7861e5c-351b-442a-8b86-3a7e9ea3e856\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-7zk86" Mar 20 11:00:48 crc kubenswrapper[4748]: I0320 11:00:48.684163 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7861e5c-351b-442a-8b86-3a7e9ea3e856-config\") pod \"dnsmasq-dns-79bd4cc8c9-7zk86\" (UID: \"d7861e5c-351b-442a-8b86-3a7e9ea3e856\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-7zk86" Mar 20 11:00:48 crc kubenswrapper[4748]: I0320 11:00:48.706260 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fz878\" (UniqueName: \"kubernetes.io/projected/d7861e5c-351b-442a-8b86-3a7e9ea3e856-kube-api-access-fz878\") pod \"dnsmasq-dns-79bd4cc8c9-7zk86\" (UID: \"d7861e5c-351b-442a-8b86-3a7e9ea3e856\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-7zk86" Mar 20 11:00:48 crc kubenswrapper[4748]: I0320 11:00:48.776639 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79bd4cc8c9-7zk86" Mar 20 11:00:49 crc kubenswrapper[4748]: I0320 11:00:49.265869 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-7zk86"] Mar 20 11:00:49 crc kubenswrapper[4748]: W0320 11:00:49.272975 4748 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd7861e5c_351b_442a_8b86_3a7e9ea3e856.slice/crio-0a78df30682ad8398c48dbdf5d315e3f256bd6c9ae01466f2d487a6fe3c45480 WatchSource:0}: Error finding container 0a78df30682ad8398c48dbdf5d315e3f256bd6c9ae01466f2d487a6fe3c45480: Status 404 returned error can't find the container with id 0a78df30682ad8398c48dbdf5d315e3f256bd6c9ae01466f2d487a6fe3c45480 Mar 20 11:00:49 crc kubenswrapper[4748]: I0320 11:00:49.537949 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd4cc8c9-7zk86" event={"ID":"d7861e5c-351b-442a-8b86-3a7e9ea3e856","Type":"ContainerStarted","Data":"2d64e82d02a533e7e90598eaa40c12a915bd9f4213f0e457ff0c5d27dd41a2d7"} Mar 20 11:00:49 crc kubenswrapper[4748]: I0320 11:00:49.537999 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd4cc8c9-7zk86" event={"ID":"d7861e5c-351b-442a-8b86-3a7e9ea3e856","Type":"ContainerStarted","Data":"0a78df30682ad8398c48dbdf5d315e3f256bd6c9ae01466f2d487a6fe3c45480"} Mar 20 11:00:50 crc kubenswrapper[4748]: I0320 11:00:50.549553 4748 generic.go:334] "Generic (PLEG): container finished" podID="d7861e5c-351b-442a-8b86-3a7e9ea3e856" containerID="2d64e82d02a533e7e90598eaa40c12a915bd9f4213f0e457ff0c5d27dd41a2d7" exitCode=0 Mar 20 11:00:50 crc kubenswrapper[4748]: I0320 11:00:50.549771 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd4cc8c9-7zk86" event={"ID":"d7861e5c-351b-442a-8b86-3a7e9ea3e856","Type":"ContainerDied","Data":"2d64e82d02a533e7e90598eaa40c12a915bd9f4213f0e457ff0c5d27dd41a2d7"} Mar 20 11:00:51 crc kubenswrapper[4748]: I0320 11:00:51.560149 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd4cc8c9-7zk86" event={"ID":"d7861e5c-351b-442a-8b86-3a7e9ea3e856","Type":"ContainerStarted","Data":"0aacf0224ee0c71f8ff0a63ee00055f127dd8a6a5bfce682a48d870520eb6c14"} Mar 20 11:00:51 crc kubenswrapper[4748]: I0320 11:00:51.561782 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-79bd4cc8c9-7zk86" Mar 20 11:00:51 crc kubenswrapper[4748]: I0320 11:00:51.591748 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-79bd4cc8c9-7zk86" podStartSLOduration=3.5917291479999998 podStartE2EDuration="3.591729148s" podCreationTimestamp="2026-03-20 11:00:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 11:00:51.584052336 +0000 UTC m=+1486.725598150" watchObservedRunningTime="2026-03-20 11:00:51.591729148 +0000 UTC m=+1486.733274962" Mar 20 11:00:58 crc kubenswrapper[4748]: I0320 11:00:58.779068 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-79bd4cc8c9-7zk86" Mar 20 11:00:58 crc kubenswrapper[4748]: I0320 11:00:58.846419 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-s9zg6"] Mar 20 11:00:58 crc kubenswrapper[4748]: I0320 11:00:58.846851 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-89c5cd4d5-s9zg6" podUID="58e56aa6-3665-4020-827c-4b961f13924b" containerName="dnsmasq-dns" containerID="cri-o://92e782d1a7e09d2cc3eb10cc8bbd162df5212bb983fd019e2e5780d3df198373" gracePeriod=10 Mar 20 11:00:59 crc kubenswrapper[4748]: I0320 11:00:59.012146 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-54ffdb7d8c-ckmg2"] Mar 20 11:00:59 crc kubenswrapper[4748]: I0320 11:00:59.015001 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54ffdb7d8c-ckmg2" Mar 20 11:00:59 crc kubenswrapper[4748]: I0320 11:00:59.024327 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/66fa6d88-f5fa-4288-8b2b-bc30561967c0-ovsdbserver-sb\") pod \"dnsmasq-dns-54ffdb7d8c-ckmg2\" (UID: \"66fa6d88-f5fa-4288-8b2b-bc30561967c0\") " pod="openstack/dnsmasq-dns-54ffdb7d8c-ckmg2" Mar 20 11:00:59 crc kubenswrapper[4748]: I0320 11:00:59.024415 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66fa6d88-f5fa-4288-8b2b-bc30561967c0-config\") pod \"dnsmasq-dns-54ffdb7d8c-ckmg2\" (UID: \"66fa6d88-f5fa-4288-8b2b-bc30561967c0\") " pod="openstack/dnsmasq-dns-54ffdb7d8c-ckmg2" Mar 20 11:00:59 crc kubenswrapper[4748]: I0320 11:00:59.024457 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/66fa6d88-f5fa-4288-8b2b-bc30561967c0-ovsdbserver-nb\") pod \"dnsmasq-dns-54ffdb7d8c-ckmg2\" (UID: \"66fa6d88-f5fa-4288-8b2b-bc30561967c0\") " pod="openstack/dnsmasq-dns-54ffdb7d8c-ckmg2" Mar 20 11:00:59 crc kubenswrapper[4748]: I0320 11:00:59.024518 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/66fa6d88-f5fa-4288-8b2b-bc30561967c0-openstack-edpm-ipam\") pod \"dnsmasq-dns-54ffdb7d8c-ckmg2\" (UID: \"66fa6d88-f5fa-4288-8b2b-bc30561967c0\") " pod="openstack/dnsmasq-dns-54ffdb7d8c-ckmg2" Mar 20 11:00:59 crc kubenswrapper[4748]: I0320 11:00:59.024577 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/66fa6d88-f5fa-4288-8b2b-bc30561967c0-dns-svc\") pod \"dnsmasq-dns-54ffdb7d8c-ckmg2\" (UID: \"66fa6d88-f5fa-4288-8b2b-bc30561967c0\") " pod="openstack/dnsmasq-dns-54ffdb7d8c-ckmg2" Mar 20 11:00:59 crc kubenswrapper[4748]: I0320 11:00:59.024749 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdb9g\" (UniqueName: \"kubernetes.io/projected/66fa6d88-f5fa-4288-8b2b-bc30561967c0-kube-api-access-mdb9g\") pod \"dnsmasq-dns-54ffdb7d8c-ckmg2\" (UID: \"66fa6d88-f5fa-4288-8b2b-bc30561967c0\") " pod="openstack/dnsmasq-dns-54ffdb7d8c-ckmg2" Mar 20 11:00:59 crc kubenswrapper[4748]: I0320 11:00:59.024817 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/66fa6d88-f5fa-4288-8b2b-bc30561967c0-dns-swift-storage-0\") pod \"dnsmasq-dns-54ffdb7d8c-ckmg2\" (UID: \"66fa6d88-f5fa-4288-8b2b-bc30561967c0\") " pod="openstack/dnsmasq-dns-54ffdb7d8c-ckmg2" Mar 20 11:00:59 crc kubenswrapper[4748]: I0320 11:00:59.034360 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-54ffdb7d8c-ckmg2"] Mar 20 11:00:59 crc kubenswrapper[4748]: I0320 11:00:59.126097 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/66fa6d88-f5fa-4288-8b2b-bc30561967c0-ovsdbserver-sb\") pod \"dnsmasq-dns-54ffdb7d8c-ckmg2\" (UID: \"66fa6d88-f5fa-4288-8b2b-bc30561967c0\") " pod="openstack/dnsmasq-dns-54ffdb7d8c-ckmg2" Mar 20 11:00:59 crc kubenswrapper[4748]: I0320 11:00:59.126184 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66fa6d88-f5fa-4288-8b2b-bc30561967c0-config\") pod \"dnsmasq-dns-54ffdb7d8c-ckmg2\" (UID: \"66fa6d88-f5fa-4288-8b2b-bc30561967c0\") " pod="openstack/dnsmasq-dns-54ffdb7d8c-ckmg2" Mar 20 11:00:59 crc kubenswrapper[4748]: I0320 11:00:59.126225 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/66fa6d88-f5fa-4288-8b2b-bc30561967c0-ovsdbserver-nb\") pod \"dnsmasq-dns-54ffdb7d8c-ckmg2\" (UID: \"66fa6d88-f5fa-4288-8b2b-bc30561967c0\") " pod="openstack/dnsmasq-dns-54ffdb7d8c-ckmg2" Mar 20 11:00:59 crc kubenswrapper[4748]: I0320 11:00:59.126302 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/66fa6d88-f5fa-4288-8b2b-bc30561967c0-openstack-edpm-ipam\") pod \"dnsmasq-dns-54ffdb7d8c-ckmg2\" (UID: \"66fa6d88-f5fa-4288-8b2b-bc30561967c0\") " pod="openstack/dnsmasq-dns-54ffdb7d8c-ckmg2" Mar 20 11:00:59 crc kubenswrapper[4748]: I0320 11:00:59.126359 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/66fa6d88-f5fa-4288-8b2b-bc30561967c0-dns-svc\") pod \"dnsmasq-dns-54ffdb7d8c-ckmg2\" (UID: \"66fa6d88-f5fa-4288-8b2b-bc30561967c0\") " pod="openstack/dnsmasq-dns-54ffdb7d8c-ckmg2" Mar 20 11:00:59 crc kubenswrapper[4748]: I0320 11:00:59.126433 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mdb9g\" (UniqueName: \"kubernetes.io/projected/66fa6d88-f5fa-4288-8b2b-bc30561967c0-kube-api-access-mdb9g\") pod \"dnsmasq-dns-54ffdb7d8c-ckmg2\" (UID: \"66fa6d88-f5fa-4288-8b2b-bc30561967c0\") " pod="openstack/dnsmasq-dns-54ffdb7d8c-ckmg2" Mar 20 11:00:59 crc kubenswrapper[4748]: I0320 11:00:59.126464 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/66fa6d88-f5fa-4288-8b2b-bc30561967c0-dns-swift-storage-0\") pod \"dnsmasq-dns-54ffdb7d8c-ckmg2\" (UID: \"66fa6d88-f5fa-4288-8b2b-bc30561967c0\") " pod="openstack/dnsmasq-dns-54ffdb7d8c-ckmg2" Mar 20 11:00:59 crc kubenswrapper[4748]: I0320 11:00:59.127118 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/66fa6d88-f5fa-4288-8b2b-bc30561967c0-ovsdbserver-sb\") pod \"dnsmasq-dns-54ffdb7d8c-ckmg2\" (UID: \"66fa6d88-f5fa-4288-8b2b-bc30561967c0\") " pod="openstack/dnsmasq-dns-54ffdb7d8c-ckmg2" Mar 20 11:00:59 crc kubenswrapper[4748]: I0320 11:00:59.127240 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/66fa6d88-f5fa-4288-8b2b-bc30561967c0-dns-swift-storage-0\") pod \"dnsmasq-dns-54ffdb7d8c-ckmg2\" (UID: \"66fa6d88-f5fa-4288-8b2b-bc30561967c0\") " pod="openstack/dnsmasq-dns-54ffdb7d8c-ckmg2" Mar 20 11:00:59 crc kubenswrapper[4748]: I0320 11:00:59.127811 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/66fa6d88-f5fa-4288-8b2b-bc30561967c0-openstack-edpm-ipam\") pod \"dnsmasq-dns-54ffdb7d8c-ckmg2\" (UID: \"66fa6d88-f5fa-4288-8b2b-bc30561967c0\") " pod="openstack/dnsmasq-dns-54ffdb7d8c-ckmg2" Mar 20 11:00:59 crc kubenswrapper[4748]: I0320 11:00:59.128012 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/66fa6d88-f5fa-4288-8b2b-bc30561967c0-dns-svc\") pod \"dnsmasq-dns-54ffdb7d8c-ckmg2\" (UID: \"66fa6d88-f5fa-4288-8b2b-bc30561967c0\") " pod="openstack/dnsmasq-dns-54ffdb7d8c-ckmg2" Mar 20 11:00:59 crc kubenswrapper[4748]: I0320 11:00:59.128580 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66fa6d88-f5fa-4288-8b2b-bc30561967c0-config\") pod \"dnsmasq-dns-54ffdb7d8c-ckmg2\" (UID: \"66fa6d88-f5fa-4288-8b2b-bc30561967c0\") " pod="openstack/dnsmasq-dns-54ffdb7d8c-ckmg2" Mar 20 11:00:59 crc kubenswrapper[4748]: I0320 11:00:59.130422 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/66fa6d88-f5fa-4288-8b2b-bc30561967c0-ovsdbserver-nb\") pod \"dnsmasq-dns-54ffdb7d8c-ckmg2\" (UID: \"66fa6d88-f5fa-4288-8b2b-bc30561967c0\") " pod="openstack/dnsmasq-dns-54ffdb7d8c-ckmg2" Mar 20 11:00:59 crc kubenswrapper[4748]: I0320 11:00:59.181472 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdb9g\" (UniqueName: \"kubernetes.io/projected/66fa6d88-f5fa-4288-8b2b-bc30561967c0-kube-api-access-mdb9g\") pod \"dnsmasq-dns-54ffdb7d8c-ckmg2\" (UID: \"66fa6d88-f5fa-4288-8b2b-bc30561967c0\") " pod="openstack/dnsmasq-dns-54ffdb7d8c-ckmg2" Mar 20 11:00:59 crc kubenswrapper[4748]: I0320 11:00:59.350961 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54ffdb7d8c-ckmg2" Mar 20 11:00:59 crc kubenswrapper[4748]: I0320 11:00:59.503116 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-s9zg6" Mar 20 11:00:59 crc kubenswrapper[4748]: I0320 11:00:59.639850 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/58e56aa6-3665-4020-827c-4b961f13924b-ovsdbserver-sb\") pod \"58e56aa6-3665-4020-827c-4b961f13924b\" (UID: \"58e56aa6-3665-4020-827c-4b961f13924b\") " Mar 20 11:00:59 crc kubenswrapper[4748]: I0320 11:00:59.639909 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/58e56aa6-3665-4020-827c-4b961f13924b-ovsdbserver-nb\") pod \"58e56aa6-3665-4020-827c-4b961f13924b\" (UID: \"58e56aa6-3665-4020-827c-4b961f13924b\") " Mar 20 11:00:59 crc kubenswrapper[4748]: I0320 11:00:59.639956 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/58e56aa6-3665-4020-827c-4b961f13924b-config\") pod \"58e56aa6-3665-4020-827c-4b961f13924b\" (UID: \"58e56aa6-3665-4020-827c-4b961f13924b\") " Mar 20 11:00:59 crc kubenswrapper[4748]: I0320 11:00:59.640007 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/58e56aa6-3665-4020-827c-4b961f13924b-dns-swift-storage-0\") pod \"58e56aa6-3665-4020-827c-4b961f13924b\" (UID: \"58e56aa6-3665-4020-827c-4b961f13924b\") " Mar 20 11:00:59 crc kubenswrapper[4748]: I0320 11:00:59.640123 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n96fp\" (UniqueName: \"kubernetes.io/projected/58e56aa6-3665-4020-827c-4b961f13924b-kube-api-access-n96fp\") pod \"58e56aa6-3665-4020-827c-4b961f13924b\" (UID: \"58e56aa6-3665-4020-827c-4b961f13924b\") " Mar 20 11:00:59 crc kubenswrapper[4748]: I0320 11:00:59.640216 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/58e56aa6-3665-4020-827c-4b961f13924b-dns-svc\") pod \"58e56aa6-3665-4020-827c-4b961f13924b\" (UID: \"58e56aa6-3665-4020-827c-4b961f13924b\") " Mar 20 11:00:59 crc kubenswrapper[4748]: I0320 11:00:59.655578 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58e56aa6-3665-4020-827c-4b961f13924b-kube-api-access-n96fp" (OuterVolumeSpecName: "kube-api-access-n96fp") pod "58e56aa6-3665-4020-827c-4b961f13924b" (UID: "58e56aa6-3665-4020-827c-4b961f13924b"). InnerVolumeSpecName "kube-api-access-n96fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:00:59 crc kubenswrapper[4748]: I0320 11:00:59.662255 4748 generic.go:334] "Generic (PLEG): container finished" podID="58e56aa6-3665-4020-827c-4b961f13924b" containerID="92e782d1a7e09d2cc3eb10cc8bbd162df5212bb983fd019e2e5780d3df198373" exitCode=0 Mar 20 11:00:59 crc kubenswrapper[4748]: I0320 11:00:59.662299 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-s9zg6" event={"ID":"58e56aa6-3665-4020-827c-4b961f13924b","Type":"ContainerDied","Data":"92e782d1a7e09d2cc3eb10cc8bbd162df5212bb983fd019e2e5780d3df198373"} Mar 20 11:00:59 crc kubenswrapper[4748]: I0320 11:00:59.662332 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-s9zg6" event={"ID":"58e56aa6-3665-4020-827c-4b961f13924b","Type":"ContainerDied","Data":"2edad229b6a897d80f6d86fd6f68e37a1aac8156c95f134e75704724811388b0"} Mar 20 11:00:59 crc kubenswrapper[4748]: I0320 11:00:59.662351 4748 scope.go:117] "RemoveContainer" containerID="92e782d1a7e09d2cc3eb10cc8bbd162df5212bb983fd019e2e5780d3df198373" Mar 20 11:00:59 crc kubenswrapper[4748]: I0320 11:00:59.662527 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-s9zg6" Mar 20 11:00:59 crc kubenswrapper[4748]: I0320 11:00:59.695769 4748 scope.go:117] "RemoveContainer" containerID="4e1e7aa1c41e26606a5631a17ee256676f245b65264b83c015acba498ebfe920" Mar 20 11:00:59 crc kubenswrapper[4748]: I0320 11:00:59.743929 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n96fp\" (UniqueName: \"kubernetes.io/projected/58e56aa6-3665-4020-827c-4b961f13924b-kube-api-access-n96fp\") on node \"crc\" DevicePath \"\"" Mar 20 11:00:59 crc kubenswrapper[4748]: I0320 11:00:59.770106 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/58e56aa6-3665-4020-827c-4b961f13924b-config" (OuterVolumeSpecName: "config") pod "58e56aa6-3665-4020-827c-4b961f13924b" (UID: "58e56aa6-3665-4020-827c-4b961f13924b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 11:00:59 crc kubenswrapper[4748]: I0320 11:00:59.805511 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/58e56aa6-3665-4020-827c-4b961f13924b-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "58e56aa6-3665-4020-827c-4b961f13924b" (UID: "58e56aa6-3665-4020-827c-4b961f13924b"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 11:00:59 crc kubenswrapper[4748]: I0320 11:00:59.819581 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/58e56aa6-3665-4020-827c-4b961f13924b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "58e56aa6-3665-4020-827c-4b961f13924b" (UID: "58e56aa6-3665-4020-827c-4b961f13924b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 11:00:59 crc kubenswrapper[4748]: I0320 11:00:59.836538 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/58e56aa6-3665-4020-827c-4b961f13924b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "58e56aa6-3665-4020-827c-4b961f13924b" (UID: "58e56aa6-3665-4020-827c-4b961f13924b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 11:00:59 crc kubenswrapper[4748]: I0320 11:00:59.847024 4748 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/58e56aa6-3665-4020-827c-4b961f13924b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 11:00:59 crc kubenswrapper[4748]: I0320 11:00:59.847155 4748 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/58e56aa6-3665-4020-827c-4b961f13924b-config\") on node \"crc\" DevicePath \"\"" Mar 20 11:00:59 crc kubenswrapper[4748]: I0320 11:00:59.847171 4748 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/58e56aa6-3665-4020-827c-4b961f13924b-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 20 11:00:59 crc kubenswrapper[4748]: I0320 11:00:59.847183 4748 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/58e56aa6-3665-4020-827c-4b961f13924b-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 11:00:59 crc kubenswrapper[4748]: I0320 11:00:59.887730 4748 scope.go:117] "RemoveContainer" containerID="92e782d1a7e09d2cc3eb10cc8bbd162df5212bb983fd019e2e5780d3df198373" Mar 20 11:00:59 crc kubenswrapper[4748]: I0320 11:00:59.888387 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/58e56aa6-3665-4020-827c-4b961f13924b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "58e56aa6-3665-4020-827c-4b961f13924b" (UID: "58e56aa6-3665-4020-827c-4b961f13924b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 11:00:59 crc kubenswrapper[4748]: E0320 11:00:59.889661 4748 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"92e782d1a7e09d2cc3eb10cc8bbd162df5212bb983fd019e2e5780d3df198373\": container with ID starting with 92e782d1a7e09d2cc3eb10cc8bbd162df5212bb983fd019e2e5780d3df198373 not found: ID does not exist" containerID="92e782d1a7e09d2cc3eb10cc8bbd162df5212bb983fd019e2e5780d3df198373" Mar 20 11:00:59 crc kubenswrapper[4748]: I0320 11:00:59.889709 4748 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92e782d1a7e09d2cc3eb10cc8bbd162df5212bb983fd019e2e5780d3df198373"} err="failed to get container status \"92e782d1a7e09d2cc3eb10cc8bbd162df5212bb983fd019e2e5780d3df198373\": rpc error: code = NotFound desc = could not find container \"92e782d1a7e09d2cc3eb10cc8bbd162df5212bb983fd019e2e5780d3df198373\": container with ID starting with 92e782d1a7e09d2cc3eb10cc8bbd162df5212bb983fd019e2e5780d3df198373 not found: ID does not exist" Mar 20 11:00:59 crc kubenswrapper[4748]: I0320 11:00:59.889744 4748 scope.go:117] "RemoveContainer" containerID="4e1e7aa1c41e26606a5631a17ee256676f245b65264b83c015acba498ebfe920" Mar 20 11:00:59 crc kubenswrapper[4748]: E0320 11:00:59.890729 4748 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e1e7aa1c41e26606a5631a17ee256676f245b65264b83c015acba498ebfe920\": container with ID starting with 4e1e7aa1c41e26606a5631a17ee256676f245b65264b83c015acba498ebfe920 not found: ID does not exist" containerID="4e1e7aa1c41e26606a5631a17ee256676f245b65264b83c015acba498ebfe920" Mar 20 11:00:59 crc kubenswrapper[4748]: I0320 11:00:59.890777 4748 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e1e7aa1c41e26606a5631a17ee256676f245b65264b83c015acba498ebfe920"} err="failed to get container status \"4e1e7aa1c41e26606a5631a17ee256676f245b65264b83c015acba498ebfe920\": rpc error: code = NotFound desc = could not find container \"4e1e7aa1c41e26606a5631a17ee256676f245b65264b83c015acba498ebfe920\": container with ID starting with 4e1e7aa1c41e26606a5631a17ee256676f245b65264b83c015acba498ebfe920 not found: ID does not exist" Mar 20 11:00:59 crc kubenswrapper[4748]: I0320 11:00:59.903222 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-54ffdb7d8c-ckmg2"] Mar 20 11:00:59 crc kubenswrapper[4748]: I0320 11:00:59.949493 4748 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/58e56aa6-3665-4020-827c-4b961f13924b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 11:01:00 crc kubenswrapper[4748]: I0320 11:01:00.146712 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29566741-sxfvz"] Mar 20 11:01:00 crc kubenswrapper[4748]: E0320 11:01:00.148237 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58e56aa6-3665-4020-827c-4b961f13924b" containerName="dnsmasq-dns" Mar 20 11:01:00 crc kubenswrapper[4748]: I0320 11:01:00.148263 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="58e56aa6-3665-4020-827c-4b961f13924b" containerName="dnsmasq-dns" Mar 20 11:01:00 crc kubenswrapper[4748]: E0320 11:01:00.148285 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58e56aa6-3665-4020-827c-4b961f13924b" containerName="init" Mar 20 11:01:00 crc kubenswrapper[4748]: I0320 11:01:00.148294 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="58e56aa6-3665-4020-827c-4b961f13924b" containerName="init" Mar 20 11:01:00 crc kubenswrapper[4748]: I0320 11:01:00.153821 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="58e56aa6-3665-4020-827c-4b961f13924b" containerName="dnsmasq-dns" Mar 20 11:01:00 crc kubenswrapper[4748]: I0320 11:01:00.155466 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29566741-sxfvz" Mar 20 11:01:00 crc kubenswrapper[4748]: I0320 11:01:00.219118 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-s9zg6"] Mar 20 11:01:00 crc kubenswrapper[4748]: I0320 11:01:00.231634 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29566741-sxfvz"] Mar 20 11:01:00 crc kubenswrapper[4748]: I0320 11:01:00.246405 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-s9zg6"] Mar 20 11:01:00 crc kubenswrapper[4748]: I0320 11:01:00.267392 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7997caf5-1478-40d5-a0c6-6811d242ef17-combined-ca-bundle\") pod \"keystone-cron-29566741-sxfvz\" (UID: \"7997caf5-1478-40d5-a0c6-6811d242ef17\") " pod="openstack/keystone-cron-29566741-sxfvz" Mar 20 11:01:00 crc kubenswrapper[4748]: I0320 11:01:00.268269 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zb499\" (UniqueName: \"kubernetes.io/projected/7997caf5-1478-40d5-a0c6-6811d242ef17-kube-api-access-zb499\") pod \"keystone-cron-29566741-sxfvz\" (UID: \"7997caf5-1478-40d5-a0c6-6811d242ef17\") " pod="openstack/keystone-cron-29566741-sxfvz" Mar 20 11:01:00 crc kubenswrapper[4748]: I0320 11:01:00.268756 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7997caf5-1478-40d5-a0c6-6811d242ef17-fernet-keys\") pod \"keystone-cron-29566741-sxfvz\" (UID: \"7997caf5-1478-40d5-a0c6-6811d242ef17\") " pod="openstack/keystone-cron-29566741-sxfvz" Mar 20 11:01:00 crc kubenswrapper[4748]: I0320 11:01:00.269226 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7997caf5-1478-40d5-a0c6-6811d242ef17-config-data\") pod \"keystone-cron-29566741-sxfvz\" (UID: \"7997caf5-1478-40d5-a0c6-6811d242ef17\") " pod="openstack/keystone-cron-29566741-sxfvz" Mar 20 11:01:00 crc kubenswrapper[4748]: I0320 11:01:00.370610 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7997caf5-1478-40d5-a0c6-6811d242ef17-fernet-keys\") pod \"keystone-cron-29566741-sxfvz\" (UID: \"7997caf5-1478-40d5-a0c6-6811d242ef17\") " pod="openstack/keystone-cron-29566741-sxfvz" Mar 20 11:01:00 crc kubenswrapper[4748]: I0320 11:01:00.371178 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7997caf5-1478-40d5-a0c6-6811d242ef17-config-data\") pod \"keystone-cron-29566741-sxfvz\" (UID: \"7997caf5-1478-40d5-a0c6-6811d242ef17\") " pod="openstack/keystone-cron-29566741-sxfvz" Mar 20 11:01:00 crc kubenswrapper[4748]: I0320 11:01:00.371583 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7997caf5-1478-40d5-a0c6-6811d242ef17-combined-ca-bundle\") pod \"keystone-cron-29566741-sxfvz\" (UID: \"7997caf5-1478-40d5-a0c6-6811d242ef17\") " pod="openstack/keystone-cron-29566741-sxfvz" Mar 20 11:01:00 crc kubenswrapper[4748]: I0320 11:01:00.371621 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zb499\" (UniqueName: \"kubernetes.io/projected/7997caf5-1478-40d5-a0c6-6811d242ef17-kube-api-access-zb499\") pod \"keystone-cron-29566741-sxfvz\" (UID: \"7997caf5-1478-40d5-a0c6-6811d242ef17\") " pod="openstack/keystone-cron-29566741-sxfvz" Mar 20 11:01:00 crc kubenswrapper[4748]: I0320 11:01:00.376932 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7997caf5-1478-40d5-a0c6-6811d242ef17-combined-ca-bundle\") pod \"keystone-cron-29566741-sxfvz\" (UID: \"7997caf5-1478-40d5-a0c6-6811d242ef17\") " pod="openstack/keystone-cron-29566741-sxfvz" Mar 20 11:01:00 crc kubenswrapper[4748]: I0320 11:01:00.377367 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7997caf5-1478-40d5-a0c6-6811d242ef17-config-data\") pod \"keystone-cron-29566741-sxfvz\" (UID: \"7997caf5-1478-40d5-a0c6-6811d242ef17\") " pod="openstack/keystone-cron-29566741-sxfvz" Mar 20 11:01:00 crc kubenswrapper[4748]: I0320 11:01:00.377698 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7997caf5-1478-40d5-a0c6-6811d242ef17-fernet-keys\") pod \"keystone-cron-29566741-sxfvz\" (UID: \"7997caf5-1478-40d5-a0c6-6811d242ef17\") " pod="openstack/keystone-cron-29566741-sxfvz" Mar 20 11:01:00 crc kubenswrapper[4748]: I0320 11:01:00.392797 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zb499\" (UniqueName: \"kubernetes.io/projected/7997caf5-1478-40d5-a0c6-6811d242ef17-kube-api-access-zb499\") pod \"keystone-cron-29566741-sxfvz\" (UID: \"7997caf5-1478-40d5-a0c6-6811d242ef17\") " pod="openstack/keystone-cron-29566741-sxfvz" Mar 20 11:01:00 crc kubenswrapper[4748]: I0320 11:01:00.511100 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29566741-sxfvz" Mar 20 11:01:00 crc kubenswrapper[4748]: I0320 11:01:00.685494 4748 generic.go:334] "Generic (PLEG): container finished" podID="66fa6d88-f5fa-4288-8b2b-bc30561967c0" containerID="714962eb26731611d996d8715c0622e9d3f0a9f7c03c5207355956da8cf47920" exitCode=0 Mar 20 11:01:00 crc kubenswrapper[4748]: I0320 11:01:00.685665 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54ffdb7d8c-ckmg2" event={"ID":"66fa6d88-f5fa-4288-8b2b-bc30561967c0","Type":"ContainerDied","Data":"714962eb26731611d996d8715c0622e9d3f0a9f7c03c5207355956da8cf47920"} Mar 20 11:01:00 crc kubenswrapper[4748]: I0320 11:01:00.685719 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54ffdb7d8c-ckmg2" event={"ID":"66fa6d88-f5fa-4288-8b2b-bc30561967c0","Type":"ContainerStarted","Data":"09f38a0d8643a82e15f72e46a8535c04e4cc259f649202c88688f07a9e73f552"} Mar 20 11:01:01 crc kubenswrapper[4748]: I0320 11:01:01.059665 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29566741-sxfvz"] Mar 20 11:01:01 crc kubenswrapper[4748]: W0320 11:01:01.090731 4748 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7997caf5_1478_40d5_a0c6_6811d242ef17.slice/crio-3ce29b9d4c783c39bda5dc646d5800e7cffd2659c5cda847ab07079ef41fbd18 WatchSource:0}: Error finding container 3ce29b9d4c783c39bda5dc646d5800e7cffd2659c5cda847ab07079ef41fbd18: Status 404 returned error can't find the container with id 3ce29b9d4c783c39bda5dc646d5800e7cffd2659c5cda847ab07079ef41fbd18 Mar 20 11:01:01 crc kubenswrapper[4748]: I0320 11:01:01.528061 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58e56aa6-3665-4020-827c-4b961f13924b" path="/var/lib/kubelet/pods/58e56aa6-3665-4020-827c-4b961f13924b/volumes" Mar 20 11:01:01 crc kubenswrapper[4748]: I0320 11:01:01.714357 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54ffdb7d8c-ckmg2" event={"ID":"66fa6d88-f5fa-4288-8b2b-bc30561967c0","Type":"ContainerStarted","Data":"c0a7bc814176950f418814cca9fa1911604fad36021ff54c23c9ed25718a2fa6"} Mar 20 11:01:01 crc kubenswrapper[4748]: I0320 11:01:01.716189 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-54ffdb7d8c-ckmg2" Mar 20 11:01:01 crc kubenswrapper[4748]: I0320 11:01:01.717512 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29566741-sxfvz" event={"ID":"7997caf5-1478-40d5-a0c6-6811d242ef17","Type":"ContainerStarted","Data":"dd45a8fd4cbdbfedfda44051e62e40b511f0a3d31e5b926636fe8e134fc2312a"} Mar 20 11:01:01 crc kubenswrapper[4748]: I0320 11:01:01.717540 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29566741-sxfvz" event={"ID":"7997caf5-1478-40d5-a0c6-6811d242ef17","Type":"ContainerStarted","Data":"3ce29b9d4c783c39bda5dc646d5800e7cffd2659c5cda847ab07079ef41fbd18"} Mar 20 11:01:01 crc kubenswrapper[4748]: I0320 11:01:01.746181 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-54ffdb7d8c-ckmg2" podStartSLOduration=3.746153157 podStartE2EDuration="3.746153157s" podCreationTimestamp="2026-03-20 11:00:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 11:01:01.731983333 +0000 UTC m=+1496.873529167" watchObservedRunningTime="2026-03-20 11:01:01.746153157 +0000 UTC m=+1496.887698961" Mar 20 11:01:01 crc kubenswrapper[4748]: I0320 11:01:01.754706 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29566741-sxfvz" podStartSLOduration=1.7546823 podStartE2EDuration="1.7546823s" podCreationTimestamp="2026-03-20 11:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 11:01:01.750040895 +0000 UTC m=+1496.891586709" watchObservedRunningTime="2026-03-20 11:01:01.7546823 +0000 UTC m=+1496.896228114" Mar 20 11:01:03 crc kubenswrapper[4748]: I0320 11:01:03.747908 4748 generic.go:334] "Generic (PLEG): container finished" podID="7997caf5-1478-40d5-a0c6-6811d242ef17" containerID="dd45a8fd4cbdbfedfda44051e62e40b511f0a3d31e5b926636fe8e134fc2312a" exitCode=0 Mar 20 11:01:03 crc kubenswrapper[4748]: I0320 11:01:03.748032 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29566741-sxfvz" event={"ID":"7997caf5-1478-40d5-a0c6-6811d242ef17","Type":"ContainerDied","Data":"dd45a8fd4cbdbfedfda44051e62e40b511f0a3d31e5b926636fe8e134fc2312a"} Mar 20 11:01:05 crc kubenswrapper[4748]: I0320 11:01:05.127369 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29566741-sxfvz" Mar 20 11:01:05 crc kubenswrapper[4748]: I0320 11:01:05.284003 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7997caf5-1478-40d5-a0c6-6811d242ef17-config-data\") pod \"7997caf5-1478-40d5-a0c6-6811d242ef17\" (UID: \"7997caf5-1478-40d5-a0c6-6811d242ef17\") " Mar 20 11:01:05 crc kubenswrapper[4748]: I0320 11:01:05.284153 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7997caf5-1478-40d5-a0c6-6811d242ef17-combined-ca-bundle\") pod \"7997caf5-1478-40d5-a0c6-6811d242ef17\" (UID: \"7997caf5-1478-40d5-a0c6-6811d242ef17\") " Mar 20 11:01:05 crc kubenswrapper[4748]: I0320 11:01:05.284185 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7997caf5-1478-40d5-a0c6-6811d242ef17-fernet-keys\") pod \"7997caf5-1478-40d5-a0c6-6811d242ef17\" (UID: \"7997caf5-1478-40d5-a0c6-6811d242ef17\") " Mar 20 11:01:05 crc kubenswrapper[4748]: I0320 11:01:05.284245 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zb499\" (UniqueName: \"kubernetes.io/projected/7997caf5-1478-40d5-a0c6-6811d242ef17-kube-api-access-zb499\") pod \"7997caf5-1478-40d5-a0c6-6811d242ef17\" (UID: \"7997caf5-1478-40d5-a0c6-6811d242ef17\") " Mar 20 11:01:05 crc kubenswrapper[4748]: I0320 11:01:05.292607 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7997caf5-1478-40d5-a0c6-6811d242ef17-kube-api-access-zb499" (OuterVolumeSpecName: "kube-api-access-zb499") pod "7997caf5-1478-40d5-a0c6-6811d242ef17" (UID: "7997caf5-1478-40d5-a0c6-6811d242ef17"). InnerVolumeSpecName "kube-api-access-zb499". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:01:05 crc kubenswrapper[4748]: I0320 11:01:05.292978 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7997caf5-1478-40d5-a0c6-6811d242ef17-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "7997caf5-1478-40d5-a0c6-6811d242ef17" (UID: "7997caf5-1478-40d5-a0c6-6811d242ef17"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 11:01:05 crc kubenswrapper[4748]: I0320 11:01:05.327136 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7997caf5-1478-40d5-a0c6-6811d242ef17-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7997caf5-1478-40d5-a0c6-6811d242ef17" (UID: "7997caf5-1478-40d5-a0c6-6811d242ef17"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 11:01:05 crc kubenswrapper[4748]: I0320 11:01:05.356045 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7997caf5-1478-40d5-a0c6-6811d242ef17-config-data" (OuterVolumeSpecName: "config-data") pod "7997caf5-1478-40d5-a0c6-6811d242ef17" (UID: "7997caf5-1478-40d5-a0c6-6811d242ef17"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 11:01:05 crc kubenswrapper[4748]: I0320 11:01:05.386856 4748 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7997caf5-1478-40d5-a0c6-6811d242ef17-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 11:01:05 crc kubenswrapper[4748]: I0320 11:01:05.386894 4748 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7997caf5-1478-40d5-a0c6-6811d242ef17-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 11:01:05 crc kubenswrapper[4748]: I0320 11:01:05.386906 4748 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7997caf5-1478-40d5-a0c6-6811d242ef17-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 20 11:01:05 crc kubenswrapper[4748]: I0320 11:01:05.386918 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zb499\" (UniqueName: \"kubernetes.io/projected/7997caf5-1478-40d5-a0c6-6811d242ef17-kube-api-access-zb499\") on node \"crc\" DevicePath \"\"" Mar 20 11:01:05 crc kubenswrapper[4748]: I0320 11:01:05.766991 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29566741-sxfvz" event={"ID":"7997caf5-1478-40d5-a0c6-6811d242ef17","Type":"ContainerDied","Data":"3ce29b9d4c783c39bda5dc646d5800e7cffd2659c5cda847ab07079ef41fbd18"} Mar 20 11:01:05 crc kubenswrapper[4748]: I0320 11:01:05.767271 4748 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3ce29b9d4c783c39bda5dc646d5800e7cffd2659c5cda847ab07079ef41fbd18" Mar 20 11:01:05 crc kubenswrapper[4748]: I0320 11:01:05.767058 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29566741-sxfvz" Mar 20 11:01:09 crc kubenswrapper[4748]: I0320 11:01:09.352924 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-54ffdb7d8c-ckmg2" Mar 20 11:01:09 crc kubenswrapper[4748]: I0320 11:01:09.441994 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-7zk86"] Mar 20 11:01:09 crc kubenswrapper[4748]: I0320 11:01:09.442265 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-79bd4cc8c9-7zk86" podUID="d7861e5c-351b-442a-8b86-3a7e9ea3e856" containerName="dnsmasq-dns" containerID="cri-o://0aacf0224ee0c71f8ff0a63ee00055f127dd8a6a5bfce682a48d870520eb6c14" gracePeriod=10 Mar 20 11:01:09 crc kubenswrapper[4748]: I0320 11:01:09.819366 4748 generic.go:334] "Generic (PLEG): container finished" podID="d7861e5c-351b-442a-8b86-3a7e9ea3e856" containerID="0aacf0224ee0c71f8ff0a63ee00055f127dd8a6a5bfce682a48d870520eb6c14" exitCode=0 Mar 20 11:01:09 crc kubenswrapper[4748]: I0320 11:01:09.819431 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd4cc8c9-7zk86" event={"ID":"d7861e5c-351b-442a-8b86-3a7e9ea3e856","Type":"ContainerDied","Data":"0aacf0224ee0c71f8ff0a63ee00055f127dd8a6a5bfce682a48d870520eb6c14"} Mar 20 11:01:09 crc kubenswrapper[4748]: I0320 11:01:09.996858 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79bd4cc8c9-7zk86" Mar 20 11:01:10 crc kubenswrapper[4748]: I0320 11:01:10.183955 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d7861e5c-351b-442a-8b86-3a7e9ea3e856-dns-swift-storage-0\") pod \"d7861e5c-351b-442a-8b86-3a7e9ea3e856\" (UID: \"d7861e5c-351b-442a-8b86-3a7e9ea3e856\") " Mar 20 11:01:10 crc kubenswrapper[4748]: I0320 11:01:10.184087 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d7861e5c-351b-442a-8b86-3a7e9ea3e856-ovsdbserver-nb\") pod \"d7861e5c-351b-442a-8b86-3a7e9ea3e856\" (UID: \"d7861e5c-351b-442a-8b86-3a7e9ea3e856\") " Mar 20 11:01:10 crc kubenswrapper[4748]: I0320 11:01:10.184122 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fz878\" (UniqueName: \"kubernetes.io/projected/d7861e5c-351b-442a-8b86-3a7e9ea3e856-kube-api-access-fz878\") pod \"d7861e5c-351b-442a-8b86-3a7e9ea3e856\" (UID: \"d7861e5c-351b-442a-8b86-3a7e9ea3e856\") " Mar 20 11:01:10 crc kubenswrapper[4748]: I0320 11:01:10.184179 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d7861e5c-351b-442a-8b86-3a7e9ea3e856-ovsdbserver-sb\") pod \"d7861e5c-351b-442a-8b86-3a7e9ea3e856\" (UID: \"d7861e5c-351b-442a-8b86-3a7e9ea3e856\") " Mar 20 11:01:10 crc kubenswrapper[4748]: I0320 11:01:10.184247 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/d7861e5c-351b-442a-8b86-3a7e9ea3e856-openstack-edpm-ipam\") pod \"d7861e5c-351b-442a-8b86-3a7e9ea3e856\" (UID: \"d7861e5c-351b-442a-8b86-3a7e9ea3e856\") " Mar 20 11:01:10 crc kubenswrapper[4748]: I0320 11:01:10.184273 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7861e5c-351b-442a-8b86-3a7e9ea3e856-config\") pod \"d7861e5c-351b-442a-8b86-3a7e9ea3e856\" (UID: \"d7861e5c-351b-442a-8b86-3a7e9ea3e856\") " Mar 20 11:01:10 crc kubenswrapper[4748]: I0320 11:01:10.184308 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d7861e5c-351b-442a-8b86-3a7e9ea3e856-dns-svc\") pod \"d7861e5c-351b-442a-8b86-3a7e9ea3e856\" (UID: \"d7861e5c-351b-442a-8b86-3a7e9ea3e856\") " Mar 20 11:01:10 crc kubenswrapper[4748]: I0320 11:01:10.207105 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7861e5c-351b-442a-8b86-3a7e9ea3e856-kube-api-access-fz878" (OuterVolumeSpecName: "kube-api-access-fz878") pod "d7861e5c-351b-442a-8b86-3a7e9ea3e856" (UID: "d7861e5c-351b-442a-8b86-3a7e9ea3e856"). InnerVolumeSpecName "kube-api-access-fz878". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:01:10 crc kubenswrapper[4748]: I0320 11:01:10.280931 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7861e5c-351b-442a-8b86-3a7e9ea3e856-config" (OuterVolumeSpecName: "config") pod "d7861e5c-351b-442a-8b86-3a7e9ea3e856" (UID: "d7861e5c-351b-442a-8b86-3a7e9ea3e856"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 11:01:10 crc kubenswrapper[4748]: I0320 11:01:10.283657 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7861e5c-351b-442a-8b86-3a7e9ea3e856-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d7861e5c-351b-442a-8b86-3a7e9ea3e856" (UID: "d7861e5c-351b-442a-8b86-3a7e9ea3e856"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 11:01:10 crc kubenswrapper[4748]: I0320 11:01:10.283742 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7861e5c-351b-442a-8b86-3a7e9ea3e856-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "d7861e5c-351b-442a-8b86-3a7e9ea3e856" (UID: "d7861e5c-351b-442a-8b86-3a7e9ea3e856"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 11:01:10 crc kubenswrapper[4748]: I0320 11:01:10.285149 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7861e5c-351b-442a-8b86-3a7e9ea3e856-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d7861e5c-351b-442a-8b86-3a7e9ea3e856" (UID: "d7861e5c-351b-442a-8b86-3a7e9ea3e856"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 11:01:10 crc kubenswrapper[4748]: I0320 11:01:10.286653 4748 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d7861e5c-351b-442a-8b86-3a7e9ea3e856-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 20 11:01:10 crc kubenswrapper[4748]: I0320 11:01:10.286685 4748 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d7861e5c-351b-442a-8b86-3a7e9ea3e856-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 11:01:10 crc kubenswrapper[4748]: I0320 11:01:10.286697 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fz878\" (UniqueName: \"kubernetes.io/projected/d7861e5c-351b-442a-8b86-3a7e9ea3e856-kube-api-access-fz878\") on node \"crc\" DevicePath \"\"" Mar 20 11:01:10 crc kubenswrapper[4748]: I0320 11:01:10.286709 4748 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d7861e5c-351b-442a-8b86-3a7e9ea3e856-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 11:01:10 crc kubenswrapper[4748]: I0320 11:01:10.286722 4748 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7861e5c-351b-442a-8b86-3a7e9ea3e856-config\") on node \"crc\" DevicePath \"\"" Mar 20 11:01:10 crc kubenswrapper[4748]: I0320 11:01:10.312054 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7861e5c-351b-442a-8b86-3a7e9ea3e856-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d7861e5c-351b-442a-8b86-3a7e9ea3e856" (UID: "d7861e5c-351b-442a-8b86-3a7e9ea3e856"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 11:01:10 crc kubenswrapper[4748]: I0320 11:01:10.315250 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7861e5c-351b-442a-8b86-3a7e9ea3e856-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "d7861e5c-351b-442a-8b86-3a7e9ea3e856" (UID: "d7861e5c-351b-442a-8b86-3a7e9ea3e856"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 11:01:10 crc kubenswrapper[4748]: I0320 11:01:10.388425 4748 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/d7861e5c-351b-442a-8b86-3a7e9ea3e856-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 11:01:10 crc kubenswrapper[4748]: I0320 11:01:10.389336 4748 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d7861e5c-351b-442a-8b86-3a7e9ea3e856-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 11:01:10 crc kubenswrapper[4748]: I0320 11:01:10.829803 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd4cc8c9-7zk86" event={"ID":"d7861e5c-351b-442a-8b86-3a7e9ea3e856","Type":"ContainerDied","Data":"0a78df30682ad8398c48dbdf5d315e3f256bd6c9ae01466f2d487a6fe3c45480"} Mar 20 11:01:10 crc kubenswrapper[4748]: I0320 11:01:10.830115 4748 scope.go:117] "RemoveContainer" containerID="0aacf0224ee0c71f8ff0a63ee00055f127dd8a6a5bfce682a48d870520eb6c14" Mar 20 11:01:10 crc kubenswrapper[4748]: I0320 11:01:10.829890 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79bd4cc8c9-7zk86" Mar 20 11:01:10 crc kubenswrapper[4748]: I0320 11:01:10.881867 4748 scope.go:117] "RemoveContainer" containerID="2d64e82d02a533e7e90598eaa40c12a915bd9f4213f0e457ff0c5d27dd41a2d7" Mar 20 11:01:10 crc kubenswrapper[4748]: I0320 11:01:10.886902 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-7zk86"] Mar 20 11:01:10 crc kubenswrapper[4748]: I0320 11:01:10.901383 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-7zk86"] Mar 20 11:01:11 crc kubenswrapper[4748]: I0320 11:01:11.524983 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7861e5c-351b-442a-8b86-3a7e9ea3e856" path="/var/lib/kubelet/pods/d7861e5c-351b-442a-8b86-3a7e9ea3e856/volumes" Mar 20 11:01:11 crc kubenswrapper[4748]: E0320 11:01:11.972917 4748 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7997caf5_1478_40d5_a0c6_6811d242ef17.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7997caf5_1478_40d5_a0c6_6811d242ef17.slice/crio-3ce29b9d4c783c39bda5dc646d5800e7cffd2659c5cda847ab07079ef41fbd18\": RecentStats: unable to find data in memory cache]" Mar 20 11:01:12 crc kubenswrapper[4748]: I0320 11:01:12.928755 4748 patch_prober.go:28] interesting pod/machine-config-daemon-5lbvz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:01:12 crc kubenswrapper[4748]: I0320 11:01:12.929131 4748 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:01:20 crc kubenswrapper[4748]: I0320 11:01:20.933637 4748 generic.go:334] "Generic (PLEG): container finished" podID="507647d5-8633-4346-a9e0-4af3eb0e3e5f" containerID="0c7552290b90407d9f27c2c2ba3622fa03227521cfded9fde55cf5b4478d5622" exitCode=0 Mar 20 11:01:20 crc kubenswrapper[4748]: I0320 11:01:20.933737 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"507647d5-8633-4346-a9e0-4af3eb0e3e5f","Type":"ContainerDied","Data":"0c7552290b90407d9f27c2c2ba3622fa03227521cfded9fde55cf5b4478d5622"} Mar 20 11:01:20 crc kubenswrapper[4748]: I0320 11:01:20.936739 4748 generic.go:334] "Generic (PLEG): container finished" podID="b6bc49d0-a6dc-4a70-9d54-2cc66e8cf07a" containerID="427e1c298d95f58f79b7797e69454eb041aece5f7b89def3c34ecdb1dfeba329" exitCode=0 Mar 20 11:01:20 crc kubenswrapper[4748]: I0320 11:01:20.936774 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"b6bc49d0-a6dc-4a70-9d54-2cc66e8cf07a","Type":"ContainerDied","Data":"427e1c298d95f58f79b7797e69454eb041aece5f7b89def3c34ecdb1dfeba329"} Mar 20 11:01:21 crc kubenswrapper[4748]: I0320 11:01:21.947880 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"507647d5-8633-4346-a9e0-4af3eb0e3e5f","Type":"ContainerStarted","Data":"3452163ea0e56bb35e7ff31cad987264b2bb55ca1440268b9830c47b851ce761"} Mar 20 11:01:21 crc kubenswrapper[4748]: I0320 11:01:21.948429 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 20 11:01:21 crc kubenswrapper[4748]: I0320 11:01:21.950062 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"b6bc49d0-a6dc-4a70-9d54-2cc66e8cf07a","Type":"ContainerStarted","Data":"2a88185224c5c8e98695db9378c0f0cc463909897747a9931152333d90215fd8"} Mar 20 11:01:21 crc kubenswrapper[4748]: I0320 11:01:21.950311 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 20 11:01:21 crc kubenswrapper[4748]: I0320 11:01:21.980351 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=36.980325731 podStartE2EDuration="36.980325731s" podCreationTimestamp="2026-03-20 11:00:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 11:01:21.967561552 +0000 UTC m=+1517.109107366" watchObservedRunningTime="2026-03-20 11:01:21.980325731 +0000 UTC m=+1517.121871545" Mar 20 11:01:22 crc kubenswrapper[4748]: I0320 11:01:22.004247 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.004226439 podStartE2EDuration="37.004226439s" podCreationTimestamp="2026-03-20 11:00:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 11:01:21.998480215 +0000 UTC m=+1517.140026039" watchObservedRunningTime="2026-03-20 11:01:22.004226439 +0000 UTC m=+1517.145772253" Mar 20 11:01:22 crc kubenswrapper[4748]: E0320 11:01:22.218857 4748 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7997caf5_1478_40d5_a0c6_6811d242ef17.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7997caf5_1478_40d5_a0c6_6811d242ef17.slice/crio-3ce29b9d4c783c39bda5dc646d5800e7cffd2659c5cda847ab07079ef41fbd18\": RecentStats: unable to find data in memory cache]" Mar 20 11:01:32 crc kubenswrapper[4748]: E0320 11:01:32.445234 4748 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7997caf5_1478_40d5_a0c6_6811d242ef17.slice/crio-3ce29b9d4c783c39bda5dc646d5800e7cffd2659c5cda847ab07079ef41fbd18\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7997caf5_1478_40d5_a0c6_6811d242ef17.slice\": RecentStats: unable to find data in memory cache]" Mar 20 11:01:33 crc kubenswrapper[4748]: I0320 11:01:33.593809 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6nsm6"] Mar 20 11:01:33 crc kubenswrapper[4748]: E0320 11:01:33.594608 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7861e5c-351b-442a-8b86-3a7e9ea3e856" containerName="dnsmasq-dns" Mar 20 11:01:33 crc kubenswrapper[4748]: I0320 11:01:33.594621 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7861e5c-351b-442a-8b86-3a7e9ea3e856" containerName="dnsmasq-dns" Mar 20 11:01:33 crc kubenswrapper[4748]: E0320 11:01:33.594646 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7997caf5-1478-40d5-a0c6-6811d242ef17" containerName="keystone-cron" Mar 20 11:01:33 crc kubenswrapper[4748]: I0320 11:01:33.594653 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="7997caf5-1478-40d5-a0c6-6811d242ef17" containerName="keystone-cron" Mar 20 11:01:33 crc kubenswrapper[4748]: E0320 11:01:33.594676 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7861e5c-351b-442a-8b86-3a7e9ea3e856" containerName="init" Mar 20 11:01:33 crc kubenswrapper[4748]: I0320 11:01:33.594682 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7861e5c-351b-442a-8b86-3a7e9ea3e856" containerName="init" Mar 20 11:01:33 crc kubenswrapper[4748]: I0320 11:01:33.594867 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7861e5c-351b-442a-8b86-3a7e9ea3e856" containerName="dnsmasq-dns" Mar 20 11:01:33 crc kubenswrapper[4748]: I0320 11:01:33.594889 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="7997caf5-1478-40d5-a0c6-6811d242ef17" containerName="keystone-cron" Mar 20 11:01:33 crc kubenswrapper[4748]: I0320 11:01:33.595492 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6nsm6" Mar 20 11:01:33 crc kubenswrapper[4748]: I0320 11:01:33.598427 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 11:01:33 crc kubenswrapper[4748]: I0320 11:01:33.598681 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-fd5jb" Mar 20 11:01:33 crc kubenswrapper[4748]: I0320 11:01:33.599693 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 11:01:33 crc kubenswrapper[4748]: I0320 11:01:33.601046 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 11:01:33 crc kubenswrapper[4748]: I0320 11:01:33.611550 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6nsm6"] Mar 20 11:01:33 crc kubenswrapper[4748]: I0320 11:01:33.669619 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bl4tr\" (UniqueName: \"kubernetes.io/projected/6298ed1e-1de4-489a-ba4c-ca6f3f989909-kube-api-access-bl4tr\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-6nsm6\" (UID: \"6298ed1e-1de4-489a-ba4c-ca6f3f989909\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6nsm6" Mar 20 11:01:33 crc kubenswrapper[4748]: I0320 11:01:33.669685 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6298ed1e-1de4-489a-ba4c-ca6f3f989909-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-6nsm6\" (UID: \"6298ed1e-1de4-489a-ba4c-ca6f3f989909\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6nsm6" Mar 20 11:01:33 crc kubenswrapper[4748]: I0320 11:01:33.669721 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6298ed1e-1de4-489a-ba4c-ca6f3f989909-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-6nsm6\" (UID: \"6298ed1e-1de4-489a-ba4c-ca6f3f989909\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6nsm6" Mar 20 11:01:33 crc kubenswrapper[4748]: I0320 11:01:33.669860 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6298ed1e-1de4-489a-ba4c-ca6f3f989909-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-6nsm6\" (UID: \"6298ed1e-1de4-489a-ba4c-ca6f3f989909\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6nsm6" Mar 20 11:01:33 crc kubenswrapper[4748]: I0320 11:01:33.771445 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bl4tr\" (UniqueName: \"kubernetes.io/projected/6298ed1e-1de4-489a-ba4c-ca6f3f989909-kube-api-access-bl4tr\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-6nsm6\" (UID: \"6298ed1e-1de4-489a-ba4c-ca6f3f989909\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6nsm6" Mar 20 11:01:33 crc kubenswrapper[4748]: I0320 11:01:33.771501 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6298ed1e-1de4-489a-ba4c-ca6f3f989909-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-6nsm6\" (UID: \"6298ed1e-1de4-489a-ba4c-ca6f3f989909\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6nsm6" Mar 20 11:01:33 crc kubenswrapper[4748]: I0320 11:01:33.771529 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6298ed1e-1de4-489a-ba4c-ca6f3f989909-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-6nsm6\" (UID: \"6298ed1e-1de4-489a-ba4c-ca6f3f989909\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6nsm6" Mar 20 11:01:33 crc kubenswrapper[4748]: I0320 11:01:33.771605 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6298ed1e-1de4-489a-ba4c-ca6f3f989909-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-6nsm6\" (UID: \"6298ed1e-1de4-489a-ba4c-ca6f3f989909\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6nsm6" Mar 20 11:01:33 crc kubenswrapper[4748]: I0320 11:01:33.781717 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6298ed1e-1de4-489a-ba4c-ca6f3f989909-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-6nsm6\" (UID: \"6298ed1e-1de4-489a-ba4c-ca6f3f989909\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6nsm6" Mar 20 11:01:33 crc kubenswrapper[4748]: I0320 11:01:33.781812 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6298ed1e-1de4-489a-ba4c-ca6f3f989909-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-6nsm6\" (UID: \"6298ed1e-1de4-489a-ba4c-ca6f3f989909\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6nsm6" Mar 20 11:01:33 crc kubenswrapper[4748]: I0320 11:01:33.782852 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6298ed1e-1de4-489a-ba4c-ca6f3f989909-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-6nsm6\" (UID: \"6298ed1e-1de4-489a-ba4c-ca6f3f989909\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6nsm6" Mar 20 11:01:33 crc kubenswrapper[4748]: I0320 11:01:33.798424 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bl4tr\" (UniqueName: \"kubernetes.io/projected/6298ed1e-1de4-489a-ba4c-ca6f3f989909-kube-api-access-bl4tr\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-6nsm6\" (UID: \"6298ed1e-1de4-489a-ba4c-ca6f3f989909\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6nsm6" Mar 20 11:01:33 crc kubenswrapper[4748]: I0320 11:01:33.916925 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6nsm6" Mar 20 11:01:34 crc kubenswrapper[4748]: I0320 11:01:34.443942 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6nsm6"] Mar 20 11:01:35 crc kubenswrapper[4748]: I0320 11:01:35.078974 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6nsm6" event={"ID":"6298ed1e-1de4-489a-ba4c-ca6f3f989909","Type":"ContainerStarted","Data":"ee297add583c1724db1957dc6098b04ce4c62abdb7a546bb80c51b2e4ba74088"} Mar 20 11:01:36 crc kubenswrapper[4748]: I0320 11:01:36.042753 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 20 11:01:36 crc kubenswrapper[4748]: I0320 11:01:36.161442 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 20 11:01:42 crc kubenswrapper[4748]: E0320 11:01:42.697464 4748 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7997caf5_1478_40d5_a0c6_6811d242ef17.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7997caf5_1478_40d5_a0c6_6811d242ef17.slice/crio-3ce29b9d4c783c39bda5dc646d5800e7cffd2659c5cda847ab07079ef41fbd18\": RecentStats: unable to find data in memory cache]" Mar 20 11:01:42 crc kubenswrapper[4748]: I0320 11:01:42.928560 4748 patch_prober.go:28] interesting pod/machine-config-daemon-5lbvz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:01:42 crc kubenswrapper[4748]: I0320 11:01:42.928663 4748 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:01:45 crc kubenswrapper[4748]: I0320 11:01:45.510808 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-j4djx"] Mar 20 11:01:45 crc kubenswrapper[4748]: I0320 11:01:45.513740 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j4djx" Mar 20 11:01:45 crc kubenswrapper[4748]: I0320 11:01:45.530187 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-786lm\" (UniqueName: \"kubernetes.io/projected/8f860d5a-e1f3-4500-871e-6e9fb5f8acd6-kube-api-access-786lm\") pod \"redhat-marketplace-j4djx\" (UID: \"8f860d5a-e1f3-4500-871e-6e9fb5f8acd6\") " pod="openshift-marketplace/redhat-marketplace-j4djx" Mar 20 11:01:45 crc kubenswrapper[4748]: I0320 11:01:45.530291 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f860d5a-e1f3-4500-871e-6e9fb5f8acd6-catalog-content\") pod \"redhat-marketplace-j4djx\" (UID: \"8f860d5a-e1f3-4500-871e-6e9fb5f8acd6\") " pod="openshift-marketplace/redhat-marketplace-j4djx" Mar 20 11:01:45 crc kubenswrapper[4748]: I0320 11:01:45.530414 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f860d5a-e1f3-4500-871e-6e9fb5f8acd6-utilities\") pod \"redhat-marketplace-j4djx\" (UID: \"8f860d5a-e1f3-4500-871e-6e9fb5f8acd6\") " pod="openshift-marketplace/redhat-marketplace-j4djx" Mar 20 11:01:45 crc kubenswrapper[4748]: I0320 11:01:45.551798 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-j4djx"] Mar 20 11:01:45 crc kubenswrapper[4748]: I0320 11:01:45.632804 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f860d5a-e1f3-4500-871e-6e9fb5f8acd6-catalog-content\") pod \"redhat-marketplace-j4djx\" (UID: \"8f860d5a-e1f3-4500-871e-6e9fb5f8acd6\") " pod="openshift-marketplace/redhat-marketplace-j4djx" Mar 20 11:01:45 crc kubenswrapper[4748]: I0320 11:01:45.633065 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f860d5a-e1f3-4500-871e-6e9fb5f8acd6-utilities\") pod \"redhat-marketplace-j4djx\" (UID: \"8f860d5a-e1f3-4500-871e-6e9fb5f8acd6\") " pod="openshift-marketplace/redhat-marketplace-j4djx" Mar 20 11:01:45 crc kubenswrapper[4748]: I0320 11:01:45.633195 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-786lm\" (UniqueName: \"kubernetes.io/projected/8f860d5a-e1f3-4500-871e-6e9fb5f8acd6-kube-api-access-786lm\") pod \"redhat-marketplace-j4djx\" (UID: \"8f860d5a-e1f3-4500-871e-6e9fb5f8acd6\") " pod="openshift-marketplace/redhat-marketplace-j4djx" Mar 20 11:01:45 crc kubenswrapper[4748]: I0320 11:01:45.633374 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f860d5a-e1f3-4500-871e-6e9fb5f8acd6-catalog-content\") pod \"redhat-marketplace-j4djx\" (UID: \"8f860d5a-e1f3-4500-871e-6e9fb5f8acd6\") " pod="openshift-marketplace/redhat-marketplace-j4djx" Mar 20 11:01:45 crc kubenswrapper[4748]: I0320 11:01:45.633704 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f860d5a-e1f3-4500-871e-6e9fb5f8acd6-utilities\") pod \"redhat-marketplace-j4djx\" (UID: \"8f860d5a-e1f3-4500-871e-6e9fb5f8acd6\") " pod="openshift-marketplace/redhat-marketplace-j4djx" Mar 20 11:01:45 crc kubenswrapper[4748]: I0320 11:01:45.676198 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-786lm\" (UniqueName: \"kubernetes.io/projected/8f860d5a-e1f3-4500-871e-6e9fb5f8acd6-kube-api-access-786lm\") pod \"redhat-marketplace-j4djx\" (UID: \"8f860d5a-e1f3-4500-871e-6e9fb5f8acd6\") " pod="openshift-marketplace/redhat-marketplace-j4djx" Mar 20 11:01:45 crc kubenswrapper[4748]: I0320 11:01:45.841468 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j4djx" Mar 20 11:01:49 crc kubenswrapper[4748]: I0320 11:01:49.034373 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-j4djx"] Mar 20 11:01:49 crc kubenswrapper[4748]: I0320 11:01:49.326177 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6nsm6" event={"ID":"6298ed1e-1de4-489a-ba4c-ca6f3f989909","Type":"ContainerStarted","Data":"9935fa68ed4b543f4f0becf5c3be91db0ee85dbcffa9892d78fb821d80a0b82d"} Mar 20 11:01:49 crc kubenswrapper[4748]: I0320 11:01:49.328556 4748 generic.go:334] "Generic (PLEG): container finished" podID="8f860d5a-e1f3-4500-871e-6e9fb5f8acd6" containerID="f64edf2bae4476b8c911f9259c8baf848da52bf51f4107c840041379ee79564c" exitCode=0 Mar 20 11:01:49 crc kubenswrapper[4748]: I0320 11:01:49.328591 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j4djx" event={"ID":"8f860d5a-e1f3-4500-871e-6e9fb5f8acd6","Type":"ContainerDied","Data":"f64edf2bae4476b8c911f9259c8baf848da52bf51f4107c840041379ee79564c"} Mar 20 11:01:49 crc kubenswrapper[4748]: I0320 11:01:49.328613 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j4djx" event={"ID":"8f860d5a-e1f3-4500-871e-6e9fb5f8acd6","Type":"ContainerStarted","Data":"6e68bce54ded2a6a7800eaa31e30c1709e8057cd77dfc9ec611ebf5998943e29"} Mar 20 11:01:49 crc kubenswrapper[4748]: I0320 11:01:49.348160 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6nsm6" podStartSLOduration=2.148948977 podStartE2EDuration="16.348140394s" podCreationTimestamp="2026-03-20 11:01:33 +0000 UTC" firstStartedPulling="2026-03-20 11:01:34.449428808 +0000 UTC m=+1529.590974622" lastFinishedPulling="2026-03-20 11:01:48.648620225 +0000 UTC m=+1543.790166039" observedRunningTime="2026-03-20 11:01:49.33997866 +0000 UTC m=+1544.481524484" watchObservedRunningTime="2026-03-20 11:01:49.348140394 +0000 UTC m=+1544.489686198" Mar 20 11:01:51 crc kubenswrapper[4748]: I0320 11:01:51.352779 4748 generic.go:334] "Generic (PLEG): container finished" podID="8f860d5a-e1f3-4500-871e-6e9fb5f8acd6" containerID="6fd836298a433e44ab162c36a25210d10d532f15c6dd79229d2cebf91190e73c" exitCode=0 Mar 20 11:01:51 crc kubenswrapper[4748]: I0320 11:01:51.352878 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j4djx" event={"ID":"8f860d5a-e1f3-4500-871e-6e9fb5f8acd6","Type":"ContainerDied","Data":"6fd836298a433e44ab162c36a25210d10d532f15c6dd79229d2cebf91190e73c"} Mar 20 11:01:52 crc kubenswrapper[4748]: E0320 11:01:52.928124 4748 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7997caf5_1478_40d5_a0c6_6811d242ef17.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7997caf5_1478_40d5_a0c6_6811d242ef17.slice/crio-3ce29b9d4c783c39bda5dc646d5800e7cffd2659c5cda847ab07079ef41fbd18\": RecentStats: unable to find data in memory cache]" Mar 20 11:01:56 crc kubenswrapper[4748]: I0320 11:01:56.398374 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j4djx" event={"ID":"8f860d5a-e1f3-4500-871e-6e9fb5f8acd6","Type":"ContainerStarted","Data":"fd346146db74d9746a9c49401fd8370941cbcecd520608fce60c559d832f242c"} Mar 20 11:02:00 crc kubenswrapper[4748]: I0320 11:02:00.138687 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-j4djx" podStartSLOduration=8.665669716 podStartE2EDuration="15.138664418s" podCreationTimestamp="2026-03-20 11:01:45 +0000 UTC" firstStartedPulling="2026-03-20 11:01:49.329712013 +0000 UTC m=+1544.471257827" lastFinishedPulling="2026-03-20 11:01:55.802706725 +0000 UTC m=+1550.944252529" observedRunningTime="2026-03-20 11:01:56.429340183 +0000 UTC m=+1551.570885997" watchObservedRunningTime="2026-03-20 11:02:00.138664418 +0000 UTC m=+1555.280210232" Mar 20 11:02:00 crc kubenswrapper[4748]: I0320 11:02:00.150621 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566742-g82x8"] Mar 20 11:02:00 crc kubenswrapper[4748]: I0320 11:02:00.152237 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566742-g82x8" Mar 20 11:02:00 crc kubenswrapper[4748]: I0320 11:02:00.160227 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 11:02:00 crc kubenswrapper[4748]: I0320 11:02:00.160969 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-p6q8z" Mar 20 11:02:00 crc kubenswrapper[4748]: I0320 11:02:00.161097 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 11:02:00 crc kubenswrapper[4748]: I0320 11:02:00.181032 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566742-g82x8"] Mar 20 11:02:00 crc kubenswrapper[4748]: I0320 11:02:00.324793 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zq9xn\" (UniqueName: \"kubernetes.io/projected/97916155-2786-4b69-afc3-86954c6b4b50-kube-api-access-zq9xn\") pod \"auto-csr-approver-29566742-g82x8\" (UID: \"97916155-2786-4b69-afc3-86954c6b4b50\") " pod="openshift-infra/auto-csr-approver-29566742-g82x8" Mar 20 11:02:00 crc kubenswrapper[4748]: I0320 11:02:00.426855 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zq9xn\" (UniqueName: \"kubernetes.io/projected/97916155-2786-4b69-afc3-86954c6b4b50-kube-api-access-zq9xn\") pod \"auto-csr-approver-29566742-g82x8\" (UID: \"97916155-2786-4b69-afc3-86954c6b4b50\") " pod="openshift-infra/auto-csr-approver-29566742-g82x8" Mar 20 11:02:00 crc kubenswrapper[4748]: I0320 11:02:00.445619 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zq9xn\" (UniqueName: \"kubernetes.io/projected/97916155-2786-4b69-afc3-86954c6b4b50-kube-api-access-zq9xn\") pod \"auto-csr-approver-29566742-g82x8\" (UID: \"97916155-2786-4b69-afc3-86954c6b4b50\") " pod="openshift-infra/auto-csr-approver-29566742-g82x8" Mar 20 11:02:00 crc kubenswrapper[4748]: I0320 11:02:00.475957 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566742-g82x8" Mar 20 11:02:00 crc kubenswrapper[4748]: I0320 11:02:00.944218 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566742-g82x8"] Mar 20 11:02:00 crc kubenswrapper[4748]: W0320 11:02:00.948634 4748 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod97916155_2786_4b69_afc3_86954c6b4b50.slice/crio-0b97e428364f28f5f0736199ddb24c297d92a4b350dee34b36aff1a9dccf9625 WatchSource:0}: Error finding container 0b97e428364f28f5f0736199ddb24c297d92a4b350dee34b36aff1a9dccf9625: Status 404 returned error can't find the container with id 0b97e428364f28f5f0736199ddb24c297d92a4b350dee34b36aff1a9dccf9625 Mar 20 11:02:01 crc kubenswrapper[4748]: I0320 11:02:01.445274 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566742-g82x8" event={"ID":"97916155-2786-4b69-afc3-86954c6b4b50","Type":"ContainerStarted","Data":"0b97e428364f28f5f0736199ddb24c297d92a4b350dee34b36aff1a9dccf9625"} Mar 20 11:02:02 crc kubenswrapper[4748]: I0320 11:02:02.455691 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566742-g82x8" event={"ID":"97916155-2786-4b69-afc3-86954c6b4b50","Type":"ContainerStarted","Data":"f2a8ddd516cf1926db46fb199dfb1f31a737a26079dd893201121819223acf58"} Mar 20 11:02:02 crc kubenswrapper[4748]: I0320 11:02:02.459209 4748 generic.go:334] "Generic (PLEG): container finished" podID="6298ed1e-1de4-489a-ba4c-ca6f3f989909" containerID="9935fa68ed4b543f4f0becf5c3be91db0ee85dbcffa9892d78fb821d80a0b82d" exitCode=0 Mar 20 11:02:02 crc kubenswrapper[4748]: I0320 11:02:02.459256 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6nsm6" event={"ID":"6298ed1e-1de4-489a-ba4c-ca6f3f989909","Type":"ContainerDied","Data":"9935fa68ed4b543f4f0becf5c3be91db0ee85dbcffa9892d78fb821d80a0b82d"} Mar 20 11:02:02 crc kubenswrapper[4748]: I0320 11:02:02.471006 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29566742-g82x8" podStartSLOduration=1.450865017 podStartE2EDuration="2.470981406s" podCreationTimestamp="2026-03-20 11:02:00 +0000 UTC" firstStartedPulling="2026-03-20 11:02:00.951735185 +0000 UTC m=+1556.093280989" lastFinishedPulling="2026-03-20 11:02:01.971851554 +0000 UTC m=+1557.113397378" observedRunningTime="2026-03-20 11:02:02.467411947 +0000 UTC m=+1557.608957761" watchObservedRunningTime="2026-03-20 11:02:02.470981406 +0000 UTC m=+1557.612527220" Mar 20 11:02:03 crc kubenswrapper[4748]: E0320 11:02:03.163616 4748 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7997caf5_1478_40d5_a0c6_6811d242ef17.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7997caf5_1478_40d5_a0c6_6811d242ef17.slice/crio-3ce29b9d4c783c39bda5dc646d5800e7cffd2659c5cda847ab07079ef41fbd18\": RecentStats: unable to find data in memory cache]" Mar 20 11:02:03 crc kubenswrapper[4748]: I0320 11:02:03.470370 4748 generic.go:334] "Generic (PLEG): container finished" podID="97916155-2786-4b69-afc3-86954c6b4b50" containerID="f2a8ddd516cf1926db46fb199dfb1f31a737a26079dd893201121819223acf58" exitCode=0 Mar 20 11:02:03 crc kubenswrapper[4748]: I0320 11:02:03.470470 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566742-g82x8" event={"ID":"97916155-2786-4b69-afc3-86954c6b4b50","Type":"ContainerDied","Data":"f2a8ddd516cf1926db46fb199dfb1f31a737a26079dd893201121819223acf58"} Mar 20 11:02:03 crc kubenswrapper[4748]: I0320 11:02:03.882747 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6nsm6" Mar 20 11:02:03 crc kubenswrapper[4748]: I0320 11:02:03.999112 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6298ed1e-1de4-489a-ba4c-ca6f3f989909-inventory\") pod \"6298ed1e-1de4-489a-ba4c-ca6f3f989909\" (UID: \"6298ed1e-1de4-489a-ba4c-ca6f3f989909\") " Mar 20 11:02:03 crc kubenswrapper[4748]: I0320 11:02:03.999277 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6298ed1e-1de4-489a-ba4c-ca6f3f989909-repo-setup-combined-ca-bundle\") pod \"6298ed1e-1de4-489a-ba4c-ca6f3f989909\" (UID: \"6298ed1e-1de4-489a-ba4c-ca6f3f989909\") " Mar 20 11:02:03 crc kubenswrapper[4748]: I0320 11:02:03.999322 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bl4tr\" (UniqueName: \"kubernetes.io/projected/6298ed1e-1de4-489a-ba4c-ca6f3f989909-kube-api-access-bl4tr\") pod \"6298ed1e-1de4-489a-ba4c-ca6f3f989909\" (UID: \"6298ed1e-1de4-489a-ba4c-ca6f3f989909\") " Mar 20 11:02:03 crc kubenswrapper[4748]: I0320 11:02:03.999372 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6298ed1e-1de4-489a-ba4c-ca6f3f989909-ssh-key-openstack-edpm-ipam\") pod \"6298ed1e-1de4-489a-ba4c-ca6f3f989909\" (UID: \"6298ed1e-1de4-489a-ba4c-ca6f3f989909\") " Mar 20 11:02:04 crc kubenswrapper[4748]: I0320 11:02:04.006720 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6298ed1e-1de4-489a-ba4c-ca6f3f989909-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "6298ed1e-1de4-489a-ba4c-ca6f3f989909" (UID: "6298ed1e-1de4-489a-ba4c-ca6f3f989909"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 11:02:04 crc kubenswrapper[4748]: I0320 11:02:04.007743 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6298ed1e-1de4-489a-ba4c-ca6f3f989909-kube-api-access-bl4tr" (OuterVolumeSpecName: "kube-api-access-bl4tr") pod "6298ed1e-1de4-489a-ba4c-ca6f3f989909" (UID: "6298ed1e-1de4-489a-ba4c-ca6f3f989909"). InnerVolumeSpecName "kube-api-access-bl4tr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:02:04 crc kubenswrapper[4748]: I0320 11:02:04.030762 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6298ed1e-1de4-489a-ba4c-ca6f3f989909-inventory" (OuterVolumeSpecName: "inventory") pod "6298ed1e-1de4-489a-ba4c-ca6f3f989909" (UID: "6298ed1e-1de4-489a-ba4c-ca6f3f989909"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 11:02:04 crc kubenswrapper[4748]: I0320 11:02:04.033110 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6298ed1e-1de4-489a-ba4c-ca6f3f989909-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "6298ed1e-1de4-489a-ba4c-ca6f3f989909" (UID: "6298ed1e-1de4-489a-ba4c-ca6f3f989909"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 11:02:04 crc kubenswrapper[4748]: I0320 11:02:04.101733 4748 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6298ed1e-1de4-489a-ba4c-ca6f3f989909-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 11:02:04 crc kubenswrapper[4748]: I0320 11:02:04.101778 4748 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6298ed1e-1de4-489a-ba4c-ca6f3f989909-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 11:02:04 crc kubenswrapper[4748]: I0320 11:02:04.101791 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bl4tr\" (UniqueName: \"kubernetes.io/projected/6298ed1e-1de4-489a-ba4c-ca6f3f989909-kube-api-access-bl4tr\") on node \"crc\" DevicePath \"\"" Mar 20 11:02:04 crc kubenswrapper[4748]: I0320 11:02:04.101804 4748 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6298ed1e-1de4-489a-ba4c-ca6f3f989909-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 11:02:04 crc kubenswrapper[4748]: I0320 11:02:04.480704 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6nsm6" event={"ID":"6298ed1e-1de4-489a-ba4c-ca6f3f989909","Type":"ContainerDied","Data":"ee297add583c1724db1957dc6098b04ce4c62abdb7a546bb80c51b2e4ba74088"} Mar 20 11:02:04 crc kubenswrapper[4748]: I0320 11:02:04.480761 4748 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ee297add583c1724db1957dc6098b04ce4c62abdb7a546bb80c51b2e4ba74088" Mar 20 11:02:04 crc kubenswrapper[4748]: I0320 11:02:04.480727 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6nsm6" Mar 20 11:02:04 crc kubenswrapper[4748]: I0320 11:02:04.566546 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-s6rzn"] Mar 20 11:02:04 crc kubenswrapper[4748]: E0320 11:02:04.567029 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6298ed1e-1de4-489a-ba4c-ca6f3f989909" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 20 11:02:04 crc kubenswrapper[4748]: I0320 11:02:04.567052 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="6298ed1e-1de4-489a-ba4c-ca6f3f989909" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 20 11:02:04 crc kubenswrapper[4748]: I0320 11:02:04.567332 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="6298ed1e-1de4-489a-ba4c-ca6f3f989909" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 20 11:02:04 crc kubenswrapper[4748]: I0320 11:02:04.568103 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-s6rzn" Mar 20 11:02:04 crc kubenswrapper[4748]: I0320 11:02:04.575202 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-fd5jb" Mar 20 11:02:04 crc kubenswrapper[4748]: I0320 11:02:04.575334 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 11:02:04 crc kubenswrapper[4748]: I0320 11:02:04.575440 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 11:02:04 crc kubenswrapper[4748]: I0320 11:02:04.575629 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 11:02:04 crc kubenswrapper[4748]: I0320 11:02:04.595587 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-s6rzn"] Mar 20 11:02:04 crc kubenswrapper[4748]: I0320 11:02:04.719393 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c8abd498-c75d-47c5-992b-77857b856c30-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-s6rzn\" (UID: \"c8abd498-c75d-47c5-992b-77857b856c30\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-s6rzn" Mar 20 11:02:04 crc kubenswrapper[4748]: I0320 11:02:04.719748 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c8abd498-c75d-47c5-992b-77857b856c30-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-s6rzn\" (UID: \"c8abd498-c75d-47c5-992b-77857b856c30\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-s6rzn" Mar 20 11:02:04 crc kubenswrapper[4748]: I0320 11:02:04.719803 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5v98f\" (UniqueName: \"kubernetes.io/projected/c8abd498-c75d-47c5-992b-77857b856c30-kube-api-access-5v98f\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-s6rzn\" (UID: \"c8abd498-c75d-47c5-992b-77857b856c30\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-s6rzn" Mar 20 11:02:04 crc kubenswrapper[4748]: I0320 11:02:04.775788 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566742-g82x8" Mar 20 11:02:04 crc kubenswrapper[4748]: I0320 11:02:04.824647 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c8abd498-c75d-47c5-992b-77857b856c30-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-s6rzn\" (UID: \"c8abd498-c75d-47c5-992b-77857b856c30\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-s6rzn" Mar 20 11:02:04 crc kubenswrapper[4748]: I0320 11:02:04.824693 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c8abd498-c75d-47c5-992b-77857b856c30-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-s6rzn\" (UID: \"c8abd498-c75d-47c5-992b-77857b856c30\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-s6rzn" Mar 20 11:02:04 crc kubenswrapper[4748]: I0320 11:02:04.824740 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5v98f\" (UniqueName: \"kubernetes.io/projected/c8abd498-c75d-47c5-992b-77857b856c30-kube-api-access-5v98f\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-s6rzn\" (UID: \"c8abd498-c75d-47c5-992b-77857b856c30\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-s6rzn" Mar 20 11:02:04 crc kubenswrapper[4748]: I0320 11:02:04.832507 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c8abd498-c75d-47c5-992b-77857b856c30-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-s6rzn\" (UID: \"c8abd498-c75d-47c5-992b-77857b856c30\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-s6rzn" Mar 20 11:02:04 crc kubenswrapper[4748]: I0320 11:02:04.832591 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c8abd498-c75d-47c5-992b-77857b856c30-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-s6rzn\" (UID: \"c8abd498-c75d-47c5-992b-77857b856c30\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-s6rzn" Mar 20 11:02:04 crc kubenswrapper[4748]: I0320 11:02:04.846032 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5v98f\" (UniqueName: \"kubernetes.io/projected/c8abd498-c75d-47c5-992b-77857b856c30-kube-api-access-5v98f\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-s6rzn\" (UID: \"c8abd498-c75d-47c5-992b-77857b856c30\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-s6rzn" Mar 20 11:02:04 crc kubenswrapper[4748]: I0320 11:02:04.891804 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-s6rzn" Mar 20 11:02:04 crc kubenswrapper[4748]: I0320 11:02:04.926098 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zq9xn\" (UniqueName: \"kubernetes.io/projected/97916155-2786-4b69-afc3-86954c6b4b50-kube-api-access-zq9xn\") pod \"97916155-2786-4b69-afc3-86954c6b4b50\" (UID: \"97916155-2786-4b69-afc3-86954c6b4b50\") " Mar 20 11:02:04 crc kubenswrapper[4748]: I0320 11:02:04.932449 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97916155-2786-4b69-afc3-86954c6b4b50-kube-api-access-zq9xn" (OuterVolumeSpecName: "kube-api-access-zq9xn") pod "97916155-2786-4b69-afc3-86954c6b4b50" (UID: "97916155-2786-4b69-afc3-86954c6b4b50"). InnerVolumeSpecName "kube-api-access-zq9xn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:02:05 crc kubenswrapper[4748]: I0320 11:02:05.028775 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zq9xn\" (UniqueName: \"kubernetes.io/projected/97916155-2786-4b69-afc3-86954c6b4b50-kube-api-access-zq9xn\") on node \"crc\" DevicePath \"\"" Mar 20 11:02:05 crc kubenswrapper[4748]: W0320 11:02:05.409262 4748 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc8abd498_c75d_47c5_992b_77857b856c30.slice/crio-1aabd8278acf52fdf27f5221519ef6c119d8331b0b507ef7ffc9b2fb5aa5a751 WatchSource:0}: Error finding container 1aabd8278acf52fdf27f5221519ef6c119d8331b0b507ef7ffc9b2fb5aa5a751: Status 404 returned error can't find the container with id 1aabd8278acf52fdf27f5221519ef6c119d8331b0b507ef7ffc9b2fb5aa5a751 Mar 20 11:02:05 crc kubenswrapper[4748]: I0320 11:02:05.410483 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-s6rzn"] Mar 20 11:02:05 crc kubenswrapper[4748]: I0320 11:02:05.492950 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-s6rzn" event={"ID":"c8abd498-c75d-47c5-992b-77857b856c30","Type":"ContainerStarted","Data":"1aabd8278acf52fdf27f5221519ef6c119d8331b0b507ef7ffc9b2fb5aa5a751"} Mar 20 11:02:05 crc kubenswrapper[4748]: I0320 11:02:05.494902 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566742-g82x8" event={"ID":"97916155-2786-4b69-afc3-86954c6b4b50","Type":"ContainerDied","Data":"0b97e428364f28f5f0736199ddb24c297d92a4b350dee34b36aff1a9dccf9625"} Mar 20 11:02:05 crc kubenswrapper[4748]: I0320 11:02:05.494946 4748 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0b97e428364f28f5f0736199ddb24c297d92a4b350dee34b36aff1a9dccf9625" Mar 20 11:02:05 crc kubenswrapper[4748]: I0320 11:02:05.494943 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566742-g82x8" Mar 20 11:02:05 crc kubenswrapper[4748]: I0320 11:02:05.548963 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566736-zpttt"] Mar 20 11:02:05 crc kubenswrapper[4748]: I0320 11:02:05.560599 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566736-zpttt"] Mar 20 11:02:05 crc kubenswrapper[4748]: I0320 11:02:05.599174 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 11:02:05 crc kubenswrapper[4748]: I0320 11:02:05.849010 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-j4djx" Mar 20 11:02:05 crc kubenswrapper[4748]: I0320 11:02:05.849073 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-j4djx" Mar 20 11:02:05 crc kubenswrapper[4748]: I0320 11:02:05.904474 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-j4djx" Mar 20 11:02:06 crc kubenswrapper[4748]: I0320 11:02:06.505874 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-s6rzn" event={"ID":"c8abd498-c75d-47c5-992b-77857b856c30","Type":"ContainerStarted","Data":"41abad1438045219fd169ded2b012d676626b927eedf754e3e1c45a090717026"} Mar 20 11:02:06 crc kubenswrapper[4748]: I0320 11:02:06.521679 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-s6rzn" podStartSLOduration=2.337416577 podStartE2EDuration="2.521662161s" podCreationTimestamp="2026-03-20 11:02:04 +0000 UTC" firstStartedPulling="2026-03-20 11:02:05.412066615 +0000 UTC m=+1560.553612429" lastFinishedPulling="2026-03-20 11:02:05.596312199 +0000 UTC m=+1560.737858013" observedRunningTime="2026-03-20 11:02:06.520903252 +0000 UTC m=+1561.662449066" watchObservedRunningTime="2026-03-20 11:02:06.521662161 +0000 UTC m=+1561.663207975" Mar 20 11:02:06 crc kubenswrapper[4748]: I0320 11:02:06.559340 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-j4djx" Mar 20 11:02:06 crc kubenswrapper[4748]: I0320 11:02:06.736734 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-j4djx"] Mar 20 11:02:07 crc kubenswrapper[4748]: I0320 11:02:07.529445 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1de2aa6f-ec78-4ec8-bc9d-f903756a3fd3" path="/var/lib/kubelet/pods/1de2aa6f-ec78-4ec8-bc9d-f903756a3fd3/volumes" Mar 20 11:02:08 crc kubenswrapper[4748]: I0320 11:02:08.525641 4748 generic.go:334] "Generic (PLEG): container finished" podID="c8abd498-c75d-47c5-992b-77857b856c30" containerID="41abad1438045219fd169ded2b012d676626b927eedf754e3e1c45a090717026" exitCode=0 Mar 20 11:02:08 crc kubenswrapper[4748]: I0320 11:02:08.525750 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-s6rzn" event={"ID":"c8abd498-c75d-47c5-992b-77857b856c30","Type":"ContainerDied","Data":"41abad1438045219fd169ded2b012d676626b927eedf754e3e1c45a090717026"} Mar 20 11:02:08 crc kubenswrapper[4748]: I0320 11:02:08.526246 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-j4djx" podUID="8f860d5a-e1f3-4500-871e-6e9fb5f8acd6" containerName="registry-server" containerID="cri-o://fd346146db74d9746a9c49401fd8370941cbcecd520608fce60c559d832f242c" gracePeriod=2 Mar 20 11:02:09 crc kubenswrapper[4748]: I0320 11:02:09.023687 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j4djx" Mar 20 11:02:09 crc kubenswrapper[4748]: I0320 11:02:09.119451 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f860d5a-e1f3-4500-871e-6e9fb5f8acd6-utilities\") pod \"8f860d5a-e1f3-4500-871e-6e9fb5f8acd6\" (UID: \"8f860d5a-e1f3-4500-871e-6e9fb5f8acd6\") " Mar 20 11:02:09 crc kubenswrapper[4748]: I0320 11:02:09.119549 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f860d5a-e1f3-4500-871e-6e9fb5f8acd6-catalog-content\") pod \"8f860d5a-e1f3-4500-871e-6e9fb5f8acd6\" (UID: \"8f860d5a-e1f3-4500-871e-6e9fb5f8acd6\") " Mar 20 11:02:09 crc kubenswrapper[4748]: I0320 11:02:09.119628 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-786lm\" (UniqueName: \"kubernetes.io/projected/8f860d5a-e1f3-4500-871e-6e9fb5f8acd6-kube-api-access-786lm\") pod \"8f860d5a-e1f3-4500-871e-6e9fb5f8acd6\" (UID: \"8f860d5a-e1f3-4500-871e-6e9fb5f8acd6\") " Mar 20 11:02:09 crc kubenswrapper[4748]: I0320 11:02:09.120917 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f860d5a-e1f3-4500-871e-6e9fb5f8acd6-utilities" (OuterVolumeSpecName: "utilities") pod "8f860d5a-e1f3-4500-871e-6e9fb5f8acd6" (UID: "8f860d5a-e1f3-4500-871e-6e9fb5f8acd6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:02:09 crc kubenswrapper[4748]: I0320 11:02:09.127067 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f860d5a-e1f3-4500-871e-6e9fb5f8acd6-kube-api-access-786lm" (OuterVolumeSpecName: "kube-api-access-786lm") pod "8f860d5a-e1f3-4500-871e-6e9fb5f8acd6" (UID: "8f860d5a-e1f3-4500-871e-6e9fb5f8acd6"). InnerVolumeSpecName "kube-api-access-786lm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:02:09 crc kubenswrapper[4748]: I0320 11:02:09.148612 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f860d5a-e1f3-4500-871e-6e9fb5f8acd6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8f860d5a-e1f3-4500-871e-6e9fb5f8acd6" (UID: "8f860d5a-e1f3-4500-871e-6e9fb5f8acd6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:02:09 crc kubenswrapper[4748]: I0320 11:02:09.221736 4748 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f860d5a-e1f3-4500-871e-6e9fb5f8acd6-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 11:02:09 crc kubenswrapper[4748]: I0320 11:02:09.221782 4748 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f860d5a-e1f3-4500-871e-6e9fb5f8acd6-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 11:02:09 crc kubenswrapper[4748]: I0320 11:02:09.221797 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-786lm\" (UniqueName: \"kubernetes.io/projected/8f860d5a-e1f3-4500-871e-6e9fb5f8acd6-kube-api-access-786lm\") on node \"crc\" DevicePath \"\"" Mar 20 11:02:09 crc kubenswrapper[4748]: I0320 11:02:09.537363 4748 generic.go:334] "Generic (PLEG): container finished" podID="8f860d5a-e1f3-4500-871e-6e9fb5f8acd6" containerID="fd346146db74d9746a9c49401fd8370941cbcecd520608fce60c559d832f242c" exitCode=0 Mar 20 11:02:09 crc kubenswrapper[4748]: I0320 11:02:09.537475 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j4djx" event={"ID":"8f860d5a-e1f3-4500-871e-6e9fb5f8acd6","Type":"ContainerDied","Data":"fd346146db74d9746a9c49401fd8370941cbcecd520608fce60c559d832f242c"} Mar 20 11:02:09 crc kubenswrapper[4748]: I0320 11:02:09.537578 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j4djx" event={"ID":"8f860d5a-e1f3-4500-871e-6e9fb5f8acd6","Type":"ContainerDied","Data":"6e68bce54ded2a6a7800eaa31e30c1709e8057cd77dfc9ec611ebf5998943e29"} Mar 20 11:02:09 crc kubenswrapper[4748]: I0320 11:02:09.537419 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j4djx" Mar 20 11:02:09 crc kubenswrapper[4748]: I0320 11:02:09.537638 4748 scope.go:117] "RemoveContainer" containerID="fd346146db74d9746a9c49401fd8370941cbcecd520608fce60c559d832f242c" Mar 20 11:02:09 crc kubenswrapper[4748]: I0320 11:02:09.574244 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-j4djx"] Mar 20 11:02:09 crc kubenswrapper[4748]: I0320 11:02:09.583753 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-j4djx"] Mar 20 11:02:09 crc kubenswrapper[4748]: I0320 11:02:09.589097 4748 scope.go:117] "RemoveContainer" containerID="6fd836298a433e44ab162c36a25210d10d532f15c6dd79229d2cebf91190e73c" Mar 20 11:02:09 crc kubenswrapper[4748]: I0320 11:02:09.612212 4748 scope.go:117] "RemoveContainer" containerID="f64edf2bae4476b8c911f9259c8baf848da52bf51f4107c840041379ee79564c" Mar 20 11:02:09 crc kubenswrapper[4748]: I0320 11:02:09.663192 4748 scope.go:117] "RemoveContainer" containerID="fd346146db74d9746a9c49401fd8370941cbcecd520608fce60c559d832f242c" Mar 20 11:02:09 crc kubenswrapper[4748]: E0320 11:02:09.663786 4748 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd346146db74d9746a9c49401fd8370941cbcecd520608fce60c559d832f242c\": container with ID starting with fd346146db74d9746a9c49401fd8370941cbcecd520608fce60c559d832f242c not found: ID does not exist" containerID="fd346146db74d9746a9c49401fd8370941cbcecd520608fce60c559d832f242c" Mar 20 11:02:09 crc kubenswrapper[4748]: I0320 11:02:09.663844 4748 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd346146db74d9746a9c49401fd8370941cbcecd520608fce60c559d832f242c"} err="failed to get container status \"fd346146db74d9746a9c49401fd8370941cbcecd520608fce60c559d832f242c\": rpc error: code = NotFound desc = could not find container \"fd346146db74d9746a9c49401fd8370941cbcecd520608fce60c559d832f242c\": container with ID starting with fd346146db74d9746a9c49401fd8370941cbcecd520608fce60c559d832f242c not found: ID does not exist" Mar 20 11:02:09 crc kubenswrapper[4748]: I0320 11:02:09.663875 4748 scope.go:117] "RemoveContainer" containerID="6fd836298a433e44ab162c36a25210d10d532f15c6dd79229d2cebf91190e73c" Mar 20 11:02:09 crc kubenswrapper[4748]: E0320 11:02:09.664230 4748 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6fd836298a433e44ab162c36a25210d10d532f15c6dd79229d2cebf91190e73c\": container with ID starting with 6fd836298a433e44ab162c36a25210d10d532f15c6dd79229d2cebf91190e73c not found: ID does not exist" containerID="6fd836298a433e44ab162c36a25210d10d532f15c6dd79229d2cebf91190e73c" Mar 20 11:02:09 crc kubenswrapper[4748]: I0320 11:02:09.664270 4748 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6fd836298a433e44ab162c36a25210d10d532f15c6dd79229d2cebf91190e73c"} err="failed to get container status \"6fd836298a433e44ab162c36a25210d10d532f15c6dd79229d2cebf91190e73c\": rpc error: code = NotFound desc = could not find container \"6fd836298a433e44ab162c36a25210d10d532f15c6dd79229d2cebf91190e73c\": container with ID starting with 6fd836298a433e44ab162c36a25210d10d532f15c6dd79229d2cebf91190e73c not found: ID does not exist" Mar 20 11:02:09 crc kubenswrapper[4748]: I0320 11:02:09.664302 4748 scope.go:117] "RemoveContainer" containerID="f64edf2bae4476b8c911f9259c8baf848da52bf51f4107c840041379ee79564c" Mar 20 11:02:09 crc kubenswrapper[4748]: E0320 11:02:09.664705 4748 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f64edf2bae4476b8c911f9259c8baf848da52bf51f4107c840041379ee79564c\": container with ID starting with f64edf2bae4476b8c911f9259c8baf848da52bf51f4107c840041379ee79564c not found: ID does not exist" containerID="f64edf2bae4476b8c911f9259c8baf848da52bf51f4107c840041379ee79564c" Mar 20 11:02:09 crc kubenswrapper[4748]: I0320 11:02:09.664739 4748 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f64edf2bae4476b8c911f9259c8baf848da52bf51f4107c840041379ee79564c"} err="failed to get container status \"f64edf2bae4476b8c911f9259c8baf848da52bf51f4107c840041379ee79564c\": rpc error: code = NotFound desc = could not find container \"f64edf2bae4476b8c911f9259c8baf848da52bf51f4107c840041379ee79564c\": container with ID starting with f64edf2bae4476b8c911f9259c8baf848da52bf51f4107c840041379ee79564c not found: ID does not exist" Mar 20 11:02:09 crc kubenswrapper[4748]: I0320 11:02:09.970619 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-s6rzn" Mar 20 11:02:10 crc kubenswrapper[4748]: I0320 11:02:10.035858 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5v98f\" (UniqueName: \"kubernetes.io/projected/c8abd498-c75d-47c5-992b-77857b856c30-kube-api-access-5v98f\") pod \"c8abd498-c75d-47c5-992b-77857b856c30\" (UID: \"c8abd498-c75d-47c5-992b-77857b856c30\") " Mar 20 11:02:10 crc kubenswrapper[4748]: I0320 11:02:10.035978 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c8abd498-c75d-47c5-992b-77857b856c30-ssh-key-openstack-edpm-ipam\") pod \"c8abd498-c75d-47c5-992b-77857b856c30\" (UID: \"c8abd498-c75d-47c5-992b-77857b856c30\") " Mar 20 11:02:10 crc kubenswrapper[4748]: I0320 11:02:10.036058 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c8abd498-c75d-47c5-992b-77857b856c30-inventory\") pod \"c8abd498-c75d-47c5-992b-77857b856c30\" (UID: \"c8abd498-c75d-47c5-992b-77857b856c30\") " Mar 20 11:02:10 crc kubenswrapper[4748]: I0320 11:02:10.046941 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8abd498-c75d-47c5-992b-77857b856c30-kube-api-access-5v98f" (OuterVolumeSpecName: "kube-api-access-5v98f") pod "c8abd498-c75d-47c5-992b-77857b856c30" (UID: "c8abd498-c75d-47c5-992b-77857b856c30"). InnerVolumeSpecName "kube-api-access-5v98f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:02:10 crc kubenswrapper[4748]: I0320 11:02:10.063810 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8abd498-c75d-47c5-992b-77857b856c30-inventory" (OuterVolumeSpecName: "inventory") pod "c8abd498-c75d-47c5-992b-77857b856c30" (UID: "c8abd498-c75d-47c5-992b-77857b856c30"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 11:02:10 crc kubenswrapper[4748]: I0320 11:02:10.090033 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8abd498-c75d-47c5-992b-77857b856c30-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "c8abd498-c75d-47c5-992b-77857b856c30" (UID: "c8abd498-c75d-47c5-992b-77857b856c30"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 11:02:10 crc kubenswrapper[4748]: I0320 11:02:10.139078 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5v98f\" (UniqueName: \"kubernetes.io/projected/c8abd498-c75d-47c5-992b-77857b856c30-kube-api-access-5v98f\") on node \"crc\" DevicePath \"\"" Mar 20 11:02:10 crc kubenswrapper[4748]: I0320 11:02:10.139114 4748 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c8abd498-c75d-47c5-992b-77857b856c30-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 11:02:10 crc kubenswrapper[4748]: I0320 11:02:10.139124 4748 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c8abd498-c75d-47c5-992b-77857b856c30-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 11:02:10 crc kubenswrapper[4748]: I0320 11:02:10.553406 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-s6rzn" event={"ID":"c8abd498-c75d-47c5-992b-77857b856c30","Type":"ContainerDied","Data":"1aabd8278acf52fdf27f5221519ef6c119d8331b0b507ef7ffc9b2fb5aa5a751"} Mar 20 11:02:10 crc kubenswrapper[4748]: I0320 11:02:10.553699 4748 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1aabd8278acf52fdf27f5221519ef6c119d8331b0b507ef7ffc9b2fb5aa5a751" Mar 20 11:02:10 crc kubenswrapper[4748]: I0320 11:02:10.553472 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-s6rzn" Mar 20 11:02:10 crc kubenswrapper[4748]: I0320 11:02:10.647475 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-q4wv5"] Mar 20 11:02:10 crc kubenswrapper[4748]: E0320 11:02:10.647876 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f860d5a-e1f3-4500-871e-6e9fb5f8acd6" containerName="extract-content" Mar 20 11:02:10 crc kubenswrapper[4748]: I0320 11:02:10.647894 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f860d5a-e1f3-4500-871e-6e9fb5f8acd6" containerName="extract-content" Mar 20 11:02:10 crc kubenswrapper[4748]: E0320 11:02:10.647919 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8abd498-c75d-47c5-992b-77857b856c30" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Mar 20 11:02:10 crc kubenswrapper[4748]: I0320 11:02:10.647933 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8abd498-c75d-47c5-992b-77857b856c30" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Mar 20 11:02:10 crc kubenswrapper[4748]: E0320 11:02:10.647947 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97916155-2786-4b69-afc3-86954c6b4b50" containerName="oc" Mar 20 11:02:10 crc kubenswrapper[4748]: I0320 11:02:10.647955 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="97916155-2786-4b69-afc3-86954c6b4b50" containerName="oc" Mar 20 11:02:10 crc kubenswrapper[4748]: E0320 11:02:10.647969 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f860d5a-e1f3-4500-871e-6e9fb5f8acd6" containerName="extract-utilities" Mar 20 11:02:10 crc kubenswrapper[4748]: I0320 11:02:10.647975 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f860d5a-e1f3-4500-871e-6e9fb5f8acd6" containerName="extract-utilities" Mar 20 11:02:10 crc kubenswrapper[4748]: E0320 11:02:10.647986 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f860d5a-e1f3-4500-871e-6e9fb5f8acd6" containerName="registry-server" Mar 20 11:02:10 crc kubenswrapper[4748]: I0320 11:02:10.647992 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f860d5a-e1f3-4500-871e-6e9fb5f8acd6" containerName="registry-server" Mar 20 11:02:10 crc kubenswrapper[4748]: I0320 11:02:10.648164 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f860d5a-e1f3-4500-871e-6e9fb5f8acd6" containerName="registry-server" Mar 20 11:02:10 crc kubenswrapper[4748]: I0320 11:02:10.648180 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="97916155-2786-4b69-afc3-86954c6b4b50" containerName="oc" Mar 20 11:02:10 crc kubenswrapper[4748]: I0320 11:02:10.648191 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8abd498-c75d-47c5-992b-77857b856c30" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Mar 20 11:02:10 crc kubenswrapper[4748]: I0320 11:02:10.648788 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-q4wv5" Mar 20 11:02:10 crc kubenswrapper[4748]: I0320 11:02:10.652707 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 11:02:10 crc kubenswrapper[4748]: I0320 11:02:10.652860 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 11:02:10 crc kubenswrapper[4748]: I0320 11:02:10.652950 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 11:02:10 crc kubenswrapper[4748]: I0320 11:02:10.653014 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-fd5jb" Mar 20 11:02:10 crc kubenswrapper[4748]: I0320 11:02:10.664702 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-q4wv5"] Mar 20 11:02:10 crc kubenswrapper[4748]: I0320 11:02:10.751334 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5qjm\" (UniqueName: \"kubernetes.io/projected/01e10255-e1d0-4e62-9b54-4c1043b5f502-kube-api-access-v5qjm\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-q4wv5\" (UID: \"01e10255-e1d0-4e62-9b54-4c1043b5f502\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-q4wv5" Mar 20 11:02:10 crc kubenswrapper[4748]: I0320 11:02:10.751436 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01e10255-e1d0-4e62-9b54-4c1043b5f502-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-q4wv5\" (UID: \"01e10255-e1d0-4e62-9b54-4c1043b5f502\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-q4wv5" Mar 20 11:02:10 crc kubenswrapper[4748]: I0320 11:02:10.751556 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/01e10255-e1d0-4e62-9b54-4c1043b5f502-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-q4wv5\" (UID: \"01e10255-e1d0-4e62-9b54-4c1043b5f502\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-q4wv5" Mar 20 11:02:10 crc kubenswrapper[4748]: I0320 11:02:10.751588 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/01e10255-e1d0-4e62-9b54-4c1043b5f502-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-q4wv5\" (UID: \"01e10255-e1d0-4e62-9b54-4c1043b5f502\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-q4wv5" Mar 20 11:02:10 crc kubenswrapper[4748]: I0320 11:02:10.853609 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01e10255-e1d0-4e62-9b54-4c1043b5f502-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-q4wv5\" (UID: \"01e10255-e1d0-4e62-9b54-4c1043b5f502\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-q4wv5" Mar 20 11:02:10 crc kubenswrapper[4748]: I0320 11:02:10.853750 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/01e10255-e1d0-4e62-9b54-4c1043b5f502-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-q4wv5\" (UID: \"01e10255-e1d0-4e62-9b54-4c1043b5f502\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-q4wv5" Mar 20 11:02:10 crc kubenswrapper[4748]: I0320 11:02:10.853790 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/01e10255-e1d0-4e62-9b54-4c1043b5f502-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-q4wv5\" (UID: \"01e10255-e1d0-4e62-9b54-4c1043b5f502\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-q4wv5" Mar 20 11:02:10 crc kubenswrapper[4748]: I0320 11:02:10.853868 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v5qjm\" (UniqueName: \"kubernetes.io/projected/01e10255-e1d0-4e62-9b54-4c1043b5f502-kube-api-access-v5qjm\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-q4wv5\" (UID: \"01e10255-e1d0-4e62-9b54-4c1043b5f502\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-q4wv5" Mar 20 11:02:10 crc kubenswrapper[4748]: I0320 11:02:10.858993 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/01e10255-e1d0-4e62-9b54-4c1043b5f502-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-q4wv5\" (UID: \"01e10255-e1d0-4e62-9b54-4c1043b5f502\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-q4wv5" Mar 20 11:02:10 crc kubenswrapper[4748]: I0320 11:02:10.860085 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/01e10255-e1d0-4e62-9b54-4c1043b5f502-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-q4wv5\" (UID: \"01e10255-e1d0-4e62-9b54-4c1043b5f502\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-q4wv5" Mar 20 11:02:10 crc kubenswrapper[4748]: I0320 11:02:10.861623 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01e10255-e1d0-4e62-9b54-4c1043b5f502-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-q4wv5\" (UID: \"01e10255-e1d0-4e62-9b54-4c1043b5f502\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-q4wv5" Mar 20 11:02:10 crc kubenswrapper[4748]: I0320 11:02:10.871129 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5qjm\" (UniqueName: \"kubernetes.io/projected/01e10255-e1d0-4e62-9b54-4c1043b5f502-kube-api-access-v5qjm\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-q4wv5\" (UID: \"01e10255-e1d0-4e62-9b54-4c1043b5f502\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-q4wv5" Mar 20 11:02:10 crc kubenswrapper[4748]: I0320 11:02:10.967187 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-q4wv5" Mar 20 11:02:11 crc kubenswrapper[4748]: I0320 11:02:11.525099 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f860d5a-e1f3-4500-871e-6e9fb5f8acd6" path="/var/lib/kubelet/pods/8f860d5a-e1f3-4500-871e-6e9fb5f8acd6/volumes" Mar 20 11:02:11 crc kubenswrapper[4748]: I0320 11:02:11.543943 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-q4wv5"] Mar 20 11:02:11 crc kubenswrapper[4748]: I0320 11:02:11.567648 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-q4wv5" event={"ID":"01e10255-e1d0-4e62-9b54-4c1043b5f502","Type":"ContainerStarted","Data":"7a91cb434075fb975306e67b3ddb13b4233768cea44f7632aba5b8e82531d530"} Mar 20 11:02:12 crc kubenswrapper[4748]: I0320 11:02:12.583549 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-q4wv5" event={"ID":"01e10255-e1d0-4e62-9b54-4c1043b5f502","Type":"ContainerStarted","Data":"5164b6bc15b2d4bbd0bb40e76f70af7ffc406cf66de263f26f18c83e5d50d2d9"} Mar 20 11:02:12 crc kubenswrapper[4748]: I0320 11:02:12.928754 4748 patch_prober.go:28] interesting pod/machine-config-daemon-5lbvz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:02:12 crc kubenswrapper[4748]: I0320 11:02:12.928825 4748 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:02:12 crc kubenswrapper[4748]: I0320 11:02:12.928945 4748 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" Mar 20 11:02:12 crc kubenswrapper[4748]: I0320 11:02:12.930013 4748 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ec8318a59f2a0dcbdd19ff1535aafa2664120c1ab98ee7f0ce82eda8b7b3e371"} pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 11:02:12 crc kubenswrapper[4748]: I0320 11:02:12.930100 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" containerName="machine-config-daemon" containerID="cri-o://ec8318a59f2a0dcbdd19ff1535aafa2664120c1ab98ee7f0ce82eda8b7b3e371" gracePeriod=600 Mar 20 11:02:13 crc kubenswrapper[4748]: I0320 11:02:13.598714 4748 generic.go:334] "Generic (PLEG): container finished" podID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" containerID="ec8318a59f2a0dcbdd19ff1535aafa2664120c1ab98ee7f0ce82eda8b7b3e371" exitCode=0 Mar 20 11:02:13 crc kubenswrapper[4748]: I0320 11:02:13.598809 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" event={"ID":"8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c","Type":"ContainerDied","Data":"ec8318a59f2a0dcbdd19ff1535aafa2664120c1ab98ee7f0ce82eda8b7b3e371"} Mar 20 11:02:13 crc kubenswrapper[4748]: I0320 11:02:13.599165 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" event={"ID":"8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c","Type":"ContainerStarted","Data":"e428509671e8fe99771dfcddf54085bbb58bd5b6557fd616733b187409873ead"} Mar 20 11:02:13 crc kubenswrapper[4748]: I0320 11:02:13.599187 4748 scope.go:117] "RemoveContainer" containerID="5702c4811be941554197075836f07c222a1d80d3d9c15e6ccc7b992ee69ce82c" Mar 20 11:02:13 crc kubenswrapper[4748]: I0320 11:02:13.624457 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-q4wv5" podStartSLOduration=3.444703288 podStartE2EDuration="3.624435609s" podCreationTimestamp="2026-03-20 11:02:10 +0000 UTC" firstStartedPulling="2026-03-20 11:02:11.550534328 +0000 UTC m=+1566.692080142" lastFinishedPulling="2026-03-20 11:02:11.730266649 +0000 UTC m=+1566.871812463" observedRunningTime="2026-03-20 11:02:12.603390326 +0000 UTC m=+1567.744936140" watchObservedRunningTime="2026-03-20 11:02:13.624435609 +0000 UTC m=+1568.765981423" Mar 20 11:02:17 crc kubenswrapper[4748]: I0320 11:02:17.304799 4748 scope.go:117] "RemoveContainer" containerID="e0195e5b9b083710c317244426866370155836b4ed13cd67af2e6cb0295ece7b" Mar 20 11:02:17 crc kubenswrapper[4748]: I0320 11:02:17.333741 4748 scope.go:117] "RemoveContainer" containerID="7cb3666c6f49b0aba3c470df100fcae2dc72637ad9c5c723dc545bd54342e5c2" Mar 20 11:02:17 crc kubenswrapper[4748]: I0320 11:02:17.378489 4748 scope.go:117] "RemoveContainer" containerID="e587cab5b997726602fd7a4fea64c3dd529dc0c8b7bed010dfa12a6999b395b7" Mar 20 11:02:17 crc kubenswrapper[4748]: I0320 11:02:17.442966 4748 scope.go:117] "RemoveContainer" containerID="be3695bbb6fdf796a499eeaa190fe1108f365b77b2d5a6962cc53640abb5b2de" Mar 20 11:02:17 crc kubenswrapper[4748]: I0320 11:02:17.495280 4748 scope.go:117] "RemoveContainer" containerID="f36200a33dc22e05dc3c7d05e0b92e8c66da61fab414e967919c12f075a3e662" Mar 20 11:02:22 crc kubenswrapper[4748]: I0320 11:02:22.952426 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-q9rwk"] Mar 20 11:02:22 crc kubenswrapper[4748]: I0320 11:02:22.955240 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-q9rwk" Mar 20 11:02:22 crc kubenswrapper[4748]: I0320 11:02:22.973705 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-q9rwk"] Mar 20 11:02:23 crc kubenswrapper[4748]: I0320 11:02:23.060476 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bae39b2b-a087-457b-bd48-1ae752e081a4-utilities\") pod \"certified-operators-q9rwk\" (UID: \"bae39b2b-a087-457b-bd48-1ae752e081a4\") " pod="openshift-marketplace/certified-operators-q9rwk" Mar 20 11:02:23 crc kubenswrapper[4748]: I0320 11:02:23.060533 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bae39b2b-a087-457b-bd48-1ae752e081a4-catalog-content\") pod \"certified-operators-q9rwk\" (UID: \"bae39b2b-a087-457b-bd48-1ae752e081a4\") " pod="openshift-marketplace/certified-operators-q9rwk" Mar 20 11:02:23 crc kubenswrapper[4748]: I0320 11:02:23.060584 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ht9f5\" (UniqueName: \"kubernetes.io/projected/bae39b2b-a087-457b-bd48-1ae752e081a4-kube-api-access-ht9f5\") pod \"certified-operators-q9rwk\" (UID: \"bae39b2b-a087-457b-bd48-1ae752e081a4\") " pod="openshift-marketplace/certified-operators-q9rwk" Mar 20 11:02:23 crc kubenswrapper[4748]: I0320 11:02:23.162286 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bae39b2b-a087-457b-bd48-1ae752e081a4-utilities\") pod \"certified-operators-q9rwk\" (UID: \"bae39b2b-a087-457b-bd48-1ae752e081a4\") " pod="openshift-marketplace/certified-operators-q9rwk" Mar 20 11:02:23 crc kubenswrapper[4748]: I0320 11:02:23.162348 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bae39b2b-a087-457b-bd48-1ae752e081a4-catalog-content\") pod \"certified-operators-q9rwk\" (UID: \"bae39b2b-a087-457b-bd48-1ae752e081a4\") " pod="openshift-marketplace/certified-operators-q9rwk" Mar 20 11:02:23 crc kubenswrapper[4748]: I0320 11:02:23.162412 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ht9f5\" (UniqueName: \"kubernetes.io/projected/bae39b2b-a087-457b-bd48-1ae752e081a4-kube-api-access-ht9f5\") pod \"certified-operators-q9rwk\" (UID: \"bae39b2b-a087-457b-bd48-1ae752e081a4\") " pod="openshift-marketplace/certified-operators-q9rwk" Mar 20 11:02:23 crc kubenswrapper[4748]: I0320 11:02:23.163070 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bae39b2b-a087-457b-bd48-1ae752e081a4-utilities\") pod \"certified-operators-q9rwk\" (UID: \"bae39b2b-a087-457b-bd48-1ae752e081a4\") " pod="openshift-marketplace/certified-operators-q9rwk" Mar 20 11:02:23 crc kubenswrapper[4748]: I0320 11:02:23.163121 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bae39b2b-a087-457b-bd48-1ae752e081a4-catalog-content\") pod \"certified-operators-q9rwk\" (UID: \"bae39b2b-a087-457b-bd48-1ae752e081a4\") " pod="openshift-marketplace/certified-operators-q9rwk" Mar 20 11:02:23 crc kubenswrapper[4748]: I0320 11:02:23.195818 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ht9f5\" (UniqueName: \"kubernetes.io/projected/bae39b2b-a087-457b-bd48-1ae752e081a4-kube-api-access-ht9f5\") pod \"certified-operators-q9rwk\" (UID: \"bae39b2b-a087-457b-bd48-1ae752e081a4\") " pod="openshift-marketplace/certified-operators-q9rwk" Mar 20 11:02:23 crc kubenswrapper[4748]: I0320 11:02:23.279908 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-q9rwk" Mar 20 11:02:23 crc kubenswrapper[4748]: I0320 11:02:23.814080 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-q9rwk"] Mar 20 11:02:24 crc kubenswrapper[4748]: I0320 11:02:24.737467 4748 generic.go:334] "Generic (PLEG): container finished" podID="bae39b2b-a087-457b-bd48-1ae752e081a4" containerID="0b303a97d10ea2da638f0c77dd5d9fbb1fbdb7a655d1be114e52b8db68ce2401" exitCode=0 Mar 20 11:02:24 crc kubenswrapper[4748]: I0320 11:02:24.737558 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q9rwk" event={"ID":"bae39b2b-a087-457b-bd48-1ae752e081a4","Type":"ContainerDied","Data":"0b303a97d10ea2da638f0c77dd5d9fbb1fbdb7a655d1be114e52b8db68ce2401"} Mar 20 11:02:24 crc kubenswrapper[4748]: I0320 11:02:24.740559 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q9rwk" event={"ID":"bae39b2b-a087-457b-bd48-1ae752e081a4","Type":"ContainerStarted","Data":"5e72329c6366758ff08c50d148c0ea5a58a968205fdd3416bca7bcc7d5eefd75"} Mar 20 11:02:25 crc kubenswrapper[4748]: I0320 11:02:25.752130 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q9rwk" event={"ID":"bae39b2b-a087-457b-bd48-1ae752e081a4","Type":"ContainerStarted","Data":"0c6e0ded94fb43056d9b94a7f4b3b5c93f57691b9d7d6312ca4a305078e1c42a"} Mar 20 11:02:26 crc kubenswrapper[4748]: I0320 11:02:26.764200 4748 generic.go:334] "Generic (PLEG): container finished" podID="bae39b2b-a087-457b-bd48-1ae752e081a4" containerID="0c6e0ded94fb43056d9b94a7f4b3b5c93f57691b9d7d6312ca4a305078e1c42a" exitCode=0 Mar 20 11:02:26 crc kubenswrapper[4748]: I0320 11:02:26.764264 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q9rwk" event={"ID":"bae39b2b-a087-457b-bd48-1ae752e081a4","Type":"ContainerDied","Data":"0c6e0ded94fb43056d9b94a7f4b3b5c93f57691b9d7d6312ca4a305078e1c42a"} Mar 20 11:02:27 crc kubenswrapper[4748]: I0320 11:02:27.779028 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q9rwk" event={"ID":"bae39b2b-a087-457b-bd48-1ae752e081a4","Type":"ContainerStarted","Data":"82096ee019ab65ea25c401772ff38cc840c27703ed0cedd43df2111243fdb6ef"} Mar 20 11:02:27 crc kubenswrapper[4748]: I0320 11:02:27.798457 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-q9rwk" podStartSLOduration=3.262242142 podStartE2EDuration="5.798423514s" podCreationTimestamp="2026-03-20 11:02:22 +0000 UTC" firstStartedPulling="2026-03-20 11:02:24.740229478 +0000 UTC m=+1579.881775282" lastFinishedPulling="2026-03-20 11:02:27.27641084 +0000 UTC m=+1582.417956654" observedRunningTime="2026-03-20 11:02:27.798023394 +0000 UTC m=+1582.939569218" watchObservedRunningTime="2026-03-20 11:02:27.798423514 +0000 UTC m=+1582.939969338" Mar 20 11:02:33 crc kubenswrapper[4748]: I0320 11:02:33.280033 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-q9rwk" Mar 20 11:02:33 crc kubenswrapper[4748]: I0320 11:02:33.281117 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-q9rwk" Mar 20 11:02:33 crc kubenswrapper[4748]: I0320 11:02:33.382387 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-q9rwk" Mar 20 11:02:33 crc kubenswrapper[4748]: I0320 11:02:33.908546 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-q9rwk" Mar 20 11:02:33 crc kubenswrapper[4748]: I0320 11:02:33.979983 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-q9rwk"] Mar 20 11:02:35 crc kubenswrapper[4748]: I0320 11:02:35.872105 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-q9rwk" podUID="bae39b2b-a087-457b-bd48-1ae752e081a4" containerName="registry-server" containerID="cri-o://82096ee019ab65ea25c401772ff38cc840c27703ed0cedd43df2111243fdb6ef" gracePeriod=2 Mar 20 11:02:36 crc kubenswrapper[4748]: I0320 11:02:36.392281 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-q9rwk" Mar 20 11:02:36 crc kubenswrapper[4748]: I0320 11:02:36.589259 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ht9f5\" (UniqueName: \"kubernetes.io/projected/bae39b2b-a087-457b-bd48-1ae752e081a4-kube-api-access-ht9f5\") pod \"bae39b2b-a087-457b-bd48-1ae752e081a4\" (UID: \"bae39b2b-a087-457b-bd48-1ae752e081a4\") " Mar 20 11:02:36 crc kubenswrapper[4748]: I0320 11:02:36.589514 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bae39b2b-a087-457b-bd48-1ae752e081a4-catalog-content\") pod \"bae39b2b-a087-457b-bd48-1ae752e081a4\" (UID: \"bae39b2b-a087-457b-bd48-1ae752e081a4\") " Mar 20 11:02:36 crc kubenswrapper[4748]: I0320 11:02:36.589643 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bae39b2b-a087-457b-bd48-1ae752e081a4-utilities\") pod \"bae39b2b-a087-457b-bd48-1ae752e081a4\" (UID: \"bae39b2b-a087-457b-bd48-1ae752e081a4\") " Mar 20 11:02:36 crc kubenswrapper[4748]: I0320 11:02:36.590985 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bae39b2b-a087-457b-bd48-1ae752e081a4-utilities" (OuterVolumeSpecName: "utilities") pod "bae39b2b-a087-457b-bd48-1ae752e081a4" (UID: "bae39b2b-a087-457b-bd48-1ae752e081a4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:02:36 crc kubenswrapper[4748]: I0320 11:02:36.599565 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bae39b2b-a087-457b-bd48-1ae752e081a4-kube-api-access-ht9f5" (OuterVolumeSpecName: "kube-api-access-ht9f5") pod "bae39b2b-a087-457b-bd48-1ae752e081a4" (UID: "bae39b2b-a087-457b-bd48-1ae752e081a4"). InnerVolumeSpecName "kube-api-access-ht9f5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:02:36 crc kubenswrapper[4748]: I0320 11:02:36.692389 4748 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bae39b2b-a087-457b-bd48-1ae752e081a4-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 11:02:36 crc kubenswrapper[4748]: I0320 11:02:36.692433 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ht9f5\" (UniqueName: \"kubernetes.io/projected/bae39b2b-a087-457b-bd48-1ae752e081a4-kube-api-access-ht9f5\") on node \"crc\" DevicePath \"\"" Mar 20 11:02:36 crc kubenswrapper[4748]: I0320 11:02:36.886589 4748 generic.go:334] "Generic (PLEG): container finished" podID="bae39b2b-a087-457b-bd48-1ae752e081a4" containerID="82096ee019ab65ea25c401772ff38cc840c27703ed0cedd43df2111243fdb6ef" exitCode=0 Mar 20 11:02:36 crc kubenswrapper[4748]: I0320 11:02:36.886643 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q9rwk" event={"ID":"bae39b2b-a087-457b-bd48-1ae752e081a4","Type":"ContainerDied","Data":"82096ee019ab65ea25c401772ff38cc840c27703ed0cedd43df2111243fdb6ef"} Mar 20 11:02:36 crc kubenswrapper[4748]: I0320 11:02:36.886682 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q9rwk" event={"ID":"bae39b2b-a087-457b-bd48-1ae752e081a4","Type":"ContainerDied","Data":"5e72329c6366758ff08c50d148c0ea5a58a968205fdd3416bca7bcc7d5eefd75"} Mar 20 11:02:36 crc kubenswrapper[4748]: I0320 11:02:36.886706 4748 scope.go:117] "RemoveContainer" containerID="82096ee019ab65ea25c401772ff38cc840c27703ed0cedd43df2111243fdb6ef" Mar 20 11:02:36 crc kubenswrapper[4748]: I0320 11:02:36.886697 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-q9rwk" Mar 20 11:02:36 crc kubenswrapper[4748]: I0320 11:02:36.932676 4748 scope.go:117] "RemoveContainer" containerID="0c6e0ded94fb43056d9b94a7f4b3b5c93f57691b9d7d6312ca4a305078e1c42a" Mar 20 11:02:36 crc kubenswrapper[4748]: I0320 11:02:36.966329 4748 scope.go:117] "RemoveContainer" containerID="0b303a97d10ea2da638f0c77dd5d9fbb1fbdb7a655d1be114e52b8db68ce2401" Mar 20 11:02:37 crc kubenswrapper[4748]: I0320 11:02:37.010882 4748 scope.go:117] "RemoveContainer" containerID="82096ee019ab65ea25c401772ff38cc840c27703ed0cedd43df2111243fdb6ef" Mar 20 11:02:37 crc kubenswrapper[4748]: E0320 11:02:37.012570 4748 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"82096ee019ab65ea25c401772ff38cc840c27703ed0cedd43df2111243fdb6ef\": container with ID starting with 82096ee019ab65ea25c401772ff38cc840c27703ed0cedd43df2111243fdb6ef not found: ID does not exist" containerID="82096ee019ab65ea25c401772ff38cc840c27703ed0cedd43df2111243fdb6ef" Mar 20 11:02:37 crc kubenswrapper[4748]: I0320 11:02:37.012651 4748 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82096ee019ab65ea25c401772ff38cc840c27703ed0cedd43df2111243fdb6ef"} err="failed to get container status \"82096ee019ab65ea25c401772ff38cc840c27703ed0cedd43df2111243fdb6ef\": rpc error: code = NotFound desc = could not find container \"82096ee019ab65ea25c401772ff38cc840c27703ed0cedd43df2111243fdb6ef\": container with ID starting with 82096ee019ab65ea25c401772ff38cc840c27703ed0cedd43df2111243fdb6ef not found: ID does not exist" Mar 20 11:02:37 crc kubenswrapper[4748]: I0320 11:02:37.012702 4748 scope.go:117] "RemoveContainer" containerID="0c6e0ded94fb43056d9b94a7f4b3b5c93f57691b9d7d6312ca4a305078e1c42a" Mar 20 11:02:37 crc kubenswrapper[4748]: E0320 11:02:37.013435 4748 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c6e0ded94fb43056d9b94a7f4b3b5c93f57691b9d7d6312ca4a305078e1c42a\": container with ID starting with 0c6e0ded94fb43056d9b94a7f4b3b5c93f57691b9d7d6312ca4a305078e1c42a not found: ID does not exist" containerID="0c6e0ded94fb43056d9b94a7f4b3b5c93f57691b9d7d6312ca4a305078e1c42a" Mar 20 11:02:37 crc kubenswrapper[4748]: I0320 11:02:37.013489 4748 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c6e0ded94fb43056d9b94a7f4b3b5c93f57691b9d7d6312ca4a305078e1c42a"} err="failed to get container status \"0c6e0ded94fb43056d9b94a7f4b3b5c93f57691b9d7d6312ca4a305078e1c42a\": rpc error: code = NotFound desc = could not find container \"0c6e0ded94fb43056d9b94a7f4b3b5c93f57691b9d7d6312ca4a305078e1c42a\": container with ID starting with 0c6e0ded94fb43056d9b94a7f4b3b5c93f57691b9d7d6312ca4a305078e1c42a not found: ID does not exist" Mar 20 11:02:37 crc kubenswrapper[4748]: I0320 11:02:37.013527 4748 scope.go:117] "RemoveContainer" containerID="0b303a97d10ea2da638f0c77dd5d9fbb1fbdb7a655d1be114e52b8db68ce2401" Mar 20 11:02:37 crc kubenswrapper[4748]: E0320 11:02:37.026195 4748 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b303a97d10ea2da638f0c77dd5d9fbb1fbdb7a655d1be114e52b8db68ce2401\": container with ID starting with 0b303a97d10ea2da638f0c77dd5d9fbb1fbdb7a655d1be114e52b8db68ce2401 not found: ID does not exist" containerID="0b303a97d10ea2da638f0c77dd5d9fbb1fbdb7a655d1be114e52b8db68ce2401" Mar 20 11:02:37 crc kubenswrapper[4748]: I0320 11:02:37.026857 4748 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b303a97d10ea2da638f0c77dd5d9fbb1fbdb7a655d1be114e52b8db68ce2401"} err="failed to get container status \"0b303a97d10ea2da638f0c77dd5d9fbb1fbdb7a655d1be114e52b8db68ce2401\": rpc error: code = NotFound desc = could not find container \"0b303a97d10ea2da638f0c77dd5d9fbb1fbdb7a655d1be114e52b8db68ce2401\": container with ID starting with 0b303a97d10ea2da638f0c77dd5d9fbb1fbdb7a655d1be114e52b8db68ce2401 not found: ID does not exist" Mar 20 11:02:37 crc kubenswrapper[4748]: I0320 11:02:37.494348 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bae39b2b-a087-457b-bd48-1ae752e081a4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bae39b2b-a087-457b-bd48-1ae752e081a4" (UID: "bae39b2b-a087-457b-bd48-1ae752e081a4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:02:37 crc kubenswrapper[4748]: I0320 11:02:37.509047 4748 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bae39b2b-a087-457b-bd48-1ae752e081a4-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 11:02:37 crc kubenswrapper[4748]: I0320 11:02:37.814689 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-q9rwk"] Mar 20 11:02:37 crc kubenswrapper[4748]: I0320 11:02:37.825260 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-q9rwk"] Mar 20 11:02:39 crc kubenswrapper[4748]: I0320 11:02:39.527140 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bae39b2b-a087-457b-bd48-1ae752e081a4" path="/var/lib/kubelet/pods/bae39b2b-a087-457b-bd48-1ae752e081a4/volumes" Mar 20 11:03:17 crc kubenswrapper[4748]: I0320 11:03:17.668543 4748 scope.go:117] "RemoveContainer" containerID="66fa7651be64027ace9f7143628b017a1dd50c839db23f62930476af8cd84dd3" Mar 20 11:03:40 crc kubenswrapper[4748]: I0320 11:03:40.260799 4748 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-operator-index-p9wd8" podUID="246b06bc-5f0b-4ef1-87eb-a0f56ad26e30" containerName="registry-server" probeResult="failure" output=< Mar 20 11:03:40 crc kubenswrapper[4748]: timeout: failed to connect service ":50051" within 1s Mar 20 11:03:40 crc kubenswrapper[4748]: > Mar 20 11:04:00 crc kubenswrapper[4748]: I0320 11:04:00.153488 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566744-2c55t"] Mar 20 11:04:00 crc kubenswrapper[4748]: E0320 11:04:00.154617 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bae39b2b-a087-457b-bd48-1ae752e081a4" containerName="extract-content" Mar 20 11:04:00 crc kubenswrapper[4748]: I0320 11:04:00.154633 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="bae39b2b-a087-457b-bd48-1ae752e081a4" containerName="extract-content" Mar 20 11:04:00 crc kubenswrapper[4748]: E0320 11:04:00.154654 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bae39b2b-a087-457b-bd48-1ae752e081a4" containerName="extract-utilities" Mar 20 11:04:00 crc kubenswrapper[4748]: I0320 11:04:00.154662 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="bae39b2b-a087-457b-bd48-1ae752e081a4" containerName="extract-utilities" Mar 20 11:04:00 crc kubenswrapper[4748]: E0320 11:04:00.154700 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bae39b2b-a087-457b-bd48-1ae752e081a4" containerName="registry-server" Mar 20 11:04:00 crc kubenswrapper[4748]: I0320 11:04:00.154709 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="bae39b2b-a087-457b-bd48-1ae752e081a4" containerName="registry-server" Mar 20 11:04:00 crc kubenswrapper[4748]: I0320 11:04:00.154980 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="bae39b2b-a087-457b-bd48-1ae752e081a4" containerName="registry-server" Mar 20 11:04:00 crc kubenswrapper[4748]: I0320 11:04:00.155770 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566744-2c55t" Mar 20 11:04:00 crc kubenswrapper[4748]: I0320 11:04:00.159544 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-p6q8z" Mar 20 11:04:00 crc kubenswrapper[4748]: I0320 11:04:00.159780 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 11:04:00 crc kubenswrapper[4748]: I0320 11:04:00.159861 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 11:04:00 crc kubenswrapper[4748]: I0320 11:04:00.168391 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566744-2c55t"] Mar 20 11:04:00 crc kubenswrapper[4748]: I0320 11:04:00.258462 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5p26\" (UniqueName: \"kubernetes.io/projected/3b5d23e4-abaf-49c2-87df-fb1243d774f6-kube-api-access-b5p26\") pod \"auto-csr-approver-29566744-2c55t\" (UID: \"3b5d23e4-abaf-49c2-87df-fb1243d774f6\") " pod="openshift-infra/auto-csr-approver-29566744-2c55t" Mar 20 11:04:00 crc kubenswrapper[4748]: I0320 11:04:00.361064 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5p26\" (UniqueName: \"kubernetes.io/projected/3b5d23e4-abaf-49c2-87df-fb1243d774f6-kube-api-access-b5p26\") pod \"auto-csr-approver-29566744-2c55t\" (UID: \"3b5d23e4-abaf-49c2-87df-fb1243d774f6\") " pod="openshift-infra/auto-csr-approver-29566744-2c55t" Mar 20 11:04:00 crc kubenswrapper[4748]: I0320 11:04:00.380525 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5p26\" (UniqueName: \"kubernetes.io/projected/3b5d23e4-abaf-49c2-87df-fb1243d774f6-kube-api-access-b5p26\") pod \"auto-csr-approver-29566744-2c55t\" (UID: \"3b5d23e4-abaf-49c2-87df-fb1243d774f6\") " pod="openshift-infra/auto-csr-approver-29566744-2c55t" Mar 20 11:04:00 crc kubenswrapper[4748]: I0320 11:04:00.481514 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566744-2c55t" Mar 20 11:04:00 crc kubenswrapper[4748]: I0320 11:04:00.905300 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566744-2c55t"] Mar 20 11:04:01 crc kubenswrapper[4748]: I0320 11:04:01.685958 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566744-2c55t" event={"ID":"3b5d23e4-abaf-49c2-87df-fb1243d774f6","Type":"ContainerStarted","Data":"ce4c1aa64c22c19c93918ad69a547af54091626da06da6bd1460220c911af523"} Mar 20 11:04:02 crc kubenswrapper[4748]: I0320 11:04:02.695501 4748 generic.go:334] "Generic (PLEG): container finished" podID="3b5d23e4-abaf-49c2-87df-fb1243d774f6" containerID="4b8527fdfa261ede39a84da80f5b2d12a0691b869c53b7e9ffce1b4b103e38e7" exitCode=0 Mar 20 11:04:02 crc kubenswrapper[4748]: I0320 11:04:02.695771 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566744-2c55t" event={"ID":"3b5d23e4-abaf-49c2-87df-fb1243d774f6","Type":"ContainerDied","Data":"4b8527fdfa261ede39a84da80f5b2d12a0691b869c53b7e9ffce1b4b103e38e7"} Mar 20 11:04:04 crc kubenswrapper[4748]: I0320 11:04:04.044256 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566744-2c55t" Mar 20 11:04:04 crc kubenswrapper[4748]: I0320 11:04:04.247063 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b5p26\" (UniqueName: \"kubernetes.io/projected/3b5d23e4-abaf-49c2-87df-fb1243d774f6-kube-api-access-b5p26\") pod \"3b5d23e4-abaf-49c2-87df-fb1243d774f6\" (UID: \"3b5d23e4-abaf-49c2-87df-fb1243d774f6\") " Mar 20 11:04:04 crc kubenswrapper[4748]: I0320 11:04:04.253402 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b5d23e4-abaf-49c2-87df-fb1243d774f6-kube-api-access-b5p26" (OuterVolumeSpecName: "kube-api-access-b5p26") pod "3b5d23e4-abaf-49c2-87df-fb1243d774f6" (UID: "3b5d23e4-abaf-49c2-87df-fb1243d774f6"). InnerVolumeSpecName "kube-api-access-b5p26". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:04:04 crc kubenswrapper[4748]: I0320 11:04:04.349058 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b5p26\" (UniqueName: \"kubernetes.io/projected/3b5d23e4-abaf-49c2-87df-fb1243d774f6-kube-api-access-b5p26\") on node \"crc\" DevicePath \"\"" Mar 20 11:04:04 crc kubenswrapper[4748]: I0320 11:04:04.720666 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566744-2c55t" event={"ID":"3b5d23e4-abaf-49c2-87df-fb1243d774f6","Type":"ContainerDied","Data":"ce4c1aa64c22c19c93918ad69a547af54091626da06da6bd1460220c911af523"} Mar 20 11:04:04 crc kubenswrapper[4748]: I0320 11:04:04.720708 4748 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ce4c1aa64c22c19c93918ad69a547af54091626da06da6bd1460220c911af523" Mar 20 11:04:04 crc kubenswrapper[4748]: I0320 11:04:04.720746 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566744-2c55t" Mar 20 11:04:05 crc kubenswrapper[4748]: I0320 11:04:05.111320 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566738-k6md5"] Mar 20 11:04:05 crc kubenswrapper[4748]: I0320 11:04:05.118774 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566738-k6md5"] Mar 20 11:04:05 crc kubenswrapper[4748]: I0320 11:04:05.530402 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50ebd6f9-305e-482a-9f7a-e5a63e04921a" path="/var/lib/kubelet/pods/50ebd6f9-305e-482a-9f7a-e5a63e04921a/volumes" Mar 20 11:04:17 crc kubenswrapper[4748]: I0320 11:04:17.723107 4748 scope.go:117] "RemoveContainer" containerID="b548843003db3272aba4481961ab94ee37fe5d87780e1d7d7b1100cbb1db6c7e" Mar 20 11:04:17 crc kubenswrapper[4748]: I0320 11:04:17.756389 4748 scope.go:117] "RemoveContainer" containerID="d4cbd022be390d67fc0a3423cfe7cffd3eabc70daf6a0fd850ed78e3f32e83e5" Mar 20 11:04:17 crc kubenswrapper[4748]: I0320 11:04:17.779145 4748 scope.go:117] "RemoveContainer" containerID="19fc679c210222d1ff016794891f4e94e4de33a4324d0a3466fe5ac80621233f" Mar 20 11:04:17 crc kubenswrapper[4748]: I0320 11:04:17.796019 4748 scope.go:117] "RemoveContainer" containerID="008873a1a263b9df636360272d0f050ef0e564b8e9ef2b35e92e27442ee77c86" Mar 20 11:04:17 crc kubenswrapper[4748]: I0320 11:04:17.819437 4748 scope.go:117] "RemoveContainer" containerID="7df4a7959b03db15ccbc3d1cd86e1cd6e8de9867705181272a1e3ae10b2473dc" Mar 20 11:04:17 crc kubenswrapper[4748]: I0320 11:04:17.837740 4748 scope.go:117] "RemoveContainer" containerID="5378fa20cdc8defe0e8c30928c7e93473341304ae8d67e7d815e059dbbb01837" Mar 20 11:04:17 crc kubenswrapper[4748]: I0320 11:04:17.857130 4748 scope.go:117] "RemoveContainer" containerID="7cd5e8205006dfbdd4f33f086718c1d4b03326a930a7c42ea5cec3dbf6bf18d1" Mar 20 11:04:17 crc kubenswrapper[4748]: I0320 11:04:17.875523 4748 scope.go:117] "RemoveContainer" containerID="495cd93e6040a80b4c2b6f07f1a8b3dedbcf2a7062c8e29da71c1bc92d3d8751" Mar 20 11:04:17 crc kubenswrapper[4748]: I0320 11:04:17.893066 4748 scope.go:117] "RemoveContainer" containerID="35dde74b88e408e8e7cfff28ce2fdf992cf0dff4e05b7d0fd2f64f7b6d0be800" Mar 20 11:04:17 crc kubenswrapper[4748]: I0320 11:04:17.913478 4748 scope.go:117] "RemoveContainer" containerID="49a3fb574b78615d975ca586c2e94ce881cd83d29c263d85d451be63e9565da0" Mar 20 11:04:42 crc kubenswrapper[4748]: I0320 11:04:42.927976 4748 patch_prober.go:28] interesting pod/machine-config-daemon-5lbvz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:04:42 crc kubenswrapper[4748]: I0320 11:04:42.928489 4748 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:05:12 crc kubenswrapper[4748]: I0320 11:05:12.928341 4748 patch_prober.go:28] interesting pod/machine-config-daemon-5lbvz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:05:12 crc kubenswrapper[4748]: I0320 11:05:12.929380 4748 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:05:17 crc kubenswrapper[4748]: I0320 11:05:17.049420 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-blm4h"] Mar 20 11:05:17 crc kubenswrapper[4748]: I0320 11:05:17.062412 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-96de-account-create-update-6v8g4"] Mar 20 11:05:17 crc kubenswrapper[4748]: I0320 11:05:17.074436 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-96de-account-create-update-6v8g4"] Mar 20 11:05:17 crc kubenswrapper[4748]: I0320 11:05:17.086582 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-blm4h"] Mar 20 11:05:17 crc kubenswrapper[4748]: I0320 11:05:17.528391 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="089a3bc7-a7c8-4579-bb88-8e7e515e750a" path="/var/lib/kubelet/pods/089a3bc7-a7c8-4579-bb88-8e7e515e750a/volumes" Mar 20 11:05:17 crc kubenswrapper[4748]: I0320 11:05:17.529274 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff27e027-9d58-432b-81aa-6ceacbf7fe94" path="/var/lib/kubelet/pods/ff27e027-9d58-432b-81aa-6ceacbf7fe94/volumes" Mar 20 11:05:18 crc kubenswrapper[4748]: I0320 11:05:18.051506 4748 scope.go:117] "RemoveContainer" containerID="71e234e3809b3d8da9dde5d5f2196c5e4903bdc342f50c61309a220a0ca75dbe" Mar 20 11:05:18 crc kubenswrapper[4748]: I0320 11:05:18.094991 4748 scope.go:117] "RemoveContainer" containerID="b09444d0b28597ebdc3857a37e7f455eaea7e7cca49abbdac92f5f1276f00468" Mar 20 11:05:18 crc kubenswrapper[4748]: I0320 11:05:18.128008 4748 scope.go:117] "RemoveContainer" containerID="ddcac3d621e07e876e9eb9fe3a0fa0559595aa5e04dc6c8838cf569b7effb9cf" Mar 20 11:05:18 crc kubenswrapper[4748]: I0320 11:05:18.146726 4748 scope.go:117] "RemoveContainer" containerID="f71e88db4a9250b6893dc8de5e8a6e00028b0de0a607e36410458115cbcbc439" Mar 20 11:05:18 crc kubenswrapper[4748]: I0320 11:05:18.180231 4748 scope.go:117] "RemoveContainer" containerID="a594226dd2e1fe41ac6300f680114ffedfadebb30c0ede6352508cec5a95a15e" Mar 20 11:05:18 crc kubenswrapper[4748]: I0320 11:05:18.202164 4748 scope.go:117] "RemoveContainer" containerID="f92e189958f888ce283b036bc284b4431af82bd0c2f0bd94e1a43b803ebb2fc2" Mar 20 11:05:26 crc kubenswrapper[4748]: I0320 11:05:26.512384 4748 generic.go:334] "Generic (PLEG): container finished" podID="01e10255-e1d0-4e62-9b54-4c1043b5f502" containerID="5164b6bc15b2d4bbd0bb40e76f70af7ffc406cf66de263f26f18c83e5d50d2d9" exitCode=0 Mar 20 11:05:26 crc kubenswrapper[4748]: I0320 11:05:26.512530 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-q4wv5" event={"ID":"01e10255-e1d0-4e62-9b54-4c1043b5f502","Type":"ContainerDied","Data":"5164b6bc15b2d4bbd0bb40e76f70af7ffc406cf66de263f26f18c83e5d50d2d9"} Mar 20 11:05:27 crc kubenswrapper[4748]: I0320 11:05:27.918439 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-q4wv5" Mar 20 11:05:28 crc kubenswrapper[4748]: I0320 11:05:28.024142 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01e10255-e1d0-4e62-9b54-4c1043b5f502-bootstrap-combined-ca-bundle\") pod \"01e10255-e1d0-4e62-9b54-4c1043b5f502\" (UID: \"01e10255-e1d0-4e62-9b54-4c1043b5f502\") " Mar 20 11:05:28 crc kubenswrapper[4748]: I0320 11:05:28.024275 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/01e10255-e1d0-4e62-9b54-4c1043b5f502-ssh-key-openstack-edpm-ipam\") pod \"01e10255-e1d0-4e62-9b54-4c1043b5f502\" (UID: \"01e10255-e1d0-4e62-9b54-4c1043b5f502\") " Mar 20 11:05:28 crc kubenswrapper[4748]: I0320 11:05:28.024371 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v5qjm\" (UniqueName: \"kubernetes.io/projected/01e10255-e1d0-4e62-9b54-4c1043b5f502-kube-api-access-v5qjm\") pod \"01e10255-e1d0-4e62-9b54-4c1043b5f502\" (UID: \"01e10255-e1d0-4e62-9b54-4c1043b5f502\") " Mar 20 11:05:28 crc kubenswrapper[4748]: I0320 11:05:28.024412 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/01e10255-e1d0-4e62-9b54-4c1043b5f502-inventory\") pod \"01e10255-e1d0-4e62-9b54-4c1043b5f502\" (UID: \"01e10255-e1d0-4e62-9b54-4c1043b5f502\") " Mar 20 11:05:28 crc kubenswrapper[4748]: I0320 11:05:28.030319 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01e10255-e1d0-4e62-9b54-4c1043b5f502-kube-api-access-v5qjm" (OuterVolumeSpecName: "kube-api-access-v5qjm") pod "01e10255-e1d0-4e62-9b54-4c1043b5f502" (UID: "01e10255-e1d0-4e62-9b54-4c1043b5f502"). InnerVolumeSpecName "kube-api-access-v5qjm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:05:28 crc kubenswrapper[4748]: I0320 11:05:28.030976 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01e10255-e1d0-4e62-9b54-4c1043b5f502-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "01e10255-e1d0-4e62-9b54-4c1043b5f502" (UID: "01e10255-e1d0-4e62-9b54-4c1043b5f502"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 11:05:28 crc kubenswrapper[4748]: I0320 11:05:28.054120 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01e10255-e1d0-4e62-9b54-4c1043b5f502-inventory" (OuterVolumeSpecName: "inventory") pod "01e10255-e1d0-4e62-9b54-4c1043b5f502" (UID: "01e10255-e1d0-4e62-9b54-4c1043b5f502"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 11:05:28 crc kubenswrapper[4748]: I0320 11:05:28.063949 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01e10255-e1d0-4e62-9b54-4c1043b5f502-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "01e10255-e1d0-4e62-9b54-4c1043b5f502" (UID: "01e10255-e1d0-4e62-9b54-4c1043b5f502"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 11:05:28 crc kubenswrapper[4748]: I0320 11:05:28.126583 4748 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01e10255-e1d0-4e62-9b54-4c1043b5f502-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 11:05:28 crc kubenswrapper[4748]: I0320 11:05:28.126815 4748 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/01e10255-e1d0-4e62-9b54-4c1043b5f502-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 11:05:28 crc kubenswrapper[4748]: I0320 11:05:28.126981 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v5qjm\" (UniqueName: \"kubernetes.io/projected/01e10255-e1d0-4e62-9b54-4c1043b5f502-kube-api-access-v5qjm\") on node \"crc\" DevicePath \"\"" Mar 20 11:05:28 crc kubenswrapper[4748]: I0320 11:05:28.127108 4748 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/01e10255-e1d0-4e62-9b54-4c1043b5f502-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 11:05:28 crc kubenswrapper[4748]: I0320 11:05:28.560552 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-q4wv5" event={"ID":"01e10255-e1d0-4e62-9b54-4c1043b5f502","Type":"ContainerDied","Data":"7a91cb434075fb975306e67b3ddb13b4233768cea44f7632aba5b8e82531d530"} Mar 20 11:05:28 crc kubenswrapper[4748]: I0320 11:05:28.560608 4748 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7a91cb434075fb975306e67b3ddb13b4233768cea44f7632aba5b8e82531d530" Mar 20 11:05:28 crc kubenswrapper[4748]: I0320 11:05:28.560620 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-q4wv5" Mar 20 11:05:28 crc kubenswrapper[4748]: I0320 11:05:28.640943 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-rzshm"] Mar 20 11:05:28 crc kubenswrapper[4748]: E0320 11:05:28.641422 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01e10255-e1d0-4e62-9b54-4c1043b5f502" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 20 11:05:28 crc kubenswrapper[4748]: I0320 11:05:28.641440 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="01e10255-e1d0-4e62-9b54-4c1043b5f502" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 20 11:05:28 crc kubenswrapper[4748]: E0320 11:05:28.641481 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b5d23e4-abaf-49c2-87df-fb1243d774f6" containerName="oc" Mar 20 11:05:28 crc kubenswrapper[4748]: I0320 11:05:28.641489 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b5d23e4-abaf-49c2-87df-fb1243d774f6" containerName="oc" Mar 20 11:05:28 crc kubenswrapper[4748]: I0320 11:05:28.641653 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b5d23e4-abaf-49c2-87df-fb1243d774f6" containerName="oc" Mar 20 11:05:28 crc kubenswrapper[4748]: I0320 11:05:28.641676 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="01e10255-e1d0-4e62-9b54-4c1043b5f502" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 20 11:05:28 crc kubenswrapper[4748]: I0320 11:05:28.642374 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-rzshm" Mar 20 11:05:28 crc kubenswrapper[4748]: I0320 11:05:28.647921 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 11:05:28 crc kubenswrapper[4748]: I0320 11:05:28.648194 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 11:05:28 crc kubenswrapper[4748]: I0320 11:05:28.648218 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-fd5jb" Mar 20 11:05:28 crc kubenswrapper[4748]: I0320 11:05:28.648388 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 11:05:28 crc kubenswrapper[4748]: I0320 11:05:28.652399 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-rzshm"] Mar 20 11:05:28 crc kubenswrapper[4748]: I0320 11:05:28.738426 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a0708128-eacf-422a-8dac-98032a9f12e7-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-rzshm\" (UID: \"a0708128-eacf-422a-8dac-98032a9f12e7\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-rzshm" Mar 20 11:05:28 crc kubenswrapper[4748]: I0320 11:05:28.738720 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ft278\" (UniqueName: \"kubernetes.io/projected/a0708128-eacf-422a-8dac-98032a9f12e7-kube-api-access-ft278\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-rzshm\" (UID: \"a0708128-eacf-422a-8dac-98032a9f12e7\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-rzshm" Mar 20 11:05:28 crc kubenswrapper[4748]: I0320 11:05:28.738909 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a0708128-eacf-422a-8dac-98032a9f12e7-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-rzshm\" (UID: \"a0708128-eacf-422a-8dac-98032a9f12e7\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-rzshm" Mar 20 11:05:28 crc kubenswrapper[4748]: I0320 11:05:28.840559 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a0708128-eacf-422a-8dac-98032a9f12e7-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-rzshm\" (UID: \"a0708128-eacf-422a-8dac-98032a9f12e7\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-rzshm" Mar 20 11:05:28 crc kubenswrapper[4748]: I0320 11:05:28.840717 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a0708128-eacf-422a-8dac-98032a9f12e7-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-rzshm\" (UID: \"a0708128-eacf-422a-8dac-98032a9f12e7\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-rzshm" Mar 20 11:05:28 crc kubenswrapper[4748]: I0320 11:05:28.840762 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ft278\" (UniqueName: \"kubernetes.io/projected/a0708128-eacf-422a-8dac-98032a9f12e7-kube-api-access-ft278\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-rzshm\" (UID: \"a0708128-eacf-422a-8dac-98032a9f12e7\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-rzshm" Mar 20 11:05:28 crc kubenswrapper[4748]: I0320 11:05:28.845579 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a0708128-eacf-422a-8dac-98032a9f12e7-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-rzshm\" (UID: \"a0708128-eacf-422a-8dac-98032a9f12e7\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-rzshm" Mar 20 11:05:28 crc kubenswrapper[4748]: I0320 11:05:28.851648 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a0708128-eacf-422a-8dac-98032a9f12e7-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-rzshm\" (UID: \"a0708128-eacf-422a-8dac-98032a9f12e7\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-rzshm" Mar 20 11:05:28 crc kubenswrapper[4748]: I0320 11:05:28.869390 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ft278\" (UniqueName: \"kubernetes.io/projected/a0708128-eacf-422a-8dac-98032a9f12e7-kube-api-access-ft278\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-rzshm\" (UID: \"a0708128-eacf-422a-8dac-98032a9f12e7\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-rzshm" Mar 20 11:05:28 crc kubenswrapper[4748]: I0320 11:05:28.967193 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-rzshm" Mar 20 11:05:29 crc kubenswrapper[4748]: I0320 11:05:29.461247 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-rzshm"] Mar 20 11:05:29 crc kubenswrapper[4748]: I0320 11:05:29.466486 4748 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 11:05:29 crc kubenswrapper[4748]: I0320 11:05:29.571404 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-rzshm" event={"ID":"a0708128-eacf-422a-8dac-98032a9f12e7","Type":"ContainerStarted","Data":"3fd3817a58e3c94536b977651e0adcc06522ca0fd3fe1a7f33e7c65d1ec644ca"} Mar 20 11:05:30 crc kubenswrapper[4748]: I0320 11:05:30.032656 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-mzn7b"] Mar 20 11:05:30 crc kubenswrapper[4748]: I0320 11:05:30.044928 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-n8tkm"] Mar 20 11:05:30 crc kubenswrapper[4748]: I0320 11:05:30.057640 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-mzn7b"] Mar 20 11:05:30 crc kubenswrapper[4748]: I0320 11:05:30.069170 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-n8tkm"] Mar 20 11:05:30 crc kubenswrapper[4748]: I0320 11:05:30.586240 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-rzshm" event={"ID":"a0708128-eacf-422a-8dac-98032a9f12e7","Type":"ContainerStarted","Data":"06504aa24b314c15c08b5139f367c9391bfa38b4788810518bd1c22fc37ea54f"} Mar 20 11:05:30 crc kubenswrapper[4748]: I0320 11:05:30.649024 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-rzshm" podStartSLOduration=2.497536141 podStartE2EDuration="2.649001873s" podCreationTimestamp="2026-03-20 11:05:28 +0000 UTC" firstStartedPulling="2026-03-20 11:05:29.466214596 +0000 UTC m=+1764.607760410" lastFinishedPulling="2026-03-20 11:05:29.617680328 +0000 UTC m=+1764.759226142" observedRunningTime="2026-03-20 11:05:30.642235024 +0000 UTC m=+1765.783780848" watchObservedRunningTime="2026-03-20 11:05:30.649001873 +0000 UTC m=+1765.790547677" Mar 20 11:05:31 crc kubenswrapper[4748]: I0320 11:05:31.037878 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-915e-account-create-update-rq9s8"] Mar 20 11:05:31 crc kubenswrapper[4748]: I0320 11:05:31.048997 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-7a59-account-create-update-vt42l"] Mar 20 11:05:31 crc kubenswrapper[4748]: I0320 11:05:31.060382 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-915e-account-create-update-rq9s8"] Mar 20 11:05:31 crc kubenswrapper[4748]: I0320 11:05:31.068901 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-7a59-account-create-update-vt42l"] Mar 20 11:05:31 crc kubenswrapper[4748]: I0320 11:05:31.527417 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="056ff617-ed15-4344-8883-afb1abd38abb" path="/var/lib/kubelet/pods/056ff617-ed15-4344-8883-afb1abd38abb/volumes" Mar 20 11:05:31 crc kubenswrapper[4748]: I0320 11:05:31.528612 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5905f14e-ca31-4983-9335-db320b71f0d1" path="/var/lib/kubelet/pods/5905f14e-ca31-4983-9335-db320b71f0d1/volumes" Mar 20 11:05:31 crc kubenswrapper[4748]: I0320 11:05:31.529183 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b34c916-eb28-4060-8a0c-22b7ae45bcaa" path="/var/lib/kubelet/pods/7b34c916-eb28-4060-8a0c-22b7ae45bcaa/volumes" Mar 20 11:05:31 crc kubenswrapper[4748]: I0320 11:05:31.529749 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8949f08a-2d27-4055-86a7-a66a77de530b" path="/var/lib/kubelet/pods/8949f08a-2d27-4055-86a7-a66a77de530b/volumes" Mar 20 11:05:32 crc kubenswrapper[4748]: I0320 11:05:32.084255 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-l55x2"] Mar 20 11:05:32 crc kubenswrapper[4748]: I0320 11:05:32.086224 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l55x2" Mar 20 11:05:32 crc kubenswrapper[4748]: I0320 11:05:32.098815 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-l55x2"] Mar 20 11:05:32 crc kubenswrapper[4748]: I0320 11:05:32.234063 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvzll\" (UniqueName: \"kubernetes.io/projected/c3235624-b08a-44d1-a2c1-21c9603ff5a1-kube-api-access-cvzll\") pod \"community-operators-l55x2\" (UID: \"c3235624-b08a-44d1-a2c1-21c9603ff5a1\") " pod="openshift-marketplace/community-operators-l55x2" Mar 20 11:05:32 crc kubenswrapper[4748]: I0320 11:05:32.234142 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3235624-b08a-44d1-a2c1-21c9603ff5a1-catalog-content\") pod \"community-operators-l55x2\" (UID: \"c3235624-b08a-44d1-a2c1-21c9603ff5a1\") " pod="openshift-marketplace/community-operators-l55x2" Mar 20 11:05:32 crc kubenswrapper[4748]: I0320 11:05:32.234193 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3235624-b08a-44d1-a2c1-21c9603ff5a1-utilities\") pod \"community-operators-l55x2\" (UID: \"c3235624-b08a-44d1-a2c1-21c9603ff5a1\") " pod="openshift-marketplace/community-operators-l55x2" Mar 20 11:05:32 crc kubenswrapper[4748]: I0320 11:05:32.335947 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cvzll\" (UniqueName: \"kubernetes.io/projected/c3235624-b08a-44d1-a2c1-21c9603ff5a1-kube-api-access-cvzll\") pod \"community-operators-l55x2\" (UID: \"c3235624-b08a-44d1-a2c1-21c9603ff5a1\") " pod="openshift-marketplace/community-operators-l55x2" Mar 20 11:05:32 crc kubenswrapper[4748]: I0320 11:05:32.336047 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3235624-b08a-44d1-a2c1-21c9603ff5a1-catalog-content\") pod \"community-operators-l55x2\" (UID: \"c3235624-b08a-44d1-a2c1-21c9603ff5a1\") " pod="openshift-marketplace/community-operators-l55x2" Mar 20 11:05:32 crc kubenswrapper[4748]: I0320 11:05:32.336098 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3235624-b08a-44d1-a2c1-21c9603ff5a1-utilities\") pod \"community-operators-l55x2\" (UID: \"c3235624-b08a-44d1-a2c1-21c9603ff5a1\") " pod="openshift-marketplace/community-operators-l55x2" Mar 20 11:05:32 crc kubenswrapper[4748]: I0320 11:05:32.336769 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3235624-b08a-44d1-a2c1-21c9603ff5a1-catalog-content\") pod \"community-operators-l55x2\" (UID: \"c3235624-b08a-44d1-a2c1-21c9603ff5a1\") " pod="openshift-marketplace/community-operators-l55x2" Mar 20 11:05:32 crc kubenswrapper[4748]: I0320 11:05:32.336878 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3235624-b08a-44d1-a2c1-21c9603ff5a1-utilities\") pod \"community-operators-l55x2\" (UID: \"c3235624-b08a-44d1-a2c1-21c9603ff5a1\") " pod="openshift-marketplace/community-operators-l55x2" Mar 20 11:05:32 crc kubenswrapper[4748]: I0320 11:05:32.365062 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvzll\" (UniqueName: \"kubernetes.io/projected/c3235624-b08a-44d1-a2c1-21c9603ff5a1-kube-api-access-cvzll\") pod \"community-operators-l55x2\" (UID: \"c3235624-b08a-44d1-a2c1-21c9603ff5a1\") " pod="openshift-marketplace/community-operators-l55x2" Mar 20 11:05:32 crc kubenswrapper[4748]: I0320 11:05:32.431774 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l55x2" Mar 20 11:05:32 crc kubenswrapper[4748]: I0320 11:05:32.990299 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-l55x2"] Mar 20 11:05:32 crc kubenswrapper[4748]: W0320 11:05:32.993990 4748 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc3235624_b08a_44d1_a2c1_21c9603ff5a1.slice/crio-5d51f5475ca9faa4d6151c4d252ea4ddca9b693e736b282b3774812a6997c302 WatchSource:0}: Error finding container 5d51f5475ca9faa4d6151c4d252ea4ddca9b693e736b282b3774812a6997c302: Status 404 returned error can't find the container with id 5d51f5475ca9faa4d6151c4d252ea4ddca9b693e736b282b3774812a6997c302 Mar 20 11:05:33 crc kubenswrapper[4748]: I0320 11:05:33.640330 4748 generic.go:334] "Generic (PLEG): container finished" podID="c3235624-b08a-44d1-a2c1-21c9603ff5a1" containerID="60d5be33ee5cab8f2ed39d2d959a7688663ab8019c4f8c85ad9b31ea11705511" exitCode=0 Mar 20 11:05:33 crc kubenswrapper[4748]: I0320 11:05:33.640379 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l55x2" event={"ID":"c3235624-b08a-44d1-a2c1-21c9603ff5a1","Type":"ContainerDied","Data":"60d5be33ee5cab8f2ed39d2d959a7688663ab8019c4f8c85ad9b31ea11705511"} Mar 20 11:05:33 crc kubenswrapper[4748]: I0320 11:05:33.640428 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l55x2" event={"ID":"c3235624-b08a-44d1-a2c1-21c9603ff5a1","Type":"ContainerStarted","Data":"5d51f5475ca9faa4d6151c4d252ea4ddca9b693e736b282b3774812a6997c302"} Mar 20 11:05:34 crc kubenswrapper[4748]: I0320 11:05:34.651174 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l55x2" event={"ID":"c3235624-b08a-44d1-a2c1-21c9603ff5a1","Type":"ContainerStarted","Data":"5bbc1293cbb3de529f93acaf695198011c480ddb7d1baf3faffda7d88af1ea24"} Mar 20 11:05:36 crc kubenswrapper[4748]: I0320 11:05:36.671636 4748 generic.go:334] "Generic (PLEG): container finished" podID="c3235624-b08a-44d1-a2c1-21c9603ff5a1" containerID="5bbc1293cbb3de529f93acaf695198011c480ddb7d1baf3faffda7d88af1ea24" exitCode=0 Mar 20 11:05:36 crc kubenswrapper[4748]: I0320 11:05:36.671739 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l55x2" event={"ID":"c3235624-b08a-44d1-a2c1-21c9603ff5a1","Type":"ContainerDied","Data":"5bbc1293cbb3de529f93acaf695198011c480ddb7d1baf3faffda7d88af1ea24"} Mar 20 11:05:37 crc kubenswrapper[4748]: I0320 11:05:37.685499 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l55x2" event={"ID":"c3235624-b08a-44d1-a2c1-21c9603ff5a1","Type":"ContainerStarted","Data":"ff2f9a8f44b5eb01c3cf635db3b88fd88d1823ec01209ddaf4ad4a358354dc11"} Mar 20 11:05:37 crc kubenswrapper[4748]: I0320 11:05:37.711664 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-l55x2" podStartSLOduration=2.2611253700000002 podStartE2EDuration="5.711644711s" podCreationTimestamp="2026-03-20 11:05:32 +0000 UTC" firstStartedPulling="2026-03-20 11:05:33.642081349 +0000 UTC m=+1768.783627173" lastFinishedPulling="2026-03-20 11:05:37.0926007 +0000 UTC m=+1772.234146514" observedRunningTime="2026-03-20 11:05:37.708927163 +0000 UTC m=+1772.850472977" watchObservedRunningTime="2026-03-20 11:05:37.711644711 +0000 UTC m=+1772.853190525" Mar 20 11:05:39 crc kubenswrapper[4748]: I0320 11:05:39.027811 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-gk4gw"] Mar 20 11:05:39 crc kubenswrapper[4748]: I0320 11:05:39.037396 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-gk4gw"] Mar 20 11:05:39 crc kubenswrapper[4748]: I0320 11:05:39.527181 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07aca114-65eb-4bd6-8d63-d17635771c2d" path="/var/lib/kubelet/pods/07aca114-65eb-4bd6-8d63-d17635771c2d/volumes" Mar 20 11:05:42 crc kubenswrapper[4748]: I0320 11:05:42.432876 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-l55x2" Mar 20 11:05:42 crc kubenswrapper[4748]: I0320 11:05:42.433506 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-l55x2" Mar 20 11:05:42 crc kubenswrapper[4748]: I0320 11:05:42.485125 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-l55x2" Mar 20 11:05:42 crc kubenswrapper[4748]: I0320 11:05:42.784403 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-l55x2" Mar 20 11:05:42 crc kubenswrapper[4748]: I0320 11:05:42.846796 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-l55x2"] Mar 20 11:05:42 crc kubenswrapper[4748]: I0320 11:05:42.929146 4748 patch_prober.go:28] interesting pod/machine-config-daemon-5lbvz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:05:42 crc kubenswrapper[4748]: I0320 11:05:42.929227 4748 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:05:42 crc kubenswrapper[4748]: I0320 11:05:42.929290 4748 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" Mar 20 11:05:42 crc kubenswrapper[4748]: I0320 11:05:42.930232 4748 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e428509671e8fe99771dfcddf54085bbb58bd5b6557fd616733b187409873ead"} pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 11:05:42 crc kubenswrapper[4748]: I0320 11:05:42.930297 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" containerName="machine-config-daemon" containerID="cri-o://e428509671e8fe99771dfcddf54085bbb58bd5b6557fd616733b187409873ead" gracePeriod=600 Mar 20 11:05:43 crc kubenswrapper[4748]: E0320 11:05:43.175231 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lbvz_openshift-machine-config-operator(8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" Mar 20 11:05:43 crc kubenswrapper[4748]: I0320 11:05:43.746367 4748 generic.go:334] "Generic (PLEG): container finished" podID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" containerID="e428509671e8fe99771dfcddf54085bbb58bd5b6557fd616733b187409873ead" exitCode=0 Mar 20 11:05:43 crc kubenswrapper[4748]: I0320 11:05:43.746455 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" event={"ID":"8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c","Type":"ContainerDied","Data":"e428509671e8fe99771dfcddf54085bbb58bd5b6557fd616733b187409873ead"} Mar 20 11:05:43 crc kubenswrapper[4748]: I0320 11:05:43.746526 4748 scope.go:117] "RemoveContainer" containerID="ec8318a59f2a0dcbdd19ff1535aafa2664120c1ab98ee7f0ce82eda8b7b3e371" Mar 20 11:05:43 crc kubenswrapper[4748]: I0320 11:05:43.747359 4748 scope.go:117] "RemoveContainer" containerID="e428509671e8fe99771dfcddf54085bbb58bd5b6557fd616733b187409873ead" Mar 20 11:05:43 crc kubenswrapper[4748]: E0320 11:05:43.747672 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lbvz_openshift-machine-config-operator(8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" Mar 20 11:05:44 crc kubenswrapper[4748]: I0320 11:05:44.762775 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-l55x2" podUID="c3235624-b08a-44d1-a2c1-21c9603ff5a1" containerName="registry-server" containerID="cri-o://ff2f9a8f44b5eb01c3cf635db3b88fd88d1823ec01209ddaf4ad4a358354dc11" gracePeriod=2 Mar 20 11:05:45 crc kubenswrapper[4748]: I0320 11:05:45.776032 4748 generic.go:334] "Generic (PLEG): container finished" podID="c3235624-b08a-44d1-a2c1-21c9603ff5a1" containerID="ff2f9a8f44b5eb01c3cf635db3b88fd88d1823ec01209ddaf4ad4a358354dc11" exitCode=0 Mar 20 11:05:45 crc kubenswrapper[4748]: I0320 11:05:45.776080 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l55x2" event={"ID":"c3235624-b08a-44d1-a2c1-21c9603ff5a1","Type":"ContainerDied","Data":"ff2f9a8f44b5eb01c3cf635db3b88fd88d1823ec01209ddaf4ad4a358354dc11"} Mar 20 11:05:47 crc kubenswrapper[4748]: I0320 11:05:47.274238 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l55x2" Mar 20 11:05:47 crc kubenswrapper[4748]: I0320 11:05:47.346154 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3235624-b08a-44d1-a2c1-21c9603ff5a1-catalog-content\") pod \"c3235624-b08a-44d1-a2c1-21c9603ff5a1\" (UID: \"c3235624-b08a-44d1-a2c1-21c9603ff5a1\") " Mar 20 11:05:47 crc kubenswrapper[4748]: I0320 11:05:47.346653 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3235624-b08a-44d1-a2c1-21c9603ff5a1-utilities\") pod \"c3235624-b08a-44d1-a2c1-21c9603ff5a1\" (UID: \"c3235624-b08a-44d1-a2c1-21c9603ff5a1\") " Mar 20 11:05:47 crc kubenswrapper[4748]: I0320 11:05:47.346714 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cvzll\" (UniqueName: \"kubernetes.io/projected/c3235624-b08a-44d1-a2c1-21c9603ff5a1-kube-api-access-cvzll\") pod \"c3235624-b08a-44d1-a2c1-21c9603ff5a1\" (UID: \"c3235624-b08a-44d1-a2c1-21c9603ff5a1\") " Mar 20 11:05:47 crc kubenswrapper[4748]: I0320 11:05:47.348887 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c3235624-b08a-44d1-a2c1-21c9603ff5a1-utilities" (OuterVolumeSpecName: "utilities") pod "c3235624-b08a-44d1-a2c1-21c9603ff5a1" (UID: "c3235624-b08a-44d1-a2c1-21c9603ff5a1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:05:47 crc kubenswrapper[4748]: I0320 11:05:47.360201 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3235624-b08a-44d1-a2c1-21c9603ff5a1-kube-api-access-cvzll" (OuterVolumeSpecName: "kube-api-access-cvzll") pod "c3235624-b08a-44d1-a2c1-21c9603ff5a1" (UID: "c3235624-b08a-44d1-a2c1-21c9603ff5a1"). InnerVolumeSpecName "kube-api-access-cvzll". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:05:47 crc kubenswrapper[4748]: I0320 11:05:47.407768 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c3235624-b08a-44d1-a2c1-21c9603ff5a1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c3235624-b08a-44d1-a2c1-21c9603ff5a1" (UID: "c3235624-b08a-44d1-a2c1-21c9603ff5a1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:05:47 crc kubenswrapper[4748]: I0320 11:05:47.449074 4748 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3235624-b08a-44d1-a2c1-21c9603ff5a1-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 11:05:47 crc kubenswrapper[4748]: I0320 11:05:47.449105 4748 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3235624-b08a-44d1-a2c1-21c9603ff5a1-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 11:05:47 crc kubenswrapper[4748]: I0320 11:05:47.449117 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cvzll\" (UniqueName: \"kubernetes.io/projected/c3235624-b08a-44d1-a2c1-21c9603ff5a1-kube-api-access-cvzll\") on node \"crc\" DevicePath \"\"" Mar 20 11:05:47 crc kubenswrapper[4748]: I0320 11:05:47.800595 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l55x2" event={"ID":"c3235624-b08a-44d1-a2c1-21c9603ff5a1","Type":"ContainerDied","Data":"5d51f5475ca9faa4d6151c4d252ea4ddca9b693e736b282b3774812a6997c302"} Mar 20 11:05:47 crc kubenswrapper[4748]: I0320 11:05:47.800664 4748 scope.go:117] "RemoveContainer" containerID="ff2f9a8f44b5eb01c3cf635db3b88fd88d1823ec01209ddaf4ad4a358354dc11" Mar 20 11:05:47 crc kubenswrapper[4748]: I0320 11:05:47.800865 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l55x2" Mar 20 11:05:47 crc kubenswrapper[4748]: I0320 11:05:47.829749 4748 scope.go:117] "RemoveContainer" containerID="5bbc1293cbb3de529f93acaf695198011c480ddb7d1baf3faffda7d88af1ea24" Mar 20 11:05:47 crc kubenswrapper[4748]: I0320 11:05:47.835775 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-l55x2"] Mar 20 11:05:47 crc kubenswrapper[4748]: I0320 11:05:47.847653 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-l55x2"] Mar 20 11:05:47 crc kubenswrapper[4748]: I0320 11:05:47.863043 4748 scope.go:117] "RemoveContainer" containerID="60d5be33ee5cab8f2ed39d2d959a7688663ab8019c4f8c85ad9b31ea11705511" Mar 20 11:05:49 crc kubenswrapper[4748]: I0320 11:05:49.528054 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3235624-b08a-44d1-a2c1-21c9603ff5a1" path="/var/lib/kubelet/pods/c3235624-b08a-44d1-a2c1-21c9603ff5a1/volumes" Mar 20 11:05:56 crc kubenswrapper[4748]: I0320 11:05:56.515384 4748 scope.go:117] "RemoveContainer" containerID="e428509671e8fe99771dfcddf54085bbb58bd5b6557fd616733b187409873ead" Mar 20 11:05:56 crc kubenswrapper[4748]: E0320 11:05:56.516596 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lbvz_openshift-machine-config-operator(8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" Mar 20 11:06:00 crc kubenswrapper[4748]: I0320 11:06:00.152100 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566746-q8tck"] Mar 20 11:06:00 crc kubenswrapper[4748]: E0320 11:06:00.153090 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3235624-b08a-44d1-a2c1-21c9603ff5a1" containerName="extract-utilities" Mar 20 11:06:00 crc kubenswrapper[4748]: I0320 11:06:00.153105 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3235624-b08a-44d1-a2c1-21c9603ff5a1" containerName="extract-utilities" Mar 20 11:06:00 crc kubenswrapper[4748]: E0320 11:06:00.153141 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3235624-b08a-44d1-a2c1-21c9603ff5a1" containerName="extract-content" Mar 20 11:06:00 crc kubenswrapper[4748]: I0320 11:06:00.153147 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3235624-b08a-44d1-a2c1-21c9603ff5a1" containerName="extract-content" Mar 20 11:06:00 crc kubenswrapper[4748]: E0320 11:06:00.153159 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3235624-b08a-44d1-a2c1-21c9603ff5a1" containerName="registry-server" Mar 20 11:06:00 crc kubenswrapper[4748]: I0320 11:06:00.153166 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3235624-b08a-44d1-a2c1-21c9603ff5a1" containerName="registry-server" Mar 20 11:06:00 crc kubenswrapper[4748]: I0320 11:06:00.153355 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3235624-b08a-44d1-a2c1-21c9603ff5a1" containerName="registry-server" Mar 20 11:06:00 crc kubenswrapper[4748]: I0320 11:06:00.154075 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566746-q8tck" Mar 20 11:06:00 crc kubenswrapper[4748]: I0320 11:06:00.156641 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 11:06:00 crc kubenswrapper[4748]: I0320 11:06:00.156722 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 11:06:00 crc kubenswrapper[4748]: I0320 11:06:00.156880 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-p6q8z" Mar 20 11:06:00 crc kubenswrapper[4748]: I0320 11:06:00.163533 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566746-q8tck"] Mar 20 11:06:00 crc kubenswrapper[4748]: I0320 11:06:00.231993 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwznz\" (UniqueName: \"kubernetes.io/projected/08153fb7-b276-48d4-a0ad-c4a433f5db7f-kube-api-access-nwznz\") pod \"auto-csr-approver-29566746-q8tck\" (UID: \"08153fb7-b276-48d4-a0ad-c4a433f5db7f\") " pod="openshift-infra/auto-csr-approver-29566746-q8tck" Mar 20 11:06:00 crc kubenswrapper[4748]: I0320 11:06:00.333759 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nwznz\" (UniqueName: \"kubernetes.io/projected/08153fb7-b276-48d4-a0ad-c4a433f5db7f-kube-api-access-nwznz\") pod \"auto-csr-approver-29566746-q8tck\" (UID: \"08153fb7-b276-48d4-a0ad-c4a433f5db7f\") " pod="openshift-infra/auto-csr-approver-29566746-q8tck" Mar 20 11:06:00 crc kubenswrapper[4748]: I0320 11:06:00.352765 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nwznz\" (UniqueName: \"kubernetes.io/projected/08153fb7-b276-48d4-a0ad-c4a433f5db7f-kube-api-access-nwznz\") pod \"auto-csr-approver-29566746-q8tck\" (UID: \"08153fb7-b276-48d4-a0ad-c4a433f5db7f\") " pod="openshift-infra/auto-csr-approver-29566746-q8tck" Mar 20 11:06:00 crc kubenswrapper[4748]: I0320 11:06:00.485294 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566746-q8tck" Mar 20 11:06:00 crc kubenswrapper[4748]: I0320 11:06:00.963292 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566746-q8tck"] Mar 20 11:06:01 crc kubenswrapper[4748]: I0320 11:06:01.942490 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566746-q8tck" event={"ID":"08153fb7-b276-48d4-a0ad-c4a433f5db7f","Type":"ContainerStarted","Data":"7c25afed62d4fa65663bcd2b47fce86c67e11984bc6304bc6a117c89cebfcfb7"} Mar 20 11:06:02 crc kubenswrapper[4748]: I0320 11:06:02.951702 4748 generic.go:334] "Generic (PLEG): container finished" podID="08153fb7-b276-48d4-a0ad-c4a433f5db7f" containerID="b77affdc552ff46fa440829ec7baf141492a8c3ef9601b449c647e1ecc6637e3" exitCode=0 Mar 20 11:06:02 crc kubenswrapper[4748]: I0320 11:06:02.951745 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566746-q8tck" event={"ID":"08153fb7-b276-48d4-a0ad-c4a433f5db7f","Type":"ContainerDied","Data":"b77affdc552ff46fa440829ec7baf141492a8c3ef9601b449c647e1ecc6637e3"} Mar 20 11:06:03 crc kubenswrapper[4748]: I0320 11:06:03.049956 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-f74c-account-create-update-s4q4j"] Mar 20 11:06:03 crc kubenswrapper[4748]: I0320 11:06:03.070461 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-znvbx"] Mar 20 11:06:03 crc kubenswrapper[4748]: I0320 11:06:03.080541 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-f74c-account-create-update-s4q4j"] Mar 20 11:06:03 crc kubenswrapper[4748]: I0320 11:06:03.089559 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-znvbx"] Mar 20 11:06:03 crc kubenswrapper[4748]: I0320 11:06:03.098585 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-edd6-account-create-update-v5v7v"] Mar 20 11:06:03 crc kubenswrapper[4748]: I0320 11:06:03.115044 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-2hr4g"] Mar 20 11:06:03 crc kubenswrapper[4748]: I0320 11:06:03.128894 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-qnhjl"] Mar 20 11:06:03 crc kubenswrapper[4748]: I0320 11:06:03.139412 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-beb0-account-create-update-ls746"] Mar 20 11:06:03 crc kubenswrapper[4748]: I0320 11:06:03.146544 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-beb0-account-create-update-ls746"] Mar 20 11:06:03 crc kubenswrapper[4748]: I0320 11:06:03.153086 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-2hr4g"] Mar 20 11:06:03 crc kubenswrapper[4748]: I0320 11:06:03.159608 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-qnhjl"] Mar 20 11:06:03 crc kubenswrapper[4748]: I0320 11:06:03.166007 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-edd6-account-create-update-v5v7v"] Mar 20 11:06:03 crc kubenswrapper[4748]: I0320 11:06:03.527023 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b730a50-8cb3-43e9-af09-5bf38f7cfe3f" path="/var/lib/kubelet/pods/2b730a50-8cb3-43e9-af09-5bf38f7cfe3f/volumes" Mar 20 11:06:03 crc kubenswrapper[4748]: I0320 11:06:03.527784 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ed8081e-cf8f-41b3-829c-a65608af1644" path="/var/lib/kubelet/pods/2ed8081e-cf8f-41b3-829c-a65608af1644/volumes" Mar 20 11:06:03 crc kubenswrapper[4748]: I0320 11:06:03.528485 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2435374-d4a6-4d1d-81ab-ab6bc61ae023" path="/var/lib/kubelet/pods/a2435374-d4a6-4d1d-81ab-ab6bc61ae023/volumes" Mar 20 11:06:03 crc kubenswrapper[4748]: I0320 11:06:03.529060 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df4bc055-0cec-45a8-b6ef-a3b1b8a3b5be" path="/var/lib/kubelet/pods/df4bc055-0cec-45a8-b6ef-a3b1b8a3b5be/volumes" Mar 20 11:06:03 crc kubenswrapper[4748]: I0320 11:06:03.530156 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb7e7326-cf77-4cb1-93a5-f463fb86b184" path="/var/lib/kubelet/pods/eb7e7326-cf77-4cb1-93a5-f463fb86b184/volumes" Mar 20 11:06:03 crc kubenswrapper[4748]: I0320 11:06:03.530709 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ebd3eb4a-6f8b-4ea0-b44d-e279e0c8331f" path="/var/lib/kubelet/pods/ebd3eb4a-6f8b-4ea0-b44d-e279e0c8331f/volumes" Mar 20 11:06:04 crc kubenswrapper[4748]: I0320 11:06:04.288955 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566746-q8tck" Mar 20 11:06:04 crc kubenswrapper[4748]: I0320 11:06:04.416759 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nwznz\" (UniqueName: \"kubernetes.io/projected/08153fb7-b276-48d4-a0ad-c4a433f5db7f-kube-api-access-nwznz\") pod \"08153fb7-b276-48d4-a0ad-c4a433f5db7f\" (UID: \"08153fb7-b276-48d4-a0ad-c4a433f5db7f\") " Mar 20 11:06:04 crc kubenswrapper[4748]: I0320 11:06:04.426481 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08153fb7-b276-48d4-a0ad-c4a433f5db7f-kube-api-access-nwznz" (OuterVolumeSpecName: "kube-api-access-nwznz") pod "08153fb7-b276-48d4-a0ad-c4a433f5db7f" (UID: "08153fb7-b276-48d4-a0ad-c4a433f5db7f"). InnerVolumeSpecName "kube-api-access-nwznz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:06:04 crc kubenswrapper[4748]: I0320 11:06:04.519112 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nwznz\" (UniqueName: \"kubernetes.io/projected/08153fb7-b276-48d4-a0ad-c4a433f5db7f-kube-api-access-nwznz\") on node \"crc\" DevicePath \"\"" Mar 20 11:06:04 crc kubenswrapper[4748]: I0320 11:06:04.974314 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566746-q8tck" event={"ID":"08153fb7-b276-48d4-a0ad-c4a433f5db7f","Type":"ContainerDied","Data":"7c25afed62d4fa65663bcd2b47fce86c67e11984bc6304bc6a117c89cebfcfb7"} Mar 20 11:06:04 crc kubenswrapper[4748]: I0320 11:06:04.974369 4748 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7c25afed62d4fa65663bcd2b47fce86c67e11984bc6304bc6a117c89cebfcfb7" Mar 20 11:06:04 crc kubenswrapper[4748]: I0320 11:06:04.974370 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566746-q8tck" Mar 20 11:06:05 crc kubenswrapper[4748]: I0320 11:06:05.350415 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566740-jzzl4"] Mar 20 11:06:05 crc kubenswrapper[4748]: I0320 11:06:05.359607 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566740-jzzl4"] Mar 20 11:06:05 crc kubenswrapper[4748]: I0320 11:06:05.529440 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b7b4ae8-52b3-4fb3-8684-9b64c2e24ac4" path="/var/lib/kubelet/pods/4b7b4ae8-52b3-4fb3-8684-9b64c2e24ac4/volumes" Mar 20 11:06:06 crc kubenswrapper[4748]: I0320 11:06:06.035084 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-t76rv"] Mar 20 11:06:06 crc kubenswrapper[4748]: I0320 11:06:06.048859 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-t76rv"] Mar 20 11:06:07 crc kubenswrapper[4748]: I0320 11:06:07.515698 4748 scope.go:117] "RemoveContainer" containerID="e428509671e8fe99771dfcddf54085bbb58bd5b6557fd616733b187409873ead" Mar 20 11:06:07 crc kubenswrapper[4748]: E0320 11:06:07.516514 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lbvz_openshift-machine-config-operator(8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" Mar 20 11:06:07 crc kubenswrapper[4748]: I0320 11:06:07.529589 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dfabafe6-f3da-44d2-bb13-cc39bffc6dbd" path="/var/lib/kubelet/pods/dfabafe6-f3da-44d2-bb13-cc39bffc6dbd/volumes" Mar 20 11:06:11 crc kubenswrapper[4748]: I0320 11:06:11.048435 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-kctrc"] Mar 20 11:06:11 crc kubenswrapper[4748]: I0320 11:06:11.060683 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-kctrc"] Mar 20 11:06:11 crc kubenswrapper[4748]: I0320 11:06:11.527031 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab07a02-a8eb-4d81-a8e0-e8999fbbca4a" path="/var/lib/kubelet/pods/3ab07a02-a8eb-4d81-a8e0-e8999fbbca4a/volumes" Mar 20 11:06:18 crc kubenswrapper[4748]: I0320 11:06:18.321918 4748 scope.go:117] "RemoveContainer" containerID="2bdafb6b2197d279e48298731c49d3da2e047b701b39d77a84a4581d41810e0a" Mar 20 11:06:18 crc kubenswrapper[4748]: I0320 11:06:18.350377 4748 scope.go:117] "RemoveContainer" containerID="e3f5d3bfe950565c6a7ea780b13c5fa011c1630f5c5b9ae8eaefa509cb2dd41c" Mar 20 11:06:18 crc kubenswrapper[4748]: I0320 11:06:18.400029 4748 scope.go:117] "RemoveContainer" containerID="b8f75909bd9bc56662a6978c2f85bf27abd90e98a29d9195c5fe14dc0b753532" Mar 20 11:06:18 crc kubenswrapper[4748]: I0320 11:06:18.461416 4748 scope.go:117] "RemoveContainer" containerID="d7bc10c1a8c9d9d3e0ee576dd9aa00a993b281bf5342e91016f27cc02bba9b86" Mar 20 11:06:18 crc kubenswrapper[4748]: I0320 11:06:18.486018 4748 scope.go:117] "RemoveContainer" containerID="de2cf53897f1fae8cfe402a1267d257326195e4f7f9853a322c0a26d10c18478" Mar 20 11:06:18 crc kubenswrapper[4748]: I0320 11:06:18.526163 4748 scope.go:117] "RemoveContainer" containerID="25793bede7a0318a469e669ff529dd03177a5100b7973134147bf88e9eb4696b" Mar 20 11:06:18 crc kubenswrapper[4748]: I0320 11:06:18.548172 4748 scope.go:117] "RemoveContainer" containerID="284ac5c3e73609ade4a8deae9663a678883f6ec93e2571ca8b0166050123fddf" Mar 20 11:06:18 crc kubenswrapper[4748]: I0320 11:06:18.588554 4748 scope.go:117] "RemoveContainer" containerID="4e908ab37e2c79196cb9b30af43500cb2e4a3912257202aab505a4d6f6d22626" Mar 20 11:06:18 crc kubenswrapper[4748]: I0320 11:06:18.607727 4748 scope.go:117] "RemoveContainer" containerID="c353ed06e6f8f53e1243b5b8f308e8427c57a3e3422e265e2be563a2aa08eb88" Mar 20 11:06:18 crc kubenswrapper[4748]: I0320 11:06:18.643895 4748 scope.go:117] "RemoveContainer" containerID="2ccdc100ca6c5f92b72e272d08534275a25bf6a2c5c36331713c59dd6c4c33da" Mar 20 11:06:18 crc kubenswrapper[4748]: I0320 11:06:18.666708 4748 scope.go:117] "RemoveContainer" containerID="8699a18cdf70da16c891192e1dcf8147b0e4f26040a7df9f400a9bdf1dfd12e2" Mar 20 11:06:18 crc kubenswrapper[4748]: I0320 11:06:18.702717 4748 scope.go:117] "RemoveContainer" containerID="118d9930410d6da787a8d2f5979051084dc728d53b3d7004706200f762bb31b1" Mar 20 11:06:18 crc kubenswrapper[4748]: I0320 11:06:18.723728 4748 scope.go:117] "RemoveContainer" containerID="16edf332d6a43bd6e5cbc1f5a8b01321d8d02151271b55dc3b7b4f2faef92351" Mar 20 11:06:18 crc kubenswrapper[4748]: I0320 11:06:18.745902 4748 scope.go:117] "RemoveContainer" containerID="05db1c5e609ddbd336478b4a6064b39774d3a3bc47feaa116f5ec5c21630f999" Mar 20 11:06:18 crc kubenswrapper[4748]: I0320 11:06:18.771151 4748 scope.go:117] "RemoveContainer" containerID="6f5ccf3fc6f379e479084fc05831e25807d7b24baa886b124561078d730d0b76" Mar 20 11:06:18 crc kubenswrapper[4748]: I0320 11:06:18.800882 4748 scope.go:117] "RemoveContainer" containerID="d00d1cd95d6aeea04a3f52207c886bfedc4b1bf732ae0fcfdc5f08c4c58d28bd" Mar 20 11:06:18 crc kubenswrapper[4748]: I0320 11:06:18.824255 4748 scope.go:117] "RemoveContainer" containerID="2139ed030748c21d1eaebb5f04701c7777310e08ef8766510c63ae73b9d9eeea" Mar 20 11:06:18 crc kubenswrapper[4748]: I0320 11:06:18.851259 4748 scope.go:117] "RemoveContainer" containerID="2f176c006cdeb4d99ef251284203bab1e079b75677e00e6eda348f7000787342" Mar 20 11:06:22 crc kubenswrapper[4748]: I0320 11:06:22.515932 4748 scope.go:117] "RemoveContainer" containerID="e428509671e8fe99771dfcddf54085bbb58bd5b6557fd616733b187409873ead" Mar 20 11:06:22 crc kubenswrapper[4748]: E0320 11:06:22.516822 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lbvz_openshift-machine-config-operator(8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" Mar 20 11:06:36 crc kubenswrapper[4748]: I0320 11:06:36.514964 4748 scope.go:117] "RemoveContainer" containerID="e428509671e8fe99771dfcddf54085bbb58bd5b6557fd616733b187409873ead" Mar 20 11:06:36 crc kubenswrapper[4748]: E0320 11:06:36.515886 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lbvz_openshift-machine-config-operator(8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" Mar 20 11:06:48 crc kubenswrapper[4748]: I0320 11:06:48.515564 4748 scope.go:117] "RemoveContainer" containerID="e428509671e8fe99771dfcddf54085bbb58bd5b6557fd616733b187409873ead" Mar 20 11:06:48 crc kubenswrapper[4748]: E0320 11:06:48.516415 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lbvz_openshift-machine-config-operator(8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" Mar 20 11:06:58 crc kubenswrapper[4748]: I0320 11:06:58.049432 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-744pk"] Mar 20 11:06:58 crc kubenswrapper[4748]: I0320 11:06:58.062768 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-g4qfl"] Mar 20 11:06:58 crc kubenswrapper[4748]: I0320 11:06:58.076636 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-744pk"] Mar 20 11:06:58 crc kubenswrapper[4748]: I0320 11:06:58.084408 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-g4qfl"] Mar 20 11:06:59 crc kubenswrapper[4748]: I0320 11:06:59.516139 4748 scope.go:117] "RemoveContainer" containerID="e428509671e8fe99771dfcddf54085bbb58bd5b6557fd616733b187409873ead" Mar 20 11:06:59 crc kubenswrapper[4748]: E0320 11:06:59.516654 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lbvz_openshift-machine-config-operator(8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" Mar 20 11:06:59 crc kubenswrapper[4748]: I0320 11:06:59.527087 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6340e84c-8ffb-40e5-a470-52a50bff86f1" path="/var/lib/kubelet/pods/6340e84c-8ffb-40e5-a470-52a50bff86f1/volumes" Mar 20 11:06:59 crc kubenswrapper[4748]: I0320 11:06:59.528321 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66aa7b6f-a021-4161-b3da-ddb593f2b169" path="/var/lib/kubelet/pods/66aa7b6f-a021-4161-b3da-ddb593f2b169/volumes" Mar 20 11:07:09 crc kubenswrapper[4748]: I0320 11:07:09.576972 4748 generic.go:334] "Generic (PLEG): container finished" podID="a0708128-eacf-422a-8dac-98032a9f12e7" containerID="06504aa24b314c15c08b5139f367c9391bfa38b4788810518bd1c22fc37ea54f" exitCode=0 Mar 20 11:07:09 crc kubenswrapper[4748]: I0320 11:07:09.577176 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-rzshm" event={"ID":"a0708128-eacf-422a-8dac-98032a9f12e7","Type":"ContainerDied","Data":"06504aa24b314c15c08b5139f367c9391bfa38b4788810518bd1c22fc37ea54f"} Mar 20 11:07:11 crc kubenswrapper[4748]: I0320 11:07:11.186939 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-rzshm" Mar 20 11:07:11 crc kubenswrapper[4748]: I0320 11:07:11.316316 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ft278\" (UniqueName: \"kubernetes.io/projected/a0708128-eacf-422a-8dac-98032a9f12e7-kube-api-access-ft278\") pod \"a0708128-eacf-422a-8dac-98032a9f12e7\" (UID: \"a0708128-eacf-422a-8dac-98032a9f12e7\") " Mar 20 11:07:11 crc kubenswrapper[4748]: I0320 11:07:11.316487 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a0708128-eacf-422a-8dac-98032a9f12e7-ssh-key-openstack-edpm-ipam\") pod \"a0708128-eacf-422a-8dac-98032a9f12e7\" (UID: \"a0708128-eacf-422a-8dac-98032a9f12e7\") " Mar 20 11:07:11 crc kubenswrapper[4748]: I0320 11:07:11.316531 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a0708128-eacf-422a-8dac-98032a9f12e7-inventory\") pod \"a0708128-eacf-422a-8dac-98032a9f12e7\" (UID: \"a0708128-eacf-422a-8dac-98032a9f12e7\") " Mar 20 11:07:11 crc kubenswrapper[4748]: I0320 11:07:11.325128 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0708128-eacf-422a-8dac-98032a9f12e7-kube-api-access-ft278" (OuterVolumeSpecName: "kube-api-access-ft278") pod "a0708128-eacf-422a-8dac-98032a9f12e7" (UID: "a0708128-eacf-422a-8dac-98032a9f12e7"). InnerVolumeSpecName "kube-api-access-ft278". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:07:11 crc kubenswrapper[4748]: I0320 11:07:11.349052 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0708128-eacf-422a-8dac-98032a9f12e7-inventory" (OuterVolumeSpecName: "inventory") pod "a0708128-eacf-422a-8dac-98032a9f12e7" (UID: "a0708128-eacf-422a-8dac-98032a9f12e7"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 11:07:11 crc kubenswrapper[4748]: I0320 11:07:11.349891 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0708128-eacf-422a-8dac-98032a9f12e7-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "a0708128-eacf-422a-8dac-98032a9f12e7" (UID: "a0708128-eacf-422a-8dac-98032a9f12e7"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 11:07:11 crc kubenswrapper[4748]: I0320 11:07:11.418525 4748 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a0708128-eacf-422a-8dac-98032a9f12e7-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 11:07:11 crc kubenswrapper[4748]: I0320 11:07:11.418563 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ft278\" (UniqueName: \"kubernetes.io/projected/a0708128-eacf-422a-8dac-98032a9f12e7-kube-api-access-ft278\") on node \"crc\" DevicePath \"\"" Mar 20 11:07:11 crc kubenswrapper[4748]: I0320 11:07:11.418574 4748 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a0708128-eacf-422a-8dac-98032a9f12e7-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 11:07:11 crc kubenswrapper[4748]: I0320 11:07:11.516760 4748 scope.go:117] "RemoveContainer" containerID="e428509671e8fe99771dfcddf54085bbb58bd5b6557fd616733b187409873ead" Mar 20 11:07:11 crc kubenswrapper[4748]: E0320 11:07:11.517129 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lbvz_openshift-machine-config-operator(8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" Mar 20 11:07:11 crc kubenswrapper[4748]: I0320 11:07:11.609544 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-rzshm" event={"ID":"a0708128-eacf-422a-8dac-98032a9f12e7","Type":"ContainerDied","Data":"3fd3817a58e3c94536b977651e0adcc06522ca0fd3fe1a7f33e7c65d1ec644ca"} Mar 20 11:07:11 crc kubenswrapper[4748]: I0320 11:07:11.609600 4748 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3fd3817a58e3c94536b977651e0adcc06522ca0fd3fe1a7f33e7c65d1ec644ca" Mar 20 11:07:11 crc kubenswrapper[4748]: I0320 11:07:11.609634 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-rzshm" Mar 20 11:07:11 crc kubenswrapper[4748]: I0320 11:07:11.698062 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-r6hn2"] Mar 20 11:07:11 crc kubenswrapper[4748]: E0320 11:07:11.698466 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08153fb7-b276-48d4-a0ad-c4a433f5db7f" containerName="oc" Mar 20 11:07:11 crc kubenswrapper[4748]: I0320 11:07:11.698480 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="08153fb7-b276-48d4-a0ad-c4a433f5db7f" containerName="oc" Mar 20 11:07:11 crc kubenswrapper[4748]: E0320 11:07:11.698493 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0708128-eacf-422a-8dac-98032a9f12e7" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Mar 20 11:07:11 crc kubenswrapper[4748]: I0320 11:07:11.698500 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0708128-eacf-422a-8dac-98032a9f12e7" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Mar 20 11:07:11 crc kubenswrapper[4748]: I0320 11:07:11.698679 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="08153fb7-b276-48d4-a0ad-c4a433f5db7f" containerName="oc" Mar 20 11:07:11 crc kubenswrapper[4748]: I0320 11:07:11.698696 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0708128-eacf-422a-8dac-98032a9f12e7" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Mar 20 11:07:11 crc kubenswrapper[4748]: I0320 11:07:11.699306 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-r6hn2" Mar 20 11:07:11 crc kubenswrapper[4748]: I0320 11:07:11.704448 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-fd5jb" Mar 20 11:07:11 crc kubenswrapper[4748]: I0320 11:07:11.704613 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 11:07:11 crc kubenswrapper[4748]: I0320 11:07:11.704681 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 11:07:11 crc kubenswrapper[4748]: I0320 11:07:11.704914 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 11:07:11 crc kubenswrapper[4748]: I0320 11:07:11.714561 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-r6hn2"] Mar 20 11:07:11 crc kubenswrapper[4748]: I0320 11:07:11.834989 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2e4d68e5-3aee-40fe-98fb-a2c06bdd601e-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-r6hn2\" (UID: \"2e4d68e5-3aee-40fe-98fb-a2c06bdd601e\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-r6hn2" Mar 20 11:07:11 crc kubenswrapper[4748]: I0320 11:07:11.835470 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkszd\" (UniqueName: \"kubernetes.io/projected/2e4d68e5-3aee-40fe-98fb-a2c06bdd601e-kube-api-access-xkszd\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-r6hn2\" (UID: \"2e4d68e5-3aee-40fe-98fb-a2c06bdd601e\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-r6hn2" Mar 20 11:07:11 crc kubenswrapper[4748]: I0320 11:07:11.836013 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2e4d68e5-3aee-40fe-98fb-a2c06bdd601e-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-r6hn2\" (UID: \"2e4d68e5-3aee-40fe-98fb-a2c06bdd601e\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-r6hn2" Mar 20 11:07:11 crc kubenswrapper[4748]: I0320 11:07:11.938599 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkszd\" (UniqueName: \"kubernetes.io/projected/2e4d68e5-3aee-40fe-98fb-a2c06bdd601e-kube-api-access-xkszd\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-r6hn2\" (UID: \"2e4d68e5-3aee-40fe-98fb-a2c06bdd601e\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-r6hn2" Mar 20 11:07:11 crc kubenswrapper[4748]: I0320 11:07:11.938735 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2e4d68e5-3aee-40fe-98fb-a2c06bdd601e-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-r6hn2\" (UID: \"2e4d68e5-3aee-40fe-98fb-a2c06bdd601e\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-r6hn2" Mar 20 11:07:11 crc kubenswrapper[4748]: I0320 11:07:11.938810 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2e4d68e5-3aee-40fe-98fb-a2c06bdd601e-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-r6hn2\" (UID: \"2e4d68e5-3aee-40fe-98fb-a2c06bdd601e\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-r6hn2" Mar 20 11:07:11 crc kubenswrapper[4748]: I0320 11:07:11.945071 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2e4d68e5-3aee-40fe-98fb-a2c06bdd601e-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-r6hn2\" (UID: \"2e4d68e5-3aee-40fe-98fb-a2c06bdd601e\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-r6hn2" Mar 20 11:07:11 crc kubenswrapper[4748]: I0320 11:07:11.950808 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2e4d68e5-3aee-40fe-98fb-a2c06bdd601e-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-r6hn2\" (UID: \"2e4d68e5-3aee-40fe-98fb-a2c06bdd601e\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-r6hn2" Mar 20 11:07:11 crc kubenswrapper[4748]: I0320 11:07:11.956027 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkszd\" (UniqueName: \"kubernetes.io/projected/2e4d68e5-3aee-40fe-98fb-a2c06bdd601e-kube-api-access-xkszd\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-r6hn2\" (UID: \"2e4d68e5-3aee-40fe-98fb-a2c06bdd601e\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-r6hn2" Mar 20 11:07:12 crc kubenswrapper[4748]: I0320 11:07:12.034282 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-r6hn2" Mar 20 11:07:12 crc kubenswrapper[4748]: I0320 11:07:12.603573 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-r6hn2"] Mar 20 11:07:12 crc kubenswrapper[4748]: I0320 11:07:12.628063 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-r6hn2" event={"ID":"2e4d68e5-3aee-40fe-98fb-a2c06bdd601e","Type":"ContainerStarted","Data":"00bf89982b66b96147b35f651043a7bab20d9f16bfb3a183ae06a3f22a5abe26"} Mar 20 11:07:13 crc kubenswrapper[4748]: I0320 11:07:13.658484 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-r6hn2" event={"ID":"2e4d68e5-3aee-40fe-98fb-a2c06bdd601e","Type":"ContainerStarted","Data":"4858a55e527bead5ace10d121ae828d93855d3b8b624694d0e5c5c3bf8319544"} Mar 20 11:07:15 crc kubenswrapper[4748]: I0320 11:07:15.041960 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-r6hn2" podStartSLOduration=3.8552928079999997 podStartE2EDuration="4.041799008s" podCreationTimestamp="2026-03-20 11:07:11 +0000 UTC" firstStartedPulling="2026-03-20 11:07:12.613562375 +0000 UTC m=+1867.755108189" lastFinishedPulling="2026-03-20 11:07:12.800068575 +0000 UTC m=+1867.941614389" observedRunningTime="2026-03-20 11:07:13.690373448 +0000 UTC m=+1868.831919262" watchObservedRunningTime="2026-03-20 11:07:15.041799008 +0000 UTC m=+1870.183344822" Mar 20 11:07:15 crc kubenswrapper[4748]: I0320 11:07:15.048125 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-j7jqs"] Mar 20 11:07:15 crc kubenswrapper[4748]: I0320 11:07:15.059140 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-j7jqs"] Mar 20 11:07:15 crc kubenswrapper[4748]: I0320 11:07:15.530922 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4a4f230-3fe6-44a4-a91b-5b0ea07ae755" path="/var/lib/kubelet/pods/a4a4f230-3fe6-44a4-a91b-5b0ea07ae755/volumes" Mar 20 11:07:18 crc kubenswrapper[4748]: I0320 11:07:18.032326 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-r6t2v"] Mar 20 11:07:18 crc kubenswrapper[4748]: I0320 11:07:18.041694 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-r6t2v"] Mar 20 11:07:19 crc kubenswrapper[4748]: I0320 11:07:19.159131 4748 scope.go:117] "RemoveContainer" containerID="082ff8fe131c6a070446d24be4bd88a999c7470b5b6ea25ae826803de9f4f0c7" Mar 20 11:07:19 crc kubenswrapper[4748]: I0320 11:07:19.211817 4748 scope.go:117] "RemoveContainer" containerID="219f24c3854d372dd4332eb0a35515344f2b41d6e05185335df317097eb29204" Mar 20 11:07:19 crc kubenswrapper[4748]: I0320 11:07:19.261740 4748 scope.go:117] "RemoveContainer" containerID="c0e9914d7fc10214a4367f42562b7721df70ea6cc98d9004fbc125748550f8ec" Mar 20 11:07:19 crc kubenswrapper[4748]: I0320 11:07:19.526139 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7f8f96f-de61-435d-a542-0b10d8860ccd" path="/var/lib/kubelet/pods/e7f8f96f-de61-435d-a542-0b10d8860ccd/volumes" Mar 20 11:07:21 crc kubenswrapper[4748]: I0320 11:07:21.037970 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-rm5bp"] Mar 20 11:07:21 crc kubenswrapper[4748]: I0320 11:07:21.051204 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-rm5bp"] Mar 20 11:07:21 crc kubenswrapper[4748]: I0320 11:07:21.526596 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af16052b-a5ab-4244-b007-69a32d050a35" path="/var/lib/kubelet/pods/af16052b-a5ab-4244-b007-69a32d050a35/volumes" Mar 20 11:07:26 crc kubenswrapper[4748]: I0320 11:07:26.515879 4748 scope.go:117] "RemoveContainer" containerID="e428509671e8fe99771dfcddf54085bbb58bd5b6557fd616733b187409873ead" Mar 20 11:07:26 crc kubenswrapper[4748]: E0320 11:07:26.516750 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lbvz_openshift-machine-config-operator(8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" Mar 20 11:07:39 crc kubenswrapper[4748]: I0320 11:07:39.516295 4748 scope.go:117] "RemoveContainer" containerID="e428509671e8fe99771dfcddf54085bbb58bd5b6557fd616733b187409873ead" Mar 20 11:07:39 crc kubenswrapper[4748]: E0320 11:07:39.517233 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lbvz_openshift-machine-config-operator(8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" Mar 20 11:07:52 crc kubenswrapper[4748]: I0320 11:07:52.516177 4748 scope.go:117] "RemoveContainer" containerID="e428509671e8fe99771dfcddf54085bbb58bd5b6557fd616733b187409873ead" Mar 20 11:07:52 crc kubenswrapper[4748]: E0320 11:07:52.517114 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lbvz_openshift-machine-config-operator(8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" Mar 20 11:07:59 crc kubenswrapper[4748]: I0320 11:07:59.053245 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-dqp69"] Mar 20 11:07:59 crc kubenswrapper[4748]: I0320 11:07:59.065266 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-89ab-account-create-update-qtgz6"] Mar 20 11:07:59 crc kubenswrapper[4748]: I0320 11:07:59.073047 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-6fd8-account-create-update-mcv9d"] Mar 20 11:07:59 crc kubenswrapper[4748]: I0320 11:07:59.081477 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-d299-account-create-update-xxxtz"] Mar 20 11:07:59 crc kubenswrapper[4748]: I0320 11:07:59.090499 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-pcsgd"] Mar 20 11:07:59 crc kubenswrapper[4748]: I0320 11:07:59.100586 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-d299-account-create-update-xxxtz"] Mar 20 11:07:59 crc kubenswrapper[4748]: I0320 11:07:59.107662 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-dqp69"] Mar 20 11:07:59 crc kubenswrapper[4748]: I0320 11:07:59.117155 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-89ab-account-create-update-qtgz6"] Mar 20 11:07:59 crc kubenswrapper[4748]: I0320 11:07:59.124907 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-pcsgd"] Mar 20 11:07:59 crc kubenswrapper[4748]: I0320 11:07:59.131971 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-6fd8-account-create-update-mcv9d"] Mar 20 11:07:59 crc kubenswrapper[4748]: I0320 11:07:59.527302 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62b48274-d738-49ad-81a5-a7c701193695" path="/var/lib/kubelet/pods/62b48274-d738-49ad-81a5-a7c701193695/volumes" Mar 20 11:07:59 crc kubenswrapper[4748]: I0320 11:07:59.528701 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="777d5c45-8727-451f-bb86-6048a03ceb0b" path="/var/lib/kubelet/pods/777d5c45-8727-451f-bb86-6048a03ceb0b/volumes" Mar 20 11:07:59 crc kubenswrapper[4748]: I0320 11:07:59.529532 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2dafa6c-d784-48eb-926d-0648fb990dd4" path="/var/lib/kubelet/pods/c2dafa6c-d784-48eb-926d-0648fb990dd4/volumes" Mar 20 11:07:59 crc kubenswrapper[4748]: I0320 11:07:59.530333 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dde56056-a261-4e3a-8cd6-b703d33a14ca" path="/var/lib/kubelet/pods/dde56056-a261-4e3a-8cd6-b703d33a14ca/volumes" Mar 20 11:07:59 crc kubenswrapper[4748]: I0320 11:07:59.531704 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e90432d5-5ca1-4447-9e0b-2afaafa0ba1b" path="/var/lib/kubelet/pods/e90432d5-5ca1-4447-9e0b-2afaafa0ba1b/volumes" Mar 20 11:08:00 crc kubenswrapper[4748]: I0320 11:08:00.040393 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-c6v9d"] Mar 20 11:08:00 crc kubenswrapper[4748]: I0320 11:08:00.052228 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-c6v9d"] Mar 20 11:08:00 crc kubenswrapper[4748]: I0320 11:08:00.147920 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566748-tsmnw"] Mar 20 11:08:00 crc kubenswrapper[4748]: I0320 11:08:00.149451 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566748-tsmnw" Mar 20 11:08:00 crc kubenswrapper[4748]: I0320 11:08:00.153349 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 11:08:00 crc kubenswrapper[4748]: I0320 11:08:00.163479 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 11:08:00 crc kubenswrapper[4748]: I0320 11:08:00.163657 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-p6q8z" Mar 20 11:08:00 crc kubenswrapper[4748]: I0320 11:08:00.190316 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566748-tsmnw"] Mar 20 11:08:00 crc kubenswrapper[4748]: I0320 11:08:00.242139 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zf5w2\" (UniqueName: \"kubernetes.io/projected/446f9bb3-d2aa-4f62-afbe-74e108a6c13d-kube-api-access-zf5w2\") pod \"auto-csr-approver-29566748-tsmnw\" (UID: \"446f9bb3-d2aa-4f62-afbe-74e108a6c13d\") " pod="openshift-infra/auto-csr-approver-29566748-tsmnw" Mar 20 11:08:00 crc kubenswrapper[4748]: I0320 11:08:00.344363 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zf5w2\" (UniqueName: \"kubernetes.io/projected/446f9bb3-d2aa-4f62-afbe-74e108a6c13d-kube-api-access-zf5w2\") pod \"auto-csr-approver-29566748-tsmnw\" (UID: \"446f9bb3-d2aa-4f62-afbe-74e108a6c13d\") " pod="openshift-infra/auto-csr-approver-29566748-tsmnw" Mar 20 11:08:00 crc kubenswrapper[4748]: I0320 11:08:00.370186 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zf5w2\" (UniqueName: \"kubernetes.io/projected/446f9bb3-d2aa-4f62-afbe-74e108a6c13d-kube-api-access-zf5w2\") pod \"auto-csr-approver-29566748-tsmnw\" (UID: \"446f9bb3-d2aa-4f62-afbe-74e108a6c13d\") " pod="openshift-infra/auto-csr-approver-29566748-tsmnw" Mar 20 11:08:00 crc kubenswrapper[4748]: I0320 11:08:00.526339 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566748-tsmnw" Mar 20 11:08:01 crc kubenswrapper[4748]: I0320 11:08:01.020125 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566748-tsmnw"] Mar 20 11:08:01 crc kubenswrapper[4748]: I0320 11:08:01.117265 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566748-tsmnw" event={"ID":"446f9bb3-d2aa-4f62-afbe-74e108a6c13d","Type":"ContainerStarted","Data":"a4e3ead6242321f08bed7f75471bcf3d88f9661881ce46ac1dd13aab64a6609d"} Mar 20 11:08:01 crc kubenswrapper[4748]: I0320 11:08:01.530235 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9e2c5e9-4e94-47b2-82f3-c11dac5bfa25" path="/var/lib/kubelet/pods/e9e2c5e9-4e94-47b2-82f3-c11dac5bfa25/volumes" Mar 20 11:08:03 crc kubenswrapper[4748]: I0320 11:08:03.141951 4748 generic.go:334] "Generic (PLEG): container finished" podID="446f9bb3-d2aa-4f62-afbe-74e108a6c13d" containerID="c4b3dd8c12084b37e4d24b9db4f6ab202949557327c8e66e35d4553c55c06976" exitCode=0 Mar 20 11:08:03 crc kubenswrapper[4748]: I0320 11:08:03.142023 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566748-tsmnw" event={"ID":"446f9bb3-d2aa-4f62-afbe-74e108a6c13d","Type":"ContainerDied","Data":"c4b3dd8c12084b37e4d24b9db4f6ab202949557327c8e66e35d4553c55c06976"} Mar 20 11:08:04 crc kubenswrapper[4748]: I0320 11:08:04.602094 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566748-tsmnw" Mar 20 11:08:04 crc kubenswrapper[4748]: I0320 11:08:04.649328 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zf5w2\" (UniqueName: \"kubernetes.io/projected/446f9bb3-d2aa-4f62-afbe-74e108a6c13d-kube-api-access-zf5w2\") pod \"446f9bb3-d2aa-4f62-afbe-74e108a6c13d\" (UID: \"446f9bb3-d2aa-4f62-afbe-74e108a6c13d\") " Mar 20 11:08:04 crc kubenswrapper[4748]: I0320 11:08:04.659188 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/446f9bb3-d2aa-4f62-afbe-74e108a6c13d-kube-api-access-zf5w2" (OuterVolumeSpecName: "kube-api-access-zf5w2") pod "446f9bb3-d2aa-4f62-afbe-74e108a6c13d" (UID: "446f9bb3-d2aa-4f62-afbe-74e108a6c13d"). InnerVolumeSpecName "kube-api-access-zf5w2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:08:04 crc kubenswrapper[4748]: I0320 11:08:04.752813 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zf5w2\" (UniqueName: \"kubernetes.io/projected/446f9bb3-d2aa-4f62-afbe-74e108a6c13d-kube-api-access-zf5w2\") on node \"crc\" DevicePath \"\"" Mar 20 11:08:05 crc kubenswrapper[4748]: I0320 11:08:05.188708 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566748-tsmnw" event={"ID":"446f9bb3-d2aa-4f62-afbe-74e108a6c13d","Type":"ContainerDied","Data":"a4e3ead6242321f08bed7f75471bcf3d88f9661881ce46ac1dd13aab64a6609d"} Mar 20 11:08:05 crc kubenswrapper[4748]: I0320 11:08:05.188771 4748 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a4e3ead6242321f08bed7f75471bcf3d88f9661881ce46ac1dd13aab64a6609d" Mar 20 11:08:05 crc kubenswrapper[4748]: I0320 11:08:05.188868 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566748-tsmnw" Mar 20 11:08:05 crc kubenswrapper[4748]: I0320 11:08:05.523945 4748 scope.go:117] "RemoveContainer" containerID="e428509671e8fe99771dfcddf54085bbb58bd5b6557fd616733b187409873ead" Mar 20 11:08:05 crc kubenswrapper[4748]: E0320 11:08:05.524298 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lbvz_openshift-machine-config-operator(8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" Mar 20 11:08:05 crc kubenswrapper[4748]: I0320 11:08:05.717879 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566742-g82x8"] Mar 20 11:08:05 crc kubenswrapper[4748]: I0320 11:08:05.728558 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566742-g82x8"] Mar 20 11:08:07 crc kubenswrapper[4748]: I0320 11:08:07.530378 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97916155-2786-4b69-afc3-86954c6b4b50" path="/var/lib/kubelet/pods/97916155-2786-4b69-afc3-86954c6b4b50/volumes" Mar 20 11:08:18 crc kubenswrapper[4748]: I0320 11:08:18.515805 4748 scope.go:117] "RemoveContainer" containerID="e428509671e8fe99771dfcddf54085bbb58bd5b6557fd616733b187409873ead" Mar 20 11:08:18 crc kubenswrapper[4748]: E0320 11:08:18.517152 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lbvz_openshift-machine-config-operator(8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" Mar 20 11:08:19 crc kubenswrapper[4748]: I0320 11:08:19.394814 4748 scope.go:117] "RemoveContainer" containerID="215e76e819f49de525ff0060641fbe0eff4c291c6d92a2b41f7a99c17b5747e3" Mar 20 11:08:19 crc kubenswrapper[4748]: I0320 11:08:19.438718 4748 scope.go:117] "RemoveContainer" containerID="b38554dd1be02a45128fe9b6df10200ec1e855d0b4273f34b69a7887f02090c1" Mar 20 11:08:19 crc kubenswrapper[4748]: I0320 11:08:19.502945 4748 scope.go:117] "RemoveContainer" containerID="3bc0db86ee4b53156414e6bc24baf2c78fbeca995b3c9e9e4d945b908cfcb834" Mar 20 11:08:19 crc kubenswrapper[4748]: I0320 11:08:19.586316 4748 scope.go:117] "RemoveContainer" containerID="e7df77cd00a4e6f0259903becec720d11b8639fd32460f92aa673a4d87b59994" Mar 20 11:08:19 crc kubenswrapper[4748]: I0320 11:08:19.616921 4748 scope.go:117] "RemoveContainer" containerID="a5cce16e1895a78c7c2ba4c5b940d7ec88cb5d9dfe1c69b5555e34ea041a135b" Mar 20 11:08:19 crc kubenswrapper[4748]: I0320 11:08:19.698714 4748 scope.go:117] "RemoveContainer" containerID="9bd4f5b57b785ddc899d661a8c8d5dd749ac08edccafe7bb665a8d95b9604f2a" Mar 20 11:08:19 crc kubenswrapper[4748]: I0320 11:08:19.738021 4748 scope.go:117] "RemoveContainer" containerID="0e20c16be7121fffee41879fde884299ec5b691be8fd53dffb29b0715fcc4d7b" Mar 20 11:08:19 crc kubenswrapper[4748]: I0320 11:08:19.763714 4748 scope.go:117] "RemoveContainer" containerID="f2a8ddd516cf1926db46fb199dfb1f31a737a26079dd893201121819223acf58" Mar 20 11:08:19 crc kubenswrapper[4748]: I0320 11:08:19.807253 4748 scope.go:117] "RemoveContainer" containerID="c1883afcac4f293a3d3b5ec3f2d4c57afa2b80b2c38b93e3a42b5c4c78645574" Mar 20 11:08:22 crc kubenswrapper[4748]: I0320 11:08:22.406630 4748 generic.go:334] "Generic (PLEG): container finished" podID="2e4d68e5-3aee-40fe-98fb-a2c06bdd601e" containerID="4858a55e527bead5ace10d121ae828d93855d3b8b624694d0e5c5c3bf8319544" exitCode=0 Mar 20 11:08:22 crc kubenswrapper[4748]: I0320 11:08:22.406703 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-r6hn2" event={"ID":"2e4d68e5-3aee-40fe-98fb-a2c06bdd601e","Type":"ContainerDied","Data":"4858a55e527bead5ace10d121ae828d93855d3b8b624694d0e5c5c3bf8319544"} Mar 20 11:08:23 crc kubenswrapper[4748]: I0320 11:08:23.873371 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-r6hn2" Mar 20 11:08:24 crc kubenswrapper[4748]: I0320 11:08:24.015050 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xkszd\" (UniqueName: \"kubernetes.io/projected/2e4d68e5-3aee-40fe-98fb-a2c06bdd601e-kube-api-access-xkszd\") pod \"2e4d68e5-3aee-40fe-98fb-a2c06bdd601e\" (UID: \"2e4d68e5-3aee-40fe-98fb-a2c06bdd601e\") " Mar 20 11:08:24 crc kubenswrapper[4748]: I0320 11:08:24.015271 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2e4d68e5-3aee-40fe-98fb-a2c06bdd601e-inventory\") pod \"2e4d68e5-3aee-40fe-98fb-a2c06bdd601e\" (UID: \"2e4d68e5-3aee-40fe-98fb-a2c06bdd601e\") " Mar 20 11:08:24 crc kubenswrapper[4748]: I0320 11:08:24.015502 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2e4d68e5-3aee-40fe-98fb-a2c06bdd601e-ssh-key-openstack-edpm-ipam\") pod \"2e4d68e5-3aee-40fe-98fb-a2c06bdd601e\" (UID: \"2e4d68e5-3aee-40fe-98fb-a2c06bdd601e\") " Mar 20 11:08:24 crc kubenswrapper[4748]: I0320 11:08:24.024252 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e4d68e5-3aee-40fe-98fb-a2c06bdd601e-kube-api-access-xkszd" (OuterVolumeSpecName: "kube-api-access-xkszd") pod "2e4d68e5-3aee-40fe-98fb-a2c06bdd601e" (UID: "2e4d68e5-3aee-40fe-98fb-a2c06bdd601e"). InnerVolumeSpecName "kube-api-access-xkszd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:08:24 crc kubenswrapper[4748]: I0320 11:08:24.046716 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e4d68e5-3aee-40fe-98fb-a2c06bdd601e-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "2e4d68e5-3aee-40fe-98fb-a2c06bdd601e" (UID: "2e4d68e5-3aee-40fe-98fb-a2c06bdd601e"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 11:08:24 crc kubenswrapper[4748]: I0320 11:08:24.051803 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e4d68e5-3aee-40fe-98fb-a2c06bdd601e-inventory" (OuterVolumeSpecName: "inventory") pod "2e4d68e5-3aee-40fe-98fb-a2c06bdd601e" (UID: "2e4d68e5-3aee-40fe-98fb-a2c06bdd601e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 11:08:24 crc kubenswrapper[4748]: I0320 11:08:24.117221 4748 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2e4d68e5-3aee-40fe-98fb-a2c06bdd601e-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 11:08:24 crc kubenswrapper[4748]: I0320 11:08:24.117253 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xkszd\" (UniqueName: \"kubernetes.io/projected/2e4d68e5-3aee-40fe-98fb-a2c06bdd601e-kube-api-access-xkszd\") on node \"crc\" DevicePath \"\"" Mar 20 11:08:24 crc kubenswrapper[4748]: I0320 11:08:24.117264 4748 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2e4d68e5-3aee-40fe-98fb-a2c06bdd601e-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 11:08:24 crc kubenswrapper[4748]: I0320 11:08:24.429854 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-r6hn2" event={"ID":"2e4d68e5-3aee-40fe-98fb-a2c06bdd601e","Type":"ContainerDied","Data":"00bf89982b66b96147b35f651043a7bab20d9f16bfb3a183ae06a3f22a5abe26"} Mar 20 11:08:24 crc kubenswrapper[4748]: I0320 11:08:24.429914 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-r6hn2" Mar 20 11:08:24 crc kubenswrapper[4748]: I0320 11:08:24.429924 4748 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="00bf89982b66b96147b35f651043a7bab20d9f16bfb3a183ae06a3f22a5abe26" Mar 20 11:08:24 crc kubenswrapper[4748]: I0320 11:08:24.637259 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xr6bf"] Mar 20 11:08:24 crc kubenswrapper[4748]: E0320 11:08:24.637921 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="446f9bb3-d2aa-4f62-afbe-74e108a6c13d" containerName="oc" Mar 20 11:08:24 crc kubenswrapper[4748]: I0320 11:08:24.637953 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="446f9bb3-d2aa-4f62-afbe-74e108a6c13d" containerName="oc" Mar 20 11:08:24 crc kubenswrapper[4748]: E0320 11:08:24.637990 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e4d68e5-3aee-40fe-98fb-a2c06bdd601e" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 20 11:08:24 crc kubenswrapper[4748]: I0320 11:08:24.638004 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e4d68e5-3aee-40fe-98fb-a2c06bdd601e" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 20 11:08:24 crc kubenswrapper[4748]: I0320 11:08:24.638327 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="446f9bb3-d2aa-4f62-afbe-74e108a6c13d" containerName="oc" Mar 20 11:08:24 crc kubenswrapper[4748]: I0320 11:08:24.638361 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e4d68e5-3aee-40fe-98fb-a2c06bdd601e" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 20 11:08:24 crc kubenswrapper[4748]: I0320 11:08:24.639346 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xr6bf" Mar 20 11:08:24 crc kubenswrapper[4748]: I0320 11:08:24.643004 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 11:08:24 crc kubenswrapper[4748]: I0320 11:08:24.643332 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-fd5jb" Mar 20 11:08:24 crc kubenswrapper[4748]: I0320 11:08:24.643715 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 11:08:24 crc kubenswrapper[4748]: I0320 11:08:24.643823 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 11:08:24 crc kubenswrapper[4748]: I0320 11:08:24.652721 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xr6bf"] Mar 20 11:08:24 crc kubenswrapper[4748]: I0320 11:08:24.838129 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2da6177a-9350-445c-820e-cf678dfd5500-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-xr6bf\" (UID: \"2da6177a-9350-445c-820e-cf678dfd5500\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xr6bf" Mar 20 11:08:24 crc kubenswrapper[4748]: I0320 11:08:24.838521 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tt2jr\" (UniqueName: \"kubernetes.io/projected/2da6177a-9350-445c-820e-cf678dfd5500-kube-api-access-tt2jr\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-xr6bf\" (UID: \"2da6177a-9350-445c-820e-cf678dfd5500\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xr6bf" Mar 20 11:08:24 crc kubenswrapper[4748]: I0320 11:08:24.838758 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2da6177a-9350-445c-820e-cf678dfd5500-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-xr6bf\" (UID: \"2da6177a-9350-445c-820e-cf678dfd5500\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xr6bf" Mar 20 11:08:24 crc kubenswrapper[4748]: I0320 11:08:24.941970 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2da6177a-9350-445c-820e-cf678dfd5500-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-xr6bf\" (UID: \"2da6177a-9350-445c-820e-cf678dfd5500\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xr6bf" Mar 20 11:08:24 crc kubenswrapper[4748]: I0320 11:08:24.942058 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2da6177a-9350-445c-820e-cf678dfd5500-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-xr6bf\" (UID: \"2da6177a-9350-445c-820e-cf678dfd5500\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xr6bf" Mar 20 11:08:24 crc kubenswrapper[4748]: I0320 11:08:24.942092 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tt2jr\" (UniqueName: \"kubernetes.io/projected/2da6177a-9350-445c-820e-cf678dfd5500-kube-api-access-tt2jr\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-xr6bf\" (UID: \"2da6177a-9350-445c-820e-cf678dfd5500\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xr6bf" Mar 20 11:08:24 crc kubenswrapper[4748]: I0320 11:08:24.947582 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2da6177a-9350-445c-820e-cf678dfd5500-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-xr6bf\" (UID: \"2da6177a-9350-445c-820e-cf678dfd5500\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xr6bf" Mar 20 11:08:24 crc kubenswrapper[4748]: I0320 11:08:24.958148 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2da6177a-9350-445c-820e-cf678dfd5500-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-xr6bf\" (UID: \"2da6177a-9350-445c-820e-cf678dfd5500\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xr6bf" Mar 20 11:08:24 crc kubenswrapper[4748]: I0320 11:08:24.961625 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tt2jr\" (UniqueName: \"kubernetes.io/projected/2da6177a-9350-445c-820e-cf678dfd5500-kube-api-access-tt2jr\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-xr6bf\" (UID: \"2da6177a-9350-445c-820e-cf678dfd5500\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xr6bf" Mar 20 11:08:24 crc kubenswrapper[4748]: I0320 11:08:24.967404 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xr6bf" Mar 20 11:08:25 crc kubenswrapper[4748]: I0320 11:08:25.524987 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xr6bf"] Mar 20 11:08:26 crc kubenswrapper[4748]: I0320 11:08:26.452122 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xr6bf" event={"ID":"2da6177a-9350-445c-820e-cf678dfd5500","Type":"ContainerStarted","Data":"5e5961e36ab118ae342321335119d05d818dd27e8280e18f0d7529dd164405d5"} Mar 20 11:08:26 crc kubenswrapper[4748]: I0320 11:08:26.453349 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xr6bf" event={"ID":"2da6177a-9350-445c-820e-cf678dfd5500","Type":"ContainerStarted","Data":"645f9b9a8bb3c3fa41c621a4f644976c583aafe6e3483264dc202d7eb7c00de1"} Mar 20 11:08:26 crc kubenswrapper[4748]: I0320 11:08:26.479143 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xr6bf" podStartSLOduration=2.246201399 podStartE2EDuration="2.479121461s" podCreationTimestamp="2026-03-20 11:08:24 +0000 UTC" firstStartedPulling="2026-03-20 11:08:25.524419905 +0000 UTC m=+1940.665965719" lastFinishedPulling="2026-03-20 11:08:25.757339957 +0000 UTC m=+1940.898885781" observedRunningTime="2026-03-20 11:08:26.470321911 +0000 UTC m=+1941.611867725" watchObservedRunningTime="2026-03-20 11:08:26.479121461 +0000 UTC m=+1941.620667275" Mar 20 11:08:31 crc kubenswrapper[4748]: I0320 11:08:31.519013 4748 scope.go:117] "RemoveContainer" containerID="e428509671e8fe99771dfcddf54085bbb58bd5b6557fd616733b187409873ead" Mar 20 11:08:31 crc kubenswrapper[4748]: E0320 11:08:31.520433 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lbvz_openshift-machine-config-operator(8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" Mar 20 11:08:32 crc kubenswrapper[4748]: I0320 11:08:32.513648 4748 generic.go:334] "Generic (PLEG): container finished" podID="2da6177a-9350-445c-820e-cf678dfd5500" containerID="5e5961e36ab118ae342321335119d05d818dd27e8280e18f0d7529dd164405d5" exitCode=0 Mar 20 11:08:32 crc kubenswrapper[4748]: I0320 11:08:32.513762 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xr6bf" event={"ID":"2da6177a-9350-445c-820e-cf678dfd5500","Type":"ContainerDied","Data":"5e5961e36ab118ae342321335119d05d818dd27e8280e18f0d7529dd164405d5"} Mar 20 11:08:33 crc kubenswrapper[4748]: I0320 11:08:33.961747 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xr6bf" Mar 20 11:08:34 crc kubenswrapper[4748]: I0320 11:08:34.071144 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tt2jr\" (UniqueName: \"kubernetes.io/projected/2da6177a-9350-445c-820e-cf678dfd5500-kube-api-access-tt2jr\") pod \"2da6177a-9350-445c-820e-cf678dfd5500\" (UID: \"2da6177a-9350-445c-820e-cf678dfd5500\") " Mar 20 11:08:34 crc kubenswrapper[4748]: I0320 11:08:34.071924 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2da6177a-9350-445c-820e-cf678dfd5500-inventory\") pod \"2da6177a-9350-445c-820e-cf678dfd5500\" (UID: \"2da6177a-9350-445c-820e-cf678dfd5500\") " Mar 20 11:08:34 crc kubenswrapper[4748]: I0320 11:08:34.072281 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2da6177a-9350-445c-820e-cf678dfd5500-ssh-key-openstack-edpm-ipam\") pod \"2da6177a-9350-445c-820e-cf678dfd5500\" (UID: \"2da6177a-9350-445c-820e-cf678dfd5500\") " Mar 20 11:08:34 crc kubenswrapper[4748]: I0320 11:08:34.080502 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2da6177a-9350-445c-820e-cf678dfd5500-kube-api-access-tt2jr" (OuterVolumeSpecName: "kube-api-access-tt2jr") pod "2da6177a-9350-445c-820e-cf678dfd5500" (UID: "2da6177a-9350-445c-820e-cf678dfd5500"). InnerVolumeSpecName "kube-api-access-tt2jr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:08:34 crc kubenswrapper[4748]: I0320 11:08:34.107289 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2da6177a-9350-445c-820e-cf678dfd5500-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "2da6177a-9350-445c-820e-cf678dfd5500" (UID: "2da6177a-9350-445c-820e-cf678dfd5500"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 11:08:34 crc kubenswrapper[4748]: I0320 11:08:34.117634 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2da6177a-9350-445c-820e-cf678dfd5500-inventory" (OuterVolumeSpecName: "inventory") pod "2da6177a-9350-445c-820e-cf678dfd5500" (UID: "2da6177a-9350-445c-820e-cf678dfd5500"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 11:08:34 crc kubenswrapper[4748]: I0320 11:08:34.175442 4748 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2da6177a-9350-445c-820e-cf678dfd5500-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 11:08:34 crc kubenswrapper[4748]: I0320 11:08:34.175488 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tt2jr\" (UniqueName: \"kubernetes.io/projected/2da6177a-9350-445c-820e-cf678dfd5500-kube-api-access-tt2jr\") on node \"crc\" DevicePath \"\"" Mar 20 11:08:34 crc kubenswrapper[4748]: I0320 11:08:34.175502 4748 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2da6177a-9350-445c-820e-cf678dfd5500-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 11:08:34 crc kubenswrapper[4748]: I0320 11:08:34.538005 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xr6bf" event={"ID":"2da6177a-9350-445c-820e-cf678dfd5500","Type":"ContainerDied","Data":"645f9b9a8bb3c3fa41c621a4f644976c583aafe6e3483264dc202d7eb7c00de1"} Mar 20 11:08:34 crc kubenswrapper[4748]: I0320 11:08:34.538100 4748 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="645f9b9a8bb3c3fa41c621a4f644976c583aafe6e3483264dc202d7eb7c00de1" Mar 20 11:08:34 crc kubenswrapper[4748]: I0320 11:08:34.538120 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xr6bf" Mar 20 11:08:34 crc kubenswrapper[4748]: I0320 11:08:34.615698 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-n6f7v"] Mar 20 11:08:34 crc kubenswrapper[4748]: E0320 11:08:34.616209 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2da6177a-9350-445c-820e-cf678dfd5500" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 20 11:08:34 crc kubenswrapper[4748]: I0320 11:08:34.616225 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="2da6177a-9350-445c-820e-cf678dfd5500" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 20 11:08:34 crc kubenswrapper[4748]: I0320 11:08:34.616409 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="2da6177a-9350-445c-820e-cf678dfd5500" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 20 11:08:34 crc kubenswrapper[4748]: I0320 11:08:34.617209 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-n6f7v" Mar 20 11:08:34 crc kubenswrapper[4748]: I0320 11:08:34.626004 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 11:08:34 crc kubenswrapper[4748]: I0320 11:08:34.626111 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 11:08:34 crc kubenswrapper[4748]: I0320 11:08:34.626236 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 11:08:34 crc kubenswrapper[4748]: I0320 11:08:34.627235 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-fd5jb" Mar 20 11:08:34 crc kubenswrapper[4748]: I0320 11:08:34.634172 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-n6f7v"] Mar 20 11:08:34 crc kubenswrapper[4748]: I0320 11:08:34.689513 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/93a46290-fef3-4e7a-9cb3-682c3f453cc1-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-n6f7v\" (UID: \"93a46290-fef3-4e7a-9cb3-682c3f453cc1\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-n6f7v" Mar 20 11:08:34 crc kubenswrapper[4748]: I0320 11:08:34.690198 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/93a46290-fef3-4e7a-9cb3-682c3f453cc1-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-n6f7v\" (UID: \"93a46290-fef3-4e7a-9cb3-682c3f453cc1\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-n6f7v" Mar 20 11:08:34 crc kubenswrapper[4748]: I0320 11:08:34.690233 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9z74\" (UniqueName: \"kubernetes.io/projected/93a46290-fef3-4e7a-9cb3-682c3f453cc1-kube-api-access-n9z74\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-n6f7v\" (UID: \"93a46290-fef3-4e7a-9cb3-682c3f453cc1\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-n6f7v" Mar 20 11:08:34 crc kubenswrapper[4748]: I0320 11:08:34.792257 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/93a46290-fef3-4e7a-9cb3-682c3f453cc1-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-n6f7v\" (UID: \"93a46290-fef3-4e7a-9cb3-682c3f453cc1\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-n6f7v" Mar 20 11:08:34 crc kubenswrapper[4748]: I0320 11:08:34.792309 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n9z74\" (UniqueName: \"kubernetes.io/projected/93a46290-fef3-4e7a-9cb3-682c3f453cc1-kube-api-access-n9z74\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-n6f7v\" (UID: \"93a46290-fef3-4e7a-9cb3-682c3f453cc1\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-n6f7v" Mar 20 11:08:34 crc kubenswrapper[4748]: I0320 11:08:34.792417 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/93a46290-fef3-4e7a-9cb3-682c3f453cc1-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-n6f7v\" (UID: \"93a46290-fef3-4e7a-9cb3-682c3f453cc1\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-n6f7v" Mar 20 11:08:34 crc kubenswrapper[4748]: I0320 11:08:34.797512 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/93a46290-fef3-4e7a-9cb3-682c3f453cc1-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-n6f7v\" (UID: \"93a46290-fef3-4e7a-9cb3-682c3f453cc1\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-n6f7v" Mar 20 11:08:34 crc kubenswrapper[4748]: I0320 11:08:34.797531 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/93a46290-fef3-4e7a-9cb3-682c3f453cc1-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-n6f7v\" (UID: \"93a46290-fef3-4e7a-9cb3-682c3f453cc1\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-n6f7v" Mar 20 11:08:34 crc kubenswrapper[4748]: I0320 11:08:34.811525 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9z74\" (UniqueName: \"kubernetes.io/projected/93a46290-fef3-4e7a-9cb3-682c3f453cc1-kube-api-access-n9z74\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-n6f7v\" (UID: \"93a46290-fef3-4e7a-9cb3-682c3f453cc1\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-n6f7v" Mar 20 11:08:34 crc kubenswrapper[4748]: I0320 11:08:34.935980 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-n6f7v" Mar 20 11:08:35 crc kubenswrapper[4748]: I0320 11:08:35.481486 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-n6f7v"] Mar 20 11:08:35 crc kubenswrapper[4748]: I0320 11:08:35.546992 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-n6f7v" event={"ID":"93a46290-fef3-4e7a-9cb3-682c3f453cc1","Type":"ContainerStarted","Data":"74480e5d46bdaaa4158260f8e59b44336c2ad5aa0043907d64204c92d0213554"} Mar 20 11:08:36 crc kubenswrapper[4748]: I0320 11:08:36.557219 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-n6f7v" event={"ID":"93a46290-fef3-4e7a-9cb3-682c3f453cc1","Type":"ContainerStarted","Data":"53936b72c47fc4f29238ba1db46c05719a398a0ca91cc0654e73d556909c9888"} Mar 20 11:08:36 crc kubenswrapper[4748]: I0320 11:08:36.598191 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-n6f7v" podStartSLOduration=2.392266785 podStartE2EDuration="2.595805001s" podCreationTimestamp="2026-03-20 11:08:34 +0000 UTC" firstStartedPulling="2026-03-20 11:08:35.487559282 +0000 UTC m=+1950.629105096" lastFinishedPulling="2026-03-20 11:08:35.691097498 +0000 UTC m=+1950.832643312" observedRunningTime="2026-03-20 11:08:36.583065383 +0000 UTC m=+1951.724611197" watchObservedRunningTime="2026-03-20 11:08:36.595805001 +0000 UTC m=+1951.737350815" Mar 20 11:08:46 crc kubenswrapper[4748]: I0320 11:08:46.515500 4748 scope.go:117] "RemoveContainer" containerID="e428509671e8fe99771dfcddf54085bbb58bd5b6557fd616733b187409873ead" Mar 20 11:08:46 crc kubenswrapper[4748]: E0320 11:08:46.516419 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lbvz_openshift-machine-config-operator(8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" Mar 20 11:08:51 crc kubenswrapper[4748]: I0320 11:08:51.042634 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-bs7cs"] Mar 20 11:08:51 crc kubenswrapper[4748]: I0320 11:08:51.049684 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-bs7cs"] Mar 20 11:08:51 crc kubenswrapper[4748]: I0320 11:08:51.527305 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a154fa7a-7ef7-4c8b-ac86-b0508e4c1cb9" path="/var/lib/kubelet/pods/a154fa7a-7ef7-4c8b-ac86-b0508e4c1cb9/volumes" Mar 20 11:08:57 crc kubenswrapper[4748]: I0320 11:08:57.516505 4748 scope.go:117] "RemoveContainer" containerID="e428509671e8fe99771dfcddf54085bbb58bd5b6557fd616733b187409873ead" Mar 20 11:08:57 crc kubenswrapper[4748]: E0320 11:08:57.517812 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lbvz_openshift-machine-config-operator(8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" Mar 20 11:09:09 crc kubenswrapper[4748]: I0320 11:09:09.515798 4748 scope.go:117] "RemoveContainer" containerID="e428509671e8fe99771dfcddf54085bbb58bd5b6557fd616733b187409873ead" Mar 20 11:09:09 crc kubenswrapper[4748]: E0320 11:09:09.516672 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lbvz_openshift-machine-config-operator(8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" Mar 20 11:09:10 crc kubenswrapper[4748]: I0320 11:09:10.058793 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-hww8p"] Mar 20 11:09:10 crc kubenswrapper[4748]: I0320 11:09:10.073693 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-hww8p"] Mar 20 11:09:11 crc kubenswrapper[4748]: I0320 11:09:11.526628 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7263772-e7ec-43ad-815f-7c6a67575402" path="/var/lib/kubelet/pods/d7263772-e7ec-43ad-815f-7c6a67575402/volumes" Mar 20 11:09:16 crc kubenswrapper[4748]: I0320 11:09:16.160044 4748 generic.go:334] "Generic (PLEG): container finished" podID="93a46290-fef3-4e7a-9cb3-682c3f453cc1" containerID="53936b72c47fc4f29238ba1db46c05719a398a0ca91cc0654e73d556909c9888" exitCode=0 Mar 20 11:09:16 crc kubenswrapper[4748]: I0320 11:09:16.160120 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-n6f7v" event={"ID":"93a46290-fef3-4e7a-9cb3-682c3f453cc1","Type":"ContainerDied","Data":"53936b72c47fc4f29238ba1db46c05719a398a0ca91cc0654e73d556909c9888"} Mar 20 11:09:17 crc kubenswrapper[4748]: I0320 11:09:17.574840 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-n6f7v" Mar 20 11:09:17 crc kubenswrapper[4748]: I0320 11:09:17.674417 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/93a46290-fef3-4e7a-9cb3-682c3f453cc1-inventory\") pod \"93a46290-fef3-4e7a-9cb3-682c3f453cc1\" (UID: \"93a46290-fef3-4e7a-9cb3-682c3f453cc1\") " Mar 20 11:09:17 crc kubenswrapper[4748]: I0320 11:09:17.674725 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n9z74\" (UniqueName: \"kubernetes.io/projected/93a46290-fef3-4e7a-9cb3-682c3f453cc1-kube-api-access-n9z74\") pod \"93a46290-fef3-4e7a-9cb3-682c3f453cc1\" (UID: \"93a46290-fef3-4e7a-9cb3-682c3f453cc1\") " Mar 20 11:09:17 crc kubenswrapper[4748]: I0320 11:09:17.674817 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/93a46290-fef3-4e7a-9cb3-682c3f453cc1-ssh-key-openstack-edpm-ipam\") pod \"93a46290-fef3-4e7a-9cb3-682c3f453cc1\" (UID: \"93a46290-fef3-4e7a-9cb3-682c3f453cc1\") " Mar 20 11:09:17 crc kubenswrapper[4748]: I0320 11:09:17.680102 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93a46290-fef3-4e7a-9cb3-682c3f453cc1-kube-api-access-n9z74" (OuterVolumeSpecName: "kube-api-access-n9z74") pod "93a46290-fef3-4e7a-9cb3-682c3f453cc1" (UID: "93a46290-fef3-4e7a-9cb3-682c3f453cc1"). InnerVolumeSpecName "kube-api-access-n9z74". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:09:17 crc kubenswrapper[4748]: I0320 11:09:17.703673 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93a46290-fef3-4e7a-9cb3-682c3f453cc1-inventory" (OuterVolumeSpecName: "inventory") pod "93a46290-fef3-4e7a-9cb3-682c3f453cc1" (UID: "93a46290-fef3-4e7a-9cb3-682c3f453cc1"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 11:09:17 crc kubenswrapper[4748]: I0320 11:09:17.707891 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93a46290-fef3-4e7a-9cb3-682c3f453cc1-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "93a46290-fef3-4e7a-9cb3-682c3f453cc1" (UID: "93a46290-fef3-4e7a-9cb3-682c3f453cc1"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 11:09:17 crc kubenswrapper[4748]: I0320 11:09:17.777022 4748 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/93a46290-fef3-4e7a-9cb3-682c3f453cc1-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 11:09:17 crc kubenswrapper[4748]: I0320 11:09:17.777060 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n9z74\" (UniqueName: \"kubernetes.io/projected/93a46290-fef3-4e7a-9cb3-682c3f453cc1-kube-api-access-n9z74\") on node \"crc\" DevicePath \"\"" Mar 20 11:09:17 crc kubenswrapper[4748]: I0320 11:09:17.777072 4748 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/93a46290-fef3-4e7a-9cb3-682c3f453cc1-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 11:09:18 crc kubenswrapper[4748]: I0320 11:09:18.177198 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-n6f7v" event={"ID":"93a46290-fef3-4e7a-9cb3-682c3f453cc1","Type":"ContainerDied","Data":"74480e5d46bdaaa4158260f8e59b44336c2ad5aa0043907d64204c92d0213554"} Mar 20 11:09:18 crc kubenswrapper[4748]: I0320 11:09:18.177239 4748 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="74480e5d46bdaaa4158260f8e59b44336c2ad5aa0043907d64204c92d0213554" Mar 20 11:09:18 crc kubenswrapper[4748]: I0320 11:09:18.177290 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-n6f7v" Mar 20 11:09:18 crc kubenswrapper[4748]: I0320 11:09:18.265257 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wdbpr"] Mar 20 11:09:18 crc kubenswrapper[4748]: E0320 11:09:18.265701 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93a46290-fef3-4e7a-9cb3-682c3f453cc1" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 20 11:09:18 crc kubenswrapper[4748]: I0320 11:09:18.265725 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="93a46290-fef3-4e7a-9cb3-682c3f453cc1" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 20 11:09:18 crc kubenswrapper[4748]: I0320 11:09:18.265986 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="93a46290-fef3-4e7a-9cb3-682c3f453cc1" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 20 11:09:18 crc kubenswrapper[4748]: I0320 11:09:18.266726 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wdbpr" Mar 20 11:09:18 crc kubenswrapper[4748]: I0320 11:09:18.268658 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-fd5jb" Mar 20 11:09:18 crc kubenswrapper[4748]: I0320 11:09:18.268994 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 11:09:18 crc kubenswrapper[4748]: I0320 11:09:18.269607 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 11:09:18 crc kubenswrapper[4748]: I0320 11:09:18.270333 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 11:09:18 crc kubenswrapper[4748]: I0320 11:09:18.279152 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wdbpr"] Mar 20 11:09:18 crc kubenswrapper[4748]: I0320 11:09:18.390962 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9e122eba-13d6-4074-b8d8-a4fc7ae4e3f1-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-wdbpr\" (UID: \"9e122eba-13d6-4074-b8d8-a4fc7ae4e3f1\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wdbpr" Mar 20 11:09:18 crc kubenswrapper[4748]: I0320 11:09:18.391029 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k85rq\" (UniqueName: \"kubernetes.io/projected/9e122eba-13d6-4074-b8d8-a4fc7ae4e3f1-kube-api-access-k85rq\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-wdbpr\" (UID: \"9e122eba-13d6-4074-b8d8-a4fc7ae4e3f1\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wdbpr" Mar 20 11:09:18 crc kubenswrapper[4748]: I0320 11:09:18.391133 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9e122eba-13d6-4074-b8d8-a4fc7ae4e3f1-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-wdbpr\" (UID: \"9e122eba-13d6-4074-b8d8-a4fc7ae4e3f1\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wdbpr" Mar 20 11:09:18 crc kubenswrapper[4748]: I0320 11:09:18.493485 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9e122eba-13d6-4074-b8d8-a4fc7ae4e3f1-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-wdbpr\" (UID: \"9e122eba-13d6-4074-b8d8-a4fc7ae4e3f1\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wdbpr" Mar 20 11:09:18 crc kubenswrapper[4748]: I0320 11:09:18.493534 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k85rq\" (UniqueName: \"kubernetes.io/projected/9e122eba-13d6-4074-b8d8-a4fc7ae4e3f1-kube-api-access-k85rq\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-wdbpr\" (UID: \"9e122eba-13d6-4074-b8d8-a4fc7ae4e3f1\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wdbpr" Mar 20 11:09:18 crc kubenswrapper[4748]: I0320 11:09:18.493594 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9e122eba-13d6-4074-b8d8-a4fc7ae4e3f1-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-wdbpr\" (UID: \"9e122eba-13d6-4074-b8d8-a4fc7ae4e3f1\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wdbpr" Mar 20 11:09:18 crc kubenswrapper[4748]: I0320 11:09:18.498808 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9e122eba-13d6-4074-b8d8-a4fc7ae4e3f1-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-wdbpr\" (UID: \"9e122eba-13d6-4074-b8d8-a4fc7ae4e3f1\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wdbpr" Mar 20 11:09:18 crc kubenswrapper[4748]: I0320 11:09:18.510164 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9e122eba-13d6-4074-b8d8-a4fc7ae4e3f1-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-wdbpr\" (UID: \"9e122eba-13d6-4074-b8d8-a4fc7ae4e3f1\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wdbpr" Mar 20 11:09:18 crc kubenswrapper[4748]: I0320 11:09:18.511075 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k85rq\" (UniqueName: \"kubernetes.io/projected/9e122eba-13d6-4074-b8d8-a4fc7ae4e3f1-kube-api-access-k85rq\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-wdbpr\" (UID: \"9e122eba-13d6-4074-b8d8-a4fc7ae4e3f1\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wdbpr" Mar 20 11:09:18 crc kubenswrapper[4748]: I0320 11:09:18.593737 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wdbpr" Mar 20 11:09:19 crc kubenswrapper[4748]: I0320 11:09:19.085631 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wdbpr"] Mar 20 11:09:19 crc kubenswrapper[4748]: I0320 11:09:19.187322 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wdbpr" event={"ID":"9e122eba-13d6-4074-b8d8-a4fc7ae4e3f1","Type":"ContainerStarted","Data":"0c98ecf2d05b90c15918952f938b038a3ae5e039b0c723a8908dc1bb1ac42c49"} Mar 20 11:09:19 crc kubenswrapper[4748]: I0320 11:09:19.998319 4748 scope.go:117] "RemoveContainer" containerID="6ffd362a35588b891b195eb4be044481eb6c22e82158760f46e425d88d3a61b6" Mar 20 11:09:20 crc kubenswrapper[4748]: I0320 11:09:20.064794 4748 scope.go:117] "RemoveContainer" containerID="5a49b5d8f18000cabf840c3c9026fdb78f1a5f83b9c6cce775ffeaa423653f08" Mar 20 11:09:20 crc kubenswrapper[4748]: I0320 11:09:20.203253 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wdbpr" event={"ID":"9e122eba-13d6-4074-b8d8-a4fc7ae4e3f1","Type":"ContainerStarted","Data":"08dd6c8815dbe818a5286f1592c8eeb1b0ce7897e9ee57829c1c5c83bf244f7f"} Mar 20 11:09:20 crc kubenswrapper[4748]: I0320 11:09:20.238661 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wdbpr" podStartSLOduration=2.040940252 podStartE2EDuration="2.23851544s" podCreationTimestamp="2026-03-20 11:09:18 +0000 UTC" firstStartedPulling="2026-03-20 11:09:19.090478153 +0000 UTC m=+1994.232023967" lastFinishedPulling="2026-03-20 11:09:19.288053341 +0000 UTC m=+1994.429599155" observedRunningTime="2026-03-20 11:09:20.230265574 +0000 UTC m=+1995.371811408" watchObservedRunningTime="2026-03-20 11:09:20.23851544 +0000 UTC m=+1995.380061254" Mar 20 11:09:22 crc kubenswrapper[4748]: I0320 11:09:22.516543 4748 scope.go:117] "RemoveContainer" containerID="e428509671e8fe99771dfcddf54085bbb58bd5b6557fd616733b187409873ead" Mar 20 11:09:22 crc kubenswrapper[4748]: E0320 11:09:22.518481 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lbvz_openshift-machine-config-operator(8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" Mar 20 11:09:23 crc kubenswrapper[4748]: I0320 11:09:23.037027 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-rghcf"] Mar 20 11:09:23 crc kubenswrapper[4748]: I0320 11:09:23.053055 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-rghcf"] Mar 20 11:09:23 crc kubenswrapper[4748]: I0320 11:09:23.531607 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="292d1168-7edc-4e05-a657-c03029450a6b" path="/var/lib/kubelet/pods/292d1168-7edc-4e05-a657-c03029450a6b/volumes" Mar 20 11:09:36 crc kubenswrapper[4748]: I0320 11:09:36.515478 4748 scope.go:117] "RemoveContainer" containerID="e428509671e8fe99771dfcddf54085bbb58bd5b6557fd616733b187409873ead" Mar 20 11:09:36 crc kubenswrapper[4748]: E0320 11:09:36.516217 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lbvz_openshift-machine-config-operator(8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" Mar 20 11:09:50 crc kubenswrapper[4748]: I0320 11:09:50.515924 4748 scope.go:117] "RemoveContainer" containerID="e428509671e8fe99771dfcddf54085bbb58bd5b6557fd616733b187409873ead" Mar 20 11:09:50 crc kubenswrapper[4748]: E0320 11:09:50.518253 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lbvz_openshift-machine-config-operator(8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" Mar 20 11:09:55 crc kubenswrapper[4748]: I0320 11:09:55.051143 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-hkmq9"] Mar 20 11:09:55 crc kubenswrapper[4748]: I0320 11:09:55.065801 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-hkmq9"] Mar 20 11:09:55 crc kubenswrapper[4748]: I0320 11:09:55.526512 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="864d7cb6-6b7f-4b08-9555-9c89fb2f0e04" path="/var/lib/kubelet/pods/864d7cb6-6b7f-4b08-9555-9c89fb2f0e04/volumes" Mar 20 11:10:00 crc kubenswrapper[4748]: I0320 11:10:00.147002 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566750-jcp97"] Mar 20 11:10:00 crc kubenswrapper[4748]: I0320 11:10:00.149277 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566750-jcp97" Mar 20 11:10:00 crc kubenswrapper[4748]: I0320 11:10:00.153393 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-p6q8z" Mar 20 11:10:00 crc kubenswrapper[4748]: I0320 11:10:00.153542 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 11:10:00 crc kubenswrapper[4748]: I0320 11:10:00.153666 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 11:10:00 crc kubenswrapper[4748]: I0320 11:10:00.166037 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566750-jcp97"] Mar 20 11:10:00 crc kubenswrapper[4748]: I0320 11:10:00.244972 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jw89n\" (UniqueName: \"kubernetes.io/projected/f13ae640-255e-402f-b0ab-a8fe649902bb-kube-api-access-jw89n\") pod \"auto-csr-approver-29566750-jcp97\" (UID: \"f13ae640-255e-402f-b0ab-a8fe649902bb\") " pod="openshift-infra/auto-csr-approver-29566750-jcp97" Mar 20 11:10:00 crc kubenswrapper[4748]: I0320 11:10:00.347115 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jw89n\" (UniqueName: \"kubernetes.io/projected/f13ae640-255e-402f-b0ab-a8fe649902bb-kube-api-access-jw89n\") pod \"auto-csr-approver-29566750-jcp97\" (UID: \"f13ae640-255e-402f-b0ab-a8fe649902bb\") " pod="openshift-infra/auto-csr-approver-29566750-jcp97" Mar 20 11:10:00 crc kubenswrapper[4748]: I0320 11:10:00.365570 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jw89n\" (UniqueName: \"kubernetes.io/projected/f13ae640-255e-402f-b0ab-a8fe649902bb-kube-api-access-jw89n\") pod \"auto-csr-approver-29566750-jcp97\" (UID: \"f13ae640-255e-402f-b0ab-a8fe649902bb\") " pod="openshift-infra/auto-csr-approver-29566750-jcp97" Mar 20 11:10:00 crc kubenswrapper[4748]: I0320 11:10:00.477374 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566750-jcp97" Mar 20 11:10:00 crc kubenswrapper[4748]: I0320 11:10:00.911102 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566750-jcp97"] Mar 20 11:10:01 crc kubenswrapper[4748]: I0320 11:10:01.554352 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566750-jcp97" event={"ID":"f13ae640-255e-402f-b0ab-a8fe649902bb","Type":"ContainerStarted","Data":"60f3c6a99cd05220a4d75f092a8b43ec1ac5cf90be36e8aee629ae4ab4970cb0"} Mar 20 11:10:03 crc kubenswrapper[4748]: I0320 11:10:03.569769 4748 generic.go:334] "Generic (PLEG): container finished" podID="f13ae640-255e-402f-b0ab-a8fe649902bb" containerID="35540e14a6eb77ccda615e494db25e46793126f3df163f027d7ece3077f932aa" exitCode=0 Mar 20 11:10:03 crc kubenswrapper[4748]: I0320 11:10:03.569820 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566750-jcp97" event={"ID":"f13ae640-255e-402f-b0ab-a8fe649902bb","Type":"ContainerDied","Data":"35540e14a6eb77ccda615e494db25e46793126f3df163f027d7ece3077f932aa"} Mar 20 11:10:04 crc kubenswrapper[4748]: I0320 11:10:04.515628 4748 scope.go:117] "RemoveContainer" containerID="e428509671e8fe99771dfcddf54085bbb58bd5b6557fd616733b187409873ead" Mar 20 11:10:04 crc kubenswrapper[4748]: E0320 11:10:04.516324 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lbvz_openshift-machine-config-operator(8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" Mar 20 11:10:04 crc kubenswrapper[4748]: I0320 11:10:04.962861 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566750-jcp97" Mar 20 11:10:05 crc kubenswrapper[4748]: I0320 11:10:05.138321 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jw89n\" (UniqueName: \"kubernetes.io/projected/f13ae640-255e-402f-b0ab-a8fe649902bb-kube-api-access-jw89n\") pod \"f13ae640-255e-402f-b0ab-a8fe649902bb\" (UID: \"f13ae640-255e-402f-b0ab-a8fe649902bb\") " Mar 20 11:10:05 crc kubenswrapper[4748]: I0320 11:10:05.143580 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f13ae640-255e-402f-b0ab-a8fe649902bb-kube-api-access-jw89n" (OuterVolumeSpecName: "kube-api-access-jw89n") pod "f13ae640-255e-402f-b0ab-a8fe649902bb" (UID: "f13ae640-255e-402f-b0ab-a8fe649902bb"). InnerVolumeSpecName "kube-api-access-jw89n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:10:05 crc kubenswrapper[4748]: I0320 11:10:05.241248 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jw89n\" (UniqueName: \"kubernetes.io/projected/f13ae640-255e-402f-b0ab-a8fe649902bb-kube-api-access-jw89n\") on node \"crc\" DevicePath \"\"" Mar 20 11:10:05 crc kubenswrapper[4748]: I0320 11:10:05.588060 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566750-jcp97" event={"ID":"f13ae640-255e-402f-b0ab-a8fe649902bb","Type":"ContainerDied","Data":"60f3c6a99cd05220a4d75f092a8b43ec1ac5cf90be36e8aee629ae4ab4970cb0"} Mar 20 11:10:05 crc kubenswrapper[4748]: I0320 11:10:05.588124 4748 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="60f3c6a99cd05220a4d75f092a8b43ec1ac5cf90be36e8aee629ae4ab4970cb0" Mar 20 11:10:05 crc kubenswrapper[4748]: I0320 11:10:05.588212 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566750-jcp97" Mar 20 11:10:06 crc kubenswrapper[4748]: I0320 11:10:06.033583 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566744-2c55t"] Mar 20 11:10:06 crc kubenswrapper[4748]: I0320 11:10:06.043881 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566744-2c55t"] Mar 20 11:10:07 crc kubenswrapper[4748]: I0320 11:10:07.527759 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b5d23e4-abaf-49c2-87df-fb1243d774f6" path="/var/lib/kubelet/pods/3b5d23e4-abaf-49c2-87df-fb1243d774f6/volumes" Mar 20 11:10:08 crc kubenswrapper[4748]: I0320 11:10:08.611212 4748 generic.go:334] "Generic (PLEG): container finished" podID="9e122eba-13d6-4074-b8d8-a4fc7ae4e3f1" containerID="08dd6c8815dbe818a5286f1592c8eeb1b0ce7897e9ee57829c1c5c83bf244f7f" exitCode=0 Mar 20 11:10:08 crc kubenswrapper[4748]: I0320 11:10:08.611438 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wdbpr" event={"ID":"9e122eba-13d6-4074-b8d8-a4fc7ae4e3f1","Type":"ContainerDied","Data":"08dd6c8815dbe818a5286f1592c8eeb1b0ce7897e9ee57829c1c5c83bf244f7f"} Mar 20 11:10:10 crc kubenswrapper[4748]: I0320 11:10:10.032371 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wdbpr" Mar 20 11:10:10 crc kubenswrapper[4748]: I0320 11:10:10.133979 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9e122eba-13d6-4074-b8d8-a4fc7ae4e3f1-inventory\") pod \"9e122eba-13d6-4074-b8d8-a4fc7ae4e3f1\" (UID: \"9e122eba-13d6-4074-b8d8-a4fc7ae4e3f1\") " Mar 20 11:10:10 crc kubenswrapper[4748]: I0320 11:10:10.134121 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k85rq\" (UniqueName: \"kubernetes.io/projected/9e122eba-13d6-4074-b8d8-a4fc7ae4e3f1-kube-api-access-k85rq\") pod \"9e122eba-13d6-4074-b8d8-a4fc7ae4e3f1\" (UID: \"9e122eba-13d6-4074-b8d8-a4fc7ae4e3f1\") " Mar 20 11:10:10 crc kubenswrapper[4748]: I0320 11:10:10.134316 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9e122eba-13d6-4074-b8d8-a4fc7ae4e3f1-ssh-key-openstack-edpm-ipam\") pod \"9e122eba-13d6-4074-b8d8-a4fc7ae4e3f1\" (UID: \"9e122eba-13d6-4074-b8d8-a4fc7ae4e3f1\") " Mar 20 11:10:10 crc kubenswrapper[4748]: I0320 11:10:10.139304 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e122eba-13d6-4074-b8d8-a4fc7ae4e3f1-kube-api-access-k85rq" (OuterVolumeSpecName: "kube-api-access-k85rq") pod "9e122eba-13d6-4074-b8d8-a4fc7ae4e3f1" (UID: "9e122eba-13d6-4074-b8d8-a4fc7ae4e3f1"). InnerVolumeSpecName "kube-api-access-k85rq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:10:10 crc kubenswrapper[4748]: I0320 11:10:10.162298 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e122eba-13d6-4074-b8d8-a4fc7ae4e3f1-inventory" (OuterVolumeSpecName: "inventory") pod "9e122eba-13d6-4074-b8d8-a4fc7ae4e3f1" (UID: "9e122eba-13d6-4074-b8d8-a4fc7ae4e3f1"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 11:10:10 crc kubenswrapper[4748]: I0320 11:10:10.162611 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e122eba-13d6-4074-b8d8-a4fc7ae4e3f1-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "9e122eba-13d6-4074-b8d8-a4fc7ae4e3f1" (UID: "9e122eba-13d6-4074-b8d8-a4fc7ae4e3f1"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 11:10:10 crc kubenswrapper[4748]: I0320 11:10:10.237012 4748 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9e122eba-13d6-4074-b8d8-a4fc7ae4e3f1-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 11:10:10 crc kubenswrapper[4748]: I0320 11:10:10.237067 4748 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9e122eba-13d6-4074-b8d8-a4fc7ae4e3f1-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 11:10:10 crc kubenswrapper[4748]: I0320 11:10:10.237082 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k85rq\" (UniqueName: \"kubernetes.io/projected/9e122eba-13d6-4074-b8d8-a4fc7ae4e3f1-kube-api-access-k85rq\") on node \"crc\" DevicePath \"\"" Mar 20 11:10:10 crc kubenswrapper[4748]: I0320 11:10:10.641117 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wdbpr" event={"ID":"9e122eba-13d6-4074-b8d8-a4fc7ae4e3f1","Type":"ContainerDied","Data":"0c98ecf2d05b90c15918952f938b038a3ae5e039b0c723a8908dc1bb1ac42c49"} Mar 20 11:10:10 crc kubenswrapper[4748]: I0320 11:10:10.641166 4748 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0c98ecf2d05b90c15918952f938b038a3ae5e039b0c723a8908dc1bb1ac42c49" Mar 20 11:10:10 crc kubenswrapper[4748]: I0320 11:10:10.641230 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wdbpr" Mar 20 11:10:10 crc kubenswrapper[4748]: I0320 11:10:10.744188 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-66zrt"] Mar 20 11:10:10 crc kubenswrapper[4748]: E0320 11:10:10.745425 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e122eba-13d6-4074-b8d8-a4fc7ae4e3f1" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 20 11:10:10 crc kubenswrapper[4748]: I0320 11:10:10.745466 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e122eba-13d6-4074-b8d8-a4fc7ae4e3f1" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 20 11:10:10 crc kubenswrapper[4748]: E0320 11:10:10.745503 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f13ae640-255e-402f-b0ab-a8fe649902bb" containerName="oc" Mar 20 11:10:10 crc kubenswrapper[4748]: I0320 11:10:10.745521 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="f13ae640-255e-402f-b0ab-a8fe649902bb" containerName="oc" Mar 20 11:10:10 crc kubenswrapper[4748]: I0320 11:10:10.746322 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="f13ae640-255e-402f-b0ab-a8fe649902bb" containerName="oc" Mar 20 11:10:10 crc kubenswrapper[4748]: I0320 11:10:10.746469 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e122eba-13d6-4074-b8d8-a4fc7ae4e3f1" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 20 11:10:10 crc kubenswrapper[4748]: I0320 11:10:10.747695 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-66zrt" Mar 20 11:10:10 crc kubenswrapper[4748]: I0320 11:10:10.757705 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-66zrt"] Mar 20 11:10:10 crc kubenswrapper[4748]: I0320 11:10:10.803688 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 11:10:10 crc kubenswrapper[4748]: I0320 11:10:10.803774 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 11:10:10 crc kubenswrapper[4748]: I0320 11:10:10.804277 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-fd5jb" Mar 20 11:10:10 crc kubenswrapper[4748]: I0320 11:10:10.804581 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 11:10:10 crc kubenswrapper[4748]: I0320 11:10:10.848504 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sczwc\" (UniqueName: \"kubernetes.io/projected/80a3504f-a9f2-4be3-9f87-e4110fb5fc7b-kube-api-access-sczwc\") pod \"ssh-known-hosts-edpm-deployment-66zrt\" (UID: \"80a3504f-a9f2-4be3-9f87-e4110fb5fc7b\") " pod="openstack/ssh-known-hosts-edpm-deployment-66zrt" Mar 20 11:10:10 crc kubenswrapper[4748]: I0320 11:10:10.848574 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/80a3504f-a9f2-4be3-9f87-e4110fb5fc7b-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-66zrt\" (UID: \"80a3504f-a9f2-4be3-9f87-e4110fb5fc7b\") " pod="openstack/ssh-known-hosts-edpm-deployment-66zrt" Mar 20 11:10:10 crc kubenswrapper[4748]: I0320 11:10:10.848736 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/80a3504f-a9f2-4be3-9f87-e4110fb5fc7b-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-66zrt\" (UID: \"80a3504f-a9f2-4be3-9f87-e4110fb5fc7b\") " pod="openstack/ssh-known-hosts-edpm-deployment-66zrt" Mar 20 11:10:10 crc kubenswrapper[4748]: I0320 11:10:10.950467 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/80a3504f-a9f2-4be3-9f87-e4110fb5fc7b-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-66zrt\" (UID: \"80a3504f-a9f2-4be3-9f87-e4110fb5fc7b\") " pod="openstack/ssh-known-hosts-edpm-deployment-66zrt" Mar 20 11:10:10 crc kubenswrapper[4748]: I0320 11:10:10.950600 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sczwc\" (UniqueName: \"kubernetes.io/projected/80a3504f-a9f2-4be3-9f87-e4110fb5fc7b-kube-api-access-sczwc\") pod \"ssh-known-hosts-edpm-deployment-66zrt\" (UID: \"80a3504f-a9f2-4be3-9f87-e4110fb5fc7b\") " pod="openstack/ssh-known-hosts-edpm-deployment-66zrt" Mar 20 11:10:10 crc kubenswrapper[4748]: I0320 11:10:10.950656 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/80a3504f-a9f2-4be3-9f87-e4110fb5fc7b-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-66zrt\" (UID: \"80a3504f-a9f2-4be3-9f87-e4110fb5fc7b\") " pod="openstack/ssh-known-hosts-edpm-deployment-66zrt" Mar 20 11:10:10 crc kubenswrapper[4748]: I0320 11:10:10.955066 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/80a3504f-a9f2-4be3-9f87-e4110fb5fc7b-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-66zrt\" (UID: \"80a3504f-a9f2-4be3-9f87-e4110fb5fc7b\") " pod="openstack/ssh-known-hosts-edpm-deployment-66zrt" Mar 20 11:10:10 crc kubenswrapper[4748]: I0320 11:10:10.956402 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/80a3504f-a9f2-4be3-9f87-e4110fb5fc7b-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-66zrt\" (UID: \"80a3504f-a9f2-4be3-9f87-e4110fb5fc7b\") " pod="openstack/ssh-known-hosts-edpm-deployment-66zrt" Mar 20 11:10:10 crc kubenswrapper[4748]: I0320 11:10:10.968572 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sczwc\" (UniqueName: \"kubernetes.io/projected/80a3504f-a9f2-4be3-9f87-e4110fb5fc7b-kube-api-access-sczwc\") pod \"ssh-known-hosts-edpm-deployment-66zrt\" (UID: \"80a3504f-a9f2-4be3-9f87-e4110fb5fc7b\") " pod="openstack/ssh-known-hosts-edpm-deployment-66zrt" Mar 20 11:10:11 crc kubenswrapper[4748]: I0320 11:10:11.131745 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-66zrt" Mar 20 11:10:11 crc kubenswrapper[4748]: I0320 11:10:11.673222 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-66zrt"] Mar 20 11:10:12 crc kubenswrapper[4748]: I0320 11:10:12.660046 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-66zrt" event={"ID":"80a3504f-a9f2-4be3-9f87-e4110fb5fc7b","Type":"ContainerStarted","Data":"c3b93825d4063a2e51013706e56d4410e791440bc90cac6a69dd0b468fd6a077"} Mar 20 11:10:12 crc kubenswrapper[4748]: I0320 11:10:12.660375 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-66zrt" event={"ID":"80a3504f-a9f2-4be3-9f87-e4110fb5fc7b","Type":"ContainerStarted","Data":"b3445587308e97e550ab872544978dd452dba90f04470c4d0d5ff941f73a76d4"} Mar 20 11:10:12 crc kubenswrapper[4748]: I0320 11:10:12.685129 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-66zrt" podStartSLOduration=2.475511889 podStartE2EDuration="2.685112177s" podCreationTimestamp="2026-03-20 11:10:10 +0000 UTC" firstStartedPulling="2026-03-20 11:10:11.681643141 +0000 UTC m=+2046.823188955" lastFinishedPulling="2026-03-20 11:10:11.891243429 +0000 UTC m=+2047.032789243" observedRunningTime="2026-03-20 11:10:12.676647305 +0000 UTC m=+2047.818193119" watchObservedRunningTime="2026-03-20 11:10:12.685112177 +0000 UTC m=+2047.826657991" Mar 20 11:10:15 crc kubenswrapper[4748]: E0320 11:10:15.677903 4748 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf13ae640_255e_402f_b0ab_a8fe649902bb.slice/crio-60f3c6a99cd05220a4d75f092a8b43ec1ac5cf90be36e8aee629ae4ab4970cb0\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf13ae640_255e_402f_b0ab_a8fe649902bb.slice\": RecentStats: unable to find data in memory cache]" Mar 20 11:10:18 crc kubenswrapper[4748]: I0320 11:10:18.516539 4748 scope.go:117] "RemoveContainer" containerID="e428509671e8fe99771dfcddf54085bbb58bd5b6557fd616733b187409873ead" Mar 20 11:10:18 crc kubenswrapper[4748]: E0320 11:10:18.518045 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lbvz_openshift-machine-config-operator(8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" Mar 20 11:10:18 crc kubenswrapper[4748]: I0320 11:10:18.721271 4748 generic.go:334] "Generic (PLEG): container finished" podID="80a3504f-a9f2-4be3-9f87-e4110fb5fc7b" containerID="c3b93825d4063a2e51013706e56d4410e791440bc90cac6a69dd0b468fd6a077" exitCode=0 Mar 20 11:10:18 crc kubenswrapper[4748]: I0320 11:10:18.721321 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-66zrt" event={"ID":"80a3504f-a9f2-4be3-9f87-e4110fb5fc7b","Type":"ContainerDied","Data":"c3b93825d4063a2e51013706e56d4410e791440bc90cac6a69dd0b468fd6a077"} Mar 20 11:10:20 crc kubenswrapper[4748]: I0320 11:10:20.136649 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-66zrt" Mar 20 11:10:20 crc kubenswrapper[4748]: I0320 11:10:20.181521 4748 scope.go:117] "RemoveContainer" containerID="2273cd0372819a347527cbc3d6009a6f2bb4e2e90d83429149b6691f73d025d6" Mar 20 11:10:20 crc kubenswrapper[4748]: I0320 11:10:20.224984 4748 scope.go:117] "RemoveContainer" containerID="d1ebbb3667bd4d483bc0e2f4be4610d8a04be8354aac381ebc50df5e6bad9595" Mar 20 11:10:20 crc kubenswrapper[4748]: I0320 11:10:20.277705 4748 scope.go:117] "RemoveContainer" containerID="4b8527fdfa261ede39a84da80f5b2d12a0691b869c53b7e9ffce1b4b103e38e7" Mar 20 11:10:20 crc kubenswrapper[4748]: I0320 11:10:20.330297 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sczwc\" (UniqueName: \"kubernetes.io/projected/80a3504f-a9f2-4be3-9f87-e4110fb5fc7b-kube-api-access-sczwc\") pod \"80a3504f-a9f2-4be3-9f87-e4110fb5fc7b\" (UID: \"80a3504f-a9f2-4be3-9f87-e4110fb5fc7b\") " Mar 20 11:10:20 crc kubenswrapper[4748]: I0320 11:10:20.330454 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/80a3504f-a9f2-4be3-9f87-e4110fb5fc7b-inventory-0\") pod \"80a3504f-a9f2-4be3-9f87-e4110fb5fc7b\" (UID: \"80a3504f-a9f2-4be3-9f87-e4110fb5fc7b\") " Mar 20 11:10:20 crc kubenswrapper[4748]: I0320 11:10:20.330594 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/80a3504f-a9f2-4be3-9f87-e4110fb5fc7b-ssh-key-openstack-edpm-ipam\") pod \"80a3504f-a9f2-4be3-9f87-e4110fb5fc7b\" (UID: \"80a3504f-a9f2-4be3-9f87-e4110fb5fc7b\") " Mar 20 11:10:20 crc kubenswrapper[4748]: I0320 11:10:20.335851 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80a3504f-a9f2-4be3-9f87-e4110fb5fc7b-kube-api-access-sczwc" (OuterVolumeSpecName: "kube-api-access-sczwc") pod "80a3504f-a9f2-4be3-9f87-e4110fb5fc7b" (UID: "80a3504f-a9f2-4be3-9f87-e4110fb5fc7b"). InnerVolumeSpecName "kube-api-access-sczwc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:10:20 crc kubenswrapper[4748]: I0320 11:10:20.358495 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80a3504f-a9f2-4be3-9f87-e4110fb5fc7b-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "80a3504f-a9f2-4be3-9f87-e4110fb5fc7b" (UID: "80a3504f-a9f2-4be3-9f87-e4110fb5fc7b"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 11:10:20 crc kubenswrapper[4748]: I0320 11:10:20.358919 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80a3504f-a9f2-4be3-9f87-e4110fb5fc7b-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "80a3504f-a9f2-4be3-9f87-e4110fb5fc7b" (UID: "80a3504f-a9f2-4be3-9f87-e4110fb5fc7b"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 11:10:20 crc kubenswrapper[4748]: I0320 11:10:20.432990 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sczwc\" (UniqueName: \"kubernetes.io/projected/80a3504f-a9f2-4be3-9f87-e4110fb5fc7b-kube-api-access-sczwc\") on node \"crc\" DevicePath \"\"" Mar 20 11:10:20 crc kubenswrapper[4748]: I0320 11:10:20.433049 4748 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/80a3504f-a9f2-4be3-9f87-e4110fb5fc7b-inventory-0\") on node \"crc\" DevicePath \"\"" Mar 20 11:10:20 crc kubenswrapper[4748]: I0320 11:10:20.433061 4748 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/80a3504f-a9f2-4be3-9f87-e4110fb5fc7b-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 11:10:20 crc kubenswrapper[4748]: I0320 11:10:20.740740 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-66zrt" event={"ID":"80a3504f-a9f2-4be3-9f87-e4110fb5fc7b","Type":"ContainerDied","Data":"b3445587308e97e550ab872544978dd452dba90f04470c4d0d5ff941f73a76d4"} Mar 20 11:10:20 crc kubenswrapper[4748]: I0320 11:10:20.741125 4748 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b3445587308e97e550ab872544978dd452dba90f04470c4d0d5ff941f73a76d4" Mar 20 11:10:20 crc kubenswrapper[4748]: I0320 11:10:20.740800 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-66zrt" Mar 20 11:10:20 crc kubenswrapper[4748]: I0320 11:10:20.820018 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-gl7rt"] Mar 20 11:10:20 crc kubenswrapper[4748]: E0320 11:10:20.820526 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80a3504f-a9f2-4be3-9f87-e4110fb5fc7b" containerName="ssh-known-hosts-edpm-deployment" Mar 20 11:10:20 crc kubenswrapper[4748]: I0320 11:10:20.820562 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="80a3504f-a9f2-4be3-9f87-e4110fb5fc7b" containerName="ssh-known-hosts-edpm-deployment" Mar 20 11:10:20 crc kubenswrapper[4748]: I0320 11:10:20.820782 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="80a3504f-a9f2-4be3-9f87-e4110fb5fc7b" containerName="ssh-known-hosts-edpm-deployment" Mar 20 11:10:20 crc kubenswrapper[4748]: I0320 11:10:20.821610 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gl7rt" Mar 20 11:10:20 crc kubenswrapper[4748]: I0320 11:10:20.823807 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-fd5jb" Mar 20 11:10:20 crc kubenswrapper[4748]: I0320 11:10:20.825095 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 11:10:20 crc kubenswrapper[4748]: I0320 11:10:20.825496 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 11:10:20 crc kubenswrapper[4748]: I0320 11:10:20.825640 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 11:10:20 crc kubenswrapper[4748]: I0320 11:10:20.833895 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-gl7rt"] Mar 20 11:10:20 crc kubenswrapper[4748]: I0320 11:10:20.942449 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4ae809a6-7d0a-4b85-a623-eda42d60e2d7-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-gl7rt\" (UID: \"4ae809a6-7d0a-4b85-a623-eda42d60e2d7\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gl7rt" Mar 20 11:10:20 crc kubenswrapper[4748]: I0320 11:10:20.942500 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-922gn\" (UniqueName: \"kubernetes.io/projected/4ae809a6-7d0a-4b85-a623-eda42d60e2d7-kube-api-access-922gn\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-gl7rt\" (UID: \"4ae809a6-7d0a-4b85-a623-eda42d60e2d7\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gl7rt" Mar 20 11:10:20 crc kubenswrapper[4748]: I0320 11:10:20.942576 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4ae809a6-7d0a-4b85-a623-eda42d60e2d7-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-gl7rt\" (UID: \"4ae809a6-7d0a-4b85-a623-eda42d60e2d7\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gl7rt" Mar 20 11:10:21 crc kubenswrapper[4748]: I0320 11:10:21.044715 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4ae809a6-7d0a-4b85-a623-eda42d60e2d7-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-gl7rt\" (UID: \"4ae809a6-7d0a-4b85-a623-eda42d60e2d7\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gl7rt" Mar 20 11:10:21 crc kubenswrapper[4748]: I0320 11:10:21.044780 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-922gn\" (UniqueName: \"kubernetes.io/projected/4ae809a6-7d0a-4b85-a623-eda42d60e2d7-kube-api-access-922gn\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-gl7rt\" (UID: \"4ae809a6-7d0a-4b85-a623-eda42d60e2d7\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gl7rt" Mar 20 11:10:21 crc kubenswrapper[4748]: I0320 11:10:21.044921 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4ae809a6-7d0a-4b85-a623-eda42d60e2d7-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-gl7rt\" (UID: \"4ae809a6-7d0a-4b85-a623-eda42d60e2d7\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gl7rt" Mar 20 11:10:21 crc kubenswrapper[4748]: I0320 11:10:21.049553 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4ae809a6-7d0a-4b85-a623-eda42d60e2d7-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-gl7rt\" (UID: \"4ae809a6-7d0a-4b85-a623-eda42d60e2d7\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gl7rt" Mar 20 11:10:21 crc kubenswrapper[4748]: I0320 11:10:21.058075 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4ae809a6-7d0a-4b85-a623-eda42d60e2d7-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-gl7rt\" (UID: \"4ae809a6-7d0a-4b85-a623-eda42d60e2d7\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gl7rt" Mar 20 11:10:21 crc kubenswrapper[4748]: I0320 11:10:21.063418 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-922gn\" (UniqueName: \"kubernetes.io/projected/4ae809a6-7d0a-4b85-a623-eda42d60e2d7-kube-api-access-922gn\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-gl7rt\" (UID: \"4ae809a6-7d0a-4b85-a623-eda42d60e2d7\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gl7rt" Mar 20 11:10:21 crc kubenswrapper[4748]: I0320 11:10:21.147081 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gl7rt" Mar 20 11:10:21 crc kubenswrapper[4748]: I0320 11:10:21.664766 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-gl7rt"] Mar 20 11:10:21 crc kubenswrapper[4748]: W0320 11:10:21.669827 4748 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4ae809a6_7d0a_4b85_a623_eda42d60e2d7.slice/crio-79722b3ceaa6fb0bc44eb6dc4700c6af5df97b84490c616b29c6b25996122d51 WatchSource:0}: Error finding container 79722b3ceaa6fb0bc44eb6dc4700c6af5df97b84490c616b29c6b25996122d51: Status 404 returned error can't find the container with id 79722b3ceaa6fb0bc44eb6dc4700c6af5df97b84490c616b29c6b25996122d51 Mar 20 11:10:21 crc kubenswrapper[4748]: I0320 11:10:21.749996 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gl7rt" event={"ID":"4ae809a6-7d0a-4b85-a623-eda42d60e2d7","Type":"ContainerStarted","Data":"79722b3ceaa6fb0bc44eb6dc4700c6af5df97b84490c616b29c6b25996122d51"} Mar 20 11:10:22 crc kubenswrapper[4748]: I0320 11:10:22.767967 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gl7rt" event={"ID":"4ae809a6-7d0a-4b85-a623-eda42d60e2d7","Type":"ContainerStarted","Data":"f81789411b3c2e4cb517617eb3a24615fd1dcee12c4cd3e8b994334d141944b9"} Mar 20 11:10:22 crc kubenswrapper[4748]: I0320 11:10:22.791242 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gl7rt" podStartSLOduration=2.6103079940000002 podStartE2EDuration="2.791216334s" podCreationTimestamp="2026-03-20 11:10:20 +0000 UTC" firstStartedPulling="2026-03-20 11:10:21.672459 +0000 UTC m=+2056.814004814" lastFinishedPulling="2026-03-20 11:10:21.85336734 +0000 UTC m=+2056.994913154" observedRunningTime="2026-03-20 11:10:22.786378053 +0000 UTC m=+2057.927923887" watchObservedRunningTime="2026-03-20 11:10:22.791216334 +0000 UTC m=+2057.932762148" Mar 20 11:10:25 crc kubenswrapper[4748]: E0320 11:10:25.916955 4748 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf13ae640_255e_402f_b0ab_a8fe649902bb.slice/crio-60f3c6a99cd05220a4d75f092a8b43ec1ac5cf90be36e8aee629ae4ab4970cb0\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf13ae640_255e_402f_b0ab_a8fe649902bb.slice\": RecentStats: unable to find data in memory cache]" Mar 20 11:10:29 crc kubenswrapper[4748]: I0320 11:10:29.823527 4748 generic.go:334] "Generic (PLEG): container finished" podID="4ae809a6-7d0a-4b85-a623-eda42d60e2d7" containerID="f81789411b3c2e4cb517617eb3a24615fd1dcee12c4cd3e8b994334d141944b9" exitCode=0 Mar 20 11:10:29 crc kubenswrapper[4748]: I0320 11:10:29.823604 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gl7rt" event={"ID":"4ae809a6-7d0a-4b85-a623-eda42d60e2d7","Type":"ContainerDied","Data":"f81789411b3c2e4cb517617eb3a24615fd1dcee12c4cd3e8b994334d141944b9"} Mar 20 11:10:30 crc kubenswrapper[4748]: I0320 11:10:30.515649 4748 scope.go:117] "RemoveContainer" containerID="e428509671e8fe99771dfcddf54085bbb58bd5b6557fd616733b187409873ead" Mar 20 11:10:30 crc kubenswrapper[4748]: E0320 11:10:30.516072 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lbvz_openshift-machine-config-operator(8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" Mar 20 11:10:31 crc kubenswrapper[4748]: I0320 11:10:31.329687 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gl7rt" Mar 20 11:10:31 crc kubenswrapper[4748]: I0320 11:10:31.455877 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4ae809a6-7d0a-4b85-a623-eda42d60e2d7-inventory\") pod \"4ae809a6-7d0a-4b85-a623-eda42d60e2d7\" (UID: \"4ae809a6-7d0a-4b85-a623-eda42d60e2d7\") " Mar 20 11:10:31 crc kubenswrapper[4748]: I0320 11:10:31.455925 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4ae809a6-7d0a-4b85-a623-eda42d60e2d7-ssh-key-openstack-edpm-ipam\") pod \"4ae809a6-7d0a-4b85-a623-eda42d60e2d7\" (UID: \"4ae809a6-7d0a-4b85-a623-eda42d60e2d7\") " Mar 20 11:10:31 crc kubenswrapper[4748]: I0320 11:10:31.456047 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-922gn\" (UniqueName: \"kubernetes.io/projected/4ae809a6-7d0a-4b85-a623-eda42d60e2d7-kube-api-access-922gn\") pod \"4ae809a6-7d0a-4b85-a623-eda42d60e2d7\" (UID: \"4ae809a6-7d0a-4b85-a623-eda42d60e2d7\") " Mar 20 11:10:31 crc kubenswrapper[4748]: I0320 11:10:31.463645 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ae809a6-7d0a-4b85-a623-eda42d60e2d7-kube-api-access-922gn" (OuterVolumeSpecName: "kube-api-access-922gn") pod "4ae809a6-7d0a-4b85-a623-eda42d60e2d7" (UID: "4ae809a6-7d0a-4b85-a623-eda42d60e2d7"). InnerVolumeSpecName "kube-api-access-922gn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:10:31 crc kubenswrapper[4748]: I0320 11:10:31.510391 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ae809a6-7d0a-4b85-a623-eda42d60e2d7-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "4ae809a6-7d0a-4b85-a623-eda42d60e2d7" (UID: "4ae809a6-7d0a-4b85-a623-eda42d60e2d7"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 11:10:31 crc kubenswrapper[4748]: I0320 11:10:31.514946 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ae809a6-7d0a-4b85-a623-eda42d60e2d7-inventory" (OuterVolumeSpecName: "inventory") pod "4ae809a6-7d0a-4b85-a623-eda42d60e2d7" (UID: "4ae809a6-7d0a-4b85-a623-eda42d60e2d7"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 11:10:31 crc kubenswrapper[4748]: I0320 11:10:31.557762 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-922gn\" (UniqueName: \"kubernetes.io/projected/4ae809a6-7d0a-4b85-a623-eda42d60e2d7-kube-api-access-922gn\") on node \"crc\" DevicePath \"\"" Mar 20 11:10:31 crc kubenswrapper[4748]: I0320 11:10:31.557802 4748 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4ae809a6-7d0a-4b85-a623-eda42d60e2d7-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 11:10:31 crc kubenswrapper[4748]: I0320 11:10:31.557814 4748 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4ae809a6-7d0a-4b85-a623-eda42d60e2d7-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 11:10:31 crc kubenswrapper[4748]: I0320 11:10:31.841314 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gl7rt" event={"ID":"4ae809a6-7d0a-4b85-a623-eda42d60e2d7","Type":"ContainerDied","Data":"79722b3ceaa6fb0bc44eb6dc4700c6af5df97b84490c616b29c6b25996122d51"} Mar 20 11:10:31 crc kubenswrapper[4748]: I0320 11:10:31.841347 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gl7rt" Mar 20 11:10:31 crc kubenswrapper[4748]: I0320 11:10:31.841351 4748 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="79722b3ceaa6fb0bc44eb6dc4700c6af5df97b84490c616b29c6b25996122d51" Mar 20 11:10:31 crc kubenswrapper[4748]: I0320 11:10:31.939973 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gx2k6"] Mar 20 11:10:31 crc kubenswrapper[4748]: E0320 11:10:31.940981 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ae809a6-7d0a-4b85-a623-eda42d60e2d7" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 20 11:10:31 crc kubenswrapper[4748]: I0320 11:10:31.941074 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ae809a6-7d0a-4b85-a623-eda42d60e2d7" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 20 11:10:31 crc kubenswrapper[4748]: I0320 11:10:31.942247 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ae809a6-7d0a-4b85-a623-eda42d60e2d7" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 20 11:10:31 crc kubenswrapper[4748]: I0320 11:10:31.943524 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gx2k6" Mar 20 11:10:31 crc kubenswrapper[4748]: I0320 11:10:31.957075 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 11:10:31 crc kubenswrapper[4748]: I0320 11:10:31.957199 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-fd5jb" Mar 20 11:10:31 crc kubenswrapper[4748]: I0320 11:10:31.957472 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 11:10:31 crc kubenswrapper[4748]: I0320 11:10:31.957631 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 11:10:31 crc kubenswrapper[4748]: I0320 11:10:31.969606 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gx2k6"] Mar 20 11:10:32 crc kubenswrapper[4748]: I0320 11:10:32.066777 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e51cf464-1d93-4c6c-99f9-418be04dce30-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-gx2k6\" (UID: \"e51cf464-1d93-4c6c-99f9-418be04dce30\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gx2k6" Mar 20 11:10:32 crc kubenswrapper[4748]: I0320 11:10:32.066983 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e51cf464-1d93-4c6c-99f9-418be04dce30-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-gx2k6\" (UID: \"e51cf464-1d93-4c6c-99f9-418be04dce30\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gx2k6" Mar 20 11:10:32 crc kubenswrapper[4748]: I0320 11:10:32.067026 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krfp7\" (UniqueName: \"kubernetes.io/projected/e51cf464-1d93-4c6c-99f9-418be04dce30-kube-api-access-krfp7\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-gx2k6\" (UID: \"e51cf464-1d93-4c6c-99f9-418be04dce30\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gx2k6" Mar 20 11:10:32 crc kubenswrapper[4748]: I0320 11:10:32.169027 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e51cf464-1d93-4c6c-99f9-418be04dce30-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-gx2k6\" (UID: \"e51cf464-1d93-4c6c-99f9-418be04dce30\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gx2k6" Mar 20 11:10:32 crc kubenswrapper[4748]: I0320 11:10:32.169200 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e51cf464-1d93-4c6c-99f9-418be04dce30-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-gx2k6\" (UID: \"e51cf464-1d93-4c6c-99f9-418be04dce30\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gx2k6" Mar 20 11:10:32 crc kubenswrapper[4748]: I0320 11:10:32.169246 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-krfp7\" (UniqueName: \"kubernetes.io/projected/e51cf464-1d93-4c6c-99f9-418be04dce30-kube-api-access-krfp7\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-gx2k6\" (UID: \"e51cf464-1d93-4c6c-99f9-418be04dce30\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gx2k6" Mar 20 11:10:32 crc kubenswrapper[4748]: I0320 11:10:32.173904 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e51cf464-1d93-4c6c-99f9-418be04dce30-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-gx2k6\" (UID: \"e51cf464-1d93-4c6c-99f9-418be04dce30\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gx2k6" Mar 20 11:10:32 crc kubenswrapper[4748]: I0320 11:10:32.176692 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e51cf464-1d93-4c6c-99f9-418be04dce30-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-gx2k6\" (UID: \"e51cf464-1d93-4c6c-99f9-418be04dce30\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gx2k6" Mar 20 11:10:32 crc kubenswrapper[4748]: I0320 11:10:32.186084 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-krfp7\" (UniqueName: \"kubernetes.io/projected/e51cf464-1d93-4c6c-99f9-418be04dce30-kube-api-access-krfp7\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-gx2k6\" (UID: \"e51cf464-1d93-4c6c-99f9-418be04dce30\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gx2k6" Mar 20 11:10:32 crc kubenswrapper[4748]: I0320 11:10:32.285604 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gx2k6" Mar 20 11:10:32 crc kubenswrapper[4748]: I0320 11:10:32.790576 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gx2k6"] Mar 20 11:10:32 crc kubenswrapper[4748]: I0320 11:10:32.797636 4748 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 11:10:32 crc kubenswrapper[4748]: I0320 11:10:32.849550 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gx2k6" event={"ID":"e51cf464-1d93-4c6c-99f9-418be04dce30","Type":"ContainerStarted","Data":"484b84606a49fb526d4ca10c07782b18037a885892a2dd897721f7867e67c287"} Mar 20 11:10:33 crc kubenswrapper[4748]: I0320 11:10:33.862081 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gx2k6" event={"ID":"e51cf464-1d93-4c6c-99f9-418be04dce30","Type":"ContainerStarted","Data":"2def23da0e3d79f576b49ce62b0d650ab5973a9a78f35272b73d05f5e3067ebd"} Mar 20 11:10:33 crc kubenswrapper[4748]: I0320 11:10:33.879392 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gx2k6" podStartSLOduration=2.329224243 podStartE2EDuration="2.879373139s" podCreationTimestamp="2026-03-20 11:10:31 +0000 UTC" firstStartedPulling="2026-03-20 11:10:32.797374427 +0000 UTC m=+2067.938920251" lastFinishedPulling="2026-03-20 11:10:33.347523333 +0000 UTC m=+2068.489069147" observedRunningTime="2026-03-20 11:10:33.876984199 +0000 UTC m=+2069.018530013" watchObservedRunningTime="2026-03-20 11:10:33.879373139 +0000 UTC m=+2069.020918953" Mar 20 11:10:36 crc kubenswrapper[4748]: E0320 11:10:36.180330 4748 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf13ae640_255e_402f_b0ab_a8fe649902bb.slice/crio-60f3c6a99cd05220a4d75f092a8b43ec1ac5cf90be36e8aee629ae4ab4970cb0\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf13ae640_255e_402f_b0ab_a8fe649902bb.slice\": RecentStats: unable to find data in memory cache]" Mar 20 11:10:42 crc kubenswrapper[4748]: I0320 11:10:42.517170 4748 scope.go:117] "RemoveContainer" containerID="e428509671e8fe99771dfcddf54085bbb58bd5b6557fd616733b187409873ead" Mar 20 11:10:42 crc kubenswrapper[4748]: E0320 11:10:42.517926 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lbvz_openshift-machine-config-operator(8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" Mar 20 11:10:42 crc kubenswrapper[4748]: I0320 11:10:42.973288 4748 generic.go:334] "Generic (PLEG): container finished" podID="e51cf464-1d93-4c6c-99f9-418be04dce30" containerID="2def23da0e3d79f576b49ce62b0d650ab5973a9a78f35272b73d05f5e3067ebd" exitCode=0 Mar 20 11:10:42 crc kubenswrapper[4748]: I0320 11:10:42.973389 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gx2k6" event={"ID":"e51cf464-1d93-4c6c-99f9-418be04dce30","Type":"ContainerDied","Data":"2def23da0e3d79f576b49ce62b0d650ab5973a9a78f35272b73d05f5e3067ebd"} Mar 20 11:10:44 crc kubenswrapper[4748]: I0320 11:10:44.381744 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gx2k6" Mar 20 11:10:44 crc kubenswrapper[4748]: I0320 11:10:44.533305 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e51cf464-1d93-4c6c-99f9-418be04dce30-inventory\") pod \"e51cf464-1d93-4c6c-99f9-418be04dce30\" (UID: \"e51cf464-1d93-4c6c-99f9-418be04dce30\") " Mar 20 11:10:44 crc kubenswrapper[4748]: I0320 11:10:44.533385 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-krfp7\" (UniqueName: \"kubernetes.io/projected/e51cf464-1d93-4c6c-99f9-418be04dce30-kube-api-access-krfp7\") pod \"e51cf464-1d93-4c6c-99f9-418be04dce30\" (UID: \"e51cf464-1d93-4c6c-99f9-418be04dce30\") " Mar 20 11:10:44 crc kubenswrapper[4748]: I0320 11:10:44.533422 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e51cf464-1d93-4c6c-99f9-418be04dce30-ssh-key-openstack-edpm-ipam\") pod \"e51cf464-1d93-4c6c-99f9-418be04dce30\" (UID: \"e51cf464-1d93-4c6c-99f9-418be04dce30\") " Mar 20 11:10:44 crc kubenswrapper[4748]: I0320 11:10:44.561084 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e51cf464-1d93-4c6c-99f9-418be04dce30-kube-api-access-krfp7" (OuterVolumeSpecName: "kube-api-access-krfp7") pod "e51cf464-1d93-4c6c-99f9-418be04dce30" (UID: "e51cf464-1d93-4c6c-99f9-418be04dce30"). InnerVolumeSpecName "kube-api-access-krfp7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:10:44 crc kubenswrapper[4748]: I0320 11:10:44.601981 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e51cf464-1d93-4c6c-99f9-418be04dce30-inventory" (OuterVolumeSpecName: "inventory") pod "e51cf464-1d93-4c6c-99f9-418be04dce30" (UID: "e51cf464-1d93-4c6c-99f9-418be04dce30"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 11:10:44 crc kubenswrapper[4748]: I0320 11:10:44.623988 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e51cf464-1d93-4c6c-99f9-418be04dce30-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "e51cf464-1d93-4c6c-99f9-418be04dce30" (UID: "e51cf464-1d93-4c6c-99f9-418be04dce30"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 11:10:44 crc kubenswrapper[4748]: I0320 11:10:44.637325 4748 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e51cf464-1d93-4c6c-99f9-418be04dce30-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 11:10:44 crc kubenswrapper[4748]: I0320 11:10:44.637367 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-krfp7\" (UniqueName: \"kubernetes.io/projected/e51cf464-1d93-4c6c-99f9-418be04dce30-kube-api-access-krfp7\") on node \"crc\" DevicePath \"\"" Mar 20 11:10:44 crc kubenswrapper[4748]: I0320 11:10:44.637385 4748 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e51cf464-1d93-4c6c-99f9-418be04dce30-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 11:10:44 crc kubenswrapper[4748]: I0320 11:10:44.992123 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gx2k6" event={"ID":"e51cf464-1d93-4c6c-99f9-418be04dce30","Type":"ContainerDied","Data":"484b84606a49fb526d4ca10c07782b18037a885892a2dd897721f7867e67c287"} Mar 20 11:10:44 crc kubenswrapper[4748]: I0320 11:10:44.992163 4748 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="484b84606a49fb526d4ca10c07782b18037a885892a2dd897721f7867e67c287" Mar 20 11:10:44 crc kubenswrapper[4748]: I0320 11:10:44.992192 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gx2k6" Mar 20 11:10:45 crc kubenswrapper[4748]: I0320 11:10:45.134270 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d522t"] Mar 20 11:10:45 crc kubenswrapper[4748]: E0320 11:10:45.134902 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e51cf464-1d93-4c6c-99f9-418be04dce30" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 20 11:10:45 crc kubenswrapper[4748]: I0320 11:10:45.134924 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="e51cf464-1d93-4c6c-99f9-418be04dce30" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 20 11:10:45 crc kubenswrapper[4748]: I0320 11:10:45.135136 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="e51cf464-1d93-4c6c-99f9-418be04dce30" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 20 11:10:45 crc kubenswrapper[4748]: I0320 11:10:45.135823 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d522t" Mar 20 11:10:45 crc kubenswrapper[4748]: I0320 11:10:45.138160 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Mar 20 11:10:45 crc kubenswrapper[4748]: I0320 11:10:45.138199 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Mar 20 11:10:45 crc kubenswrapper[4748]: I0320 11:10:45.138641 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 11:10:45 crc kubenswrapper[4748]: I0320 11:10:45.138954 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 11:10:45 crc kubenswrapper[4748]: I0320 11:10:45.139124 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Mar 20 11:10:45 crc kubenswrapper[4748]: I0320 11:10:45.139882 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 11:10:45 crc kubenswrapper[4748]: I0320 11:10:45.148011 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Mar 20 11:10:45 crc kubenswrapper[4748]: I0320 11:10:45.148664 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-fd5jb" Mar 20 11:10:45 crc kubenswrapper[4748]: I0320 11:10:45.156870 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d522t"] Mar 20 11:10:45 crc kubenswrapper[4748]: I0320 11:10:45.246170 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d2decbf-7d56-4ff7-896e-eaca78da7448-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d522t\" (UID: \"5d2decbf-7d56-4ff7-896e-eaca78da7448\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d522t" Mar 20 11:10:45 crc kubenswrapper[4748]: I0320 11:10:45.246284 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5d2decbf-7d56-4ff7-896e-eaca78da7448-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d522t\" (UID: \"5d2decbf-7d56-4ff7-896e-eaca78da7448\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d522t" Mar 20 11:10:45 crc kubenswrapper[4748]: I0320 11:10:45.246422 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/5d2decbf-7d56-4ff7-896e-eaca78da7448-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d522t\" (UID: \"5d2decbf-7d56-4ff7-896e-eaca78da7448\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d522t" Mar 20 11:10:45 crc kubenswrapper[4748]: I0320 11:10:45.246533 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/5d2decbf-7d56-4ff7-896e-eaca78da7448-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d522t\" (UID: \"5d2decbf-7d56-4ff7-896e-eaca78da7448\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d522t" Mar 20 11:10:45 crc kubenswrapper[4748]: I0320 11:10:45.246588 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hx8mc\" (UniqueName: \"kubernetes.io/projected/5d2decbf-7d56-4ff7-896e-eaca78da7448-kube-api-access-hx8mc\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d522t\" (UID: \"5d2decbf-7d56-4ff7-896e-eaca78da7448\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d522t" Mar 20 11:10:45 crc kubenswrapper[4748]: I0320 11:10:45.246994 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d2decbf-7d56-4ff7-896e-eaca78da7448-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d522t\" (UID: \"5d2decbf-7d56-4ff7-896e-eaca78da7448\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d522t" Mar 20 11:10:45 crc kubenswrapper[4748]: I0320 11:10:45.247076 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d2decbf-7d56-4ff7-896e-eaca78da7448-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d522t\" (UID: \"5d2decbf-7d56-4ff7-896e-eaca78da7448\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d522t" Mar 20 11:10:45 crc kubenswrapper[4748]: I0320 11:10:45.247155 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/5d2decbf-7d56-4ff7-896e-eaca78da7448-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d522t\" (UID: \"5d2decbf-7d56-4ff7-896e-eaca78da7448\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d522t" Mar 20 11:10:45 crc kubenswrapper[4748]: I0320 11:10:45.247247 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d2decbf-7d56-4ff7-896e-eaca78da7448-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d522t\" (UID: \"5d2decbf-7d56-4ff7-896e-eaca78da7448\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d522t" Mar 20 11:10:45 crc kubenswrapper[4748]: I0320 11:10:45.247304 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d2decbf-7d56-4ff7-896e-eaca78da7448-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d522t\" (UID: \"5d2decbf-7d56-4ff7-896e-eaca78da7448\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d522t" Mar 20 11:10:45 crc kubenswrapper[4748]: I0320 11:10:45.247346 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d2decbf-7d56-4ff7-896e-eaca78da7448-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d522t\" (UID: \"5d2decbf-7d56-4ff7-896e-eaca78da7448\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d522t" Mar 20 11:10:45 crc kubenswrapper[4748]: I0320 11:10:45.247395 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/5d2decbf-7d56-4ff7-896e-eaca78da7448-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d522t\" (UID: \"5d2decbf-7d56-4ff7-896e-eaca78da7448\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d522t" Mar 20 11:10:45 crc kubenswrapper[4748]: I0320 11:10:45.247446 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5d2decbf-7d56-4ff7-896e-eaca78da7448-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d522t\" (UID: \"5d2decbf-7d56-4ff7-896e-eaca78da7448\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d522t" Mar 20 11:10:45 crc kubenswrapper[4748]: I0320 11:10:45.247565 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d2decbf-7d56-4ff7-896e-eaca78da7448-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d522t\" (UID: \"5d2decbf-7d56-4ff7-896e-eaca78da7448\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d522t" Mar 20 11:10:45 crc kubenswrapper[4748]: I0320 11:10:45.350369 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/5d2decbf-7d56-4ff7-896e-eaca78da7448-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d522t\" (UID: \"5d2decbf-7d56-4ff7-896e-eaca78da7448\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d522t" Mar 20 11:10:45 crc kubenswrapper[4748]: I0320 11:10:45.350422 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hx8mc\" (UniqueName: \"kubernetes.io/projected/5d2decbf-7d56-4ff7-896e-eaca78da7448-kube-api-access-hx8mc\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d522t\" (UID: \"5d2decbf-7d56-4ff7-896e-eaca78da7448\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d522t" Mar 20 11:10:45 crc kubenswrapper[4748]: I0320 11:10:45.350485 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d2decbf-7d56-4ff7-896e-eaca78da7448-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d522t\" (UID: \"5d2decbf-7d56-4ff7-896e-eaca78da7448\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d522t" Mar 20 11:10:45 crc kubenswrapper[4748]: I0320 11:10:45.350525 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d2decbf-7d56-4ff7-896e-eaca78da7448-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d522t\" (UID: \"5d2decbf-7d56-4ff7-896e-eaca78da7448\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d522t" Mar 20 11:10:45 crc kubenswrapper[4748]: I0320 11:10:45.350562 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/5d2decbf-7d56-4ff7-896e-eaca78da7448-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d522t\" (UID: \"5d2decbf-7d56-4ff7-896e-eaca78da7448\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d522t" Mar 20 11:10:45 crc kubenswrapper[4748]: I0320 11:10:45.350608 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d2decbf-7d56-4ff7-896e-eaca78da7448-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d522t\" (UID: \"5d2decbf-7d56-4ff7-896e-eaca78da7448\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d522t" Mar 20 11:10:45 crc kubenswrapper[4748]: I0320 11:10:45.350774 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d2decbf-7d56-4ff7-896e-eaca78da7448-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d522t\" (UID: \"5d2decbf-7d56-4ff7-896e-eaca78da7448\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d522t" Mar 20 11:10:45 crc kubenswrapper[4748]: I0320 11:10:45.350809 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d2decbf-7d56-4ff7-896e-eaca78da7448-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d522t\" (UID: \"5d2decbf-7d56-4ff7-896e-eaca78da7448\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d522t" Mar 20 11:10:45 crc kubenswrapper[4748]: I0320 11:10:45.350942 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/5d2decbf-7d56-4ff7-896e-eaca78da7448-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d522t\" (UID: \"5d2decbf-7d56-4ff7-896e-eaca78da7448\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d522t" Mar 20 11:10:45 crc kubenswrapper[4748]: I0320 11:10:45.350986 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5d2decbf-7d56-4ff7-896e-eaca78da7448-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d522t\" (UID: \"5d2decbf-7d56-4ff7-896e-eaca78da7448\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d522t" Mar 20 11:10:45 crc kubenswrapper[4748]: I0320 11:10:45.351038 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d2decbf-7d56-4ff7-896e-eaca78da7448-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d522t\" (UID: \"5d2decbf-7d56-4ff7-896e-eaca78da7448\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d522t" Mar 20 11:10:45 crc kubenswrapper[4748]: I0320 11:10:45.351107 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d2decbf-7d56-4ff7-896e-eaca78da7448-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d522t\" (UID: \"5d2decbf-7d56-4ff7-896e-eaca78da7448\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d522t" Mar 20 11:10:45 crc kubenswrapper[4748]: I0320 11:10:45.351192 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5d2decbf-7d56-4ff7-896e-eaca78da7448-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d522t\" (UID: \"5d2decbf-7d56-4ff7-896e-eaca78da7448\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d522t" Mar 20 11:10:45 crc kubenswrapper[4748]: I0320 11:10:45.351216 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/5d2decbf-7d56-4ff7-896e-eaca78da7448-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d522t\" (UID: \"5d2decbf-7d56-4ff7-896e-eaca78da7448\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d522t" Mar 20 11:10:45 crc kubenswrapper[4748]: I0320 11:10:45.358542 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d2decbf-7d56-4ff7-896e-eaca78da7448-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d522t\" (UID: \"5d2decbf-7d56-4ff7-896e-eaca78da7448\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d522t" Mar 20 11:10:45 crc kubenswrapper[4748]: I0320 11:10:45.358671 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/5d2decbf-7d56-4ff7-896e-eaca78da7448-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d522t\" (UID: \"5d2decbf-7d56-4ff7-896e-eaca78da7448\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d522t" Mar 20 11:10:45 crc kubenswrapper[4748]: I0320 11:10:45.359069 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/5d2decbf-7d56-4ff7-896e-eaca78da7448-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d522t\" (UID: \"5d2decbf-7d56-4ff7-896e-eaca78da7448\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d522t" Mar 20 11:10:45 crc kubenswrapper[4748]: I0320 11:10:45.360079 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d2decbf-7d56-4ff7-896e-eaca78da7448-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d522t\" (UID: \"5d2decbf-7d56-4ff7-896e-eaca78da7448\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d522t" Mar 20 11:10:45 crc kubenswrapper[4748]: I0320 11:10:45.360213 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/5d2decbf-7d56-4ff7-896e-eaca78da7448-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d522t\" (UID: \"5d2decbf-7d56-4ff7-896e-eaca78da7448\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d522t" Mar 20 11:10:45 crc kubenswrapper[4748]: I0320 11:10:45.360765 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d2decbf-7d56-4ff7-896e-eaca78da7448-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d522t\" (UID: \"5d2decbf-7d56-4ff7-896e-eaca78da7448\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d522t" Mar 20 11:10:45 crc kubenswrapper[4748]: I0320 11:10:45.360947 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d2decbf-7d56-4ff7-896e-eaca78da7448-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d522t\" (UID: \"5d2decbf-7d56-4ff7-896e-eaca78da7448\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d522t" Mar 20 11:10:45 crc kubenswrapper[4748]: I0320 11:10:45.362455 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/5d2decbf-7d56-4ff7-896e-eaca78da7448-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d522t\" (UID: \"5d2decbf-7d56-4ff7-896e-eaca78da7448\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d522t" Mar 20 11:10:45 crc kubenswrapper[4748]: I0320 11:10:45.364431 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5d2decbf-7d56-4ff7-896e-eaca78da7448-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d522t\" (UID: \"5d2decbf-7d56-4ff7-896e-eaca78da7448\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d522t" Mar 20 11:10:45 crc kubenswrapper[4748]: I0320 11:10:45.365479 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5d2decbf-7d56-4ff7-896e-eaca78da7448-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d522t\" (UID: \"5d2decbf-7d56-4ff7-896e-eaca78da7448\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d522t" Mar 20 11:10:45 crc kubenswrapper[4748]: I0320 11:10:45.365852 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d2decbf-7d56-4ff7-896e-eaca78da7448-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d522t\" (UID: \"5d2decbf-7d56-4ff7-896e-eaca78da7448\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d522t" Mar 20 11:10:45 crc kubenswrapper[4748]: I0320 11:10:45.367086 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d2decbf-7d56-4ff7-896e-eaca78da7448-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d522t\" (UID: \"5d2decbf-7d56-4ff7-896e-eaca78da7448\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d522t" Mar 20 11:10:45 crc kubenswrapper[4748]: I0320 11:10:45.367593 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d2decbf-7d56-4ff7-896e-eaca78da7448-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d522t\" (UID: \"5d2decbf-7d56-4ff7-896e-eaca78da7448\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d522t" Mar 20 11:10:45 crc kubenswrapper[4748]: I0320 11:10:45.370232 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hx8mc\" (UniqueName: \"kubernetes.io/projected/5d2decbf-7d56-4ff7-896e-eaca78da7448-kube-api-access-hx8mc\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d522t\" (UID: \"5d2decbf-7d56-4ff7-896e-eaca78da7448\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d522t" Mar 20 11:10:45 crc kubenswrapper[4748]: I0320 11:10:45.454205 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d522t" Mar 20 11:10:46 crc kubenswrapper[4748]: I0320 11:10:46.049191 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d522t"] Mar 20 11:10:46 crc kubenswrapper[4748]: E0320 11:10:46.425856 4748 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf13ae640_255e_402f_b0ab_a8fe649902bb.slice/crio-60f3c6a99cd05220a4d75f092a8b43ec1ac5cf90be36e8aee629ae4ab4970cb0\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf13ae640_255e_402f_b0ab_a8fe649902bb.slice\": RecentStats: unable to find data in memory cache]" Mar 20 11:10:47 crc kubenswrapper[4748]: I0320 11:10:47.015144 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d522t" event={"ID":"5d2decbf-7d56-4ff7-896e-eaca78da7448","Type":"ContainerStarted","Data":"5f1ec888519bd0535a3be06cd086c6b95ea998190bb8b90b9febf4b7f5cae00e"} Mar 20 11:10:47 crc kubenswrapper[4748]: I0320 11:10:47.015485 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d522t" event={"ID":"5d2decbf-7d56-4ff7-896e-eaca78da7448","Type":"ContainerStarted","Data":"70bedc2c8c7c12182097405d732c1d9a876066c4393fb77fd2911db94049c1a1"} Mar 20 11:10:47 crc kubenswrapper[4748]: I0320 11:10:47.069693 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d522t" podStartSLOduration=1.638655451 podStartE2EDuration="2.069670413s" podCreationTimestamp="2026-03-20 11:10:45 +0000 UTC" firstStartedPulling="2026-03-20 11:10:46.063684804 +0000 UTC m=+2081.205230618" lastFinishedPulling="2026-03-20 11:10:46.494699766 +0000 UTC m=+2081.636245580" observedRunningTime="2026-03-20 11:10:47.068012662 +0000 UTC m=+2082.209558476" watchObservedRunningTime="2026-03-20 11:10:47.069670413 +0000 UTC m=+2082.211216227" Mar 20 11:10:56 crc kubenswrapper[4748]: I0320 11:10:56.515034 4748 scope.go:117] "RemoveContainer" containerID="e428509671e8fe99771dfcddf54085bbb58bd5b6557fd616733b187409873ead" Mar 20 11:10:56 crc kubenswrapper[4748]: E0320 11:10:56.691380 4748 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf13ae640_255e_402f_b0ab_a8fe649902bb.slice/crio-60f3c6a99cd05220a4d75f092a8b43ec1ac5cf90be36e8aee629ae4ab4970cb0\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf13ae640_255e_402f_b0ab_a8fe649902bb.slice\": RecentStats: unable to find data in memory cache]" Mar 20 11:10:57 crc kubenswrapper[4748]: I0320 11:10:57.126528 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" event={"ID":"8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c","Type":"ContainerStarted","Data":"108bbe2c80954d52ef46b647af46c00e66773edc87b915e27f13030879de9d88"} Mar 20 11:11:12 crc kubenswrapper[4748]: I0320 11:11:12.562187 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-fhwrk"] Mar 20 11:11:12 crc kubenswrapper[4748]: I0320 11:11:12.565967 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fhwrk" Mar 20 11:11:12 crc kubenswrapper[4748]: I0320 11:11:12.582874 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fhwrk"] Mar 20 11:11:12 crc kubenswrapper[4748]: I0320 11:11:12.603993 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6dfbb24b-0b32-49f2-80c1-9485c8b93d40-utilities\") pod \"redhat-operators-fhwrk\" (UID: \"6dfbb24b-0b32-49f2-80c1-9485c8b93d40\") " pod="openshift-marketplace/redhat-operators-fhwrk" Mar 20 11:11:12 crc kubenswrapper[4748]: I0320 11:11:12.604142 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6dfbb24b-0b32-49f2-80c1-9485c8b93d40-catalog-content\") pod \"redhat-operators-fhwrk\" (UID: \"6dfbb24b-0b32-49f2-80c1-9485c8b93d40\") " pod="openshift-marketplace/redhat-operators-fhwrk" Mar 20 11:11:12 crc kubenswrapper[4748]: I0320 11:11:12.604203 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbk77\" (UniqueName: \"kubernetes.io/projected/6dfbb24b-0b32-49f2-80c1-9485c8b93d40-kube-api-access-xbk77\") pod \"redhat-operators-fhwrk\" (UID: \"6dfbb24b-0b32-49f2-80c1-9485c8b93d40\") " pod="openshift-marketplace/redhat-operators-fhwrk" Mar 20 11:11:12 crc kubenswrapper[4748]: I0320 11:11:12.705887 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6dfbb24b-0b32-49f2-80c1-9485c8b93d40-catalog-content\") pod \"redhat-operators-fhwrk\" (UID: \"6dfbb24b-0b32-49f2-80c1-9485c8b93d40\") " pod="openshift-marketplace/redhat-operators-fhwrk" Mar 20 11:11:12 crc kubenswrapper[4748]: I0320 11:11:12.705944 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbk77\" (UniqueName: \"kubernetes.io/projected/6dfbb24b-0b32-49f2-80c1-9485c8b93d40-kube-api-access-xbk77\") pod \"redhat-operators-fhwrk\" (UID: \"6dfbb24b-0b32-49f2-80c1-9485c8b93d40\") " pod="openshift-marketplace/redhat-operators-fhwrk" Mar 20 11:11:12 crc kubenswrapper[4748]: I0320 11:11:12.706009 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6dfbb24b-0b32-49f2-80c1-9485c8b93d40-utilities\") pod \"redhat-operators-fhwrk\" (UID: \"6dfbb24b-0b32-49f2-80c1-9485c8b93d40\") " pod="openshift-marketplace/redhat-operators-fhwrk" Mar 20 11:11:12 crc kubenswrapper[4748]: I0320 11:11:12.706501 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6dfbb24b-0b32-49f2-80c1-9485c8b93d40-utilities\") pod \"redhat-operators-fhwrk\" (UID: \"6dfbb24b-0b32-49f2-80c1-9485c8b93d40\") " pod="openshift-marketplace/redhat-operators-fhwrk" Mar 20 11:11:12 crc kubenswrapper[4748]: I0320 11:11:12.706575 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6dfbb24b-0b32-49f2-80c1-9485c8b93d40-catalog-content\") pod \"redhat-operators-fhwrk\" (UID: \"6dfbb24b-0b32-49f2-80c1-9485c8b93d40\") " pod="openshift-marketplace/redhat-operators-fhwrk" Mar 20 11:11:12 crc kubenswrapper[4748]: I0320 11:11:12.743316 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbk77\" (UniqueName: \"kubernetes.io/projected/6dfbb24b-0b32-49f2-80c1-9485c8b93d40-kube-api-access-xbk77\") pod \"redhat-operators-fhwrk\" (UID: \"6dfbb24b-0b32-49f2-80c1-9485c8b93d40\") " pod="openshift-marketplace/redhat-operators-fhwrk" Mar 20 11:11:12 crc kubenswrapper[4748]: I0320 11:11:12.888933 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fhwrk" Mar 20 11:11:13 crc kubenswrapper[4748]: W0320 11:11:13.370828 4748 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6dfbb24b_0b32_49f2_80c1_9485c8b93d40.slice/crio-4c710664b0bc91600dbe5fd234adab49fd661a24f41739e5563473beebea1ba5 WatchSource:0}: Error finding container 4c710664b0bc91600dbe5fd234adab49fd661a24f41739e5563473beebea1ba5: Status 404 returned error can't find the container with id 4c710664b0bc91600dbe5fd234adab49fd661a24f41739e5563473beebea1ba5 Mar 20 11:11:13 crc kubenswrapper[4748]: I0320 11:11:13.375430 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fhwrk"] Mar 20 11:11:14 crc kubenswrapper[4748]: I0320 11:11:14.324376 4748 generic.go:334] "Generic (PLEG): container finished" podID="6dfbb24b-0b32-49f2-80c1-9485c8b93d40" containerID="a76570ce9c9e13fe9e1185d442b07d84d22dc5dc835d1704cc6a0f0da1f9dd90" exitCode=0 Mar 20 11:11:14 crc kubenswrapper[4748]: I0320 11:11:14.324469 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fhwrk" event={"ID":"6dfbb24b-0b32-49f2-80c1-9485c8b93d40","Type":"ContainerDied","Data":"a76570ce9c9e13fe9e1185d442b07d84d22dc5dc835d1704cc6a0f0da1f9dd90"} Mar 20 11:11:14 crc kubenswrapper[4748]: I0320 11:11:14.324669 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fhwrk" event={"ID":"6dfbb24b-0b32-49f2-80c1-9485c8b93d40","Type":"ContainerStarted","Data":"4c710664b0bc91600dbe5fd234adab49fd661a24f41739e5563473beebea1ba5"} Mar 20 11:11:15 crc kubenswrapper[4748]: I0320 11:11:15.335807 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fhwrk" event={"ID":"6dfbb24b-0b32-49f2-80c1-9485c8b93d40","Type":"ContainerStarted","Data":"ac150d626a827f3005293d360902f50168e152db78876bf40e5b9aa2dee94c64"} Mar 20 11:11:18 crc kubenswrapper[4748]: I0320 11:11:18.375505 4748 generic.go:334] "Generic (PLEG): container finished" podID="6dfbb24b-0b32-49f2-80c1-9485c8b93d40" containerID="ac150d626a827f3005293d360902f50168e152db78876bf40e5b9aa2dee94c64" exitCode=0 Mar 20 11:11:18 crc kubenswrapper[4748]: I0320 11:11:18.375573 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fhwrk" event={"ID":"6dfbb24b-0b32-49f2-80c1-9485c8b93d40","Type":"ContainerDied","Data":"ac150d626a827f3005293d360902f50168e152db78876bf40e5b9aa2dee94c64"} Mar 20 11:11:20 crc kubenswrapper[4748]: I0320 11:11:20.393584 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fhwrk" event={"ID":"6dfbb24b-0b32-49f2-80c1-9485c8b93d40","Type":"ContainerStarted","Data":"2ba1eecc5543096f6068d7bf59491cc0a5c14706cc499ddcffb72c5f293ef73c"} Mar 20 11:11:20 crc kubenswrapper[4748]: I0320 11:11:20.423103 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-fhwrk" podStartSLOduration=3.528815546 podStartE2EDuration="8.423068538s" podCreationTimestamp="2026-03-20 11:11:12 +0000 UTC" firstStartedPulling="2026-03-20 11:11:14.326340037 +0000 UTC m=+2109.467885851" lastFinishedPulling="2026-03-20 11:11:19.220593029 +0000 UTC m=+2114.362138843" observedRunningTime="2026-03-20 11:11:20.414475583 +0000 UTC m=+2115.556021417" watchObservedRunningTime="2026-03-20 11:11:20.423068538 +0000 UTC m=+2115.564614352" Mar 20 11:11:22 crc kubenswrapper[4748]: I0320 11:11:22.889235 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-fhwrk" Mar 20 11:11:22 crc kubenswrapper[4748]: I0320 11:11:22.889698 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-fhwrk" Mar 20 11:11:23 crc kubenswrapper[4748]: I0320 11:11:23.939727 4748 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-fhwrk" podUID="6dfbb24b-0b32-49f2-80c1-9485c8b93d40" containerName="registry-server" probeResult="failure" output=< Mar 20 11:11:23 crc kubenswrapper[4748]: timeout: failed to connect service ":50051" within 1s Mar 20 11:11:23 crc kubenswrapper[4748]: > Mar 20 11:11:24 crc kubenswrapper[4748]: I0320 11:11:24.431128 4748 generic.go:334] "Generic (PLEG): container finished" podID="5d2decbf-7d56-4ff7-896e-eaca78da7448" containerID="5f1ec888519bd0535a3be06cd086c6b95ea998190bb8b90b9febf4b7f5cae00e" exitCode=0 Mar 20 11:11:24 crc kubenswrapper[4748]: I0320 11:11:24.431202 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d522t" event={"ID":"5d2decbf-7d56-4ff7-896e-eaca78da7448","Type":"ContainerDied","Data":"5f1ec888519bd0535a3be06cd086c6b95ea998190bb8b90b9febf4b7f5cae00e"} Mar 20 11:11:25 crc kubenswrapper[4748]: I0320 11:11:25.928506 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d522t" Mar 20 11:11:25 crc kubenswrapper[4748]: I0320 11:11:25.995258 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d2decbf-7d56-4ff7-896e-eaca78da7448-telemetry-combined-ca-bundle\") pod \"5d2decbf-7d56-4ff7-896e-eaca78da7448\" (UID: \"5d2decbf-7d56-4ff7-896e-eaca78da7448\") " Mar 20 11:11:25 crc kubenswrapper[4748]: I0320 11:11:25.995343 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hx8mc\" (UniqueName: \"kubernetes.io/projected/5d2decbf-7d56-4ff7-896e-eaca78da7448-kube-api-access-hx8mc\") pod \"5d2decbf-7d56-4ff7-896e-eaca78da7448\" (UID: \"5d2decbf-7d56-4ff7-896e-eaca78da7448\") " Mar 20 11:11:25 crc kubenswrapper[4748]: I0320 11:11:25.995433 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/5d2decbf-7d56-4ff7-896e-eaca78da7448-openstack-edpm-ipam-ovn-default-certs-0\") pod \"5d2decbf-7d56-4ff7-896e-eaca78da7448\" (UID: \"5d2decbf-7d56-4ff7-896e-eaca78da7448\") " Mar 20 11:11:25 crc kubenswrapper[4748]: I0320 11:11:25.995475 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/5d2decbf-7d56-4ff7-896e-eaca78da7448-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"5d2decbf-7d56-4ff7-896e-eaca78da7448\" (UID: \"5d2decbf-7d56-4ff7-896e-eaca78da7448\") " Mar 20 11:11:25 crc kubenswrapper[4748]: I0320 11:11:25.995512 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/5d2decbf-7d56-4ff7-896e-eaca78da7448-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"5d2decbf-7d56-4ff7-896e-eaca78da7448\" (UID: \"5d2decbf-7d56-4ff7-896e-eaca78da7448\") " Mar 20 11:11:25 crc kubenswrapper[4748]: I0320 11:11:25.995565 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5d2decbf-7d56-4ff7-896e-eaca78da7448-inventory\") pod \"5d2decbf-7d56-4ff7-896e-eaca78da7448\" (UID: \"5d2decbf-7d56-4ff7-896e-eaca78da7448\") " Mar 20 11:11:25 crc kubenswrapper[4748]: I0320 11:11:25.995625 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/5d2decbf-7d56-4ff7-896e-eaca78da7448-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"5d2decbf-7d56-4ff7-896e-eaca78da7448\" (UID: \"5d2decbf-7d56-4ff7-896e-eaca78da7448\") " Mar 20 11:11:25 crc kubenswrapper[4748]: I0320 11:11:25.995654 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d2decbf-7d56-4ff7-896e-eaca78da7448-bootstrap-combined-ca-bundle\") pod \"5d2decbf-7d56-4ff7-896e-eaca78da7448\" (UID: \"5d2decbf-7d56-4ff7-896e-eaca78da7448\") " Mar 20 11:11:25 crc kubenswrapper[4748]: I0320 11:11:25.995743 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d2decbf-7d56-4ff7-896e-eaca78da7448-neutron-metadata-combined-ca-bundle\") pod \"5d2decbf-7d56-4ff7-896e-eaca78da7448\" (UID: \"5d2decbf-7d56-4ff7-896e-eaca78da7448\") " Mar 20 11:11:25 crc kubenswrapper[4748]: I0320 11:11:25.995786 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d2decbf-7d56-4ff7-896e-eaca78da7448-libvirt-combined-ca-bundle\") pod \"5d2decbf-7d56-4ff7-896e-eaca78da7448\" (UID: \"5d2decbf-7d56-4ff7-896e-eaca78da7448\") " Mar 20 11:11:25 crc kubenswrapper[4748]: I0320 11:11:25.995816 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5d2decbf-7d56-4ff7-896e-eaca78da7448-ssh-key-openstack-edpm-ipam\") pod \"5d2decbf-7d56-4ff7-896e-eaca78da7448\" (UID: \"5d2decbf-7d56-4ff7-896e-eaca78da7448\") " Mar 20 11:11:25 crc kubenswrapper[4748]: I0320 11:11:25.995896 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d2decbf-7d56-4ff7-896e-eaca78da7448-repo-setup-combined-ca-bundle\") pod \"5d2decbf-7d56-4ff7-896e-eaca78da7448\" (UID: \"5d2decbf-7d56-4ff7-896e-eaca78da7448\") " Mar 20 11:11:25 crc kubenswrapper[4748]: I0320 11:11:25.995920 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d2decbf-7d56-4ff7-896e-eaca78da7448-nova-combined-ca-bundle\") pod \"5d2decbf-7d56-4ff7-896e-eaca78da7448\" (UID: \"5d2decbf-7d56-4ff7-896e-eaca78da7448\") " Mar 20 11:11:25 crc kubenswrapper[4748]: I0320 11:11:25.995948 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d2decbf-7d56-4ff7-896e-eaca78da7448-ovn-combined-ca-bundle\") pod \"5d2decbf-7d56-4ff7-896e-eaca78da7448\" (UID: \"5d2decbf-7d56-4ff7-896e-eaca78da7448\") " Mar 20 11:11:26 crc kubenswrapper[4748]: I0320 11:11:26.004330 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d2decbf-7d56-4ff7-896e-eaca78da7448-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "5d2decbf-7d56-4ff7-896e-eaca78da7448" (UID: "5d2decbf-7d56-4ff7-896e-eaca78da7448"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:11:26 crc kubenswrapper[4748]: I0320 11:11:26.005028 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d2decbf-7d56-4ff7-896e-eaca78da7448-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "5d2decbf-7d56-4ff7-896e-eaca78da7448" (UID: "5d2decbf-7d56-4ff7-896e-eaca78da7448"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 11:11:26 crc kubenswrapper[4748]: I0320 11:11:26.005342 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d2decbf-7d56-4ff7-896e-eaca78da7448-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "5d2decbf-7d56-4ff7-896e-eaca78da7448" (UID: "5d2decbf-7d56-4ff7-896e-eaca78da7448"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 11:11:26 crc kubenswrapper[4748]: I0320 11:11:26.006049 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d2decbf-7d56-4ff7-896e-eaca78da7448-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "5d2decbf-7d56-4ff7-896e-eaca78da7448" (UID: "5d2decbf-7d56-4ff7-896e-eaca78da7448"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 11:11:26 crc kubenswrapper[4748]: I0320 11:11:26.006543 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d2decbf-7d56-4ff7-896e-eaca78da7448-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "5d2decbf-7d56-4ff7-896e-eaca78da7448" (UID: "5d2decbf-7d56-4ff7-896e-eaca78da7448"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:11:26 crc kubenswrapper[4748]: I0320 11:11:26.007098 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d2decbf-7d56-4ff7-896e-eaca78da7448-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "5d2decbf-7d56-4ff7-896e-eaca78da7448" (UID: "5d2decbf-7d56-4ff7-896e-eaca78da7448"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:11:26 crc kubenswrapper[4748]: I0320 11:11:26.007280 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d2decbf-7d56-4ff7-896e-eaca78da7448-kube-api-access-hx8mc" (OuterVolumeSpecName: "kube-api-access-hx8mc") pod "5d2decbf-7d56-4ff7-896e-eaca78da7448" (UID: "5d2decbf-7d56-4ff7-896e-eaca78da7448"). InnerVolumeSpecName "kube-api-access-hx8mc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:11:26 crc kubenswrapper[4748]: I0320 11:11:26.009301 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d2decbf-7d56-4ff7-896e-eaca78da7448-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "5d2decbf-7d56-4ff7-896e-eaca78da7448" (UID: "5d2decbf-7d56-4ff7-896e-eaca78da7448"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:11:26 crc kubenswrapper[4748]: I0320 11:11:26.011747 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d2decbf-7d56-4ff7-896e-eaca78da7448-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "5d2decbf-7d56-4ff7-896e-eaca78da7448" (UID: "5d2decbf-7d56-4ff7-896e-eaca78da7448"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 11:11:26 crc kubenswrapper[4748]: I0320 11:11:26.011837 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d2decbf-7d56-4ff7-896e-eaca78da7448-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "5d2decbf-7d56-4ff7-896e-eaca78da7448" (UID: "5d2decbf-7d56-4ff7-896e-eaca78da7448"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 11:11:26 crc kubenswrapper[4748]: I0320 11:11:26.012516 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d2decbf-7d56-4ff7-896e-eaca78da7448-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "5d2decbf-7d56-4ff7-896e-eaca78da7448" (UID: "5d2decbf-7d56-4ff7-896e-eaca78da7448"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 11:11:26 crc kubenswrapper[4748]: I0320 11:11:26.013371 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d2decbf-7d56-4ff7-896e-eaca78da7448-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "5d2decbf-7d56-4ff7-896e-eaca78da7448" (UID: "5d2decbf-7d56-4ff7-896e-eaca78da7448"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 11:11:26 crc kubenswrapper[4748]: I0320 11:11:26.038565 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d2decbf-7d56-4ff7-896e-eaca78da7448-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "5d2decbf-7d56-4ff7-896e-eaca78da7448" (UID: "5d2decbf-7d56-4ff7-896e-eaca78da7448"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 11:11:26 crc kubenswrapper[4748]: I0320 11:11:26.038985 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d2decbf-7d56-4ff7-896e-eaca78da7448-inventory" (OuterVolumeSpecName: "inventory") pod "5d2decbf-7d56-4ff7-896e-eaca78da7448" (UID: "5d2decbf-7d56-4ff7-896e-eaca78da7448"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 11:11:26 crc kubenswrapper[4748]: I0320 11:11:26.098806 4748 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/5d2decbf-7d56-4ff7-896e-eaca78da7448-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 20 11:11:26 crc kubenswrapper[4748]: I0320 11:11:26.098884 4748 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5d2decbf-7d56-4ff7-896e-eaca78da7448-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 11:11:26 crc kubenswrapper[4748]: I0320 11:11:26.098897 4748 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/5d2decbf-7d56-4ff7-896e-eaca78da7448-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 20 11:11:26 crc kubenswrapper[4748]: I0320 11:11:26.098908 4748 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d2decbf-7d56-4ff7-896e-eaca78da7448-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 11:11:26 crc kubenswrapper[4748]: I0320 11:11:26.098920 4748 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d2decbf-7d56-4ff7-896e-eaca78da7448-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 11:11:26 crc kubenswrapper[4748]: I0320 11:11:26.098929 4748 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d2decbf-7d56-4ff7-896e-eaca78da7448-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 11:11:26 crc kubenswrapper[4748]: I0320 11:11:26.098938 4748 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5d2decbf-7d56-4ff7-896e-eaca78da7448-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 11:11:26 crc kubenswrapper[4748]: I0320 11:11:26.098946 4748 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d2decbf-7d56-4ff7-896e-eaca78da7448-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 11:11:26 crc kubenswrapper[4748]: I0320 11:11:26.098955 4748 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d2decbf-7d56-4ff7-896e-eaca78da7448-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 11:11:26 crc kubenswrapper[4748]: I0320 11:11:26.098963 4748 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d2decbf-7d56-4ff7-896e-eaca78da7448-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 11:11:26 crc kubenswrapper[4748]: I0320 11:11:26.098971 4748 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d2decbf-7d56-4ff7-896e-eaca78da7448-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 11:11:26 crc kubenswrapper[4748]: I0320 11:11:26.098979 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hx8mc\" (UniqueName: \"kubernetes.io/projected/5d2decbf-7d56-4ff7-896e-eaca78da7448-kube-api-access-hx8mc\") on node \"crc\" DevicePath \"\"" Mar 20 11:11:26 crc kubenswrapper[4748]: I0320 11:11:26.098989 4748 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/5d2decbf-7d56-4ff7-896e-eaca78da7448-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 20 11:11:26 crc kubenswrapper[4748]: I0320 11:11:26.099001 4748 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/5d2decbf-7d56-4ff7-896e-eaca78da7448-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 20 11:11:26 crc kubenswrapper[4748]: I0320 11:11:26.459729 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d522t" event={"ID":"5d2decbf-7d56-4ff7-896e-eaca78da7448","Type":"ContainerDied","Data":"70bedc2c8c7c12182097405d732c1d9a876066c4393fb77fd2911db94049c1a1"} Mar 20 11:11:26 crc kubenswrapper[4748]: I0320 11:11:26.459774 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d522t" Mar 20 11:11:26 crc kubenswrapper[4748]: I0320 11:11:26.459787 4748 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="70bedc2c8c7c12182097405d732c1d9a876066c4393fb77fd2911db94049c1a1" Mar 20 11:11:26 crc kubenswrapper[4748]: I0320 11:11:26.638103 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-6qhn9"] Mar 20 11:11:26 crc kubenswrapper[4748]: E0320 11:11:26.638904 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d2decbf-7d56-4ff7-896e-eaca78da7448" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 20 11:11:26 crc kubenswrapper[4748]: I0320 11:11:26.638927 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d2decbf-7d56-4ff7-896e-eaca78da7448" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 20 11:11:26 crc kubenswrapper[4748]: I0320 11:11:26.639163 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d2decbf-7d56-4ff7-896e-eaca78da7448" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 20 11:11:26 crc kubenswrapper[4748]: I0320 11:11:26.639960 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-6qhn9" Mar 20 11:11:26 crc kubenswrapper[4748]: I0320 11:11:26.643120 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 11:11:26 crc kubenswrapper[4748]: I0320 11:11:26.643183 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 11:11:26 crc kubenswrapper[4748]: I0320 11:11:26.643226 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-fd5jb" Mar 20 11:11:26 crc kubenswrapper[4748]: I0320 11:11:26.643866 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 11:11:26 crc kubenswrapper[4748]: I0320 11:11:26.648090 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Mar 20 11:11:26 crc kubenswrapper[4748]: I0320 11:11:26.659609 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-6qhn9"] Mar 20 11:11:26 crc kubenswrapper[4748]: I0320 11:11:26.752922 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/17ae0e15-a1a8-46a4-a2d0-4d1e92fe87b3-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-6qhn9\" (UID: \"17ae0e15-a1a8-46a4-a2d0-4d1e92fe87b3\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-6qhn9" Mar 20 11:11:26 crc kubenswrapper[4748]: I0320 11:11:26.753088 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wzhpl\" (UniqueName: \"kubernetes.io/projected/17ae0e15-a1a8-46a4-a2d0-4d1e92fe87b3-kube-api-access-wzhpl\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-6qhn9\" (UID: \"17ae0e15-a1a8-46a4-a2d0-4d1e92fe87b3\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-6qhn9" Mar 20 11:11:26 crc kubenswrapper[4748]: I0320 11:11:26.753234 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/17ae0e15-a1a8-46a4-a2d0-4d1e92fe87b3-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-6qhn9\" (UID: \"17ae0e15-a1a8-46a4-a2d0-4d1e92fe87b3\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-6qhn9" Mar 20 11:11:26 crc kubenswrapper[4748]: I0320 11:11:26.753414 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17ae0e15-a1a8-46a4-a2d0-4d1e92fe87b3-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-6qhn9\" (UID: \"17ae0e15-a1a8-46a4-a2d0-4d1e92fe87b3\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-6qhn9" Mar 20 11:11:26 crc kubenswrapper[4748]: I0320 11:11:26.753513 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/17ae0e15-a1a8-46a4-a2d0-4d1e92fe87b3-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-6qhn9\" (UID: \"17ae0e15-a1a8-46a4-a2d0-4d1e92fe87b3\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-6qhn9" Mar 20 11:11:26 crc kubenswrapper[4748]: I0320 11:11:26.855910 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wzhpl\" (UniqueName: \"kubernetes.io/projected/17ae0e15-a1a8-46a4-a2d0-4d1e92fe87b3-kube-api-access-wzhpl\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-6qhn9\" (UID: \"17ae0e15-a1a8-46a4-a2d0-4d1e92fe87b3\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-6qhn9" Mar 20 11:11:26 crc kubenswrapper[4748]: I0320 11:11:26.855997 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/17ae0e15-a1a8-46a4-a2d0-4d1e92fe87b3-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-6qhn9\" (UID: \"17ae0e15-a1a8-46a4-a2d0-4d1e92fe87b3\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-6qhn9" Mar 20 11:11:26 crc kubenswrapper[4748]: I0320 11:11:26.856099 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17ae0e15-a1a8-46a4-a2d0-4d1e92fe87b3-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-6qhn9\" (UID: \"17ae0e15-a1a8-46a4-a2d0-4d1e92fe87b3\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-6qhn9" Mar 20 11:11:26 crc kubenswrapper[4748]: I0320 11:11:26.856166 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/17ae0e15-a1a8-46a4-a2d0-4d1e92fe87b3-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-6qhn9\" (UID: \"17ae0e15-a1a8-46a4-a2d0-4d1e92fe87b3\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-6qhn9" Mar 20 11:11:26 crc kubenswrapper[4748]: I0320 11:11:26.856278 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/17ae0e15-a1a8-46a4-a2d0-4d1e92fe87b3-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-6qhn9\" (UID: \"17ae0e15-a1a8-46a4-a2d0-4d1e92fe87b3\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-6qhn9" Mar 20 11:11:26 crc kubenswrapper[4748]: I0320 11:11:26.857463 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/17ae0e15-a1a8-46a4-a2d0-4d1e92fe87b3-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-6qhn9\" (UID: \"17ae0e15-a1a8-46a4-a2d0-4d1e92fe87b3\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-6qhn9" Mar 20 11:11:26 crc kubenswrapper[4748]: I0320 11:11:26.862009 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17ae0e15-a1a8-46a4-a2d0-4d1e92fe87b3-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-6qhn9\" (UID: \"17ae0e15-a1a8-46a4-a2d0-4d1e92fe87b3\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-6qhn9" Mar 20 11:11:26 crc kubenswrapper[4748]: I0320 11:11:26.862665 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/17ae0e15-a1a8-46a4-a2d0-4d1e92fe87b3-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-6qhn9\" (UID: \"17ae0e15-a1a8-46a4-a2d0-4d1e92fe87b3\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-6qhn9" Mar 20 11:11:26 crc kubenswrapper[4748]: I0320 11:11:26.863376 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/17ae0e15-a1a8-46a4-a2d0-4d1e92fe87b3-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-6qhn9\" (UID: \"17ae0e15-a1a8-46a4-a2d0-4d1e92fe87b3\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-6qhn9" Mar 20 11:11:26 crc kubenswrapper[4748]: I0320 11:11:26.886279 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wzhpl\" (UniqueName: \"kubernetes.io/projected/17ae0e15-a1a8-46a4-a2d0-4d1e92fe87b3-kube-api-access-wzhpl\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-6qhn9\" (UID: \"17ae0e15-a1a8-46a4-a2d0-4d1e92fe87b3\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-6qhn9" Mar 20 11:11:26 crc kubenswrapper[4748]: I0320 11:11:26.960371 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-6qhn9" Mar 20 11:11:27 crc kubenswrapper[4748]: I0320 11:11:27.487772 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-6qhn9"] Mar 20 11:11:28 crc kubenswrapper[4748]: I0320 11:11:28.487552 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-6qhn9" event={"ID":"17ae0e15-a1a8-46a4-a2d0-4d1e92fe87b3","Type":"ContainerStarted","Data":"52b98cedec73d27868dd1a64257e16919b1ffea884474e5cda6cf4b8023f4c86"} Mar 20 11:11:28 crc kubenswrapper[4748]: I0320 11:11:28.488234 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-6qhn9" event={"ID":"17ae0e15-a1a8-46a4-a2d0-4d1e92fe87b3","Type":"ContainerStarted","Data":"46b45663f2f5f7407681aa9a09d1831f54fe4237254be92c55aa7035ae92894d"} Mar 20 11:11:28 crc kubenswrapper[4748]: I0320 11:11:28.513453 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-6qhn9" podStartSLOduration=2.303090434 podStartE2EDuration="2.51342388s" podCreationTimestamp="2026-03-20 11:11:26 +0000 UTC" firstStartedPulling="2026-03-20 11:11:27.471814898 +0000 UTC m=+2122.613360712" lastFinishedPulling="2026-03-20 11:11:27.682148344 +0000 UTC m=+2122.823694158" observedRunningTime="2026-03-20 11:11:28.507763188 +0000 UTC m=+2123.649309002" watchObservedRunningTime="2026-03-20 11:11:28.51342388 +0000 UTC m=+2123.654969694" Mar 20 11:11:32 crc kubenswrapper[4748]: I0320 11:11:32.934381 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-fhwrk" Mar 20 11:11:32 crc kubenswrapper[4748]: I0320 11:11:32.981568 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-fhwrk" Mar 20 11:11:33 crc kubenswrapper[4748]: I0320 11:11:33.167048 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fhwrk"] Mar 20 11:11:34 crc kubenswrapper[4748]: I0320 11:11:34.546676 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-fhwrk" podUID="6dfbb24b-0b32-49f2-80c1-9485c8b93d40" containerName="registry-server" containerID="cri-o://2ba1eecc5543096f6068d7bf59491cc0a5c14706cc499ddcffb72c5f293ef73c" gracePeriod=2 Mar 20 11:11:35 crc kubenswrapper[4748]: I0320 11:11:35.102948 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fhwrk" Mar 20 11:11:35 crc kubenswrapper[4748]: I0320 11:11:35.225787 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6dfbb24b-0b32-49f2-80c1-9485c8b93d40-utilities\") pod \"6dfbb24b-0b32-49f2-80c1-9485c8b93d40\" (UID: \"6dfbb24b-0b32-49f2-80c1-9485c8b93d40\") " Mar 20 11:11:35 crc kubenswrapper[4748]: I0320 11:11:35.225864 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6dfbb24b-0b32-49f2-80c1-9485c8b93d40-catalog-content\") pod \"6dfbb24b-0b32-49f2-80c1-9485c8b93d40\" (UID: \"6dfbb24b-0b32-49f2-80c1-9485c8b93d40\") " Mar 20 11:11:35 crc kubenswrapper[4748]: I0320 11:11:35.225889 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xbk77\" (UniqueName: \"kubernetes.io/projected/6dfbb24b-0b32-49f2-80c1-9485c8b93d40-kube-api-access-xbk77\") pod \"6dfbb24b-0b32-49f2-80c1-9485c8b93d40\" (UID: \"6dfbb24b-0b32-49f2-80c1-9485c8b93d40\") " Mar 20 11:11:35 crc kubenswrapper[4748]: I0320 11:11:35.226829 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6dfbb24b-0b32-49f2-80c1-9485c8b93d40-utilities" (OuterVolumeSpecName: "utilities") pod "6dfbb24b-0b32-49f2-80c1-9485c8b93d40" (UID: "6dfbb24b-0b32-49f2-80c1-9485c8b93d40"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:11:35 crc kubenswrapper[4748]: I0320 11:11:35.231697 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6dfbb24b-0b32-49f2-80c1-9485c8b93d40-kube-api-access-xbk77" (OuterVolumeSpecName: "kube-api-access-xbk77") pod "6dfbb24b-0b32-49f2-80c1-9485c8b93d40" (UID: "6dfbb24b-0b32-49f2-80c1-9485c8b93d40"). InnerVolumeSpecName "kube-api-access-xbk77". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:11:35 crc kubenswrapper[4748]: I0320 11:11:35.328033 4748 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6dfbb24b-0b32-49f2-80c1-9485c8b93d40-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 11:11:35 crc kubenswrapper[4748]: I0320 11:11:35.328303 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xbk77\" (UniqueName: \"kubernetes.io/projected/6dfbb24b-0b32-49f2-80c1-9485c8b93d40-kube-api-access-xbk77\") on node \"crc\" DevicePath \"\"" Mar 20 11:11:35 crc kubenswrapper[4748]: I0320 11:11:35.358371 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6dfbb24b-0b32-49f2-80c1-9485c8b93d40-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6dfbb24b-0b32-49f2-80c1-9485c8b93d40" (UID: "6dfbb24b-0b32-49f2-80c1-9485c8b93d40"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:11:35 crc kubenswrapper[4748]: I0320 11:11:35.430123 4748 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6dfbb24b-0b32-49f2-80c1-9485c8b93d40-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 11:11:35 crc kubenswrapper[4748]: I0320 11:11:35.559906 4748 generic.go:334] "Generic (PLEG): container finished" podID="6dfbb24b-0b32-49f2-80c1-9485c8b93d40" containerID="2ba1eecc5543096f6068d7bf59491cc0a5c14706cc499ddcffb72c5f293ef73c" exitCode=0 Mar 20 11:11:35 crc kubenswrapper[4748]: I0320 11:11:35.559951 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fhwrk" event={"ID":"6dfbb24b-0b32-49f2-80c1-9485c8b93d40","Type":"ContainerDied","Data":"2ba1eecc5543096f6068d7bf59491cc0a5c14706cc499ddcffb72c5f293ef73c"} Mar 20 11:11:35 crc kubenswrapper[4748]: I0320 11:11:35.559981 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fhwrk" event={"ID":"6dfbb24b-0b32-49f2-80c1-9485c8b93d40","Type":"ContainerDied","Data":"4c710664b0bc91600dbe5fd234adab49fd661a24f41739e5563473beebea1ba5"} Mar 20 11:11:35 crc kubenswrapper[4748]: I0320 11:11:35.560001 4748 scope.go:117] "RemoveContainer" containerID="2ba1eecc5543096f6068d7bf59491cc0a5c14706cc499ddcffb72c5f293ef73c" Mar 20 11:11:35 crc kubenswrapper[4748]: I0320 11:11:35.560069 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fhwrk" Mar 20 11:11:35 crc kubenswrapper[4748]: I0320 11:11:35.589007 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fhwrk"] Mar 20 11:11:35 crc kubenswrapper[4748]: I0320 11:11:35.593035 4748 scope.go:117] "RemoveContainer" containerID="ac150d626a827f3005293d360902f50168e152db78876bf40e5b9aa2dee94c64" Mar 20 11:11:35 crc kubenswrapper[4748]: I0320 11:11:35.598559 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-fhwrk"] Mar 20 11:11:35 crc kubenswrapper[4748]: I0320 11:11:35.622620 4748 scope.go:117] "RemoveContainer" containerID="a76570ce9c9e13fe9e1185d442b07d84d22dc5dc835d1704cc6a0f0da1f9dd90" Mar 20 11:11:35 crc kubenswrapper[4748]: I0320 11:11:35.657216 4748 scope.go:117] "RemoveContainer" containerID="2ba1eecc5543096f6068d7bf59491cc0a5c14706cc499ddcffb72c5f293ef73c" Mar 20 11:11:35 crc kubenswrapper[4748]: E0320 11:11:35.658050 4748 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ba1eecc5543096f6068d7bf59491cc0a5c14706cc499ddcffb72c5f293ef73c\": container with ID starting with 2ba1eecc5543096f6068d7bf59491cc0a5c14706cc499ddcffb72c5f293ef73c not found: ID does not exist" containerID="2ba1eecc5543096f6068d7bf59491cc0a5c14706cc499ddcffb72c5f293ef73c" Mar 20 11:11:35 crc kubenswrapper[4748]: I0320 11:11:35.658088 4748 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ba1eecc5543096f6068d7bf59491cc0a5c14706cc499ddcffb72c5f293ef73c"} err="failed to get container status \"2ba1eecc5543096f6068d7bf59491cc0a5c14706cc499ddcffb72c5f293ef73c\": rpc error: code = NotFound desc = could not find container \"2ba1eecc5543096f6068d7bf59491cc0a5c14706cc499ddcffb72c5f293ef73c\": container with ID starting with 2ba1eecc5543096f6068d7bf59491cc0a5c14706cc499ddcffb72c5f293ef73c not found: ID does not exist" Mar 20 11:11:35 crc kubenswrapper[4748]: I0320 11:11:35.658134 4748 scope.go:117] "RemoveContainer" containerID="ac150d626a827f3005293d360902f50168e152db78876bf40e5b9aa2dee94c64" Mar 20 11:11:35 crc kubenswrapper[4748]: E0320 11:11:35.658593 4748 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac150d626a827f3005293d360902f50168e152db78876bf40e5b9aa2dee94c64\": container with ID starting with ac150d626a827f3005293d360902f50168e152db78876bf40e5b9aa2dee94c64 not found: ID does not exist" containerID="ac150d626a827f3005293d360902f50168e152db78876bf40e5b9aa2dee94c64" Mar 20 11:11:35 crc kubenswrapper[4748]: I0320 11:11:35.658726 4748 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac150d626a827f3005293d360902f50168e152db78876bf40e5b9aa2dee94c64"} err="failed to get container status \"ac150d626a827f3005293d360902f50168e152db78876bf40e5b9aa2dee94c64\": rpc error: code = NotFound desc = could not find container \"ac150d626a827f3005293d360902f50168e152db78876bf40e5b9aa2dee94c64\": container with ID starting with ac150d626a827f3005293d360902f50168e152db78876bf40e5b9aa2dee94c64 not found: ID does not exist" Mar 20 11:11:35 crc kubenswrapper[4748]: I0320 11:11:35.658854 4748 scope.go:117] "RemoveContainer" containerID="a76570ce9c9e13fe9e1185d442b07d84d22dc5dc835d1704cc6a0f0da1f9dd90" Mar 20 11:11:35 crc kubenswrapper[4748]: E0320 11:11:35.659343 4748 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a76570ce9c9e13fe9e1185d442b07d84d22dc5dc835d1704cc6a0f0da1f9dd90\": container with ID starting with a76570ce9c9e13fe9e1185d442b07d84d22dc5dc835d1704cc6a0f0da1f9dd90 not found: ID does not exist" containerID="a76570ce9c9e13fe9e1185d442b07d84d22dc5dc835d1704cc6a0f0da1f9dd90" Mar 20 11:11:35 crc kubenswrapper[4748]: I0320 11:11:35.659381 4748 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a76570ce9c9e13fe9e1185d442b07d84d22dc5dc835d1704cc6a0f0da1f9dd90"} err="failed to get container status \"a76570ce9c9e13fe9e1185d442b07d84d22dc5dc835d1704cc6a0f0da1f9dd90\": rpc error: code = NotFound desc = could not find container \"a76570ce9c9e13fe9e1185d442b07d84d22dc5dc835d1704cc6a0f0da1f9dd90\": container with ID starting with a76570ce9c9e13fe9e1185d442b07d84d22dc5dc835d1704cc6a0f0da1f9dd90 not found: ID does not exist" Mar 20 11:11:37 crc kubenswrapper[4748]: I0320 11:11:37.525749 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6dfbb24b-0b32-49f2-80c1-9485c8b93d40" path="/var/lib/kubelet/pods/6dfbb24b-0b32-49f2-80c1-9485c8b93d40/volumes" Mar 20 11:12:00 crc kubenswrapper[4748]: I0320 11:12:00.156680 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566752-9prbv"] Mar 20 11:12:00 crc kubenswrapper[4748]: E0320 11:12:00.157804 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6dfbb24b-0b32-49f2-80c1-9485c8b93d40" containerName="extract-content" Mar 20 11:12:00 crc kubenswrapper[4748]: I0320 11:12:00.157823 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="6dfbb24b-0b32-49f2-80c1-9485c8b93d40" containerName="extract-content" Mar 20 11:12:00 crc kubenswrapper[4748]: E0320 11:12:00.157863 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6dfbb24b-0b32-49f2-80c1-9485c8b93d40" containerName="extract-utilities" Mar 20 11:12:00 crc kubenswrapper[4748]: I0320 11:12:00.157871 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="6dfbb24b-0b32-49f2-80c1-9485c8b93d40" containerName="extract-utilities" Mar 20 11:12:00 crc kubenswrapper[4748]: E0320 11:12:00.157894 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6dfbb24b-0b32-49f2-80c1-9485c8b93d40" containerName="registry-server" Mar 20 11:12:00 crc kubenswrapper[4748]: I0320 11:12:00.157902 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="6dfbb24b-0b32-49f2-80c1-9485c8b93d40" containerName="registry-server" Mar 20 11:12:00 crc kubenswrapper[4748]: I0320 11:12:00.158126 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="6dfbb24b-0b32-49f2-80c1-9485c8b93d40" containerName="registry-server" Mar 20 11:12:00 crc kubenswrapper[4748]: I0320 11:12:00.158884 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566752-9prbv" Mar 20 11:12:00 crc kubenswrapper[4748]: I0320 11:12:00.162231 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 11:12:00 crc kubenswrapper[4748]: I0320 11:12:00.162561 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-p6q8z" Mar 20 11:12:00 crc kubenswrapper[4748]: I0320 11:12:00.165960 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 11:12:00 crc kubenswrapper[4748]: I0320 11:12:00.175023 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566752-9prbv"] Mar 20 11:12:00 crc kubenswrapper[4748]: I0320 11:12:00.350880 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5h6n\" (UniqueName: \"kubernetes.io/projected/da6dab72-7622-4f0e-bb06-c33bacf10588-kube-api-access-b5h6n\") pod \"auto-csr-approver-29566752-9prbv\" (UID: \"da6dab72-7622-4f0e-bb06-c33bacf10588\") " pod="openshift-infra/auto-csr-approver-29566752-9prbv" Mar 20 11:12:00 crc kubenswrapper[4748]: I0320 11:12:00.452762 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5h6n\" (UniqueName: \"kubernetes.io/projected/da6dab72-7622-4f0e-bb06-c33bacf10588-kube-api-access-b5h6n\") pod \"auto-csr-approver-29566752-9prbv\" (UID: \"da6dab72-7622-4f0e-bb06-c33bacf10588\") " pod="openshift-infra/auto-csr-approver-29566752-9prbv" Mar 20 11:12:00 crc kubenswrapper[4748]: I0320 11:12:00.479061 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5h6n\" (UniqueName: \"kubernetes.io/projected/da6dab72-7622-4f0e-bb06-c33bacf10588-kube-api-access-b5h6n\") pod \"auto-csr-approver-29566752-9prbv\" (UID: \"da6dab72-7622-4f0e-bb06-c33bacf10588\") " pod="openshift-infra/auto-csr-approver-29566752-9prbv" Mar 20 11:12:00 crc kubenswrapper[4748]: I0320 11:12:00.482544 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566752-9prbv" Mar 20 11:12:00 crc kubenswrapper[4748]: I0320 11:12:00.940476 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566752-9prbv"] Mar 20 11:12:00 crc kubenswrapper[4748]: I0320 11:12:00.987122 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566752-9prbv" event={"ID":"da6dab72-7622-4f0e-bb06-c33bacf10588","Type":"ContainerStarted","Data":"ccf5099f803aa8bf017630bbcd0b33bb7ef3a09844dbbc8cd7a9a50e059475f7"} Mar 20 11:12:03 crc kubenswrapper[4748]: I0320 11:12:03.012662 4748 generic.go:334] "Generic (PLEG): container finished" podID="da6dab72-7622-4f0e-bb06-c33bacf10588" containerID="d4ff2a6951846e8772811a165a7ade144c87f046dd3deeb17aba0c54539f067a" exitCode=0 Mar 20 11:12:03 crc kubenswrapper[4748]: I0320 11:12:03.012714 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566752-9prbv" event={"ID":"da6dab72-7622-4f0e-bb06-c33bacf10588","Type":"ContainerDied","Data":"d4ff2a6951846e8772811a165a7ade144c87f046dd3deeb17aba0c54539f067a"} Mar 20 11:12:04 crc kubenswrapper[4748]: I0320 11:12:04.354763 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566752-9prbv" Mar 20 11:12:04 crc kubenswrapper[4748]: I0320 11:12:04.534146 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b5h6n\" (UniqueName: \"kubernetes.io/projected/da6dab72-7622-4f0e-bb06-c33bacf10588-kube-api-access-b5h6n\") pod \"da6dab72-7622-4f0e-bb06-c33bacf10588\" (UID: \"da6dab72-7622-4f0e-bb06-c33bacf10588\") " Mar 20 11:12:04 crc kubenswrapper[4748]: I0320 11:12:04.541980 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da6dab72-7622-4f0e-bb06-c33bacf10588-kube-api-access-b5h6n" (OuterVolumeSpecName: "kube-api-access-b5h6n") pod "da6dab72-7622-4f0e-bb06-c33bacf10588" (UID: "da6dab72-7622-4f0e-bb06-c33bacf10588"). InnerVolumeSpecName "kube-api-access-b5h6n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:12:04 crc kubenswrapper[4748]: I0320 11:12:04.637922 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b5h6n\" (UniqueName: \"kubernetes.io/projected/da6dab72-7622-4f0e-bb06-c33bacf10588-kube-api-access-b5h6n\") on node \"crc\" DevicePath \"\"" Mar 20 11:12:05 crc kubenswrapper[4748]: I0320 11:12:05.029822 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566752-9prbv" event={"ID":"da6dab72-7622-4f0e-bb06-c33bacf10588","Type":"ContainerDied","Data":"ccf5099f803aa8bf017630bbcd0b33bb7ef3a09844dbbc8cd7a9a50e059475f7"} Mar 20 11:12:05 crc kubenswrapper[4748]: I0320 11:12:05.029902 4748 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ccf5099f803aa8bf017630bbcd0b33bb7ef3a09844dbbc8cd7a9a50e059475f7" Mar 20 11:12:05 crc kubenswrapper[4748]: I0320 11:12:05.029980 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566752-9prbv" Mar 20 11:12:05 crc kubenswrapper[4748]: I0320 11:12:05.419986 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566746-q8tck"] Mar 20 11:12:05 crc kubenswrapper[4748]: I0320 11:12:05.428423 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566746-q8tck"] Mar 20 11:12:05 crc kubenswrapper[4748]: I0320 11:12:05.527766 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08153fb7-b276-48d4-a0ad-c4a433f5db7f" path="/var/lib/kubelet/pods/08153fb7-b276-48d4-a0ad-c4a433f5db7f/volumes" Mar 20 11:12:20 crc kubenswrapper[4748]: I0320 11:12:20.397757 4748 scope.go:117] "RemoveContainer" containerID="b77affdc552ff46fa440829ec7baf141492a8c3ef9601b449c647e1ecc6637e3" Mar 20 11:12:32 crc kubenswrapper[4748]: I0320 11:12:32.278109 4748 generic.go:334] "Generic (PLEG): container finished" podID="17ae0e15-a1a8-46a4-a2d0-4d1e92fe87b3" containerID="52b98cedec73d27868dd1a64257e16919b1ffea884474e5cda6cf4b8023f4c86" exitCode=0 Mar 20 11:12:32 crc kubenswrapper[4748]: I0320 11:12:32.278256 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-6qhn9" event={"ID":"17ae0e15-a1a8-46a4-a2d0-4d1e92fe87b3","Type":"ContainerDied","Data":"52b98cedec73d27868dd1a64257e16919b1ffea884474e5cda6cf4b8023f4c86"} Mar 20 11:12:33 crc kubenswrapper[4748]: I0320 11:12:33.713333 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-6qhn9" Mar 20 11:12:33 crc kubenswrapper[4748]: I0320 11:12:33.801754 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17ae0e15-a1a8-46a4-a2d0-4d1e92fe87b3-ovn-combined-ca-bundle\") pod \"17ae0e15-a1a8-46a4-a2d0-4d1e92fe87b3\" (UID: \"17ae0e15-a1a8-46a4-a2d0-4d1e92fe87b3\") " Mar 20 11:12:33 crc kubenswrapper[4748]: I0320 11:12:33.801853 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/17ae0e15-a1a8-46a4-a2d0-4d1e92fe87b3-inventory\") pod \"17ae0e15-a1a8-46a4-a2d0-4d1e92fe87b3\" (UID: \"17ae0e15-a1a8-46a4-a2d0-4d1e92fe87b3\") " Mar 20 11:12:33 crc kubenswrapper[4748]: I0320 11:12:33.801878 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/17ae0e15-a1a8-46a4-a2d0-4d1e92fe87b3-ovncontroller-config-0\") pod \"17ae0e15-a1a8-46a4-a2d0-4d1e92fe87b3\" (UID: \"17ae0e15-a1a8-46a4-a2d0-4d1e92fe87b3\") " Mar 20 11:12:33 crc kubenswrapper[4748]: I0320 11:12:33.801921 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wzhpl\" (UniqueName: \"kubernetes.io/projected/17ae0e15-a1a8-46a4-a2d0-4d1e92fe87b3-kube-api-access-wzhpl\") pod \"17ae0e15-a1a8-46a4-a2d0-4d1e92fe87b3\" (UID: \"17ae0e15-a1a8-46a4-a2d0-4d1e92fe87b3\") " Mar 20 11:12:33 crc kubenswrapper[4748]: I0320 11:12:33.802024 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/17ae0e15-a1a8-46a4-a2d0-4d1e92fe87b3-ssh-key-openstack-edpm-ipam\") pod \"17ae0e15-a1a8-46a4-a2d0-4d1e92fe87b3\" (UID: \"17ae0e15-a1a8-46a4-a2d0-4d1e92fe87b3\") " Mar 20 11:12:33 crc kubenswrapper[4748]: I0320 11:12:33.808684 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17ae0e15-a1a8-46a4-a2d0-4d1e92fe87b3-kube-api-access-wzhpl" (OuterVolumeSpecName: "kube-api-access-wzhpl") pod "17ae0e15-a1a8-46a4-a2d0-4d1e92fe87b3" (UID: "17ae0e15-a1a8-46a4-a2d0-4d1e92fe87b3"). InnerVolumeSpecName "kube-api-access-wzhpl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:12:33 crc kubenswrapper[4748]: I0320 11:12:33.809120 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17ae0e15-a1a8-46a4-a2d0-4d1e92fe87b3-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "17ae0e15-a1a8-46a4-a2d0-4d1e92fe87b3" (UID: "17ae0e15-a1a8-46a4-a2d0-4d1e92fe87b3"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 11:12:33 crc kubenswrapper[4748]: I0320 11:12:33.832122 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17ae0e15-a1a8-46a4-a2d0-4d1e92fe87b3-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "17ae0e15-a1a8-46a4-a2d0-4d1e92fe87b3" (UID: "17ae0e15-a1a8-46a4-a2d0-4d1e92fe87b3"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 11:12:33 crc kubenswrapper[4748]: I0320 11:12:33.832292 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17ae0e15-a1a8-46a4-a2d0-4d1e92fe87b3-inventory" (OuterVolumeSpecName: "inventory") pod "17ae0e15-a1a8-46a4-a2d0-4d1e92fe87b3" (UID: "17ae0e15-a1a8-46a4-a2d0-4d1e92fe87b3"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 11:12:33 crc kubenswrapper[4748]: I0320 11:12:33.832729 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17ae0e15-a1a8-46a4-a2d0-4d1e92fe87b3-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "17ae0e15-a1a8-46a4-a2d0-4d1e92fe87b3" (UID: "17ae0e15-a1a8-46a4-a2d0-4d1e92fe87b3"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 11:12:33 crc kubenswrapper[4748]: I0320 11:12:33.904244 4748 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17ae0e15-a1a8-46a4-a2d0-4d1e92fe87b3-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 11:12:33 crc kubenswrapper[4748]: I0320 11:12:33.904283 4748 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/17ae0e15-a1a8-46a4-a2d0-4d1e92fe87b3-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 11:12:33 crc kubenswrapper[4748]: I0320 11:12:33.904294 4748 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/17ae0e15-a1a8-46a4-a2d0-4d1e92fe87b3-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Mar 20 11:12:33 crc kubenswrapper[4748]: I0320 11:12:33.904302 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wzhpl\" (UniqueName: \"kubernetes.io/projected/17ae0e15-a1a8-46a4-a2d0-4d1e92fe87b3-kube-api-access-wzhpl\") on node \"crc\" DevicePath \"\"" Mar 20 11:12:33 crc kubenswrapper[4748]: I0320 11:12:33.904311 4748 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/17ae0e15-a1a8-46a4-a2d0-4d1e92fe87b3-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 11:12:34 crc kubenswrapper[4748]: I0320 11:12:34.297212 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-6qhn9" event={"ID":"17ae0e15-a1a8-46a4-a2d0-4d1e92fe87b3","Type":"ContainerDied","Data":"46b45663f2f5f7407681aa9a09d1831f54fe4237254be92c55aa7035ae92894d"} Mar 20 11:12:34 crc kubenswrapper[4748]: I0320 11:12:34.297280 4748 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="46b45663f2f5f7407681aa9a09d1831f54fe4237254be92c55aa7035ae92894d" Mar 20 11:12:34 crc kubenswrapper[4748]: I0320 11:12:34.297237 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-6qhn9" Mar 20 11:12:34 crc kubenswrapper[4748]: I0320 11:12:34.402041 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zr6v8"] Mar 20 11:12:34 crc kubenswrapper[4748]: E0320 11:12:34.428331 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17ae0e15-a1a8-46a4-a2d0-4d1e92fe87b3" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 20 11:12:34 crc kubenswrapper[4748]: I0320 11:12:34.428364 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="17ae0e15-a1a8-46a4-a2d0-4d1e92fe87b3" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 20 11:12:34 crc kubenswrapper[4748]: E0320 11:12:34.428378 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da6dab72-7622-4f0e-bb06-c33bacf10588" containerName="oc" Mar 20 11:12:34 crc kubenswrapper[4748]: I0320 11:12:34.428385 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="da6dab72-7622-4f0e-bb06-c33bacf10588" containerName="oc" Mar 20 11:12:34 crc kubenswrapper[4748]: I0320 11:12:34.428987 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="17ae0e15-a1a8-46a4-a2d0-4d1e92fe87b3" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 20 11:12:34 crc kubenswrapper[4748]: I0320 11:12:34.429013 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="da6dab72-7622-4f0e-bb06-c33bacf10588" containerName="oc" Mar 20 11:12:34 crc kubenswrapper[4748]: I0320 11:12:34.429911 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zr6v8" Mar 20 11:12:34 crc kubenswrapper[4748]: I0320 11:12:34.432973 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-fd5jb" Mar 20 11:12:34 crc kubenswrapper[4748]: I0320 11:12:34.433179 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Mar 20 11:12:34 crc kubenswrapper[4748]: I0320 11:12:34.433971 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 11:12:34 crc kubenswrapper[4748]: I0320 11:12:34.434116 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 11:12:34 crc kubenswrapper[4748]: I0320 11:12:34.434513 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Mar 20 11:12:34 crc kubenswrapper[4748]: I0320 11:12:34.435846 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 11:12:34 crc kubenswrapper[4748]: I0320 11:12:34.443695 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zr6v8"] Mar 20 11:12:34 crc kubenswrapper[4748]: I0320 11:12:34.519120 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cba4401d-824d-4c51-8a04-43691fa34a45-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zr6v8\" (UID: \"cba4401d-824d-4c51-8a04-43691fa34a45\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zr6v8" Mar 20 11:12:34 crc kubenswrapper[4748]: I0320 11:12:34.519462 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/cba4401d-824d-4c51-8a04-43691fa34a45-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zr6v8\" (UID: \"cba4401d-824d-4c51-8a04-43691fa34a45\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zr6v8" Mar 20 11:12:34 crc kubenswrapper[4748]: I0320 11:12:34.519511 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cba4401d-824d-4c51-8a04-43691fa34a45-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zr6v8\" (UID: \"cba4401d-824d-4c51-8a04-43691fa34a45\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zr6v8" Mar 20 11:12:34 crc kubenswrapper[4748]: I0320 11:12:34.519555 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cba4401d-824d-4c51-8a04-43691fa34a45-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zr6v8\" (UID: \"cba4401d-824d-4c51-8a04-43691fa34a45\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zr6v8" Mar 20 11:12:34 crc kubenswrapper[4748]: I0320 11:12:34.519661 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvkpj\" (UniqueName: \"kubernetes.io/projected/cba4401d-824d-4c51-8a04-43691fa34a45-kube-api-access-gvkpj\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zr6v8\" (UID: \"cba4401d-824d-4c51-8a04-43691fa34a45\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zr6v8" Mar 20 11:12:34 crc kubenswrapper[4748]: I0320 11:12:34.519718 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/cba4401d-824d-4c51-8a04-43691fa34a45-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zr6v8\" (UID: \"cba4401d-824d-4c51-8a04-43691fa34a45\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zr6v8" Mar 20 11:12:34 crc kubenswrapper[4748]: I0320 11:12:34.621138 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gvkpj\" (UniqueName: \"kubernetes.io/projected/cba4401d-824d-4c51-8a04-43691fa34a45-kube-api-access-gvkpj\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zr6v8\" (UID: \"cba4401d-824d-4c51-8a04-43691fa34a45\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zr6v8" Mar 20 11:12:34 crc kubenswrapper[4748]: I0320 11:12:34.621207 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/cba4401d-824d-4c51-8a04-43691fa34a45-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zr6v8\" (UID: \"cba4401d-824d-4c51-8a04-43691fa34a45\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zr6v8" Mar 20 11:12:34 crc kubenswrapper[4748]: I0320 11:12:34.621335 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cba4401d-824d-4c51-8a04-43691fa34a45-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zr6v8\" (UID: \"cba4401d-824d-4c51-8a04-43691fa34a45\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zr6v8" Mar 20 11:12:34 crc kubenswrapper[4748]: I0320 11:12:34.621509 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/cba4401d-824d-4c51-8a04-43691fa34a45-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zr6v8\" (UID: \"cba4401d-824d-4c51-8a04-43691fa34a45\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zr6v8" Mar 20 11:12:34 crc kubenswrapper[4748]: I0320 11:12:34.621591 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cba4401d-824d-4c51-8a04-43691fa34a45-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zr6v8\" (UID: \"cba4401d-824d-4c51-8a04-43691fa34a45\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zr6v8" Mar 20 11:12:34 crc kubenswrapper[4748]: I0320 11:12:34.621676 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cba4401d-824d-4c51-8a04-43691fa34a45-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zr6v8\" (UID: \"cba4401d-824d-4c51-8a04-43691fa34a45\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zr6v8" Mar 20 11:12:34 crc kubenswrapper[4748]: I0320 11:12:34.628610 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cba4401d-824d-4c51-8a04-43691fa34a45-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zr6v8\" (UID: \"cba4401d-824d-4c51-8a04-43691fa34a45\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zr6v8" Mar 20 11:12:34 crc kubenswrapper[4748]: I0320 11:12:34.628834 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/cba4401d-824d-4c51-8a04-43691fa34a45-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zr6v8\" (UID: \"cba4401d-824d-4c51-8a04-43691fa34a45\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zr6v8" Mar 20 11:12:34 crc kubenswrapper[4748]: I0320 11:12:34.628898 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cba4401d-824d-4c51-8a04-43691fa34a45-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zr6v8\" (UID: \"cba4401d-824d-4c51-8a04-43691fa34a45\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zr6v8" Mar 20 11:12:34 crc kubenswrapper[4748]: I0320 11:12:34.628880 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/cba4401d-824d-4c51-8a04-43691fa34a45-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zr6v8\" (UID: \"cba4401d-824d-4c51-8a04-43691fa34a45\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zr6v8" Mar 20 11:12:34 crc kubenswrapper[4748]: I0320 11:12:34.630310 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cba4401d-824d-4c51-8a04-43691fa34a45-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zr6v8\" (UID: \"cba4401d-824d-4c51-8a04-43691fa34a45\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zr6v8" Mar 20 11:12:34 crc kubenswrapper[4748]: I0320 11:12:34.641788 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvkpj\" (UniqueName: \"kubernetes.io/projected/cba4401d-824d-4c51-8a04-43691fa34a45-kube-api-access-gvkpj\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zr6v8\" (UID: \"cba4401d-824d-4c51-8a04-43691fa34a45\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zr6v8" Mar 20 11:12:34 crc kubenswrapper[4748]: I0320 11:12:34.791522 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zr6v8" Mar 20 11:12:35 crc kubenswrapper[4748]: I0320 11:12:35.349005 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zr6v8"] Mar 20 11:12:36 crc kubenswrapper[4748]: I0320 11:12:36.313822 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zr6v8" event={"ID":"cba4401d-824d-4c51-8a04-43691fa34a45","Type":"ContainerStarted","Data":"182c830688126db72e6d73631ab69a8a3f1bf2aef88ad552891decf9f011c61d"} Mar 20 11:12:36 crc kubenswrapper[4748]: I0320 11:12:36.313886 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zr6v8" event={"ID":"cba4401d-824d-4c51-8a04-43691fa34a45","Type":"ContainerStarted","Data":"070651285ac725b23f09307f1a986bd4f78c9bbc9652fbf6b89e8885d0e1646a"} Mar 20 11:12:36 crc kubenswrapper[4748]: I0320 11:12:36.337820 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zr6v8" podStartSLOduration=2.14233635 podStartE2EDuration="2.337804255s" podCreationTimestamp="2026-03-20 11:12:34 +0000 UTC" firstStartedPulling="2026-03-20 11:12:35.349686272 +0000 UTC m=+2190.491232086" lastFinishedPulling="2026-03-20 11:12:35.545154177 +0000 UTC m=+2190.686699991" observedRunningTime="2026-03-20 11:12:36.336201035 +0000 UTC m=+2191.477746849" watchObservedRunningTime="2026-03-20 11:12:36.337804255 +0000 UTC m=+2191.479350069" Mar 20 11:12:42 crc kubenswrapper[4748]: I0320 11:12:42.661696 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-t8rfc"] Mar 20 11:12:42 crc kubenswrapper[4748]: I0320 11:12:42.664455 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-t8rfc" Mar 20 11:12:42 crc kubenswrapper[4748]: I0320 11:12:42.676315 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-t8rfc"] Mar 20 11:12:42 crc kubenswrapper[4748]: I0320 11:12:42.779910 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkqzb\" (UniqueName: \"kubernetes.io/projected/e227eea4-e616-4e1f-8e09-20efb2b3d718-kube-api-access-qkqzb\") pod \"redhat-marketplace-t8rfc\" (UID: \"e227eea4-e616-4e1f-8e09-20efb2b3d718\") " pod="openshift-marketplace/redhat-marketplace-t8rfc" Mar 20 11:12:42 crc kubenswrapper[4748]: I0320 11:12:42.780065 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e227eea4-e616-4e1f-8e09-20efb2b3d718-catalog-content\") pod \"redhat-marketplace-t8rfc\" (UID: \"e227eea4-e616-4e1f-8e09-20efb2b3d718\") " pod="openshift-marketplace/redhat-marketplace-t8rfc" Mar 20 11:12:42 crc kubenswrapper[4748]: I0320 11:12:42.780301 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e227eea4-e616-4e1f-8e09-20efb2b3d718-utilities\") pod \"redhat-marketplace-t8rfc\" (UID: \"e227eea4-e616-4e1f-8e09-20efb2b3d718\") " pod="openshift-marketplace/redhat-marketplace-t8rfc" Mar 20 11:12:42 crc kubenswrapper[4748]: I0320 11:12:42.882073 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qkqzb\" (UniqueName: \"kubernetes.io/projected/e227eea4-e616-4e1f-8e09-20efb2b3d718-kube-api-access-qkqzb\") pod \"redhat-marketplace-t8rfc\" (UID: \"e227eea4-e616-4e1f-8e09-20efb2b3d718\") " pod="openshift-marketplace/redhat-marketplace-t8rfc" Mar 20 11:12:42 crc kubenswrapper[4748]: I0320 11:12:42.882190 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e227eea4-e616-4e1f-8e09-20efb2b3d718-catalog-content\") pod \"redhat-marketplace-t8rfc\" (UID: \"e227eea4-e616-4e1f-8e09-20efb2b3d718\") " pod="openshift-marketplace/redhat-marketplace-t8rfc" Mar 20 11:12:42 crc kubenswrapper[4748]: I0320 11:12:42.882264 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e227eea4-e616-4e1f-8e09-20efb2b3d718-utilities\") pod \"redhat-marketplace-t8rfc\" (UID: \"e227eea4-e616-4e1f-8e09-20efb2b3d718\") " pod="openshift-marketplace/redhat-marketplace-t8rfc" Mar 20 11:12:42 crc kubenswrapper[4748]: I0320 11:12:42.882858 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e227eea4-e616-4e1f-8e09-20efb2b3d718-utilities\") pod \"redhat-marketplace-t8rfc\" (UID: \"e227eea4-e616-4e1f-8e09-20efb2b3d718\") " pod="openshift-marketplace/redhat-marketplace-t8rfc" Mar 20 11:12:42 crc kubenswrapper[4748]: I0320 11:12:42.883500 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e227eea4-e616-4e1f-8e09-20efb2b3d718-catalog-content\") pod \"redhat-marketplace-t8rfc\" (UID: \"e227eea4-e616-4e1f-8e09-20efb2b3d718\") " pod="openshift-marketplace/redhat-marketplace-t8rfc" Mar 20 11:12:42 crc kubenswrapper[4748]: I0320 11:12:42.906070 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkqzb\" (UniqueName: \"kubernetes.io/projected/e227eea4-e616-4e1f-8e09-20efb2b3d718-kube-api-access-qkqzb\") pod \"redhat-marketplace-t8rfc\" (UID: \"e227eea4-e616-4e1f-8e09-20efb2b3d718\") " pod="openshift-marketplace/redhat-marketplace-t8rfc" Mar 20 11:12:42 crc kubenswrapper[4748]: I0320 11:12:42.986812 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-t8rfc" Mar 20 11:12:43 crc kubenswrapper[4748]: I0320 11:12:43.492951 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-t8rfc"] Mar 20 11:12:44 crc kubenswrapper[4748]: I0320 11:12:44.386340 4748 generic.go:334] "Generic (PLEG): container finished" podID="e227eea4-e616-4e1f-8e09-20efb2b3d718" containerID="ca1bad5bfcfd7951defb8070fba2434f90ffb70b55352972abe3bb9320c5f285" exitCode=0 Mar 20 11:12:44 crc kubenswrapper[4748]: I0320 11:12:44.386397 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t8rfc" event={"ID":"e227eea4-e616-4e1f-8e09-20efb2b3d718","Type":"ContainerDied","Data":"ca1bad5bfcfd7951defb8070fba2434f90ffb70b55352972abe3bb9320c5f285"} Mar 20 11:12:44 crc kubenswrapper[4748]: I0320 11:12:44.386660 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t8rfc" event={"ID":"e227eea4-e616-4e1f-8e09-20efb2b3d718","Type":"ContainerStarted","Data":"ae9c212631f30f9ab5181854efa6a8f0e56486c724391223943c191a26331d2e"} Mar 20 11:12:45 crc kubenswrapper[4748]: I0320 11:12:45.396328 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t8rfc" event={"ID":"e227eea4-e616-4e1f-8e09-20efb2b3d718","Type":"ContainerStarted","Data":"2b8854d6925f1ae158821a41af042fefae07354780dbedcc95d4179ffa34d2be"} Mar 20 11:12:46 crc kubenswrapper[4748]: I0320 11:12:46.409580 4748 generic.go:334] "Generic (PLEG): container finished" podID="e227eea4-e616-4e1f-8e09-20efb2b3d718" containerID="2b8854d6925f1ae158821a41af042fefae07354780dbedcc95d4179ffa34d2be" exitCode=0 Mar 20 11:12:46 crc kubenswrapper[4748]: I0320 11:12:46.409637 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t8rfc" event={"ID":"e227eea4-e616-4e1f-8e09-20efb2b3d718","Type":"ContainerDied","Data":"2b8854d6925f1ae158821a41af042fefae07354780dbedcc95d4179ffa34d2be"} Mar 20 11:12:47 crc kubenswrapper[4748]: I0320 11:12:47.420093 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t8rfc" event={"ID":"e227eea4-e616-4e1f-8e09-20efb2b3d718","Type":"ContainerStarted","Data":"938ba1e399443ecfe2391b5726e7243c02db965bb4b3f1ee9ed6817434637be5"} Mar 20 11:12:47 crc kubenswrapper[4748]: I0320 11:12:47.441258 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-t8rfc" podStartSLOduration=3.018181451 podStartE2EDuration="5.441239744s" podCreationTimestamp="2026-03-20 11:12:42 +0000 UTC" firstStartedPulling="2026-03-20 11:12:44.388879403 +0000 UTC m=+2199.530425217" lastFinishedPulling="2026-03-20 11:12:46.811937696 +0000 UTC m=+2201.953483510" observedRunningTime="2026-03-20 11:12:47.434734531 +0000 UTC m=+2202.576280365" watchObservedRunningTime="2026-03-20 11:12:47.441239744 +0000 UTC m=+2202.582785558" Mar 20 11:12:52 crc kubenswrapper[4748]: I0320 11:12:52.987668 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-t8rfc" Mar 20 11:12:52 crc kubenswrapper[4748]: I0320 11:12:52.988286 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-t8rfc" Mar 20 11:12:53 crc kubenswrapper[4748]: I0320 11:12:53.034872 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-t8rfc" Mar 20 11:12:53 crc kubenswrapper[4748]: I0320 11:12:53.535878 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-t8rfc" Mar 20 11:12:54 crc kubenswrapper[4748]: I0320 11:12:54.245520 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-t8rfc"] Mar 20 11:12:55 crc kubenswrapper[4748]: I0320 11:12:55.497617 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-t8rfc" podUID="e227eea4-e616-4e1f-8e09-20efb2b3d718" containerName="registry-server" containerID="cri-o://938ba1e399443ecfe2391b5726e7243c02db965bb4b3f1ee9ed6817434637be5" gracePeriod=2 Mar 20 11:12:56 crc kubenswrapper[4748]: I0320 11:12:56.500272 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-t8rfc" Mar 20 11:12:56 crc kubenswrapper[4748]: I0320 11:12:56.513914 4748 generic.go:334] "Generic (PLEG): container finished" podID="e227eea4-e616-4e1f-8e09-20efb2b3d718" containerID="938ba1e399443ecfe2391b5726e7243c02db965bb4b3f1ee9ed6817434637be5" exitCode=0 Mar 20 11:12:56 crc kubenswrapper[4748]: I0320 11:12:56.513970 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t8rfc" event={"ID":"e227eea4-e616-4e1f-8e09-20efb2b3d718","Type":"ContainerDied","Data":"938ba1e399443ecfe2391b5726e7243c02db965bb4b3f1ee9ed6817434637be5"} Mar 20 11:12:56 crc kubenswrapper[4748]: I0320 11:12:56.514015 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t8rfc" event={"ID":"e227eea4-e616-4e1f-8e09-20efb2b3d718","Type":"ContainerDied","Data":"ae9c212631f30f9ab5181854efa6a8f0e56486c724391223943c191a26331d2e"} Mar 20 11:12:56 crc kubenswrapper[4748]: I0320 11:12:56.514025 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-t8rfc" Mar 20 11:12:56 crc kubenswrapper[4748]: I0320 11:12:56.514039 4748 scope.go:117] "RemoveContainer" containerID="938ba1e399443ecfe2391b5726e7243c02db965bb4b3f1ee9ed6817434637be5" Mar 20 11:12:56 crc kubenswrapper[4748]: I0320 11:12:56.540407 4748 scope.go:117] "RemoveContainer" containerID="2b8854d6925f1ae158821a41af042fefae07354780dbedcc95d4179ffa34d2be" Mar 20 11:12:56 crc kubenswrapper[4748]: I0320 11:12:56.572271 4748 scope.go:117] "RemoveContainer" containerID="ca1bad5bfcfd7951defb8070fba2434f90ffb70b55352972abe3bb9320c5f285" Mar 20 11:12:56 crc kubenswrapper[4748]: I0320 11:12:56.616287 4748 scope.go:117] "RemoveContainer" containerID="938ba1e399443ecfe2391b5726e7243c02db965bb4b3f1ee9ed6817434637be5" Mar 20 11:12:56 crc kubenswrapper[4748]: E0320 11:12:56.616819 4748 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"938ba1e399443ecfe2391b5726e7243c02db965bb4b3f1ee9ed6817434637be5\": container with ID starting with 938ba1e399443ecfe2391b5726e7243c02db965bb4b3f1ee9ed6817434637be5 not found: ID does not exist" containerID="938ba1e399443ecfe2391b5726e7243c02db965bb4b3f1ee9ed6817434637be5" Mar 20 11:12:56 crc kubenswrapper[4748]: I0320 11:12:56.616873 4748 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"938ba1e399443ecfe2391b5726e7243c02db965bb4b3f1ee9ed6817434637be5"} err="failed to get container status \"938ba1e399443ecfe2391b5726e7243c02db965bb4b3f1ee9ed6817434637be5\": rpc error: code = NotFound desc = could not find container \"938ba1e399443ecfe2391b5726e7243c02db965bb4b3f1ee9ed6817434637be5\": container with ID starting with 938ba1e399443ecfe2391b5726e7243c02db965bb4b3f1ee9ed6817434637be5 not found: ID does not exist" Mar 20 11:12:56 crc kubenswrapper[4748]: I0320 11:12:56.616904 4748 scope.go:117] "RemoveContainer" containerID="2b8854d6925f1ae158821a41af042fefae07354780dbedcc95d4179ffa34d2be" Mar 20 11:12:56 crc kubenswrapper[4748]: E0320 11:12:56.617209 4748 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b8854d6925f1ae158821a41af042fefae07354780dbedcc95d4179ffa34d2be\": container with ID starting with 2b8854d6925f1ae158821a41af042fefae07354780dbedcc95d4179ffa34d2be not found: ID does not exist" containerID="2b8854d6925f1ae158821a41af042fefae07354780dbedcc95d4179ffa34d2be" Mar 20 11:12:56 crc kubenswrapper[4748]: I0320 11:12:56.617249 4748 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b8854d6925f1ae158821a41af042fefae07354780dbedcc95d4179ffa34d2be"} err="failed to get container status \"2b8854d6925f1ae158821a41af042fefae07354780dbedcc95d4179ffa34d2be\": rpc error: code = NotFound desc = could not find container \"2b8854d6925f1ae158821a41af042fefae07354780dbedcc95d4179ffa34d2be\": container with ID starting with 2b8854d6925f1ae158821a41af042fefae07354780dbedcc95d4179ffa34d2be not found: ID does not exist" Mar 20 11:12:56 crc kubenswrapper[4748]: I0320 11:12:56.617268 4748 scope.go:117] "RemoveContainer" containerID="ca1bad5bfcfd7951defb8070fba2434f90ffb70b55352972abe3bb9320c5f285" Mar 20 11:12:56 crc kubenswrapper[4748]: E0320 11:12:56.620042 4748 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca1bad5bfcfd7951defb8070fba2434f90ffb70b55352972abe3bb9320c5f285\": container with ID starting with ca1bad5bfcfd7951defb8070fba2434f90ffb70b55352972abe3bb9320c5f285 not found: ID does not exist" containerID="ca1bad5bfcfd7951defb8070fba2434f90ffb70b55352972abe3bb9320c5f285" Mar 20 11:12:56 crc kubenswrapper[4748]: I0320 11:12:56.620122 4748 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca1bad5bfcfd7951defb8070fba2434f90ffb70b55352972abe3bb9320c5f285"} err="failed to get container status \"ca1bad5bfcfd7951defb8070fba2434f90ffb70b55352972abe3bb9320c5f285\": rpc error: code = NotFound desc = could not find container \"ca1bad5bfcfd7951defb8070fba2434f90ffb70b55352972abe3bb9320c5f285\": container with ID starting with ca1bad5bfcfd7951defb8070fba2434f90ffb70b55352972abe3bb9320c5f285 not found: ID does not exist" Mar 20 11:12:56 crc kubenswrapper[4748]: I0320 11:12:56.655696 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qkqzb\" (UniqueName: \"kubernetes.io/projected/e227eea4-e616-4e1f-8e09-20efb2b3d718-kube-api-access-qkqzb\") pod \"e227eea4-e616-4e1f-8e09-20efb2b3d718\" (UID: \"e227eea4-e616-4e1f-8e09-20efb2b3d718\") " Mar 20 11:12:56 crc kubenswrapper[4748]: I0320 11:12:56.655884 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e227eea4-e616-4e1f-8e09-20efb2b3d718-catalog-content\") pod \"e227eea4-e616-4e1f-8e09-20efb2b3d718\" (UID: \"e227eea4-e616-4e1f-8e09-20efb2b3d718\") " Mar 20 11:12:56 crc kubenswrapper[4748]: I0320 11:12:56.655922 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e227eea4-e616-4e1f-8e09-20efb2b3d718-utilities\") pod \"e227eea4-e616-4e1f-8e09-20efb2b3d718\" (UID: \"e227eea4-e616-4e1f-8e09-20efb2b3d718\") " Mar 20 11:12:56 crc kubenswrapper[4748]: I0320 11:12:56.656956 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e227eea4-e616-4e1f-8e09-20efb2b3d718-utilities" (OuterVolumeSpecName: "utilities") pod "e227eea4-e616-4e1f-8e09-20efb2b3d718" (UID: "e227eea4-e616-4e1f-8e09-20efb2b3d718"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:12:56 crc kubenswrapper[4748]: I0320 11:12:56.663258 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e227eea4-e616-4e1f-8e09-20efb2b3d718-kube-api-access-qkqzb" (OuterVolumeSpecName: "kube-api-access-qkqzb") pod "e227eea4-e616-4e1f-8e09-20efb2b3d718" (UID: "e227eea4-e616-4e1f-8e09-20efb2b3d718"). InnerVolumeSpecName "kube-api-access-qkqzb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:12:56 crc kubenswrapper[4748]: I0320 11:12:56.757902 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qkqzb\" (UniqueName: \"kubernetes.io/projected/e227eea4-e616-4e1f-8e09-20efb2b3d718-kube-api-access-qkqzb\") on node \"crc\" DevicePath \"\"" Mar 20 11:12:56 crc kubenswrapper[4748]: I0320 11:12:56.758189 4748 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e227eea4-e616-4e1f-8e09-20efb2b3d718-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 11:12:57 crc kubenswrapper[4748]: I0320 11:12:57.082056 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e227eea4-e616-4e1f-8e09-20efb2b3d718-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e227eea4-e616-4e1f-8e09-20efb2b3d718" (UID: "e227eea4-e616-4e1f-8e09-20efb2b3d718"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:12:57 crc kubenswrapper[4748]: I0320 11:12:57.151491 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-t8rfc"] Mar 20 11:12:57 crc kubenswrapper[4748]: I0320 11:12:57.159987 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-t8rfc"] Mar 20 11:12:57 crc kubenswrapper[4748]: I0320 11:12:57.166032 4748 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e227eea4-e616-4e1f-8e09-20efb2b3d718-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 11:12:57 crc kubenswrapper[4748]: I0320 11:12:57.532955 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e227eea4-e616-4e1f-8e09-20efb2b3d718" path="/var/lib/kubelet/pods/e227eea4-e616-4e1f-8e09-20efb2b3d718/volumes" Mar 20 11:13:12 crc kubenswrapper[4748]: I0320 11:13:12.928449 4748 patch_prober.go:28] interesting pod/machine-config-daemon-5lbvz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:13:12 crc kubenswrapper[4748]: I0320 11:13:12.928911 4748 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:13:22 crc kubenswrapper[4748]: I0320 11:13:22.583732 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-nnlbm"] Mar 20 11:13:22 crc kubenswrapper[4748]: E0320 11:13:22.584792 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e227eea4-e616-4e1f-8e09-20efb2b3d718" containerName="extract-utilities" Mar 20 11:13:22 crc kubenswrapper[4748]: I0320 11:13:22.584808 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="e227eea4-e616-4e1f-8e09-20efb2b3d718" containerName="extract-utilities" Mar 20 11:13:22 crc kubenswrapper[4748]: E0320 11:13:22.584824 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e227eea4-e616-4e1f-8e09-20efb2b3d718" containerName="extract-content" Mar 20 11:13:22 crc kubenswrapper[4748]: I0320 11:13:22.584863 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="e227eea4-e616-4e1f-8e09-20efb2b3d718" containerName="extract-content" Mar 20 11:13:22 crc kubenswrapper[4748]: E0320 11:13:22.584873 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e227eea4-e616-4e1f-8e09-20efb2b3d718" containerName="registry-server" Mar 20 11:13:22 crc kubenswrapper[4748]: I0320 11:13:22.584879 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="e227eea4-e616-4e1f-8e09-20efb2b3d718" containerName="registry-server" Mar 20 11:13:22 crc kubenswrapper[4748]: I0320 11:13:22.585079 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="e227eea4-e616-4e1f-8e09-20efb2b3d718" containerName="registry-server" Mar 20 11:13:22 crc kubenswrapper[4748]: I0320 11:13:22.586480 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nnlbm" Mar 20 11:13:22 crc kubenswrapper[4748]: I0320 11:13:22.602552 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nnlbm"] Mar 20 11:13:22 crc kubenswrapper[4748]: I0320 11:13:22.687952 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30fd2d6d-270b-422a-b512-d7ca8b032886-catalog-content\") pod \"certified-operators-nnlbm\" (UID: \"30fd2d6d-270b-422a-b512-d7ca8b032886\") " pod="openshift-marketplace/certified-operators-nnlbm" Mar 20 11:13:22 crc kubenswrapper[4748]: I0320 11:13:22.688060 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q89g6\" (UniqueName: \"kubernetes.io/projected/30fd2d6d-270b-422a-b512-d7ca8b032886-kube-api-access-q89g6\") pod \"certified-operators-nnlbm\" (UID: \"30fd2d6d-270b-422a-b512-d7ca8b032886\") " pod="openshift-marketplace/certified-operators-nnlbm" Mar 20 11:13:22 crc kubenswrapper[4748]: I0320 11:13:22.688097 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30fd2d6d-270b-422a-b512-d7ca8b032886-utilities\") pod \"certified-operators-nnlbm\" (UID: \"30fd2d6d-270b-422a-b512-d7ca8b032886\") " pod="openshift-marketplace/certified-operators-nnlbm" Mar 20 11:13:22 crc kubenswrapper[4748]: I0320 11:13:22.789358 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q89g6\" (UniqueName: \"kubernetes.io/projected/30fd2d6d-270b-422a-b512-d7ca8b032886-kube-api-access-q89g6\") pod \"certified-operators-nnlbm\" (UID: \"30fd2d6d-270b-422a-b512-d7ca8b032886\") " pod="openshift-marketplace/certified-operators-nnlbm" Mar 20 11:13:22 crc kubenswrapper[4748]: I0320 11:13:22.789450 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30fd2d6d-270b-422a-b512-d7ca8b032886-utilities\") pod \"certified-operators-nnlbm\" (UID: \"30fd2d6d-270b-422a-b512-d7ca8b032886\") " pod="openshift-marketplace/certified-operators-nnlbm" Mar 20 11:13:22 crc kubenswrapper[4748]: I0320 11:13:22.789573 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30fd2d6d-270b-422a-b512-d7ca8b032886-catalog-content\") pod \"certified-operators-nnlbm\" (UID: \"30fd2d6d-270b-422a-b512-d7ca8b032886\") " pod="openshift-marketplace/certified-operators-nnlbm" Mar 20 11:13:22 crc kubenswrapper[4748]: I0320 11:13:22.790075 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30fd2d6d-270b-422a-b512-d7ca8b032886-utilities\") pod \"certified-operators-nnlbm\" (UID: \"30fd2d6d-270b-422a-b512-d7ca8b032886\") " pod="openshift-marketplace/certified-operators-nnlbm" Mar 20 11:13:22 crc kubenswrapper[4748]: I0320 11:13:22.790170 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30fd2d6d-270b-422a-b512-d7ca8b032886-catalog-content\") pod \"certified-operators-nnlbm\" (UID: \"30fd2d6d-270b-422a-b512-d7ca8b032886\") " pod="openshift-marketplace/certified-operators-nnlbm" Mar 20 11:13:22 crc kubenswrapper[4748]: I0320 11:13:22.808821 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q89g6\" (UniqueName: \"kubernetes.io/projected/30fd2d6d-270b-422a-b512-d7ca8b032886-kube-api-access-q89g6\") pod \"certified-operators-nnlbm\" (UID: \"30fd2d6d-270b-422a-b512-d7ca8b032886\") " pod="openshift-marketplace/certified-operators-nnlbm" Mar 20 11:13:22 crc kubenswrapper[4748]: I0320 11:13:22.928912 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nnlbm" Mar 20 11:13:23 crc kubenswrapper[4748]: I0320 11:13:23.497019 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nnlbm"] Mar 20 11:13:23 crc kubenswrapper[4748]: I0320 11:13:23.844894 4748 generic.go:334] "Generic (PLEG): container finished" podID="30fd2d6d-270b-422a-b512-d7ca8b032886" containerID="e6271ae8a250b7f6be1bf6f53f04d1f8525072b9bc58deb1a1a74e401f301fbb" exitCode=0 Mar 20 11:13:23 crc kubenswrapper[4748]: I0320 11:13:23.845229 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nnlbm" event={"ID":"30fd2d6d-270b-422a-b512-d7ca8b032886","Type":"ContainerDied","Data":"e6271ae8a250b7f6be1bf6f53f04d1f8525072b9bc58deb1a1a74e401f301fbb"} Mar 20 11:13:23 crc kubenswrapper[4748]: I0320 11:13:23.845262 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nnlbm" event={"ID":"30fd2d6d-270b-422a-b512-d7ca8b032886","Type":"ContainerStarted","Data":"835fa17e2f3d96350dd14f8fd79a2e33186e0212ab067901443c7080c6862f93"} Mar 20 11:13:24 crc kubenswrapper[4748]: I0320 11:13:24.854620 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nnlbm" event={"ID":"30fd2d6d-270b-422a-b512-d7ca8b032886","Type":"ContainerStarted","Data":"f6e2e8457ea5cd4eed85ac6405c07b4f7d7f3e1e62940f66e030b3ca1ac2b060"} Mar 20 11:13:24 crc kubenswrapper[4748]: I0320 11:13:24.857179 4748 generic.go:334] "Generic (PLEG): container finished" podID="cba4401d-824d-4c51-8a04-43691fa34a45" containerID="182c830688126db72e6d73631ab69a8a3f1bf2aef88ad552891decf9f011c61d" exitCode=0 Mar 20 11:13:24 crc kubenswrapper[4748]: I0320 11:13:24.857224 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zr6v8" event={"ID":"cba4401d-824d-4c51-8a04-43691fa34a45","Type":"ContainerDied","Data":"182c830688126db72e6d73631ab69a8a3f1bf2aef88ad552891decf9f011c61d"} Mar 20 11:13:25 crc kubenswrapper[4748]: I0320 11:13:25.868179 4748 generic.go:334] "Generic (PLEG): container finished" podID="30fd2d6d-270b-422a-b512-d7ca8b032886" containerID="f6e2e8457ea5cd4eed85ac6405c07b4f7d7f3e1e62940f66e030b3ca1ac2b060" exitCode=0 Mar 20 11:13:25 crc kubenswrapper[4748]: I0320 11:13:25.868261 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nnlbm" event={"ID":"30fd2d6d-270b-422a-b512-d7ca8b032886","Type":"ContainerDied","Data":"f6e2e8457ea5cd4eed85ac6405c07b4f7d7f3e1e62940f66e030b3ca1ac2b060"} Mar 20 11:13:26 crc kubenswrapper[4748]: I0320 11:13:26.328202 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zr6v8" Mar 20 11:13:26 crc kubenswrapper[4748]: I0320 11:13:26.359539 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cba4401d-824d-4c51-8a04-43691fa34a45-ssh-key-openstack-edpm-ipam\") pod \"cba4401d-824d-4c51-8a04-43691fa34a45\" (UID: \"cba4401d-824d-4c51-8a04-43691fa34a45\") " Mar 20 11:13:26 crc kubenswrapper[4748]: I0320 11:13:26.359631 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/cba4401d-824d-4c51-8a04-43691fa34a45-nova-metadata-neutron-config-0\") pod \"cba4401d-824d-4c51-8a04-43691fa34a45\" (UID: \"cba4401d-824d-4c51-8a04-43691fa34a45\") " Mar 20 11:13:26 crc kubenswrapper[4748]: I0320 11:13:26.359724 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/cba4401d-824d-4c51-8a04-43691fa34a45-neutron-ovn-metadata-agent-neutron-config-0\") pod \"cba4401d-824d-4c51-8a04-43691fa34a45\" (UID: \"cba4401d-824d-4c51-8a04-43691fa34a45\") " Mar 20 11:13:26 crc kubenswrapper[4748]: I0320 11:13:26.359798 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cba4401d-824d-4c51-8a04-43691fa34a45-inventory\") pod \"cba4401d-824d-4c51-8a04-43691fa34a45\" (UID: \"cba4401d-824d-4c51-8a04-43691fa34a45\") " Mar 20 11:13:26 crc kubenswrapper[4748]: I0320 11:13:26.359891 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cba4401d-824d-4c51-8a04-43691fa34a45-neutron-metadata-combined-ca-bundle\") pod \"cba4401d-824d-4c51-8a04-43691fa34a45\" (UID: \"cba4401d-824d-4c51-8a04-43691fa34a45\") " Mar 20 11:13:26 crc kubenswrapper[4748]: I0320 11:13:26.359945 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gvkpj\" (UniqueName: \"kubernetes.io/projected/cba4401d-824d-4c51-8a04-43691fa34a45-kube-api-access-gvkpj\") pod \"cba4401d-824d-4c51-8a04-43691fa34a45\" (UID: \"cba4401d-824d-4c51-8a04-43691fa34a45\") " Mar 20 11:13:26 crc kubenswrapper[4748]: I0320 11:13:26.368287 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cba4401d-824d-4c51-8a04-43691fa34a45-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "cba4401d-824d-4c51-8a04-43691fa34a45" (UID: "cba4401d-824d-4c51-8a04-43691fa34a45"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 11:13:26 crc kubenswrapper[4748]: I0320 11:13:26.378150 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cba4401d-824d-4c51-8a04-43691fa34a45-kube-api-access-gvkpj" (OuterVolumeSpecName: "kube-api-access-gvkpj") pod "cba4401d-824d-4c51-8a04-43691fa34a45" (UID: "cba4401d-824d-4c51-8a04-43691fa34a45"). InnerVolumeSpecName "kube-api-access-gvkpj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:13:26 crc kubenswrapper[4748]: I0320 11:13:26.390803 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cba4401d-824d-4c51-8a04-43691fa34a45-inventory" (OuterVolumeSpecName: "inventory") pod "cba4401d-824d-4c51-8a04-43691fa34a45" (UID: "cba4401d-824d-4c51-8a04-43691fa34a45"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 11:13:26 crc kubenswrapper[4748]: I0320 11:13:26.396361 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cba4401d-824d-4c51-8a04-43691fa34a45-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "cba4401d-824d-4c51-8a04-43691fa34a45" (UID: "cba4401d-824d-4c51-8a04-43691fa34a45"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 11:13:26 crc kubenswrapper[4748]: I0320 11:13:26.397601 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cba4401d-824d-4c51-8a04-43691fa34a45-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "cba4401d-824d-4c51-8a04-43691fa34a45" (UID: "cba4401d-824d-4c51-8a04-43691fa34a45"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 11:13:26 crc kubenswrapper[4748]: I0320 11:13:26.407965 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cba4401d-824d-4c51-8a04-43691fa34a45-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "cba4401d-824d-4c51-8a04-43691fa34a45" (UID: "cba4401d-824d-4c51-8a04-43691fa34a45"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 11:13:26 crc kubenswrapper[4748]: I0320 11:13:26.461495 4748 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cba4401d-824d-4c51-8a04-43691fa34a45-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 11:13:26 crc kubenswrapper[4748]: I0320 11:13:26.461534 4748 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/cba4401d-824d-4c51-8a04-43691fa34a45-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 20 11:13:26 crc kubenswrapper[4748]: I0320 11:13:26.461547 4748 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/cba4401d-824d-4c51-8a04-43691fa34a45-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 20 11:13:26 crc kubenswrapper[4748]: I0320 11:13:26.461562 4748 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cba4401d-824d-4c51-8a04-43691fa34a45-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 11:13:26 crc kubenswrapper[4748]: I0320 11:13:26.461571 4748 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cba4401d-824d-4c51-8a04-43691fa34a45-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 11:13:26 crc kubenswrapper[4748]: I0320 11:13:26.461584 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gvkpj\" (UniqueName: \"kubernetes.io/projected/cba4401d-824d-4c51-8a04-43691fa34a45-kube-api-access-gvkpj\") on node \"crc\" DevicePath \"\"" Mar 20 11:13:26 crc kubenswrapper[4748]: I0320 11:13:26.880322 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zr6v8" event={"ID":"cba4401d-824d-4c51-8a04-43691fa34a45","Type":"ContainerDied","Data":"070651285ac725b23f09307f1a986bd4f78c9bbc9652fbf6b89e8885d0e1646a"} Mar 20 11:13:26 crc kubenswrapper[4748]: I0320 11:13:26.880366 4748 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="070651285ac725b23f09307f1a986bd4f78c9bbc9652fbf6b89e8885d0e1646a" Mar 20 11:13:26 crc kubenswrapper[4748]: I0320 11:13:26.881055 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zr6v8" Mar 20 11:13:26 crc kubenswrapper[4748]: I0320 11:13:26.997133 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rf2m4"] Mar 20 11:13:26 crc kubenswrapper[4748]: E0320 11:13:26.997658 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cba4401d-824d-4c51-8a04-43691fa34a45" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 20 11:13:26 crc kubenswrapper[4748]: I0320 11:13:26.997683 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="cba4401d-824d-4c51-8a04-43691fa34a45" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 20 11:13:26 crc kubenswrapper[4748]: I0320 11:13:26.998072 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="cba4401d-824d-4c51-8a04-43691fa34a45" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 20 11:13:26 crc kubenswrapper[4748]: I0320 11:13:26.998891 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rf2m4" Mar 20 11:13:27 crc kubenswrapper[4748]: I0320 11:13:27.001514 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 11:13:27 crc kubenswrapper[4748]: I0320 11:13:27.001586 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-fd5jb" Mar 20 11:13:27 crc kubenswrapper[4748]: I0320 11:13:27.001787 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Mar 20 11:13:27 crc kubenswrapper[4748]: I0320 11:13:27.001947 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 11:13:27 crc kubenswrapper[4748]: I0320 11:13:27.002019 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 11:13:27 crc kubenswrapper[4748]: I0320 11:13:27.007184 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rf2m4"] Mar 20 11:13:27 crc kubenswrapper[4748]: I0320 11:13:27.080148 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2b6tr\" (UniqueName: \"kubernetes.io/projected/a1734ea2-369d-4c96-aca3-1a450a82e9dc-kube-api-access-2b6tr\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rf2m4\" (UID: \"a1734ea2-369d-4c96-aca3-1a450a82e9dc\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rf2m4" Mar 20 11:13:27 crc kubenswrapper[4748]: I0320 11:13:27.080330 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/a1734ea2-369d-4c96-aca3-1a450a82e9dc-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rf2m4\" (UID: \"a1734ea2-369d-4c96-aca3-1a450a82e9dc\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rf2m4" Mar 20 11:13:27 crc kubenswrapper[4748]: I0320 11:13:27.080384 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a1734ea2-369d-4c96-aca3-1a450a82e9dc-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rf2m4\" (UID: \"a1734ea2-369d-4c96-aca3-1a450a82e9dc\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rf2m4" Mar 20 11:13:27 crc kubenswrapper[4748]: I0320 11:13:27.080422 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1734ea2-369d-4c96-aca3-1a450a82e9dc-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rf2m4\" (UID: \"a1734ea2-369d-4c96-aca3-1a450a82e9dc\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rf2m4" Mar 20 11:13:27 crc kubenswrapper[4748]: I0320 11:13:27.080456 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a1734ea2-369d-4c96-aca3-1a450a82e9dc-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rf2m4\" (UID: \"a1734ea2-369d-4c96-aca3-1a450a82e9dc\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rf2m4" Mar 20 11:13:27 crc kubenswrapper[4748]: I0320 11:13:27.182999 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/a1734ea2-369d-4c96-aca3-1a450a82e9dc-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rf2m4\" (UID: \"a1734ea2-369d-4c96-aca3-1a450a82e9dc\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rf2m4" Mar 20 11:13:27 crc kubenswrapper[4748]: I0320 11:13:27.183112 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a1734ea2-369d-4c96-aca3-1a450a82e9dc-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rf2m4\" (UID: \"a1734ea2-369d-4c96-aca3-1a450a82e9dc\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rf2m4" Mar 20 11:13:27 crc kubenswrapper[4748]: I0320 11:13:27.183172 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1734ea2-369d-4c96-aca3-1a450a82e9dc-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rf2m4\" (UID: \"a1734ea2-369d-4c96-aca3-1a450a82e9dc\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rf2m4" Mar 20 11:13:27 crc kubenswrapper[4748]: I0320 11:13:27.183235 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a1734ea2-369d-4c96-aca3-1a450a82e9dc-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rf2m4\" (UID: \"a1734ea2-369d-4c96-aca3-1a450a82e9dc\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rf2m4" Mar 20 11:13:27 crc kubenswrapper[4748]: I0320 11:13:27.183287 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2b6tr\" (UniqueName: \"kubernetes.io/projected/a1734ea2-369d-4c96-aca3-1a450a82e9dc-kube-api-access-2b6tr\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rf2m4\" (UID: \"a1734ea2-369d-4c96-aca3-1a450a82e9dc\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rf2m4" Mar 20 11:13:27 crc kubenswrapper[4748]: I0320 11:13:27.188384 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a1734ea2-369d-4c96-aca3-1a450a82e9dc-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rf2m4\" (UID: \"a1734ea2-369d-4c96-aca3-1a450a82e9dc\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rf2m4" Mar 20 11:13:27 crc kubenswrapper[4748]: I0320 11:13:27.188989 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a1734ea2-369d-4c96-aca3-1a450a82e9dc-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rf2m4\" (UID: \"a1734ea2-369d-4c96-aca3-1a450a82e9dc\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rf2m4" Mar 20 11:13:27 crc kubenswrapper[4748]: I0320 11:13:27.191931 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/a1734ea2-369d-4c96-aca3-1a450a82e9dc-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rf2m4\" (UID: \"a1734ea2-369d-4c96-aca3-1a450a82e9dc\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rf2m4" Mar 20 11:13:27 crc kubenswrapper[4748]: I0320 11:13:27.192720 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1734ea2-369d-4c96-aca3-1a450a82e9dc-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rf2m4\" (UID: \"a1734ea2-369d-4c96-aca3-1a450a82e9dc\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rf2m4" Mar 20 11:13:27 crc kubenswrapper[4748]: I0320 11:13:27.205417 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2b6tr\" (UniqueName: \"kubernetes.io/projected/a1734ea2-369d-4c96-aca3-1a450a82e9dc-kube-api-access-2b6tr\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rf2m4\" (UID: \"a1734ea2-369d-4c96-aca3-1a450a82e9dc\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rf2m4" Mar 20 11:13:27 crc kubenswrapper[4748]: I0320 11:13:27.317789 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rf2m4" Mar 20 11:13:27 crc kubenswrapper[4748]: W0320 11:13:27.802036 4748 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda1734ea2_369d_4c96_aca3_1a450a82e9dc.slice/crio-c6471593d21744153b91f00bc67341173fb979186036d22618c3eed0c90ba4be WatchSource:0}: Error finding container c6471593d21744153b91f00bc67341173fb979186036d22618c3eed0c90ba4be: Status 404 returned error can't find the container with id c6471593d21744153b91f00bc67341173fb979186036d22618c3eed0c90ba4be Mar 20 11:13:27 crc kubenswrapper[4748]: I0320 11:13:27.806683 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rf2m4"] Mar 20 11:13:27 crc kubenswrapper[4748]: I0320 11:13:27.890113 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rf2m4" event={"ID":"a1734ea2-369d-4c96-aca3-1a450a82e9dc","Type":"ContainerStarted","Data":"c6471593d21744153b91f00bc67341173fb979186036d22618c3eed0c90ba4be"} Mar 20 11:13:28 crc kubenswrapper[4748]: I0320 11:13:28.904640 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rf2m4" event={"ID":"a1734ea2-369d-4c96-aca3-1a450a82e9dc","Type":"ContainerStarted","Data":"ea3f2a156242454a0399dfbf9e3ce7cc44aea6e8255d0e4a48d05d8262e0e37a"} Mar 20 11:13:28 crc kubenswrapper[4748]: I0320 11:13:28.907279 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nnlbm" event={"ID":"30fd2d6d-270b-422a-b512-d7ca8b032886","Type":"ContainerStarted","Data":"866f991745ee6a71588a0f2075f37b4a0d1e4c13c379bf12f4021a4c4d4c2efd"} Mar 20 11:13:28 crc kubenswrapper[4748]: I0320 11:13:28.927862 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rf2m4" podStartSLOduration=2.756811452 podStartE2EDuration="2.927810654s" podCreationTimestamp="2026-03-20 11:13:26 +0000 UTC" firstStartedPulling="2026-03-20 11:13:27.812115377 +0000 UTC m=+2242.953661191" lastFinishedPulling="2026-03-20 11:13:27.983114579 +0000 UTC m=+2243.124660393" observedRunningTime="2026-03-20 11:13:28.922991263 +0000 UTC m=+2244.064537087" watchObservedRunningTime="2026-03-20 11:13:28.927810654 +0000 UTC m=+2244.069356468" Mar 20 11:13:32 crc kubenswrapper[4748]: I0320 11:13:32.929522 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-nnlbm" Mar 20 11:13:32 crc kubenswrapper[4748]: I0320 11:13:32.931016 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-nnlbm" Mar 20 11:13:32 crc kubenswrapper[4748]: I0320 11:13:32.982140 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-nnlbm" Mar 20 11:13:33 crc kubenswrapper[4748]: I0320 11:13:33.005889 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-nnlbm" podStartSLOduration=6.389066352 podStartE2EDuration="11.005867597s" podCreationTimestamp="2026-03-20 11:13:22 +0000 UTC" firstStartedPulling="2026-03-20 11:13:23.847067502 +0000 UTC m=+2238.988613316" lastFinishedPulling="2026-03-20 11:13:28.463868747 +0000 UTC m=+2243.605414561" observedRunningTime="2026-03-20 11:13:28.942521572 +0000 UTC m=+2244.084067396" watchObservedRunningTime="2026-03-20 11:13:33.005867597 +0000 UTC m=+2248.147413411" Mar 20 11:13:33 crc kubenswrapper[4748]: I0320 11:13:33.990915 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-nnlbm" Mar 20 11:13:34 crc kubenswrapper[4748]: I0320 11:13:34.031914 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nnlbm"] Mar 20 11:13:35 crc kubenswrapper[4748]: I0320 11:13:35.975879 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-nnlbm" podUID="30fd2d6d-270b-422a-b512-d7ca8b032886" containerName="registry-server" containerID="cri-o://866f991745ee6a71588a0f2075f37b4a0d1e4c13c379bf12f4021a4c4d4c2efd" gracePeriod=2 Mar 20 11:13:36 crc kubenswrapper[4748]: I0320 11:13:36.987960 4748 generic.go:334] "Generic (PLEG): container finished" podID="30fd2d6d-270b-422a-b512-d7ca8b032886" containerID="866f991745ee6a71588a0f2075f37b4a0d1e4c13c379bf12f4021a4c4d4c2efd" exitCode=0 Mar 20 11:13:36 crc kubenswrapper[4748]: I0320 11:13:36.988061 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nnlbm" event={"ID":"30fd2d6d-270b-422a-b512-d7ca8b032886","Type":"ContainerDied","Data":"866f991745ee6a71588a0f2075f37b4a0d1e4c13c379bf12f4021a4c4d4c2efd"} Mar 20 11:13:36 crc kubenswrapper[4748]: I0320 11:13:36.988393 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nnlbm" event={"ID":"30fd2d6d-270b-422a-b512-d7ca8b032886","Type":"ContainerDied","Data":"835fa17e2f3d96350dd14f8fd79a2e33186e0212ab067901443c7080c6862f93"} Mar 20 11:13:36 crc kubenswrapper[4748]: I0320 11:13:36.988418 4748 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="835fa17e2f3d96350dd14f8fd79a2e33186e0212ab067901443c7080c6862f93" Mar 20 11:13:36 crc kubenswrapper[4748]: I0320 11:13:36.992495 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nnlbm" Mar 20 11:13:37 crc kubenswrapper[4748]: I0320 11:13:37.158088 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q89g6\" (UniqueName: \"kubernetes.io/projected/30fd2d6d-270b-422a-b512-d7ca8b032886-kube-api-access-q89g6\") pod \"30fd2d6d-270b-422a-b512-d7ca8b032886\" (UID: \"30fd2d6d-270b-422a-b512-d7ca8b032886\") " Mar 20 11:13:37 crc kubenswrapper[4748]: I0320 11:13:37.158850 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30fd2d6d-270b-422a-b512-d7ca8b032886-utilities\") pod \"30fd2d6d-270b-422a-b512-d7ca8b032886\" (UID: \"30fd2d6d-270b-422a-b512-d7ca8b032886\") " Mar 20 11:13:37 crc kubenswrapper[4748]: I0320 11:13:37.159095 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30fd2d6d-270b-422a-b512-d7ca8b032886-catalog-content\") pod \"30fd2d6d-270b-422a-b512-d7ca8b032886\" (UID: \"30fd2d6d-270b-422a-b512-d7ca8b032886\") " Mar 20 11:13:37 crc kubenswrapper[4748]: I0320 11:13:37.159958 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/30fd2d6d-270b-422a-b512-d7ca8b032886-utilities" (OuterVolumeSpecName: "utilities") pod "30fd2d6d-270b-422a-b512-d7ca8b032886" (UID: "30fd2d6d-270b-422a-b512-d7ca8b032886"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:13:37 crc kubenswrapper[4748]: I0320 11:13:37.179179 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30fd2d6d-270b-422a-b512-d7ca8b032886-kube-api-access-q89g6" (OuterVolumeSpecName: "kube-api-access-q89g6") pod "30fd2d6d-270b-422a-b512-d7ca8b032886" (UID: "30fd2d6d-270b-422a-b512-d7ca8b032886"). InnerVolumeSpecName "kube-api-access-q89g6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:13:37 crc kubenswrapper[4748]: I0320 11:13:37.212750 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/30fd2d6d-270b-422a-b512-d7ca8b032886-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "30fd2d6d-270b-422a-b512-d7ca8b032886" (UID: "30fd2d6d-270b-422a-b512-d7ca8b032886"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:13:37 crc kubenswrapper[4748]: I0320 11:13:37.261745 4748 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30fd2d6d-270b-422a-b512-d7ca8b032886-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 11:13:37 crc kubenswrapper[4748]: I0320 11:13:37.261784 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q89g6\" (UniqueName: \"kubernetes.io/projected/30fd2d6d-270b-422a-b512-d7ca8b032886-kube-api-access-q89g6\") on node \"crc\" DevicePath \"\"" Mar 20 11:13:37 crc kubenswrapper[4748]: I0320 11:13:37.261798 4748 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30fd2d6d-270b-422a-b512-d7ca8b032886-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 11:13:37 crc kubenswrapper[4748]: I0320 11:13:37.998126 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nnlbm" Mar 20 11:13:38 crc kubenswrapper[4748]: I0320 11:13:38.030625 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nnlbm"] Mar 20 11:13:38 crc kubenswrapper[4748]: I0320 11:13:38.041163 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-nnlbm"] Mar 20 11:13:39 crc kubenswrapper[4748]: I0320 11:13:39.530765 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30fd2d6d-270b-422a-b512-d7ca8b032886" path="/var/lib/kubelet/pods/30fd2d6d-270b-422a-b512-d7ca8b032886/volumes" Mar 20 11:13:42 crc kubenswrapper[4748]: I0320 11:13:42.928190 4748 patch_prober.go:28] interesting pod/machine-config-daemon-5lbvz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:13:42 crc kubenswrapper[4748]: I0320 11:13:42.928708 4748 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:14:00 crc kubenswrapper[4748]: I0320 11:14:00.147791 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566754-xn42x"] Mar 20 11:14:00 crc kubenswrapper[4748]: E0320 11:14:00.148711 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30fd2d6d-270b-422a-b512-d7ca8b032886" containerName="extract-utilities" Mar 20 11:14:00 crc kubenswrapper[4748]: I0320 11:14:00.148723 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="30fd2d6d-270b-422a-b512-d7ca8b032886" containerName="extract-utilities" Mar 20 11:14:00 crc kubenswrapper[4748]: E0320 11:14:00.148739 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30fd2d6d-270b-422a-b512-d7ca8b032886" containerName="registry-server" Mar 20 11:14:00 crc kubenswrapper[4748]: I0320 11:14:00.148745 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="30fd2d6d-270b-422a-b512-d7ca8b032886" containerName="registry-server" Mar 20 11:14:00 crc kubenswrapper[4748]: E0320 11:14:00.148762 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30fd2d6d-270b-422a-b512-d7ca8b032886" containerName="extract-content" Mar 20 11:14:00 crc kubenswrapper[4748]: I0320 11:14:00.148768 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="30fd2d6d-270b-422a-b512-d7ca8b032886" containerName="extract-content" Mar 20 11:14:00 crc kubenswrapper[4748]: I0320 11:14:00.148989 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="30fd2d6d-270b-422a-b512-d7ca8b032886" containerName="registry-server" Mar 20 11:14:00 crc kubenswrapper[4748]: I0320 11:14:00.149733 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566754-xn42x" Mar 20 11:14:00 crc kubenswrapper[4748]: I0320 11:14:00.153285 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 11:14:00 crc kubenswrapper[4748]: I0320 11:14:00.153548 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-p6q8z" Mar 20 11:14:00 crc kubenswrapper[4748]: I0320 11:14:00.156464 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 11:14:00 crc kubenswrapper[4748]: I0320 11:14:00.158956 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566754-xn42x"] Mar 20 11:14:00 crc kubenswrapper[4748]: I0320 11:14:00.444042 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mlmlt\" (UniqueName: \"kubernetes.io/projected/10f23fc0-639a-465e-b0d8-3ca44dc9d1ac-kube-api-access-mlmlt\") pod \"auto-csr-approver-29566754-xn42x\" (UID: \"10f23fc0-639a-465e-b0d8-3ca44dc9d1ac\") " pod="openshift-infra/auto-csr-approver-29566754-xn42x" Mar 20 11:14:00 crc kubenswrapper[4748]: I0320 11:14:00.545820 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mlmlt\" (UniqueName: \"kubernetes.io/projected/10f23fc0-639a-465e-b0d8-3ca44dc9d1ac-kube-api-access-mlmlt\") pod \"auto-csr-approver-29566754-xn42x\" (UID: \"10f23fc0-639a-465e-b0d8-3ca44dc9d1ac\") " pod="openshift-infra/auto-csr-approver-29566754-xn42x" Mar 20 11:14:00 crc kubenswrapper[4748]: I0320 11:14:00.565018 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mlmlt\" (UniqueName: \"kubernetes.io/projected/10f23fc0-639a-465e-b0d8-3ca44dc9d1ac-kube-api-access-mlmlt\") pod \"auto-csr-approver-29566754-xn42x\" (UID: \"10f23fc0-639a-465e-b0d8-3ca44dc9d1ac\") " pod="openshift-infra/auto-csr-approver-29566754-xn42x" Mar 20 11:14:00 crc kubenswrapper[4748]: I0320 11:14:00.741311 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566754-xn42x" Mar 20 11:14:01 crc kubenswrapper[4748]: I0320 11:14:01.175709 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566754-xn42x"] Mar 20 11:14:01 crc kubenswrapper[4748]: I0320 11:14:01.194730 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566754-xn42x" event={"ID":"10f23fc0-639a-465e-b0d8-3ca44dc9d1ac","Type":"ContainerStarted","Data":"b7d0d0058443123232ab9f775f31494ff360afe9f70af22a584e0d7838e82032"} Mar 20 11:14:05 crc kubenswrapper[4748]: I0320 11:14:05.226350 4748 generic.go:334] "Generic (PLEG): container finished" podID="10f23fc0-639a-465e-b0d8-3ca44dc9d1ac" containerID="a5cf8131bef5a9d9f94387f4389c972e26d4cd4663a2f0c987dc97b377674c74" exitCode=0 Mar 20 11:14:05 crc kubenswrapper[4748]: I0320 11:14:05.226478 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566754-xn42x" event={"ID":"10f23fc0-639a-465e-b0d8-3ca44dc9d1ac","Type":"ContainerDied","Data":"a5cf8131bef5a9d9f94387f4389c972e26d4cd4663a2f0c987dc97b377674c74"} Mar 20 11:14:06 crc kubenswrapper[4748]: I0320 11:14:06.565788 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566754-xn42x" Mar 20 11:14:06 crc kubenswrapper[4748]: I0320 11:14:06.673613 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mlmlt\" (UniqueName: \"kubernetes.io/projected/10f23fc0-639a-465e-b0d8-3ca44dc9d1ac-kube-api-access-mlmlt\") pod \"10f23fc0-639a-465e-b0d8-3ca44dc9d1ac\" (UID: \"10f23fc0-639a-465e-b0d8-3ca44dc9d1ac\") " Mar 20 11:14:06 crc kubenswrapper[4748]: I0320 11:14:06.679248 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10f23fc0-639a-465e-b0d8-3ca44dc9d1ac-kube-api-access-mlmlt" (OuterVolumeSpecName: "kube-api-access-mlmlt") pod "10f23fc0-639a-465e-b0d8-3ca44dc9d1ac" (UID: "10f23fc0-639a-465e-b0d8-3ca44dc9d1ac"). InnerVolumeSpecName "kube-api-access-mlmlt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:14:06 crc kubenswrapper[4748]: I0320 11:14:06.777977 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mlmlt\" (UniqueName: \"kubernetes.io/projected/10f23fc0-639a-465e-b0d8-3ca44dc9d1ac-kube-api-access-mlmlt\") on node \"crc\" DevicePath \"\"" Mar 20 11:14:07 crc kubenswrapper[4748]: I0320 11:14:07.245269 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566754-xn42x" event={"ID":"10f23fc0-639a-465e-b0d8-3ca44dc9d1ac","Type":"ContainerDied","Data":"b7d0d0058443123232ab9f775f31494ff360afe9f70af22a584e0d7838e82032"} Mar 20 11:14:07 crc kubenswrapper[4748]: I0320 11:14:07.245596 4748 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b7d0d0058443123232ab9f775f31494ff360afe9f70af22a584e0d7838e82032" Mar 20 11:14:07 crc kubenswrapper[4748]: I0320 11:14:07.245362 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566754-xn42x" Mar 20 11:14:07 crc kubenswrapper[4748]: I0320 11:14:07.639929 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566748-tsmnw"] Mar 20 11:14:07 crc kubenswrapper[4748]: I0320 11:14:07.648366 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566748-tsmnw"] Mar 20 11:14:09 crc kubenswrapper[4748]: I0320 11:14:09.526536 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="446f9bb3-d2aa-4f62-afbe-74e108a6c13d" path="/var/lib/kubelet/pods/446f9bb3-d2aa-4f62-afbe-74e108a6c13d/volumes" Mar 20 11:14:12 crc kubenswrapper[4748]: I0320 11:14:12.927918 4748 patch_prober.go:28] interesting pod/machine-config-daemon-5lbvz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:14:12 crc kubenswrapper[4748]: I0320 11:14:12.928304 4748 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:14:12 crc kubenswrapper[4748]: I0320 11:14:12.928360 4748 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" Mar 20 11:14:12 crc kubenswrapper[4748]: I0320 11:14:12.929188 4748 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"108bbe2c80954d52ef46b647af46c00e66773edc87b915e27f13030879de9d88"} pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 11:14:12 crc kubenswrapper[4748]: I0320 11:14:12.929247 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" containerName="machine-config-daemon" containerID="cri-o://108bbe2c80954d52ef46b647af46c00e66773edc87b915e27f13030879de9d88" gracePeriod=600 Mar 20 11:14:13 crc kubenswrapper[4748]: I0320 11:14:13.319670 4748 generic.go:334] "Generic (PLEG): container finished" podID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" containerID="108bbe2c80954d52ef46b647af46c00e66773edc87b915e27f13030879de9d88" exitCode=0 Mar 20 11:14:13 crc kubenswrapper[4748]: I0320 11:14:13.319745 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" event={"ID":"8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c","Type":"ContainerDied","Data":"108bbe2c80954d52ef46b647af46c00e66773edc87b915e27f13030879de9d88"} Mar 20 11:14:13 crc kubenswrapper[4748]: I0320 11:14:13.320094 4748 scope.go:117] "RemoveContainer" containerID="e428509671e8fe99771dfcddf54085bbb58bd5b6557fd616733b187409873ead" Mar 20 11:14:14 crc kubenswrapper[4748]: I0320 11:14:14.330143 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" event={"ID":"8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c","Type":"ContainerStarted","Data":"9efbf264ae5e3724c39e42cf83857fbc6d1e070b3c91135d558f8fb199ffbbb5"} Mar 20 11:14:20 crc kubenswrapper[4748]: I0320 11:14:20.566149 4748 scope.go:117] "RemoveContainer" containerID="c4b3dd8c12084b37e4d24b9db4f6ab202949557327c8e66e35d4553c55c06976" Mar 20 11:15:00 crc kubenswrapper[4748]: I0320 11:15:00.153762 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566755-tbdmw"] Mar 20 11:15:00 crc kubenswrapper[4748]: E0320 11:15:00.154803 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10f23fc0-639a-465e-b0d8-3ca44dc9d1ac" containerName="oc" Mar 20 11:15:00 crc kubenswrapper[4748]: I0320 11:15:00.154818 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="10f23fc0-639a-465e-b0d8-3ca44dc9d1ac" containerName="oc" Mar 20 11:15:00 crc kubenswrapper[4748]: I0320 11:15:00.155083 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="10f23fc0-639a-465e-b0d8-3ca44dc9d1ac" containerName="oc" Mar 20 11:15:00 crc kubenswrapper[4748]: I0320 11:15:00.155897 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566755-tbdmw" Mar 20 11:15:00 crc kubenswrapper[4748]: I0320 11:15:00.158748 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 20 11:15:00 crc kubenswrapper[4748]: I0320 11:15:00.158748 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 20 11:15:00 crc kubenswrapper[4748]: I0320 11:15:00.166636 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566755-tbdmw"] Mar 20 11:15:00 crc kubenswrapper[4748]: I0320 11:15:00.319236 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/db15c3e8-540b-4a26-b05d-b56ed957a8bc-config-volume\") pod \"collect-profiles-29566755-tbdmw\" (UID: \"db15c3e8-540b-4a26-b05d-b56ed957a8bc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566755-tbdmw" Mar 20 11:15:00 crc kubenswrapper[4748]: I0320 11:15:00.319310 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rn27n\" (UniqueName: \"kubernetes.io/projected/db15c3e8-540b-4a26-b05d-b56ed957a8bc-kube-api-access-rn27n\") pod \"collect-profiles-29566755-tbdmw\" (UID: \"db15c3e8-540b-4a26-b05d-b56ed957a8bc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566755-tbdmw" Mar 20 11:15:00 crc kubenswrapper[4748]: I0320 11:15:00.319558 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/db15c3e8-540b-4a26-b05d-b56ed957a8bc-secret-volume\") pod \"collect-profiles-29566755-tbdmw\" (UID: \"db15c3e8-540b-4a26-b05d-b56ed957a8bc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566755-tbdmw" Mar 20 11:15:00 crc kubenswrapper[4748]: I0320 11:15:00.422103 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/db15c3e8-540b-4a26-b05d-b56ed957a8bc-config-volume\") pod \"collect-profiles-29566755-tbdmw\" (UID: \"db15c3e8-540b-4a26-b05d-b56ed957a8bc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566755-tbdmw" Mar 20 11:15:00 crc kubenswrapper[4748]: I0320 11:15:00.422178 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rn27n\" (UniqueName: \"kubernetes.io/projected/db15c3e8-540b-4a26-b05d-b56ed957a8bc-kube-api-access-rn27n\") pod \"collect-profiles-29566755-tbdmw\" (UID: \"db15c3e8-540b-4a26-b05d-b56ed957a8bc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566755-tbdmw" Mar 20 11:15:00 crc kubenswrapper[4748]: I0320 11:15:00.422278 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/db15c3e8-540b-4a26-b05d-b56ed957a8bc-secret-volume\") pod \"collect-profiles-29566755-tbdmw\" (UID: \"db15c3e8-540b-4a26-b05d-b56ed957a8bc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566755-tbdmw" Mar 20 11:15:00 crc kubenswrapper[4748]: I0320 11:15:00.423243 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/db15c3e8-540b-4a26-b05d-b56ed957a8bc-config-volume\") pod \"collect-profiles-29566755-tbdmw\" (UID: \"db15c3e8-540b-4a26-b05d-b56ed957a8bc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566755-tbdmw" Mar 20 11:15:00 crc kubenswrapper[4748]: I0320 11:15:00.433902 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/db15c3e8-540b-4a26-b05d-b56ed957a8bc-secret-volume\") pod \"collect-profiles-29566755-tbdmw\" (UID: \"db15c3e8-540b-4a26-b05d-b56ed957a8bc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566755-tbdmw" Mar 20 11:15:00 crc kubenswrapper[4748]: I0320 11:15:00.440257 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rn27n\" (UniqueName: \"kubernetes.io/projected/db15c3e8-540b-4a26-b05d-b56ed957a8bc-kube-api-access-rn27n\") pod \"collect-profiles-29566755-tbdmw\" (UID: \"db15c3e8-540b-4a26-b05d-b56ed957a8bc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566755-tbdmw" Mar 20 11:15:00 crc kubenswrapper[4748]: I0320 11:15:00.490636 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566755-tbdmw" Mar 20 11:15:00 crc kubenswrapper[4748]: I0320 11:15:00.917646 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566755-tbdmw"] Mar 20 11:15:01 crc kubenswrapper[4748]: I0320 11:15:01.735199 4748 generic.go:334] "Generic (PLEG): container finished" podID="db15c3e8-540b-4a26-b05d-b56ed957a8bc" containerID="554d8f30dc4e32b964cfaa1ec07885aa0297794b94f6d70063855076cb48ba28" exitCode=0 Mar 20 11:15:01 crc kubenswrapper[4748]: I0320 11:15:01.735303 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566755-tbdmw" event={"ID":"db15c3e8-540b-4a26-b05d-b56ed957a8bc","Type":"ContainerDied","Data":"554d8f30dc4e32b964cfaa1ec07885aa0297794b94f6d70063855076cb48ba28"} Mar 20 11:15:01 crc kubenswrapper[4748]: I0320 11:15:01.735499 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566755-tbdmw" event={"ID":"db15c3e8-540b-4a26-b05d-b56ed957a8bc","Type":"ContainerStarted","Data":"c169c8d150457d5d1ad56d842fac83f6dfa691d4177f3f15c4085496b94a8988"} Mar 20 11:15:03 crc kubenswrapper[4748]: I0320 11:15:03.143669 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566755-tbdmw" Mar 20 11:15:03 crc kubenswrapper[4748]: I0320 11:15:03.279869 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rn27n\" (UniqueName: \"kubernetes.io/projected/db15c3e8-540b-4a26-b05d-b56ed957a8bc-kube-api-access-rn27n\") pod \"db15c3e8-540b-4a26-b05d-b56ed957a8bc\" (UID: \"db15c3e8-540b-4a26-b05d-b56ed957a8bc\") " Mar 20 11:15:03 crc kubenswrapper[4748]: I0320 11:15:03.280011 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/db15c3e8-540b-4a26-b05d-b56ed957a8bc-config-volume\") pod \"db15c3e8-540b-4a26-b05d-b56ed957a8bc\" (UID: \"db15c3e8-540b-4a26-b05d-b56ed957a8bc\") " Mar 20 11:15:03 crc kubenswrapper[4748]: I0320 11:15:03.280128 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/db15c3e8-540b-4a26-b05d-b56ed957a8bc-secret-volume\") pod \"db15c3e8-540b-4a26-b05d-b56ed957a8bc\" (UID: \"db15c3e8-540b-4a26-b05d-b56ed957a8bc\") " Mar 20 11:15:03 crc kubenswrapper[4748]: I0320 11:15:03.281493 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db15c3e8-540b-4a26-b05d-b56ed957a8bc-config-volume" (OuterVolumeSpecName: "config-volume") pod "db15c3e8-540b-4a26-b05d-b56ed957a8bc" (UID: "db15c3e8-540b-4a26-b05d-b56ed957a8bc"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 11:15:03 crc kubenswrapper[4748]: I0320 11:15:03.292776 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db15c3e8-540b-4a26-b05d-b56ed957a8bc-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "db15c3e8-540b-4a26-b05d-b56ed957a8bc" (UID: "db15c3e8-540b-4a26-b05d-b56ed957a8bc"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 11:15:03 crc kubenswrapper[4748]: I0320 11:15:03.293354 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db15c3e8-540b-4a26-b05d-b56ed957a8bc-kube-api-access-rn27n" (OuterVolumeSpecName: "kube-api-access-rn27n") pod "db15c3e8-540b-4a26-b05d-b56ed957a8bc" (UID: "db15c3e8-540b-4a26-b05d-b56ed957a8bc"). InnerVolumeSpecName "kube-api-access-rn27n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:15:03 crc kubenswrapper[4748]: I0320 11:15:03.382478 4748 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/db15c3e8-540b-4a26-b05d-b56ed957a8bc-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 20 11:15:03 crc kubenswrapper[4748]: I0320 11:15:03.382519 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rn27n\" (UniqueName: \"kubernetes.io/projected/db15c3e8-540b-4a26-b05d-b56ed957a8bc-kube-api-access-rn27n\") on node \"crc\" DevicePath \"\"" Mar 20 11:15:03 crc kubenswrapper[4748]: I0320 11:15:03.382528 4748 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/db15c3e8-540b-4a26-b05d-b56ed957a8bc-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 11:15:03 crc kubenswrapper[4748]: I0320 11:15:03.752744 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566755-tbdmw" event={"ID":"db15c3e8-540b-4a26-b05d-b56ed957a8bc","Type":"ContainerDied","Data":"c169c8d150457d5d1ad56d842fac83f6dfa691d4177f3f15c4085496b94a8988"} Mar 20 11:15:03 crc kubenswrapper[4748]: I0320 11:15:03.752786 4748 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c169c8d150457d5d1ad56d842fac83f6dfa691d4177f3f15c4085496b94a8988" Mar 20 11:15:03 crc kubenswrapper[4748]: I0320 11:15:03.752814 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566755-tbdmw" Mar 20 11:15:04 crc kubenswrapper[4748]: I0320 11:15:04.251120 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566710-vh2t7"] Mar 20 11:15:04 crc kubenswrapper[4748]: I0320 11:15:04.259146 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566710-vh2t7"] Mar 20 11:15:05 crc kubenswrapper[4748]: I0320 11:15:05.528420 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2dd2dd89-ba90-440f-abc8-74ab27d7db69" path="/var/lib/kubelet/pods/2dd2dd89-ba90-440f-abc8-74ab27d7db69/volumes" Mar 20 11:15:20 crc kubenswrapper[4748]: I0320 11:15:20.647825 4748 scope.go:117] "RemoveContainer" containerID="f26e13d2a4bc838ed9c49216a3f99f4fc8a056e05e61c73cdfe6e29da627a0df" Mar 20 11:15:36 crc kubenswrapper[4748]: I0320 11:15:36.838109 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-wcfwt"] Mar 20 11:15:36 crc kubenswrapper[4748]: E0320 11:15:36.839207 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db15c3e8-540b-4a26-b05d-b56ed957a8bc" containerName="collect-profiles" Mar 20 11:15:36 crc kubenswrapper[4748]: I0320 11:15:36.839225 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="db15c3e8-540b-4a26-b05d-b56ed957a8bc" containerName="collect-profiles" Mar 20 11:15:36 crc kubenswrapper[4748]: I0320 11:15:36.839539 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="db15c3e8-540b-4a26-b05d-b56ed957a8bc" containerName="collect-profiles" Mar 20 11:15:36 crc kubenswrapper[4748]: I0320 11:15:36.841797 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wcfwt" Mar 20 11:15:36 crc kubenswrapper[4748]: I0320 11:15:36.860313 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wcfwt"] Mar 20 11:15:36 crc kubenswrapper[4748]: I0320 11:15:36.941960 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/391b0f44-84a4-4a23-98a9-c0ca16b01cd5-catalog-content\") pod \"community-operators-wcfwt\" (UID: \"391b0f44-84a4-4a23-98a9-c0ca16b01cd5\") " pod="openshift-marketplace/community-operators-wcfwt" Mar 20 11:15:36 crc kubenswrapper[4748]: I0320 11:15:36.942297 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/391b0f44-84a4-4a23-98a9-c0ca16b01cd5-utilities\") pod \"community-operators-wcfwt\" (UID: \"391b0f44-84a4-4a23-98a9-c0ca16b01cd5\") " pod="openshift-marketplace/community-operators-wcfwt" Mar 20 11:15:36 crc kubenswrapper[4748]: I0320 11:15:36.942327 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96n6p\" (UniqueName: \"kubernetes.io/projected/391b0f44-84a4-4a23-98a9-c0ca16b01cd5-kube-api-access-96n6p\") pod \"community-operators-wcfwt\" (UID: \"391b0f44-84a4-4a23-98a9-c0ca16b01cd5\") " pod="openshift-marketplace/community-operators-wcfwt" Mar 20 11:15:37 crc kubenswrapper[4748]: I0320 11:15:37.044517 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/391b0f44-84a4-4a23-98a9-c0ca16b01cd5-utilities\") pod \"community-operators-wcfwt\" (UID: \"391b0f44-84a4-4a23-98a9-c0ca16b01cd5\") " pod="openshift-marketplace/community-operators-wcfwt" Mar 20 11:15:37 crc kubenswrapper[4748]: I0320 11:15:37.044576 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-96n6p\" (UniqueName: \"kubernetes.io/projected/391b0f44-84a4-4a23-98a9-c0ca16b01cd5-kube-api-access-96n6p\") pod \"community-operators-wcfwt\" (UID: \"391b0f44-84a4-4a23-98a9-c0ca16b01cd5\") " pod="openshift-marketplace/community-operators-wcfwt" Mar 20 11:15:37 crc kubenswrapper[4748]: I0320 11:15:37.044654 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/391b0f44-84a4-4a23-98a9-c0ca16b01cd5-catalog-content\") pod \"community-operators-wcfwt\" (UID: \"391b0f44-84a4-4a23-98a9-c0ca16b01cd5\") " pod="openshift-marketplace/community-operators-wcfwt" Mar 20 11:15:37 crc kubenswrapper[4748]: I0320 11:15:37.045010 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/391b0f44-84a4-4a23-98a9-c0ca16b01cd5-utilities\") pod \"community-operators-wcfwt\" (UID: \"391b0f44-84a4-4a23-98a9-c0ca16b01cd5\") " pod="openshift-marketplace/community-operators-wcfwt" Mar 20 11:15:37 crc kubenswrapper[4748]: I0320 11:15:37.045272 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/391b0f44-84a4-4a23-98a9-c0ca16b01cd5-catalog-content\") pod \"community-operators-wcfwt\" (UID: \"391b0f44-84a4-4a23-98a9-c0ca16b01cd5\") " pod="openshift-marketplace/community-operators-wcfwt" Mar 20 11:15:37 crc kubenswrapper[4748]: I0320 11:15:37.067723 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-96n6p\" (UniqueName: \"kubernetes.io/projected/391b0f44-84a4-4a23-98a9-c0ca16b01cd5-kube-api-access-96n6p\") pod \"community-operators-wcfwt\" (UID: \"391b0f44-84a4-4a23-98a9-c0ca16b01cd5\") " pod="openshift-marketplace/community-operators-wcfwt" Mar 20 11:15:37 crc kubenswrapper[4748]: I0320 11:15:37.186491 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wcfwt" Mar 20 11:15:37 crc kubenswrapper[4748]: I0320 11:15:37.783059 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wcfwt"] Mar 20 11:15:38 crc kubenswrapper[4748]: I0320 11:15:38.061040 4748 generic.go:334] "Generic (PLEG): container finished" podID="391b0f44-84a4-4a23-98a9-c0ca16b01cd5" containerID="5e2d41fa00b53a6aa0106ab5604216a4644bb02eb1ec33aa1f358229183fe982" exitCode=0 Mar 20 11:15:38 crc kubenswrapper[4748]: I0320 11:15:38.061108 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wcfwt" event={"ID":"391b0f44-84a4-4a23-98a9-c0ca16b01cd5","Type":"ContainerDied","Data":"5e2d41fa00b53a6aa0106ab5604216a4644bb02eb1ec33aa1f358229183fe982"} Mar 20 11:15:38 crc kubenswrapper[4748]: I0320 11:15:38.061368 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wcfwt" event={"ID":"391b0f44-84a4-4a23-98a9-c0ca16b01cd5","Type":"ContainerStarted","Data":"9d0b941128fd6a88cca489beb6a165cebfef99f42e3f84411133222e49322d72"} Mar 20 11:15:38 crc kubenswrapper[4748]: I0320 11:15:38.063016 4748 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 11:15:39 crc kubenswrapper[4748]: I0320 11:15:39.073239 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wcfwt" event={"ID":"391b0f44-84a4-4a23-98a9-c0ca16b01cd5","Type":"ContainerStarted","Data":"98b4069f8da92afa5c85c88097e69a5a31be70268289b711f8933c91b61d38de"} Mar 20 11:15:41 crc kubenswrapper[4748]: I0320 11:15:41.089788 4748 generic.go:334] "Generic (PLEG): container finished" podID="391b0f44-84a4-4a23-98a9-c0ca16b01cd5" containerID="98b4069f8da92afa5c85c88097e69a5a31be70268289b711f8933c91b61d38de" exitCode=0 Mar 20 11:15:41 crc kubenswrapper[4748]: I0320 11:15:41.089824 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wcfwt" event={"ID":"391b0f44-84a4-4a23-98a9-c0ca16b01cd5","Type":"ContainerDied","Data":"98b4069f8da92afa5c85c88097e69a5a31be70268289b711f8933c91b61d38de"} Mar 20 11:15:42 crc kubenswrapper[4748]: I0320 11:15:42.099762 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wcfwt" event={"ID":"391b0f44-84a4-4a23-98a9-c0ca16b01cd5","Type":"ContainerStarted","Data":"541a1c19dd3d4407f3d8b29e8b4a69b5b6dbddd4e11836f800066b75769ea200"} Mar 20 11:15:42 crc kubenswrapper[4748]: I0320 11:15:42.136022 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-wcfwt" podStartSLOduration=2.416278713 podStartE2EDuration="6.135993797s" podCreationTimestamp="2026-03-20 11:15:36 +0000 UTC" firstStartedPulling="2026-03-20 11:15:38.062707466 +0000 UTC m=+2373.204253280" lastFinishedPulling="2026-03-20 11:15:41.78242255 +0000 UTC m=+2376.923968364" observedRunningTime="2026-03-20 11:15:42.126403245 +0000 UTC m=+2377.267949059" watchObservedRunningTime="2026-03-20 11:15:42.135993797 +0000 UTC m=+2377.277539611" Mar 20 11:15:47 crc kubenswrapper[4748]: I0320 11:15:47.186597 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-wcfwt" Mar 20 11:15:47 crc kubenswrapper[4748]: I0320 11:15:47.187954 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-wcfwt" Mar 20 11:15:47 crc kubenswrapper[4748]: I0320 11:15:47.235797 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-wcfwt" Mar 20 11:15:48 crc kubenswrapper[4748]: I0320 11:15:48.193170 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-wcfwt" Mar 20 11:15:48 crc kubenswrapper[4748]: I0320 11:15:48.242395 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wcfwt"] Mar 20 11:15:50 crc kubenswrapper[4748]: I0320 11:15:50.167334 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-wcfwt" podUID="391b0f44-84a4-4a23-98a9-c0ca16b01cd5" containerName="registry-server" containerID="cri-o://541a1c19dd3d4407f3d8b29e8b4a69b5b6dbddd4e11836f800066b75769ea200" gracePeriod=2 Mar 20 11:15:50 crc kubenswrapper[4748]: I0320 11:15:50.647355 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wcfwt" Mar 20 11:15:50 crc kubenswrapper[4748]: I0320 11:15:50.718661 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/391b0f44-84a4-4a23-98a9-c0ca16b01cd5-utilities\") pod \"391b0f44-84a4-4a23-98a9-c0ca16b01cd5\" (UID: \"391b0f44-84a4-4a23-98a9-c0ca16b01cd5\") " Mar 20 11:15:50 crc kubenswrapper[4748]: I0320 11:15:50.718724 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/391b0f44-84a4-4a23-98a9-c0ca16b01cd5-catalog-content\") pod \"391b0f44-84a4-4a23-98a9-c0ca16b01cd5\" (UID: \"391b0f44-84a4-4a23-98a9-c0ca16b01cd5\") " Mar 20 11:15:50 crc kubenswrapper[4748]: I0320 11:15:50.718815 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-96n6p\" (UniqueName: \"kubernetes.io/projected/391b0f44-84a4-4a23-98a9-c0ca16b01cd5-kube-api-access-96n6p\") pod \"391b0f44-84a4-4a23-98a9-c0ca16b01cd5\" (UID: \"391b0f44-84a4-4a23-98a9-c0ca16b01cd5\") " Mar 20 11:15:50 crc kubenswrapper[4748]: I0320 11:15:50.720304 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/391b0f44-84a4-4a23-98a9-c0ca16b01cd5-utilities" (OuterVolumeSpecName: "utilities") pod "391b0f44-84a4-4a23-98a9-c0ca16b01cd5" (UID: "391b0f44-84a4-4a23-98a9-c0ca16b01cd5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:15:50 crc kubenswrapper[4748]: I0320 11:15:50.724828 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/391b0f44-84a4-4a23-98a9-c0ca16b01cd5-kube-api-access-96n6p" (OuterVolumeSpecName: "kube-api-access-96n6p") pod "391b0f44-84a4-4a23-98a9-c0ca16b01cd5" (UID: "391b0f44-84a4-4a23-98a9-c0ca16b01cd5"). InnerVolumeSpecName "kube-api-access-96n6p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:15:50 crc kubenswrapper[4748]: I0320 11:15:50.779654 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/391b0f44-84a4-4a23-98a9-c0ca16b01cd5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "391b0f44-84a4-4a23-98a9-c0ca16b01cd5" (UID: "391b0f44-84a4-4a23-98a9-c0ca16b01cd5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:15:50 crc kubenswrapper[4748]: I0320 11:15:50.821734 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-96n6p\" (UniqueName: \"kubernetes.io/projected/391b0f44-84a4-4a23-98a9-c0ca16b01cd5-kube-api-access-96n6p\") on node \"crc\" DevicePath \"\"" Mar 20 11:15:50 crc kubenswrapper[4748]: I0320 11:15:50.821772 4748 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/391b0f44-84a4-4a23-98a9-c0ca16b01cd5-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 11:15:50 crc kubenswrapper[4748]: I0320 11:15:50.821782 4748 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/391b0f44-84a4-4a23-98a9-c0ca16b01cd5-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 11:15:51 crc kubenswrapper[4748]: I0320 11:15:51.186922 4748 generic.go:334] "Generic (PLEG): container finished" podID="391b0f44-84a4-4a23-98a9-c0ca16b01cd5" containerID="541a1c19dd3d4407f3d8b29e8b4a69b5b6dbddd4e11836f800066b75769ea200" exitCode=0 Mar 20 11:15:51 crc kubenswrapper[4748]: I0320 11:15:51.186986 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wcfwt" Mar 20 11:15:51 crc kubenswrapper[4748]: I0320 11:15:51.186985 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wcfwt" event={"ID":"391b0f44-84a4-4a23-98a9-c0ca16b01cd5","Type":"ContainerDied","Data":"541a1c19dd3d4407f3d8b29e8b4a69b5b6dbddd4e11836f800066b75769ea200"} Mar 20 11:15:51 crc kubenswrapper[4748]: I0320 11:15:51.187348 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wcfwt" event={"ID":"391b0f44-84a4-4a23-98a9-c0ca16b01cd5","Type":"ContainerDied","Data":"9d0b941128fd6a88cca489beb6a165cebfef99f42e3f84411133222e49322d72"} Mar 20 11:15:51 crc kubenswrapper[4748]: I0320 11:15:51.187368 4748 scope.go:117] "RemoveContainer" containerID="541a1c19dd3d4407f3d8b29e8b4a69b5b6dbddd4e11836f800066b75769ea200" Mar 20 11:15:51 crc kubenswrapper[4748]: I0320 11:15:51.211254 4748 scope.go:117] "RemoveContainer" containerID="98b4069f8da92afa5c85c88097e69a5a31be70268289b711f8933c91b61d38de" Mar 20 11:15:51 crc kubenswrapper[4748]: I0320 11:15:51.229284 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wcfwt"] Mar 20 11:15:51 crc kubenswrapper[4748]: I0320 11:15:51.238283 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-wcfwt"] Mar 20 11:15:51 crc kubenswrapper[4748]: I0320 11:15:51.243256 4748 scope.go:117] "RemoveContainer" containerID="5e2d41fa00b53a6aa0106ab5604216a4644bb02eb1ec33aa1f358229183fe982" Mar 20 11:15:51 crc kubenswrapper[4748]: I0320 11:15:51.289199 4748 scope.go:117] "RemoveContainer" containerID="541a1c19dd3d4407f3d8b29e8b4a69b5b6dbddd4e11836f800066b75769ea200" Mar 20 11:15:51 crc kubenswrapper[4748]: E0320 11:15:51.289820 4748 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"541a1c19dd3d4407f3d8b29e8b4a69b5b6dbddd4e11836f800066b75769ea200\": container with ID starting with 541a1c19dd3d4407f3d8b29e8b4a69b5b6dbddd4e11836f800066b75769ea200 not found: ID does not exist" containerID="541a1c19dd3d4407f3d8b29e8b4a69b5b6dbddd4e11836f800066b75769ea200" Mar 20 11:15:51 crc kubenswrapper[4748]: I0320 11:15:51.289908 4748 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"541a1c19dd3d4407f3d8b29e8b4a69b5b6dbddd4e11836f800066b75769ea200"} err="failed to get container status \"541a1c19dd3d4407f3d8b29e8b4a69b5b6dbddd4e11836f800066b75769ea200\": rpc error: code = NotFound desc = could not find container \"541a1c19dd3d4407f3d8b29e8b4a69b5b6dbddd4e11836f800066b75769ea200\": container with ID starting with 541a1c19dd3d4407f3d8b29e8b4a69b5b6dbddd4e11836f800066b75769ea200 not found: ID does not exist" Mar 20 11:15:51 crc kubenswrapper[4748]: I0320 11:15:51.289954 4748 scope.go:117] "RemoveContainer" containerID="98b4069f8da92afa5c85c88097e69a5a31be70268289b711f8933c91b61d38de" Mar 20 11:15:51 crc kubenswrapper[4748]: E0320 11:15:51.290402 4748 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"98b4069f8da92afa5c85c88097e69a5a31be70268289b711f8933c91b61d38de\": container with ID starting with 98b4069f8da92afa5c85c88097e69a5a31be70268289b711f8933c91b61d38de not found: ID does not exist" containerID="98b4069f8da92afa5c85c88097e69a5a31be70268289b711f8933c91b61d38de" Mar 20 11:15:51 crc kubenswrapper[4748]: I0320 11:15:51.290477 4748 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98b4069f8da92afa5c85c88097e69a5a31be70268289b711f8933c91b61d38de"} err="failed to get container status \"98b4069f8da92afa5c85c88097e69a5a31be70268289b711f8933c91b61d38de\": rpc error: code = NotFound desc = could not find container \"98b4069f8da92afa5c85c88097e69a5a31be70268289b711f8933c91b61d38de\": container with ID starting with 98b4069f8da92afa5c85c88097e69a5a31be70268289b711f8933c91b61d38de not found: ID does not exist" Mar 20 11:15:51 crc kubenswrapper[4748]: I0320 11:15:51.290528 4748 scope.go:117] "RemoveContainer" containerID="5e2d41fa00b53a6aa0106ab5604216a4644bb02eb1ec33aa1f358229183fe982" Mar 20 11:15:51 crc kubenswrapper[4748]: E0320 11:15:51.290978 4748 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e2d41fa00b53a6aa0106ab5604216a4644bb02eb1ec33aa1f358229183fe982\": container with ID starting with 5e2d41fa00b53a6aa0106ab5604216a4644bb02eb1ec33aa1f358229183fe982 not found: ID does not exist" containerID="5e2d41fa00b53a6aa0106ab5604216a4644bb02eb1ec33aa1f358229183fe982" Mar 20 11:15:51 crc kubenswrapper[4748]: I0320 11:15:51.291013 4748 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e2d41fa00b53a6aa0106ab5604216a4644bb02eb1ec33aa1f358229183fe982"} err="failed to get container status \"5e2d41fa00b53a6aa0106ab5604216a4644bb02eb1ec33aa1f358229183fe982\": rpc error: code = NotFound desc = could not find container \"5e2d41fa00b53a6aa0106ab5604216a4644bb02eb1ec33aa1f358229183fe982\": container with ID starting with 5e2d41fa00b53a6aa0106ab5604216a4644bb02eb1ec33aa1f358229183fe982 not found: ID does not exist" Mar 20 11:15:51 crc kubenswrapper[4748]: I0320 11:15:51.525349 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="391b0f44-84a4-4a23-98a9-c0ca16b01cd5" path="/var/lib/kubelet/pods/391b0f44-84a4-4a23-98a9-c0ca16b01cd5/volumes" Mar 20 11:16:00 crc kubenswrapper[4748]: I0320 11:16:00.145140 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566756-s7r5n"] Mar 20 11:16:00 crc kubenswrapper[4748]: E0320 11:16:00.155600 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="391b0f44-84a4-4a23-98a9-c0ca16b01cd5" containerName="extract-utilities" Mar 20 11:16:00 crc kubenswrapper[4748]: I0320 11:16:00.155642 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="391b0f44-84a4-4a23-98a9-c0ca16b01cd5" containerName="extract-utilities" Mar 20 11:16:00 crc kubenswrapper[4748]: E0320 11:16:00.155667 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="391b0f44-84a4-4a23-98a9-c0ca16b01cd5" containerName="registry-server" Mar 20 11:16:00 crc kubenswrapper[4748]: I0320 11:16:00.155673 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="391b0f44-84a4-4a23-98a9-c0ca16b01cd5" containerName="registry-server" Mar 20 11:16:00 crc kubenswrapper[4748]: E0320 11:16:00.155693 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="391b0f44-84a4-4a23-98a9-c0ca16b01cd5" containerName="extract-content" Mar 20 11:16:00 crc kubenswrapper[4748]: I0320 11:16:00.155698 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="391b0f44-84a4-4a23-98a9-c0ca16b01cd5" containerName="extract-content" Mar 20 11:16:00 crc kubenswrapper[4748]: I0320 11:16:00.155961 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="391b0f44-84a4-4a23-98a9-c0ca16b01cd5" containerName="registry-server" Mar 20 11:16:00 crc kubenswrapper[4748]: I0320 11:16:00.156608 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566756-s7r5n"] Mar 20 11:16:00 crc kubenswrapper[4748]: I0320 11:16:00.156704 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566756-s7r5n" Mar 20 11:16:00 crc kubenswrapper[4748]: I0320 11:16:00.159409 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-p6q8z" Mar 20 11:16:00 crc kubenswrapper[4748]: I0320 11:16:00.161275 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 11:16:00 crc kubenswrapper[4748]: I0320 11:16:00.161421 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 11:16:00 crc kubenswrapper[4748]: I0320 11:16:00.209100 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkjdm\" (UniqueName: \"kubernetes.io/projected/193a6d7a-46ae-4834-a29d-b539ce90fc65-kube-api-access-bkjdm\") pod \"auto-csr-approver-29566756-s7r5n\" (UID: \"193a6d7a-46ae-4834-a29d-b539ce90fc65\") " pod="openshift-infra/auto-csr-approver-29566756-s7r5n" Mar 20 11:16:00 crc kubenswrapper[4748]: I0320 11:16:00.312040 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bkjdm\" (UniqueName: \"kubernetes.io/projected/193a6d7a-46ae-4834-a29d-b539ce90fc65-kube-api-access-bkjdm\") pod \"auto-csr-approver-29566756-s7r5n\" (UID: \"193a6d7a-46ae-4834-a29d-b539ce90fc65\") " pod="openshift-infra/auto-csr-approver-29566756-s7r5n" Mar 20 11:16:00 crc kubenswrapper[4748]: I0320 11:16:00.331520 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkjdm\" (UniqueName: \"kubernetes.io/projected/193a6d7a-46ae-4834-a29d-b539ce90fc65-kube-api-access-bkjdm\") pod \"auto-csr-approver-29566756-s7r5n\" (UID: \"193a6d7a-46ae-4834-a29d-b539ce90fc65\") " pod="openshift-infra/auto-csr-approver-29566756-s7r5n" Mar 20 11:16:00 crc kubenswrapper[4748]: I0320 11:16:00.484595 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566756-s7r5n" Mar 20 11:16:00 crc kubenswrapper[4748]: I0320 11:16:00.899271 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566756-s7r5n"] Mar 20 11:16:01 crc kubenswrapper[4748]: I0320 11:16:01.283480 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566756-s7r5n" event={"ID":"193a6d7a-46ae-4834-a29d-b539ce90fc65","Type":"ContainerStarted","Data":"444bd3e7788d5eb8661b8d307d3f3738bcf60e84269e226675c8c44cd77ca324"} Mar 20 11:16:02 crc kubenswrapper[4748]: I0320 11:16:02.294382 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566756-s7r5n" event={"ID":"193a6d7a-46ae-4834-a29d-b539ce90fc65","Type":"ContainerStarted","Data":"27e518e54fdd74a7549c9ad651969f888e457063ac2fc259671305be23e1e38b"} Mar 20 11:16:02 crc kubenswrapper[4748]: I0320 11:16:02.315298 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29566756-s7r5n" podStartSLOduration=1.235986363 podStartE2EDuration="2.31527685s" podCreationTimestamp="2026-03-20 11:16:00 +0000 UTC" firstStartedPulling="2026-03-20 11:16:00.905374494 +0000 UTC m=+2396.046920308" lastFinishedPulling="2026-03-20 11:16:01.984664971 +0000 UTC m=+2397.126210795" observedRunningTime="2026-03-20 11:16:02.310410367 +0000 UTC m=+2397.451956171" watchObservedRunningTime="2026-03-20 11:16:02.31527685 +0000 UTC m=+2397.456822664" Mar 20 11:16:03 crc kubenswrapper[4748]: I0320 11:16:03.303654 4748 generic.go:334] "Generic (PLEG): container finished" podID="193a6d7a-46ae-4834-a29d-b539ce90fc65" containerID="27e518e54fdd74a7549c9ad651969f888e457063ac2fc259671305be23e1e38b" exitCode=0 Mar 20 11:16:03 crc kubenswrapper[4748]: I0320 11:16:03.303709 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566756-s7r5n" event={"ID":"193a6d7a-46ae-4834-a29d-b539ce90fc65","Type":"ContainerDied","Data":"27e518e54fdd74a7549c9ad651969f888e457063ac2fc259671305be23e1e38b"} Mar 20 11:16:04 crc kubenswrapper[4748]: I0320 11:16:04.650309 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566756-s7r5n" Mar 20 11:16:04 crc kubenswrapper[4748]: I0320 11:16:04.707178 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bkjdm\" (UniqueName: \"kubernetes.io/projected/193a6d7a-46ae-4834-a29d-b539ce90fc65-kube-api-access-bkjdm\") pod \"193a6d7a-46ae-4834-a29d-b539ce90fc65\" (UID: \"193a6d7a-46ae-4834-a29d-b539ce90fc65\") " Mar 20 11:16:04 crc kubenswrapper[4748]: I0320 11:16:04.720379 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/193a6d7a-46ae-4834-a29d-b539ce90fc65-kube-api-access-bkjdm" (OuterVolumeSpecName: "kube-api-access-bkjdm") pod "193a6d7a-46ae-4834-a29d-b539ce90fc65" (UID: "193a6d7a-46ae-4834-a29d-b539ce90fc65"). InnerVolumeSpecName "kube-api-access-bkjdm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:16:04 crc kubenswrapper[4748]: I0320 11:16:04.811387 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bkjdm\" (UniqueName: \"kubernetes.io/projected/193a6d7a-46ae-4834-a29d-b539ce90fc65-kube-api-access-bkjdm\") on node \"crc\" DevicePath \"\"" Mar 20 11:16:05 crc kubenswrapper[4748]: I0320 11:16:05.327304 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566756-s7r5n" event={"ID":"193a6d7a-46ae-4834-a29d-b539ce90fc65","Type":"ContainerDied","Data":"444bd3e7788d5eb8661b8d307d3f3738bcf60e84269e226675c8c44cd77ca324"} Mar 20 11:16:05 crc kubenswrapper[4748]: I0320 11:16:05.327612 4748 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="444bd3e7788d5eb8661b8d307d3f3738bcf60e84269e226675c8c44cd77ca324" Mar 20 11:16:05 crc kubenswrapper[4748]: I0320 11:16:05.327341 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566756-s7r5n" Mar 20 11:16:05 crc kubenswrapper[4748]: I0320 11:16:05.378089 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566750-jcp97"] Mar 20 11:16:05 crc kubenswrapper[4748]: I0320 11:16:05.386306 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566750-jcp97"] Mar 20 11:16:05 crc kubenswrapper[4748]: I0320 11:16:05.544750 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f13ae640-255e-402f-b0ab-a8fe649902bb" path="/var/lib/kubelet/pods/f13ae640-255e-402f-b0ab-a8fe649902bb/volumes" Mar 20 11:16:20 crc kubenswrapper[4748]: I0320 11:16:20.709941 4748 scope.go:117] "RemoveContainer" containerID="35540e14a6eb77ccda615e494db25e46793126f3df163f027d7ece3077f932aa" Mar 20 11:16:42 crc kubenswrapper[4748]: I0320 11:16:42.928031 4748 patch_prober.go:28] interesting pod/machine-config-daemon-5lbvz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:16:42 crc kubenswrapper[4748]: I0320 11:16:42.928557 4748 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:17:12 crc kubenswrapper[4748]: I0320 11:17:12.928379 4748 patch_prober.go:28] interesting pod/machine-config-daemon-5lbvz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:17:12 crc kubenswrapper[4748]: I0320 11:17:12.928963 4748 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:17:38 crc kubenswrapper[4748]: I0320 11:17:38.162390 4748 generic.go:334] "Generic (PLEG): container finished" podID="a1734ea2-369d-4c96-aca3-1a450a82e9dc" containerID="ea3f2a156242454a0399dfbf9e3ce7cc44aea6e8255d0e4a48d05d8262e0e37a" exitCode=0 Mar 20 11:17:38 crc kubenswrapper[4748]: I0320 11:17:38.162514 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rf2m4" event={"ID":"a1734ea2-369d-4c96-aca3-1a450a82e9dc","Type":"ContainerDied","Data":"ea3f2a156242454a0399dfbf9e3ce7cc44aea6e8255d0e4a48d05d8262e0e37a"} Mar 20 11:17:39 crc kubenswrapper[4748]: I0320 11:17:39.545691 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rf2m4" Mar 20 11:17:39 crc kubenswrapper[4748]: I0320 11:17:39.626105 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2b6tr\" (UniqueName: \"kubernetes.io/projected/a1734ea2-369d-4c96-aca3-1a450a82e9dc-kube-api-access-2b6tr\") pod \"a1734ea2-369d-4c96-aca3-1a450a82e9dc\" (UID: \"a1734ea2-369d-4c96-aca3-1a450a82e9dc\") " Mar 20 11:17:39 crc kubenswrapper[4748]: I0320 11:17:39.626154 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1734ea2-369d-4c96-aca3-1a450a82e9dc-libvirt-combined-ca-bundle\") pod \"a1734ea2-369d-4c96-aca3-1a450a82e9dc\" (UID: \"a1734ea2-369d-4c96-aca3-1a450a82e9dc\") " Mar 20 11:17:39 crc kubenswrapper[4748]: I0320 11:17:39.626212 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/a1734ea2-369d-4c96-aca3-1a450a82e9dc-libvirt-secret-0\") pod \"a1734ea2-369d-4c96-aca3-1a450a82e9dc\" (UID: \"a1734ea2-369d-4c96-aca3-1a450a82e9dc\") " Mar 20 11:17:39 crc kubenswrapper[4748]: I0320 11:17:39.626238 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a1734ea2-369d-4c96-aca3-1a450a82e9dc-inventory\") pod \"a1734ea2-369d-4c96-aca3-1a450a82e9dc\" (UID: \"a1734ea2-369d-4c96-aca3-1a450a82e9dc\") " Mar 20 11:17:39 crc kubenswrapper[4748]: I0320 11:17:39.626374 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a1734ea2-369d-4c96-aca3-1a450a82e9dc-ssh-key-openstack-edpm-ipam\") pod \"a1734ea2-369d-4c96-aca3-1a450a82e9dc\" (UID: \"a1734ea2-369d-4c96-aca3-1a450a82e9dc\") " Mar 20 11:17:39 crc kubenswrapper[4748]: I0320 11:17:39.639115 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1734ea2-369d-4c96-aca3-1a450a82e9dc-kube-api-access-2b6tr" (OuterVolumeSpecName: "kube-api-access-2b6tr") pod "a1734ea2-369d-4c96-aca3-1a450a82e9dc" (UID: "a1734ea2-369d-4c96-aca3-1a450a82e9dc"). InnerVolumeSpecName "kube-api-access-2b6tr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:17:39 crc kubenswrapper[4748]: I0320 11:17:39.639148 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1734ea2-369d-4c96-aca3-1a450a82e9dc-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "a1734ea2-369d-4c96-aca3-1a450a82e9dc" (UID: "a1734ea2-369d-4c96-aca3-1a450a82e9dc"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 11:17:39 crc kubenswrapper[4748]: I0320 11:17:39.654220 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1734ea2-369d-4c96-aca3-1a450a82e9dc-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "a1734ea2-369d-4c96-aca3-1a450a82e9dc" (UID: "a1734ea2-369d-4c96-aca3-1a450a82e9dc"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 11:17:39 crc kubenswrapper[4748]: I0320 11:17:39.656918 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1734ea2-369d-4c96-aca3-1a450a82e9dc-inventory" (OuterVolumeSpecName: "inventory") pod "a1734ea2-369d-4c96-aca3-1a450a82e9dc" (UID: "a1734ea2-369d-4c96-aca3-1a450a82e9dc"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 11:17:39 crc kubenswrapper[4748]: I0320 11:17:39.670947 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1734ea2-369d-4c96-aca3-1a450a82e9dc-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "a1734ea2-369d-4c96-aca3-1a450a82e9dc" (UID: "a1734ea2-369d-4c96-aca3-1a450a82e9dc"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 11:17:39 crc kubenswrapper[4748]: I0320 11:17:39.730048 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2b6tr\" (UniqueName: \"kubernetes.io/projected/a1734ea2-369d-4c96-aca3-1a450a82e9dc-kube-api-access-2b6tr\") on node \"crc\" DevicePath \"\"" Mar 20 11:17:39 crc kubenswrapper[4748]: I0320 11:17:39.730111 4748 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1734ea2-369d-4c96-aca3-1a450a82e9dc-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 11:17:39 crc kubenswrapper[4748]: I0320 11:17:39.730136 4748 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/a1734ea2-369d-4c96-aca3-1a450a82e9dc-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Mar 20 11:17:39 crc kubenswrapper[4748]: I0320 11:17:39.730163 4748 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a1734ea2-369d-4c96-aca3-1a450a82e9dc-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 11:17:39 crc kubenswrapper[4748]: I0320 11:17:39.730185 4748 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a1734ea2-369d-4c96-aca3-1a450a82e9dc-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 11:17:40 crc kubenswrapper[4748]: I0320 11:17:40.180966 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rf2m4" event={"ID":"a1734ea2-369d-4c96-aca3-1a450a82e9dc","Type":"ContainerDied","Data":"c6471593d21744153b91f00bc67341173fb979186036d22618c3eed0c90ba4be"} Mar 20 11:17:40 crc kubenswrapper[4748]: I0320 11:17:40.181277 4748 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c6471593d21744153b91f00bc67341173fb979186036d22618c3eed0c90ba4be" Mar 20 11:17:40 crc kubenswrapper[4748]: I0320 11:17:40.181040 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rf2m4" Mar 20 11:17:40 crc kubenswrapper[4748]: I0320 11:17:40.421910 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-fjmvn"] Mar 20 11:17:40 crc kubenswrapper[4748]: E0320 11:17:40.422405 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="193a6d7a-46ae-4834-a29d-b539ce90fc65" containerName="oc" Mar 20 11:17:40 crc kubenswrapper[4748]: I0320 11:17:40.422427 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="193a6d7a-46ae-4834-a29d-b539ce90fc65" containerName="oc" Mar 20 11:17:40 crc kubenswrapper[4748]: E0320 11:17:40.422450 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1734ea2-369d-4c96-aca3-1a450a82e9dc" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 20 11:17:40 crc kubenswrapper[4748]: I0320 11:17:40.422460 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1734ea2-369d-4c96-aca3-1a450a82e9dc" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 20 11:17:40 crc kubenswrapper[4748]: I0320 11:17:40.422673 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1734ea2-369d-4c96-aca3-1a450a82e9dc" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 20 11:17:40 crc kubenswrapper[4748]: I0320 11:17:40.422707 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="193a6d7a-46ae-4834-a29d-b539ce90fc65" containerName="oc" Mar 20 11:17:40 crc kubenswrapper[4748]: I0320 11:17:40.423492 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fjmvn" Mar 20 11:17:40 crc kubenswrapper[4748]: I0320 11:17:40.425085 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 11:17:40 crc kubenswrapper[4748]: I0320 11:17:40.425590 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Mar 20 11:17:40 crc kubenswrapper[4748]: I0320 11:17:40.426030 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-fd5jb" Mar 20 11:17:40 crc kubenswrapper[4748]: I0320 11:17:40.426250 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 11:17:40 crc kubenswrapper[4748]: I0320 11:17:40.426399 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 11:17:40 crc kubenswrapper[4748]: I0320 11:17:40.426878 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Mar 20 11:17:40 crc kubenswrapper[4748]: I0320 11:17:40.427867 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Mar 20 11:17:40 crc kubenswrapper[4748]: I0320 11:17:40.436474 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-fjmvn"] Mar 20 11:17:40 crc kubenswrapper[4748]: I0320 11:17:40.544919 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/4457d05c-f317-4cf2-97cb-03888616f4af-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-fjmvn\" (UID: \"4457d05c-f317-4cf2-97cb-03888616f4af\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fjmvn" Mar 20 11:17:40 crc kubenswrapper[4748]: I0320 11:17:40.545028 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/4457d05c-f317-4cf2-97cb-03888616f4af-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-fjmvn\" (UID: \"4457d05c-f317-4cf2-97cb-03888616f4af\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fjmvn" Mar 20 11:17:40 crc kubenswrapper[4748]: I0320 11:17:40.545098 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/4457d05c-f317-4cf2-97cb-03888616f4af-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-fjmvn\" (UID: \"4457d05c-f317-4cf2-97cb-03888616f4af\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fjmvn" Mar 20 11:17:40 crc kubenswrapper[4748]: I0320 11:17:40.545138 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4457d05c-f317-4cf2-97cb-03888616f4af-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-fjmvn\" (UID: \"4457d05c-f317-4cf2-97cb-03888616f4af\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fjmvn" Mar 20 11:17:40 crc kubenswrapper[4748]: I0320 11:17:40.545169 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/4457d05c-f317-4cf2-97cb-03888616f4af-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-fjmvn\" (UID: \"4457d05c-f317-4cf2-97cb-03888616f4af\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fjmvn" Mar 20 11:17:40 crc kubenswrapper[4748]: I0320 11:17:40.545216 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/4457d05c-f317-4cf2-97cb-03888616f4af-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-fjmvn\" (UID: \"4457d05c-f317-4cf2-97cb-03888616f4af\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fjmvn" Mar 20 11:17:40 crc kubenswrapper[4748]: I0320 11:17:40.545259 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78s4q\" (UniqueName: \"kubernetes.io/projected/4457d05c-f317-4cf2-97cb-03888616f4af-kube-api-access-78s4q\") pod \"nova-edpm-deployment-openstack-edpm-ipam-fjmvn\" (UID: \"4457d05c-f317-4cf2-97cb-03888616f4af\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fjmvn" Mar 20 11:17:40 crc kubenswrapper[4748]: I0320 11:17:40.545311 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4457d05c-f317-4cf2-97cb-03888616f4af-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-fjmvn\" (UID: \"4457d05c-f317-4cf2-97cb-03888616f4af\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fjmvn" Mar 20 11:17:40 crc kubenswrapper[4748]: I0320 11:17:40.545376 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4457d05c-f317-4cf2-97cb-03888616f4af-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-fjmvn\" (UID: \"4457d05c-f317-4cf2-97cb-03888616f4af\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fjmvn" Mar 20 11:17:40 crc kubenswrapper[4748]: I0320 11:17:40.545566 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/4457d05c-f317-4cf2-97cb-03888616f4af-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-fjmvn\" (UID: \"4457d05c-f317-4cf2-97cb-03888616f4af\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fjmvn" Mar 20 11:17:40 crc kubenswrapper[4748]: I0320 11:17:40.545673 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/4457d05c-f317-4cf2-97cb-03888616f4af-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-fjmvn\" (UID: \"4457d05c-f317-4cf2-97cb-03888616f4af\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fjmvn" Mar 20 11:17:40 crc kubenswrapper[4748]: I0320 11:17:40.647313 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/4457d05c-f317-4cf2-97cb-03888616f4af-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-fjmvn\" (UID: \"4457d05c-f317-4cf2-97cb-03888616f4af\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fjmvn" Mar 20 11:17:40 crc kubenswrapper[4748]: I0320 11:17:40.647369 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4457d05c-f317-4cf2-97cb-03888616f4af-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-fjmvn\" (UID: \"4457d05c-f317-4cf2-97cb-03888616f4af\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fjmvn" Mar 20 11:17:40 crc kubenswrapper[4748]: I0320 11:17:40.647400 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/4457d05c-f317-4cf2-97cb-03888616f4af-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-fjmvn\" (UID: \"4457d05c-f317-4cf2-97cb-03888616f4af\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fjmvn" Mar 20 11:17:40 crc kubenswrapper[4748]: I0320 11:17:40.647428 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/4457d05c-f317-4cf2-97cb-03888616f4af-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-fjmvn\" (UID: \"4457d05c-f317-4cf2-97cb-03888616f4af\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fjmvn" Mar 20 11:17:40 crc kubenswrapper[4748]: I0320 11:17:40.647476 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78s4q\" (UniqueName: \"kubernetes.io/projected/4457d05c-f317-4cf2-97cb-03888616f4af-kube-api-access-78s4q\") pod \"nova-edpm-deployment-openstack-edpm-ipam-fjmvn\" (UID: \"4457d05c-f317-4cf2-97cb-03888616f4af\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fjmvn" Mar 20 11:17:40 crc kubenswrapper[4748]: I0320 11:17:40.647527 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4457d05c-f317-4cf2-97cb-03888616f4af-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-fjmvn\" (UID: \"4457d05c-f317-4cf2-97cb-03888616f4af\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fjmvn" Mar 20 11:17:40 crc kubenswrapper[4748]: I0320 11:17:40.647570 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4457d05c-f317-4cf2-97cb-03888616f4af-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-fjmvn\" (UID: \"4457d05c-f317-4cf2-97cb-03888616f4af\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fjmvn" Mar 20 11:17:40 crc kubenswrapper[4748]: I0320 11:17:40.647721 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/4457d05c-f317-4cf2-97cb-03888616f4af-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-fjmvn\" (UID: \"4457d05c-f317-4cf2-97cb-03888616f4af\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fjmvn" Mar 20 11:17:40 crc kubenswrapper[4748]: I0320 11:17:40.647762 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/4457d05c-f317-4cf2-97cb-03888616f4af-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-fjmvn\" (UID: \"4457d05c-f317-4cf2-97cb-03888616f4af\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fjmvn" Mar 20 11:17:40 crc kubenswrapper[4748]: I0320 11:17:40.647787 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/4457d05c-f317-4cf2-97cb-03888616f4af-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-fjmvn\" (UID: \"4457d05c-f317-4cf2-97cb-03888616f4af\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fjmvn" Mar 20 11:17:40 crc kubenswrapper[4748]: I0320 11:17:40.647863 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/4457d05c-f317-4cf2-97cb-03888616f4af-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-fjmvn\" (UID: \"4457d05c-f317-4cf2-97cb-03888616f4af\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fjmvn" Mar 20 11:17:40 crc kubenswrapper[4748]: I0320 11:17:40.649213 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/4457d05c-f317-4cf2-97cb-03888616f4af-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-fjmvn\" (UID: \"4457d05c-f317-4cf2-97cb-03888616f4af\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fjmvn" Mar 20 11:17:40 crc kubenswrapper[4748]: I0320 11:17:40.653736 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4457d05c-f317-4cf2-97cb-03888616f4af-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-fjmvn\" (UID: \"4457d05c-f317-4cf2-97cb-03888616f4af\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fjmvn" Mar 20 11:17:40 crc kubenswrapper[4748]: I0320 11:17:40.653899 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4457d05c-f317-4cf2-97cb-03888616f4af-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-fjmvn\" (UID: \"4457d05c-f317-4cf2-97cb-03888616f4af\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fjmvn" Mar 20 11:17:40 crc kubenswrapper[4748]: I0320 11:17:40.654477 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/4457d05c-f317-4cf2-97cb-03888616f4af-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-fjmvn\" (UID: \"4457d05c-f317-4cf2-97cb-03888616f4af\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fjmvn" Mar 20 11:17:40 crc kubenswrapper[4748]: I0320 11:17:40.654742 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/4457d05c-f317-4cf2-97cb-03888616f4af-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-fjmvn\" (UID: \"4457d05c-f317-4cf2-97cb-03888616f4af\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fjmvn" Mar 20 11:17:40 crc kubenswrapper[4748]: I0320 11:17:40.657580 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/4457d05c-f317-4cf2-97cb-03888616f4af-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-fjmvn\" (UID: \"4457d05c-f317-4cf2-97cb-03888616f4af\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fjmvn" Mar 20 11:17:40 crc kubenswrapper[4748]: I0320 11:17:40.658043 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/4457d05c-f317-4cf2-97cb-03888616f4af-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-fjmvn\" (UID: \"4457d05c-f317-4cf2-97cb-03888616f4af\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fjmvn" Mar 20 11:17:40 crc kubenswrapper[4748]: I0320 11:17:40.660487 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/4457d05c-f317-4cf2-97cb-03888616f4af-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-fjmvn\" (UID: \"4457d05c-f317-4cf2-97cb-03888616f4af\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fjmvn" Mar 20 11:17:40 crc kubenswrapper[4748]: I0320 11:17:40.660683 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4457d05c-f317-4cf2-97cb-03888616f4af-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-fjmvn\" (UID: \"4457d05c-f317-4cf2-97cb-03888616f4af\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fjmvn" Mar 20 11:17:40 crc kubenswrapper[4748]: I0320 11:17:40.660935 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/4457d05c-f317-4cf2-97cb-03888616f4af-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-fjmvn\" (UID: \"4457d05c-f317-4cf2-97cb-03888616f4af\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fjmvn" Mar 20 11:17:40 crc kubenswrapper[4748]: I0320 11:17:40.667507 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78s4q\" (UniqueName: \"kubernetes.io/projected/4457d05c-f317-4cf2-97cb-03888616f4af-kube-api-access-78s4q\") pod \"nova-edpm-deployment-openstack-edpm-ipam-fjmvn\" (UID: \"4457d05c-f317-4cf2-97cb-03888616f4af\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fjmvn" Mar 20 11:17:40 crc kubenswrapper[4748]: I0320 11:17:40.743616 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fjmvn" Mar 20 11:17:41 crc kubenswrapper[4748]: I0320 11:17:41.268878 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-fjmvn"] Mar 20 11:17:42 crc kubenswrapper[4748]: I0320 11:17:42.197496 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fjmvn" event={"ID":"4457d05c-f317-4cf2-97cb-03888616f4af","Type":"ContainerStarted","Data":"b2eedaa0dc1cf26cccb7bf0c3a08e0b17456243e9d388ad1513526b968bec243"} Mar 20 11:17:42 crc kubenswrapper[4748]: I0320 11:17:42.197829 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fjmvn" event={"ID":"4457d05c-f317-4cf2-97cb-03888616f4af","Type":"ContainerStarted","Data":"c6bcdc0701b37078e394a09f310a52342ae751f79ed6c1aae2f7a0f63ee65c7f"} Mar 20 11:17:42 crc kubenswrapper[4748]: I0320 11:17:42.223946 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fjmvn" podStartSLOduration=2.049294647 podStartE2EDuration="2.223926021s" podCreationTimestamp="2026-03-20 11:17:40 +0000 UTC" firstStartedPulling="2026-03-20 11:17:41.273086646 +0000 UTC m=+2496.414632460" lastFinishedPulling="2026-03-20 11:17:41.44771802 +0000 UTC m=+2496.589263834" observedRunningTime="2026-03-20 11:17:42.215627032 +0000 UTC m=+2497.357172846" watchObservedRunningTime="2026-03-20 11:17:42.223926021 +0000 UTC m=+2497.365471835" Mar 20 11:17:42 crc kubenswrapper[4748]: I0320 11:17:42.928073 4748 patch_prober.go:28] interesting pod/machine-config-daemon-5lbvz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:17:42 crc kubenswrapper[4748]: I0320 11:17:42.928139 4748 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:17:42 crc kubenswrapper[4748]: I0320 11:17:42.928184 4748 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" Mar 20 11:17:42 crc kubenswrapper[4748]: I0320 11:17:42.928931 4748 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9efbf264ae5e3724c39e42cf83857fbc6d1e070b3c91135d558f8fb199ffbbb5"} pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 11:17:42 crc kubenswrapper[4748]: I0320 11:17:42.929008 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" containerName="machine-config-daemon" containerID="cri-o://9efbf264ae5e3724c39e42cf83857fbc6d1e070b3c91135d558f8fb199ffbbb5" gracePeriod=600 Mar 20 11:17:43 crc kubenswrapper[4748]: E0320 11:17:43.049733 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lbvz_openshift-machine-config-operator(8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" Mar 20 11:17:43 crc kubenswrapper[4748]: I0320 11:17:43.206630 4748 generic.go:334] "Generic (PLEG): container finished" podID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" containerID="9efbf264ae5e3724c39e42cf83857fbc6d1e070b3c91135d558f8fb199ffbbb5" exitCode=0 Mar 20 11:17:43 crc kubenswrapper[4748]: I0320 11:17:43.206686 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" event={"ID":"8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c","Type":"ContainerDied","Data":"9efbf264ae5e3724c39e42cf83857fbc6d1e070b3c91135d558f8fb199ffbbb5"} Mar 20 11:17:43 crc kubenswrapper[4748]: I0320 11:17:43.206740 4748 scope.go:117] "RemoveContainer" containerID="108bbe2c80954d52ef46b647af46c00e66773edc87b915e27f13030879de9d88" Mar 20 11:17:43 crc kubenswrapper[4748]: I0320 11:17:43.207301 4748 scope.go:117] "RemoveContainer" containerID="9efbf264ae5e3724c39e42cf83857fbc6d1e070b3c91135d558f8fb199ffbbb5" Mar 20 11:17:43 crc kubenswrapper[4748]: E0320 11:17:43.207626 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lbvz_openshift-machine-config-operator(8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" Mar 20 11:17:55 crc kubenswrapper[4748]: I0320 11:17:55.523778 4748 scope.go:117] "RemoveContainer" containerID="9efbf264ae5e3724c39e42cf83857fbc6d1e070b3c91135d558f8fb199ffbbb5" Mar 20 11:17:55 crc kubenswrapper[4748]: E0320 11:17:55.524576 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lbvz_openshift-machine-config-operator(8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" Mar 20 11:18:00 crc kubenswrapper[4748]: I0320 11:18:00.133347 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566758-pwps5"] Mar 20 11:18:00 crc kubenswrapper[4748]: I0320 11:18:00.135375 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566758-pwps5" Mar 20 11:18:00 crc kubenswrapper[4748]: I0320 11:18:00.138031 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-p6q8z" Mar 20 11:18:00 crc kubenswrapper[4748]: I0320 11:18:00.138281 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 11:18:00 crc kubenswrapper[4748]: I0320 11:18:00.143268 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 11:18:00 crc kubenswrapper[4748]: I0320 11:18:00.143324 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566758-pwps5"] Mar 20 11:18:00 crc kubenswrapper[4748]: I0320 11:18:00.218781 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jzz47\" (UniqueName: \"kubernetes.io/projected/d203f64a-081c-4375-99d8-3eba99e8f895-kube-api-access-jzz47\") pod \"auto-csr-approver-29566758-pwps5\" (UID: \"d203f64a-081c-4375-99d8-3eba99e8f895\") " pod="openshift-infra/auto-csr-approver-29566758-pwps5" Mar 20 11:18:00 crc kubenswrapper[4748]: I0320 11:18:00.321320 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jzz47\" (UniqueName: \"kubernetes.io/projected/d203f64a-081c-4375-99d8-3eba99e8f895-kube-api-access-jzz47\") pod \"auto-csr-approver-29566758-pwps5\" (UID: \"d203f64a-081c-4375-99d8-3eba99e8f895\") " pod="openshift-infra/auto-csr-approver-29566758-pwps5" Mar 20 11:18:00 crc kubenswrapper[4748]: I0320 11:18:00.340943 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jzz47\" (UniqueName: \"kubernetes.io/projected/d203f64a-081c-4375-99d8-3eba99e8f895-kube-api-access-jzz47\") pod \"auto-csr-approver-29566758-pwps5\" (UID: \"d203f64a-081c-4375-99d8-3eba99e8f895\") " pod="openshift-infra/auto-csr-approver-29566758-pwps5" Mar 20 11:18:00 crc kubenswrapper[4748]: I0320 11:18:00.458687 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566758-pwps5" Mar 20 11:18:01 crc kubenswrapper[4748]: I0320 11:18:00.918566 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566758-pwps5"] Mar 20 11:18:01 crc kubenswrapper[4748]: I0320 11:18:01.391553 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566758-pwps5" event={"ID":"d203f64a-081c-4375-99d8-3eba99e8f895","Type":"ContainerStarted","Data":"3463fa4b597fbf937cb26c612a717ff39723555fb08acd8daab6c97f079b5536"} Mar 20 11:18:03 crc kubenswrapper[4748]: I0320 11:18:03.414899 4748 generic.go:334] "Generic (PLEG): container finished" podID="d203f64a-081c-4375-99d8-3eba99e8f895" containerID="d3aa0a4d0156df77f2c3ca4480f412de71c3687ec773f7f50ce6f3021f63fbaa" exitCode=0 Mar 20 11:18:03 crc kubenswrapper[4748]: I0320 11:18:03.415008 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566758-pwps5" event={"ID":"d203f64a-081c-4375-99d8-3eba99e8f895","Type":"ContainerDied","Data":"d3aa0a4d0156df77f2c3ca4480f412de71c3687ec773f7f50ce6f3021f63fbaa"} Mar 20 11:18:04 crc kubenswrapper[4748]: I0320 11:18:04.727998 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566758-pwps5" Mar 20 11:18:04 crc kubenswrapper[4748]: I0320 11:18:04.809418 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jzz47\" (UniqueName: \"kubernetes.io/projected/d203f64a-081c-4375-99d8-3eba99e8f895-kube-api-access-jzz47\") pod \"d203f64a-081c-4375-99d8-3eba99e8f895\" (UID: \"d203f64a-081c-4375-99d8-3eba99e8f895\") " Mar 20 11:18:04 crc kubenswrapper[4748]: I0320 11:18:04.816027 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d203f64a-081c-4375-99d8-3eba99e8f895-kube-api-access-jzz47" (OuterVolumeSpecName: "kube-api-access-jzz47") pod "d203f64a-081c-4375-99d8-3eba99e8f895" (UID: "d203f64a-081c-4375-99d8-3eba99e8f895"). InnerVolumeSpecName "kube-api-access-jzz47". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:18:04 crc kubenswrapper[4748]: I0320 11:18:04.912219 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jzz47\" (UniqueName: \"kubernetes.io/projected/d203f64a-081c-4375-99d8-3eba99e8f895-kube-api-access-jzz47\") on node \"crc\" DevicePath \"\"" Mar 20 11:18:05 crc kubenswrapper[4748]: I0320 11:18:05.434807 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566758-pwps5" event={"ID":"d203f64a-081c-4375-99d8-3eba99e8f895","Type":"ContainerDied","Data":"3463fa4b597fbf937cb26c612a717ff39723555fb08acd8daab6c97f079b5536"} Mar 20 11:18:05 crc kubenswrapper[4748]: I0320 11:18:05.434870 4748 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3463fa4b597fbf937cb26c612a717ff39723555fb08acd8daab6c97f079b5536" Mar 20 11:18:05 crc kubenswrapper[4748]: I0320 11:18:05.434877 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566758-pwps5" Mar 20 11:18:05 crc kubenswrapper[4748]: I0320 11:18:05.815297 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566752-9prbv"] Mar 20 11:18:05 crc kubenswrapper[4748]: I0320 11:18:05.824390 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566752-9prbv"] Mar 20 11:18:07 crc kubenswrapper[4748]: I0320 11:18:07.516361 4748 scope.go:117] "RemoveContainer" containerID="9efbf264ae5e3724c39e42cf83857fbc6d1e070b3c91135d558f8fb199ffbbb5" Mar 20 11:18:07 crc kubenswrapper[4748]: E0320 11:18:07.517272 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lbvz_openshift-machine-config-operator(8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" Mar 20 11:18:07 crc kubenswrapper[4748]: I0320 11:18:07.526737 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da6dab72-7622-4f0e-bb06-c33bacf10588" path="/var/lib/kubelet/pods/da6dab72-7622-4f0e-bb06-c33bacf10588/volumes" Mar 20 11:18:19 crc kubenswrapper[4748]: I0320 11:18:19.515850 4748 scope.go:117] "RemoveContainer" containerID="9efbf264ae5e3724c39e42cf83857fbc6d1e070b3c91135d558f8fb199ffbbb5" Mar 20 11:18:19 crc kubenswrapper[4748]: E0320 11:18:19.516689 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lbvz_openshift-machine-config-operator(8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" Mar 20 11:18:20 crc kubenswrapper[4748]: I0320 11:18:20.858166 4748 scope.go:117] "RemoveContainer" containerID="d4ff2a6951846e8772811a165a7ade144c87f046dd3deeb17aba0c54539f067a" Mar 20 11:18:33 crc kubenswrapper[4748]: I0320 11:18:33.515744 4748 scope.go:117] "RemoveContainer" containerID="9efbf264ae5e3724c39e42cf83857fbc6d1e070b3c91135d558f8fb199ffbbb5" Mar 20 11:18:33 crc kubenswrapper[4748]: E0320 11:18:33.516567 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lbvz_openshift-machine-config-operator(8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" Mar 20 11:18:45 crc kubenswrapper[4748]: I0320 11:18:45.525573 4748 scope.go:117] "RemoveContainer" containerID="9efbf264ae5e3724c39e42cf83857fbc6d1e070b3c91135d558f8fb199ffbbb5" Mar 20 11:18:45 crc kubenswrapper[4748]: E0320 11:18:45.527107 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lbvz_openshift-machine-config-operator(8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" Mar 20 11:18:59 crc kubenswrapper[4748]: I0320 11:18:59.516165 4748 scope.go:117] "RemoveContainer" containerID="9efbf264ae5e3724c39e42cf83857fbc6d1e070b3c91135d558f8fb199ffbbb5" Mar 20 11:18:59 crc kubenswrapper[4748]: E0320 11:18:59.518438 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lbvz_openshift-machine-config-operator(8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" Mar 20 11:19:11 crc kubenswrapper[4748]: I0320 11:19:11.516005 4748 scope.go:117] "RemoveContainer" containerID="9efbf264ae5e3724c39e42cf83857fbc6d1e070b3c91135d558f8fb199ffbbb5" Mar 20 11:19:11 crc kubenswrapper[4748]: E0320 11:19:11.516810 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lbvz_openshift-machine-config-operator(8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" Mar 20 11:19:26 crc kubenswrapper[4748]: I0320 11:19:26.516738 4748 scope.go:117] "RemoveContainer" containerID="9efbf264ae5e3724c39e42cf83857fbc6d1e070b3c91135d558f8fb199ffbbb5" Mar 20 11:19:26 crc kubenswrapper[4748]: E0320 11:19:26.517515 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lbvz_openshift-machine-config-operator(8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" Mar 20 11:19:41 crc kubenswrapper[4748]: I0320 11:19:41.516167 4748 scope.go:117] "RemoveContainer" containerID="9efbf264ae5e3724c39e42cf83857fbc6d1e070b3c91135d558f8fb199ffbbb5" Mar 20 11:19:41 crc kubenswrapper[4748]: E0320 11:19:41.517086 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lbvz_openshift-machine-config-operator(8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" Mar 20 11:19:53 crc kubenswrapper[4748]: I0320 11:19:53.516069 4748 scope.go:117] "RemoveContainer" containerID="9efbf264ae5e3724c39e42cf83857fbc6d1e070b3c91135d558f8fb199ffbbb5" Mar 20 11:19:53 crc kubenswrapper[4748]: E0320 11:19:53.517793 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lbvz_openshift-machine-config-operator(8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" Mar 20 11:20:00 crc kubenswrapper[4748]: I0320 11:20:00.148773 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566760-pf429"] Mar 20 11:20:00 crc kubenswrapper[4748]: E0320 11:20:00.149985 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d203f64a-081c-4375-99d8-3eba99e8f895" containerName="oc" Mar 20 11:20:00 crc kubenswrapper[4748]: I0320 11:20:00.150004 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="d203f64a-081c-4375-99d8-3eba99e8f895" containerName="oc" Mar 20 11:20:00 crc kubenswrapper[4748]: I0320 11:20:00.150221 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="d203f64a-081c-4375-99d8-3eba99e8f895" containerName="oc" Mar 20 11:20:00 crc kubenswrapper[4748]: I0320 11:20:00.150957 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566760-pf429" Mar 20 11:20:00 crc kubenswrapper[4748]: I0320 11:20:00.154965 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 11:20:00 crc kubenswrapper[4748]: I0320 11:20:00.155283 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-p6q8z" Mar 20 11:20:00 crc kubenswrapper[4748]: I0320 11:20:00.155411 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 11:20:00 crc kubenswrapper[4748]: I0320 11:20:00.169296 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566760-pf429"] Mar 20 11:20:00 crc kubenswrapper[4748]: I0320 11:20:00.261562 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwz5m\" (UniqueName: \"kubernetes.io/projected/11faf586-5b92-4f52-87e2-cdb907eb9f9a-kube-api-access-kwz5m\") pod \"auto-csr-approver-29566760-pf429\" (UID: \"11faf586-5b92-4f52-87e2-cdb907eb9f9a\") " pod="openshift-infra/auto-csr-approver-29566760-pf429" Mar 20 11:20:00 crc kubenswrapper[4748]: I0320 11:20:00.363733 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kwz5m\" (UniqueName: \"kubernetes.io/projected/11faf586-5b92-4f52-87e2-cdb907eb9f9a-kube-api-access-kwz5m\") pod \"auto-csr-approver-29566760-pf429\" (UID: \"11faf586-5b92-4f52-87e2-cdb907eb9f9a\") " pod="openshift-infra/auto-csr-approver-29566760-pf429" Mar 20 11:20:00 crc kubenswrapper[4748]: I0320 11:20:00.384040 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwz5m\" (UniqueName: \"kubernetes.io/projected/11faf586-5b92-4f52-87e2-cdb907eb9f9a-kube-api-access-kwz5m\") pod \"auto-csr-approver-29566760-pf429\" (UID: \"11faf586-5b92-4f52-87e2-cdb907eb9f9a\") " pod="openshift-infra/auto-csr-approver-29566760-pf429" Mar 20 11:20:00 crc kubenswrapper[4748]: I0320 11:20:00.474604 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566760-pf429" Mar 20 11:20:00 crc kubenswrapper[4748]: I0320 11:20:00.899868 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566760-pf429"] Mar 20 11:20:01 crc kubenswrapper[4748]: I0320 11:20:01.317051 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566760-pf429" event={"ID":"11faf586-5b92-4f52-87e2-cdb907eb9f9a","Type":"ContainerStarted","Data":"4e9df9c08d4778b57b1c14ffaa0b5be86e0af2b74db9513014f1905c52364f63"} Mar 20 11:20:04 crc kubenswrapper[4748]: I0320 11:20:04.351512 4748 generic.go:334] "Generic (PLEG): container finished" podID="11faf586-5b92-4f52-87e2-cdb907eb9f9a" containerID="de2f94c17e2ff5aba9bd32186bdcf407ba76a863c40c3ddb0cdb6363dc4edd9a" exitCode=0 Mar 20 11:20:04 crc kubenswrapper[4748]: I0320 11:20:04.351581 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566760-pf429" event={"ID":"11faf586-5b92-4f52-87e2-cdb907eb9f9a","Type":"ContainerDied","Data":"de2f94c17e2ff5aba9bd32186bdcf407ba76a863c40c3ddb0cdb6363dc4edd9a"} Mar 20 11:20:05 crc kubenswrapper[4748]: I0320 11:20:05.522760 4748 scope.go:117] "RemoveContainer" containerID="9efbf264ae5e3724c39e42cf83857fbc6d1e070b3c91135d558f8fb199ffbbb5" Mar 20 11:20:05 crc kubenswrapper[4748]: E0320 11:20:05.523317 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lbvz_openshift-machine-config-operator(8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" Mar 20 11:20:05 crc kubenswrapper[4748]: I0320 11:20:05.687346 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566760-pf429" Mar 20 11:20:05 crc kubenswrapper[4748]: I0320 11:20:05.771164 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kwz5m\" (UniqueName: \"kubernetes.io/projected/11faf586-5b92-4f52-87e2-cdb907eb9f9a-kube-api-access-kwz5m\") pod \"11faf586-5b92-4f52-87e2-cdb907eb9f9a\" (UID: \"11faf586-5b92-4f52-87e2-cdb907eb9f9a\") " Mar 20 11:20:05 crc kubenswrapper[4748]: I0320 11:20:05.776371 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11faf586-5b92-4f52-87e2-cdb907eb9f9a-kube-api-access-kwz5m" (OuterVolumeSpecName: "kube-api-access-kwz5m") pod "11faf586-5b92-4f52-87e2-cdb907eb9f9a" (UID: "11faf586-5b92-4f52-87e2-cdb907eb9f9a"). InnerVolumeSpecName "kube-api-access-kwz5m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:20:05 crc kubenswrapper[4748]: I0320 11:20:05.873719 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kwz5m\" (UniqueName: \"kubernetes.io/projected/11faf586-5b92-4f52-87e2-cdb907eb9f9a-kube-api-access-kwz5m\") on node \"crc\" DevicePath \"\"" Mar 20 11:20:06 crc kubenswrapper[4748]: I0320 11:20:06.371194 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566760-pf429" event={"ID":"11faf586-5b92-4f52-87e2-cdb907eb9f9a","Type":"ContainerDied","Data":"4e9df9c08d4778b57b1c14ffaa0b5be86e0af2b74db9513014f1905c52364f63"} Mar 20 11:20:06 crc kubenswrapper[4748]: I0320 11:20:06.371244 4748 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4e9df9c08d4778b57b1c14ffaa0b5be86e0af2b74db9513014f1905c52364f63" Mar 20 11:20:06 crc kubenswrapper[4748]: I0320 11:20:06.371305 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566760-pf429" Mar 20 11:20:06 crc kubenswrapper[4748]: I0320 11:20:06.761464 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566754-xn42x"] Mar 20 11:20:06 crc kubenswrapper[4748]: I0320 11:20:06.788351 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566754-xn42x"] Mar 20 11:20:07 crc kubenswrapper[4748]: I0320 11:20:07.526493 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10f23fc0-639a-465e-b0d8-3ca44dc9d1ac" path="/var/lib/kubelet/pods/10f23fc0-639a-465e-b0d8-3ca44dc9d1ac/volumes" Mar 20 11:20:19 crc kubenswrapper[4748]: I0320 11:20:19.516096 4748 scope.go:117] "RemoveContainer" containerID="9efbf264ae5e3724c39e42cf83857fbc6d1e070b3c91135d558f8fb199ffbbb5" Mar 20 11:20:19 crc kubenswrapper[4748]: E0320 11:20:19.516768 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lbvz_openshift-machine-config-operator(8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" Mar 20 11:20:20 crc kubenswrapper[4748]: I0320 11:20:20.486034 4748 generic.go:334] "Generic (PLEG): container finished" podID="4457d05c-f317-4cf2-97cb-03888616f4af" containerID="b2eedaa0dc1cf26cccb7bf0c3a08e0b17456243e9d388ad1513526b968bec243" exitCode=0 Mar 20 11:20:20 crc kubenswrapper[4748]: I0320 11:20:20.486186 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fjmvn" event={"ID":"4457d05c-f317-4cf2-97cb-03888616f4af","Type":"ContainerDied","Data":"b2eedaa0dc1cf26cccb7bf0c3a08e0b17456243e9d388ad1513526b968bec243"} Mar 20 11:20:20 crc kubenswrapper[4748]: I0320 11:20:20.945887 4748 scope.go:117] "RemoveContainer" containerID="a5cf8131bef5a9d9f94387f4389c972e26d4cd4663a2f0c987dc97b377674c74" Mar 20 11:20:20 crc kubenswrapper[4748]: I0320 11:20:20.996621 4748 scope.go:117] "RemoveContainer" containerID="866f991745ee6a71588a0f2075f37b4a0d1e4c13c379bf12f4021a4c4d4c2efd" Mar 20 11:20:21 crc kubenswrapper[4748]: I0320 11:20:21.037639 4748 scope.go:117] "RemoveContainer" containerID="e6271ae8a250b7f6be1bf6f53f04d1f8525072b9bc58deb1a1a74e401f301fbb" Mar 20 11:20:21 crc kubenswrapper[4748]: I0320 11:20:21.076185 4748 scope.go:117] "RemoveContainer" containerID="f6e2e8457ea5cd4eed85ac6405c07b4f7d7f3e1e62940f66e030b3ca1ac2b060" Mar 20 11:20:21 crc kubenswrapper[4748]: I0320 11:20:21.936077 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fjmvn" Mar 20 11:20:22 crc kubenswrapper[4748]: I0320 11:20:22.075520 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/4457d05c-f317-4cf2-97cb-03888616f4af-nova-migration-ssh-key-1\") pod \"4457d05c-f317-4cf2-97cb-03888616f4af\" (UID: \"4457d05c-f317-4cf2-97cb-03888616f4af\") " Mar 20 11:20:22 crc kubenswrapper[4748]: I0320 11:20:22.075632 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4457d05c-f317-4cf2-97cb-03888616f4af-ssh-key-openstack-edpm-ipam\") pod \"4457d05c-f317-4cf2-97cb-03888616f4af\" (UID: \"4457d05c-f317-4cf2-97cb-03888616f4af\") " Mar 20 11:20:22 crc kubenswrapper[4748]: I0320 11:20:22.075690 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4457d05c-f317-4cf2-97cb-03888616f4af-inventory\") pod \"4457d05c-f317-4cf2-97cb-03888616f4af\" (UID: \"4457d05c-f317-4cf2-97cb-03888616f4af\") " Mar 20 11:20:22 crc kubenswrapper[4748]: I0320 11:20:22.075748 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/4457d05c-f317-4cf2-97cb-03888616f4af-nova-extra-config-0\") pod \"4457d05c-f317-4cf2-97cb-03888616f4af\" (UID: \"4457d05c-f317-4cf2-97cb-03888616f4af\") " Mar 20 11:20:22 crc kubenswrapper[4748]: I0320 11:20:22.075771 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/4457d05c-f317-4cf2-97cb-03888616f4af-nova-cell1-compute-config-2\") pod \"4457d05c-f317-4cf2-97cb-03888616f4af\" (UID: \"4457d05c-f317-4cf2-97cb-03888616f4af\") " Mar 20 11:20:22 crc kubenswrapper[4748]: I0320 11:20:22.075812 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/4457d05c-f317-4cf2-97cb-03888616f4af-nova-migration-ssh-key-0\") pod \"4457d05c-f317-4cf2-97cb-03888616f4af\" (UID: \"4457d05c-f317-4cf2-97cb-03888616f4af\") " Mar 20 11:20:22 crc kubenswrapper[4748]: I0320 11:20:22.075855 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/4457d05c-f317-4cf2-97cb-03888616f4af-nova-cell1-compute-config-0\") pod \"4457d05c-f317-4cf2-97cb-03888616f4af\" (UID: \"4457d05c-f317-4cf2-97cb-03888616f4af\") " Mar 20 11:20:22 crc kubenswrapper[4748]: I0320 11:20:22.075902 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/4457d05c-f317-4cf2-97cb-03888616f4af-nova-cell1-compute-config-3\") pod \"4457d05c-f317-4cf2-97cb-03888616f4af\" (UID: \"4457d05c-f317-4cf2-97cb-03888616f4af\") " Mar 20 11:20:22 crc kubenswrapper[4748]: I0320 11:20:22.075944 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4457d05c-f317-4cf2-97cb-03888616f4af-nova-combined-ca-bundle\") pod \"4457d05c-f317-4cf2-97cb-03888616f4af\" (UID: \"4457d05c-f317-4cf2-97cb-03888616f4af\") " Mar 20 11:20:22 crc kubenswrapper[4748]: I0320 11:20:22.075998 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/4457d05c-f317-4cf2-97cb-03888616f4af-nova-cell1-compute-config-1\") pod \"4457d05c-f317-4cf2-97cb-03888616f4af\" (UID: \"4457d05c-f317-4cf2-97cb-03888616f4af\") " Mar 20 11:20:22 crc kubenswrapper[4748]: I0320 11:20:22.076034 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-78s4q\" (UniqueName: \"kubernetes.io/projected/4457d05c-f317-4cf2-97cb-03888616f4af-kube-api-access-78s4q\") pod \"4457d05c-f317-4cf2-97cb-03888616f4af\" (UID: \"4457d05c-f317-4cf2-97cb-03888616f4af\") " Mar 20 11:20:22 crc kubenswrapper[4748]: I0320 11:20:22.120090 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4457d05c-f317-4cf2-97cb-03888616f4af-kube-api-access-78s4q" (OuterVolumeSpecName: "kube-api-access-78s4q") pod "4457d05c-f317-4cf2-97cb-03888616f4af" (UID: "4457d05c-f317-4cf2-97cb-03888616f4af"). InnerVolumeSpecName "kube-api-access-78s4q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:20:22 crc kubenswrapper[4748]: I0320 11:20:22.131323 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4457d05c-f317-4cf2-97cb-03888616f4af-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "4457d05c-f317-4cf2-97cb-03888616f4af" (UID: "4457d05c-f317-4cf2-97cb-03888616f4af"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 11:20:22 crc kubenswrapper[4748]: I0320 11:20:22.157762 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4457d05c-f317-4cf2-97cb-03888616f4af-nova-cell1-compute-config-3" (OuterVolumeSpecName: "nova-cell1-compute-config-3") pod "4457d05c-f317-4cf2-97cb-03888616f4af" (UID: "4457d05c-f317-4cf2-97cb-03888616f4af"). InnerVolumeSpecName "nova-cell1-compute-config-3". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 11:20:22 crc kubenswrapper[4748]: I0320 11:20:22.157891 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4457d05c-f317-4cf2-97cb-03888616f4af-nova-cell1-compute-config-2" (OuterVolumeSpecName: "nova-cell1-compute-config-2") pod "4457d05c-f317-4cf2-97cb-03888616f4af" (UID: "4457d05c-f317-4cf2-97cb-03888616f4af"). InnerVolumeSpecName "nova-cell1-compute-config-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 11:20:22 crc kubenswrapper[4748]: I0320 11:20:22.158923 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4457d05c-f317-4cf2-97cb-03888616f4af-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "4457d05c-f317-4cf2-97cb-03888616f4af" (UID: "4457d05c-f317-4cf2-97cb-03888616f4af"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 11:20:22 crc kubenswrapper[4748]: I0320 11:20:22.166978 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4457d05c-f317-4cf2-97cb-03888616f4af-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "4457d05c-f317-4cf2-97cb-03888616f4af" (UID: "4457d05c-f317-4cf2-97cb-03888616f4af"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 11:20:22 crc kubenswrapper[4748]: I0320 11:20:22.175797 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4457d05c-f317-4cf2-97cb-03888616f4af-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "4457d05c-f317-4cf2-97cb-03888616f4af" (UID: "4457d05c-f317-4cf2-97cb-03888616f4af"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 11:20:22 crc kubenswrapper[4748]: I0320 11:20:22.178161 4748 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/4457d05c-f317-4cf2-97cb-03888616f4af-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Mar 20 11:20:22 crc kubenswrapper[4748]: I0320 11:20:22.178197 4748 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/4457d05c-f317-4cf2-97cb-03888616f4af-nova-cell1-compute-config-2\") on node \"crc\" DevicePath \"\"" Mar 20 11:20:22 crc kubenswrapper[4748]: I0320 11:20:22.178208 4748 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/4457d05c-f317-4cf2-97cb-03888616f4af-nova-cell1-compute-config-3\") on node \"crc\" DevicePath \"\"" Mar 20 11:20:22 crc kubenswrapper[4748]: I0320 11:20:22.178221 4748 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4457d05c-f317-4cf2-97cb-03888616f4af-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 11:20:22 crc kubenswrapper[4748]: I0320 11:20:22.178231 4748 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/4457d05c-f317-4cf2-97cb-03888616f4af-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Mar 20 11:20:22 crc kubenswrapper[4748]: I0320 11:20:22.178241 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-78s4q\" (UniqueName: \"kubernetes.io/projected/4457d05c-f317-4cf2-97cb-03888616f4af-kube-api-access-78s4q\") on node \"crc\" DevicePath \"\"" Mar 20 11:20:22 crc kubenswrapper[4748]: I0320 11:20:22.178249 4748 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4457d05c-f317-4cf2-97cb-03888616f4af-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 11:20:22 crc kubenswrapper[4748]: I0320 11:20:22.178557 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4457d05c-f317-4cf2-97cb-03888616f4af-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "4457d05c-f317-4cf2-97cb-03888616f4af" (UID: "4457d05c-f317-4cf2-97cb-03888616f4af"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 11:20:22 crc kubenswrapper[4748]: I0320 11:20:22.179114 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4457d05c-f317-4cf2-97cb-03888616f4af-inventory" (OuterVolumeSpecName: "inventory") pod "4457d05c-f317-4cf2-97cb-03888616f4af" (UID: "4457d05c-f317-4cf2-97cb-03888616f4af"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 11:20:22 crc kubenswrapper[4748]: I0320 11:20:22.187978 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4457d05c-f317-4cf2-97cb-03888616f4af-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "4457d05c-f317-4cf2-97cb-03888616f4af" (UID: "4457d05c-f317-4cf2-97cb-03888616f4af"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 11:20:22 crc kubenswrapper[4748]: I0320 11:20:22.189534 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4457d05c-f317-4cf2-97cb-03888616f4af-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "4457d05c-f317-4cf2-97cb-03888616f4af" (UID: "4457d05c-f317-4cf2-97cb-03888616f4af"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 11:20:22 crc kubenswrapper[4748]: I0320 11:20:22.281031 4748 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/4457d05c-f317-4cf2-97cb-03888616f4af-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Mar 20 11:20:22 crc kubenswrapper[4748]: I0320 11:20:22.281264 4748 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4457d05c-f317-4cf2-97cb-03888616f4af-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 11:20:22 crc kubenswrapper[4748]: I0320 11:20:22.281368 4748 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/4457d05c-f317-4cf2-97cb-03888616f4af-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Mar 20 11:20:22 crc kubenswrapper[4748]: I0320 11:20:22.281443 4748 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/4457d05c-f317-4cf2-97cb-03888616f4af-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Mar 20 11:20:22 crc kubenswrapper[4748]: I0320 11:20:22.503907 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fjmvn" event={"ID":"4457d05c-f317-4cf2-97cb-03888616f4af","Type":"ContainerDied","Data":"c6bcdc0701b37078e394a09f310a52342ae751f79ed6c1aae2f7a0f63ee65c7f"} Mar 20 11:20:22 crc kubenswrapper[4748]: I0320 11:20:22.503958 4748 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c6bcdc0701b37078e394a09f310a52342ae751f79ed6c1aae2f7a0f63ee65c7f" Mar 20 11:20:22 crc kubenswrapper[4748]: I0320 11:20:22.504203 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fjmvn" Mar 20 11:20:22 crc kubenswrapper[4748]: I0320 11:20:22.632619 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7wd5b"] Mar 20 11:20:22 crc kubenswrapper[4748]: E0320 11:20:22.633076 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11faf586-5b92-4f52-87e2-cdb907eb9f9a" containerName="oc" Mar 20 11:20:22 crc kubenswrapper[4748]: I0320 11:20:22.633094 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="11faf586-5b92-4f52-87e2-cdb907eb9f9a" containerName="oc" Mar 20 11:20:22 crc kubenswrapper[4748]: E0320 11:20:22.633105 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4457d05c-f317-4cf2-97cb-03888616f4af" containerName="nova-edpm-deployment-openstack-edpm-ipam" Mar 20 11:20:22 crc kubenswrapper[4748]: I0320 11:20:22.633113 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="4457d05c-f317-4cf2-97cb-03888616f4af" containerName="nova-edpm-deployment-openstack-edpm-ipam" Mar 20 11:20:22 crc kubenswrapper[4748]: I0320 11:20:22.633295 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="11faf586-5b92-4f52-87e2-cdb907eb9f9a" containerName="oc" Mar 20 11:20:22 crc kubenswrapper[4748]: I0320 11:20:22.633323 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="4457d05c-f317-4cf2-97cb-03888616f4af" containerName="nova-edpm-deployment-openstack-edpm-ipam" Mar 20 11:20:22 crc kubenswrapper[4748]: I0320 11:20:22.633955 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7wd5b" Mar 20 11:20:22 crc kubenswrapper[4748]: I0320 11:20:22.637171 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 11:20:22 crc kubenswrapper[4748]: I0320 11:20:22.637483 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Mar 20 11:20:22 crc kubenswrapper[4748]: I0320 11:20:22.637673 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-fd5jb" Mar 20 11:20:22 crc kubenswrapper[4748]: I0320 11:20:22.638347 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 11:20:22 crc kubenswrapper[4748]: I0320 11:20:22.638885 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 11:20:22 crc kubenswrapper[4748]: I0320 11:20:22.658560 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7wd5b"] Mar 20 11:20:22 crc kubenswrapper[4748]: I0320 11:20:22.689472 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/d03d3cf8-b0f5-46bf-9396-e6da7698e6fb-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-7wd5b\" (UID: \"d03d3cf8-b0f5-46bf-9396-e6da7698e6fb\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7wd5b" Mar 20 11:20:22 crc kubenswrapper[4748]: I0320 11:20:22.689698 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m56t4\" (UniqueName: \"kubernetes.io/projected/d03d3cf8-b0f5-46bf-9396-e6da7698e6fb-kube-api-access-m56t4\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-7wd5b\" (UID: \"d03d3cf8-b0f5-46bf-9396-e6da7698e6fb\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7wd5b" Mar 20 11:20:22 crc kubenswrapper[4748]: I0320 11:20:22.689854 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d03d3cf8-b0f5-46bf-9396-e6da7698e6fb-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-7wd5b\" (UID: \"d03d3cf8-b0f5-46bf-9396-e6da7698e6fb\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7wd5b" Mar 20 11:20:22 crc kubenswrapper[4748]: I0320 11:20:22.689927 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d03d3cf8-b0f5-46bf-9396-e6da7698e6fb-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-7wd5b\" (UID: \"d03d3cf8-b0f5-46bf-9396-e6da7698e6fb\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7wd5b" Mar 20 11:20:22 crc kubenswrapper[4748]: I0320 11:20:22.689968 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d03d3cf8-b0f5-46bf-9396-e6da7698e6fb-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-7wd5b\" (UID: \"d03d3cf8-b0f5-46bf-9396-e6da7698e6fb\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7wd5b" Mar 20 11:20:22 crc kubenswrapper[4748]: I0320 11:20:22.690001 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/d03d3cf8-b0f5-46bf-9396-e6da7698e6fb-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-7wd5b\" (UID: \"d03d3cf8-b0f5-46bf-9396-e6da7698e6fb\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7wd5b" Mar 20 11:20:22 crc kubenswrapper[4748]: I0320 11:20:22.690062 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/d03d3cf8-b0f5-46bf-9396-e6da7698e6fb-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-7wd5b\" (UID: \"d03d3cf8-b0f5-46bf-9396-e6da7698e6fb\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7wd5b" Mar 20 11:20:22 crc kubenswrapper[4748]: I0320 11:20:22.791288 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m56t4\" (UniqueName: \"kubernetes.io/projected/d03d3cf8-b0f5-46bf-9396-e6da7698e6fb-kube-api-access-m56t4\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-7wd5b\" (UID: \"d03d3cf8-b0f5-46bf-9396-e6da7698e6fb\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7wd5b" Mar 20 11:20:22 crc kubenswrapper[4748]: I0320 11:20:22.791368 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d03d3cf8-b0f5-46bf-9396-e6da7698e6fb-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-7wd5b\" (UID: \"d03d3cf8-b0f5-46bf-9396-e6da7698e6fb\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7wd5b" Mar 20 11:20:22 crc kubenswrapper[4748]: I0320 11:20:22.791416 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d03d3cf8-b0f5-46bf-9396-e6da7698e6fb-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-7wd5b\" (UID: \"d03d3cf8-b0f5-46bf-9396-e6da7698e6fb\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7wd5b" Mar 20 11:20:22 crc kubenswrapper[4748]: I0320 11:20:22.791438 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d03d3cf8-b0f5-46bf-9396-e6da7698e6fb-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-7wd5b\" (UID: \"d03d3cf8-b0f5-46bf-9396-e6da7698e6fb\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7wd5b" Mar 20 11:20:22 crc kubenswrapper[4748]: I0320 11:20:22.791465 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/d03d3cf8-b0f5-46bf-9396-e6da7698e6fb-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-7wd5b\" (UID: \"d03d3cf8-b0f5-46bf-9396-e6da7698e6fb\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7wd5b" Mar 20 11:20:22 crc kubenswrapper[4748]: I0320 11:20:22.791511 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/d03d3cf8-b0f5-46bf-9396-e6da7698e6fb-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-7wd5b\" (UID: \"d03d3cf8-b0f5-46bf-9396-e6da7698e6fb\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7wd5b" Mar 20 11:20:22 crc kubenswrapper[4748]: I0320 11:20:22.791578 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/d03d3cf8-b0f5-46bf-9396-e6da7698e6fb-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-7wd5b\" (UID: \"d03d3cf8-b0f5-46bf-9396-e6da7698e6fb\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7wd5b" Mar 20 11:20:22 crc kubenswrapper[4748]: I0320 11:20:22.795624 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d03d3cf8-b0f5-46bf-9396-e6da7698e6fb-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-7wd5b\" (UID: \"d03d3cf8-b0f5-46bf-9396-e6da7698e6fb\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7wd5b" Mar 20 11:20:22 crc kubenswrapper[4748]: I0320 11:20:22.797410 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/d03d3cf8-b0f5-46bf-9396-e6da7698e6fb-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-7wd5b\" (UID: \"d03d3cf8-b0f5-46bf-9396-e6da7698e6fb\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7wd5b" Mar 20 11:20:22 crc kubenswrapper[4748]: I0320 11:20:22.797554 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d03d3cf8-b0f5-46bf-9396-e6da7698e6fb-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-7wd5b\" (UID: \"d03d3cf8-b0f5-46bf-9396-e6da7698e6fb\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7wd5b" Mar 20 11:20:22 crc kubenswrapper[4748]: I0320 11:20:22.809052 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d03d3cf8-b0f5-46bf-9396-e6da7698e6fb-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-7wd5b\" (UID: \"d03d3cf8-b0f5-46bf-9396-e6da7698e6fb\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7wd5b" Mar 20 11:20:22 crc kubenswrapper[4748]: I0320 11:20:22.809076 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/d03d3cf8-b0f5-46bf-9396-e6da7698e6fb-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-7wd5b\" (UID: \"d03d3cf8-b0f5-46bf-9396-e6da7698e6fb\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7wd5b" Mar 20 11:20:22 crc kubenswrapper[4748]: I0320 11:20:22.809996 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/d03d3cf8-b0f5-46bf-9396-e6da7698e6fb-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-7wd5b\" (UID: \"d03d3cf8-b0f5-46bf-9396-e6da7698e6fb\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7wd5b" Mar 20 11:20:22 crc kubenswrapper[4748]: I0320 11:20:22.812046 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m56t4\" (UniqueName: \"kubernetes.io/projected/d03d3cf8-b0f5-46bf-9396-e6da7698e6fb-kube-api-access-m56t4\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-7wd5b\" (UID: \"d03d3cf8-b0f5-46bf-9396-e6da7698e6fb\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7wd5b" Mar 20 11:20:22 crc kubenswrapper[4748]: I0320 11:20:22.960051 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7wd5b" Mar 20 11:20:23 crc kubenswrapper[4748]: I0320 11:20:23.496067 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7wd5b"] Mar 20 11:20:23 crc kubenswrapper[4748]: I0320 11:20:23.527879 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7wd5b" event={"ID":"d03d3cf8-b0f5-46bf-9396-e6da7698e6fb","Type":"ContainerStarted","Data":"b533e2ecfa644c491d116e0d47aea5ba526ac983855baa4e3a93b9fa6c96768d"} Mar 20 11:20:24 crc kubenswrapper[4748]: I0320 11:20:24.524700 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7wd5b" event={"ID":"d03d3cf8-b0f5-46bf-9396-e6da7698e6fb","Type":"ContainerStarted","Data":"6545c51563d5f190a056e7e616d76a0ea4e991b4c505b57d606cd776e0660f0b"} Mar 20 11:20:24 crc kubenswrapper[4748]: I0320 11:20:24.552170 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7wd5b" podStartSLOduration=2.379458516 podStartE2EDuration="2.552147998s" podCreationTimestamp="2026-03-20 11:20:22 +0000 UTC" firstStartedPulling="2026-03-20 11:20:23.512067732 +0000 UTC m=+2658.653613546" lastFinishedPulling="2026-03-20 11:20:23.684757214 +0000 UTC m=+2658.826303028" observedRunningTime="2026-03-20 11:20:24.545601954 +0000 UTC m=+2659.687147768" watchObservedRunningTime="2026-03-20 11:20:24.552147998 +0000 UTC m=+2659.693693812" Mar 20 11:20:31 crc kubenswrapper[4748]: I0320 11:20:31.516238 4748 scope.go:117] "RemoveContainer" containerID="9efbf264ae5e3724c39e42cf83857fbc6d1e070b3c91135d558f8fb199ffbbb5" Mar 20 11:20:31 crc kubenswrapper[4748]: E0320 11:20:31.517097 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lbvz_openshift-machine-config-operator(8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" Mar 20 11:20:45 crc kubenswrapper[4748]: I0320 11:20:45.526434 4748 scope.go:117] "RemoveContainer" containerID="9efbf264ae5e3724c39e42cf83857fbc6d1e070b3c91135d558f8fb199ffbbb5" Mar 20 11:20:45 crc kubenswrapper[4748]: E0320 11:20:45.527485 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lbvz_openshift-machine-config-operator(8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" Mar 20 11:20:56 crc kubenswrapper[4748]: I0320 11:20:56.515290 4748 scope.go:117] "RemoveContainer" containerID="9efbf264ae5e3724c39e42cf83857fbc6d1e070b3c91135d558f8fb199ffbbb5" Mar 20 11:20:56 crc kubenswrapper[4748]: E0320 11:20:56.516083 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lbvz_openshift-machine-config-operator(8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" Mar 20 11:21:09 crc kubenswrapper[4748]: I0320 11:21:09.515336 4748 scope.go:117] "RemoveContainer" containerID="9efbf264ae5e3724c39e42cf83857fbc6d1e070b3c91135d558f8fb199ffbbb5" Mar 20 11:21:09 crc kubenswrapper[4748]: E0320 11:21:09.516103 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lbvz_openshift-machine-config-operator(8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" Mar 20 11:21:21 crc kubenswrapper[4748]: I0320 11:21:21.515714 4748 scope.go:117] "RemoveContainer" containerID="9efbf264ae5e3724c39e42cf83857fbc6d1e070b3c91135d558f8fb199ffbbb5" Mar 20 11:21:21 crc kubenswrapper[4748]: E0320 11:21:21.516986 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lbvz_openshift-machine-config-operator(8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" Mar 20 11:21:35 crc kubenswrapper[4748]: I0320 11:21:35.521967 4748 scope.go:117] "RemoveContainer" containerID="9efbf264ae5e3724c39e42cf83857fbc6d1e070b3c91135d558f8fb199ffbbb5" Mar 20 11:21:35 crc kubenswrapper[4748]: E0320 11:21:35.522773 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lbvz_openshift-machine-config-operator(8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" Mar 20 11:21:41 crc kubenswrapper[4748]: I0320 11:21:41.829637 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-p79hj"] Mar 20 11:21:41 crc kubenswrapper[4748]: I0320 11:21:41.833652 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p79hj" Mar 20 11:21:41 crc kubenswrapper[4748]: I0320 11:21:41.859390 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-p79hj"] Mar 20 11:21:41 crc kubenswrapper[4748]: I0320 11:21:41.934110 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxckj\" (UniqueName: \"kubernetes.io/projected/a0019456-624b-4ba1-b950-76813498c5f7-kube-api-access-gxckj\") pod \"redhat-operators-p79hj\" (UID: \"a0019456-624b-4ba1-b950-76813498c5f7\") " pod="openshift-marketplace/redhat-operators-p79hj" Mar 20 11:21:41 crc kubenswrapper[4748]: I0320 11:21:41.934199 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0019456-624b-4ba1-b950-76813498c5f7-catalog-content\") pod \"redhat-operators-p79hj\" (UID: \"a0019456-624b-4ba1-b950-76813498c5f7\") " pod="openshift-marketplace/redhat-operators-p79hj" Mar 20 11:21:41 crc kubenswrapper[4748]: I0320 11:21:41.934363 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0019456-624b-4ba1-b950-76813498c5f7-utilities\") pod \"redhat-operators-p79hj\" (UID: \"a0019456-624b-4ba1-b950-76813498c5f7\") " pod="openshift-marketplace/redhat-operators-p79hj" Mar 20 11:21:42 crc kubenswrapper[4748]: I0320 11:21:42.036054 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0019456-624b-4ba1-b950-76813498c5f7-utilities\") pod \"redhat-operators-p79hj\" (UID: \"a0019456-624b-4ba1-b950-76813498c5f7\") " pod="openshift-marketplace/redhat-operators-p79hj" Mar 20 11:21:42 crc kubenswrapper[4748]: I0320 11:21:42.036479 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxckj\" (UniqueName: \"kubernetes.io/projected/a0019456-624b-4ba1-b950-76813498c5f7-kube-api-access-gxckj\") pod \"redhat-operators-p79hj\" (UID: \"a0019456-624b-4ba1-b950-76813498c5f7\") " pod="openshift-marketplace/redhat-operators-p79hj" Mar 20 11:21:42 crc kubenswrapper[4748]: I0320 11:21:42.036541 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0019456-624b-4ba1-b950-76813498c5f7-catalog-content\") pod \"redhat-operators-p79hj\" (UID: \"a0019456-624b-4ba1-b950-76813498c5f7\") " pod="openshift-marketplace/redhat-operators-p79hj" Mar 20 11:21:42 crc kubenswrapper[4748]: I0320 11:21:42.036544 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0019456-624b-4ba1-b950-76813498c5f7-utilities\") pod \"redhat-operators-p79hj\" (UID: \"a0019456-624b-4ba1-b950-76813498c5f7\") " pod="openshift-marketplace/redhat-operators-p79hj" Mar 20 11:21:42 crc kubenswrapper[4748]: I0320 11:21:42.036933 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0019456-624b-4ba1-b950-76813498c5f7-catalog-content\") pod \"redhat-operators-p79hj\" (UID: \"a0019456-624b-4ba1-b950-76813498c5f7\") " pod="openshift-marketplace/redhat-operators-p79hj" Mar 20 11:21:42 crc kubenswrapper[4748]: I0320 11:21:42.075433 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxckj\" (UniqueName: \"kubernetes.io/projected/a0019456-624b-4ba1-b950-76813498c5f7-kube-api-access-gxckj\") pod \"redhat-operators-p79hj\" (UID: \"a0019456-624b-4ba1-b950-76813498c5f7\") " pod="openshift-marketplace/redhat-operators-p79hj" Mar 20 11:21:42 crc kubenswrapper[4748]: I0320 11:21:42.158383 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p79hj" Mar 20 11:21:42 crc kubenswrapper[4748]: I0320 11:21:42.661758 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-p79hj"] Mar 20 11:21:42 crc kubenswrapper[4748]: W0320 11:21:42.670698 4748 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda0019456_624b_4ba1_b950_76813498c5f7.slice/crio-557f61fd8c6df98c9e8265ede0e01edff2ee49aff33b35d0c62e86ff5fc7d53d WatchSource:0}: Error finding container 557f61fd8c6df98c9e8265ede0e01edff2ee49aff33b35d0c62e86ff5fc7d53d: Status 404 returned error can't find the container with id 557f61fd8c6df98c9e8265ede0e01edff2ee49aff33b35d0c62e86ff5fc7d53d Mar 20 11:21:43 crc kubenswrapper[4748]: I0320 11:21:43.478133 4748 generic.go:334] "Generic (PLEG): container finished" podID="a0019456-624b-4ba1-b950-76813498c5f7" containerID="c55554a272c5ede3644a915fde1b12d559a44131c45f15f8c39c14ce4c7d37d5" exitCode=0 Mar 20 11:21:43 crc kubenswrapper[4748]: I0320 11:21:43.478198 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p79hj" event={"ID":"a0019456-624b-4ba1-b950-76813498c5f7","Type":"ContainerDied","Data":"c55554a272c5ede3644a915fde1b12d559a44131c45f15f8c39c14ce4c7d37d5"} Mar 20 11:21:43 crc kubenswrapper[4748]: I0320 11:21:43.479944 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p79hj" event={"ID":"a0019456-624b-4ba1-b950-76813498c5f7","Type":"ContainerStarted","Data":"557f61fd8c6df98c9e8265ede0e01edff2ee49aff33b35d0c62e86ff5fc7d53d"} Mar 20 11:21:43 crc kubenswrapper[4748]: I0320 11:21:43.479777 4748 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 11:21:46 crc kubenswrapper[4748]: I0320 11:21:46.502868 4748 generic.go:334] "Generic (PLEG): container finished" podID="a0019456-624b-4ba1-b950-76813498c5f7" containerID="523217349a35e5f72bfaadf4b65820b7b377f0554a08c40bca39d1b386f7c670" exitCode=0 Mar 20 11:21:46 crc kubenswrapper[4748]: I0320 11:21:46.502928 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p79hj" event={"ID":"a0019456-624b-4ba1-b950-76813498c5f7","Type":"ContainerDied","Data":"523217349a35e5f72bfaadf4b65820b7b377f0554a08c40bca39d1b386f7c670"} Mar 20 11:21:47 crc kubenswrapper[4748]: I0320 11:21:47.515747 4748 scope.go:117] "RemoveContainer" containerID="9efbf264ae5e3724c39e42cf83857fbc6d1e070b3c91135d558f8fb199ffbbb5" Mar 20 11:21:47 crc kubenswrapper[4748]: E0320 11:21:47.516293 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lbvz_openshift-machine-config-operator(8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" Mar 20 11:21:52 crc kubenswrapper[4748]: I0320 11:21:52.555579 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p79hj" event={"ID":"a0019456-624b-4ba1-b950-76813498c5f7","Type":"ContainerStarted","Data":"4cb903a9f822c080926c0ab5af7a48f1f229114ca56d028aabfa0e1c3399ca2b"} Mar 20 11:21:52 crc kubenswrapper[4748]: I0320 11:21:52.580812 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-p79hj" podStartSLOduration=3.094914226 podStartE2EDuration="11.580788062s" podCreationTimestamp="2026-03-20 11:21:41 +0000 UTC" firstStartedPulling="2026-03-20 11:21:43.479561152 +0000 UTC m=+2738.621106966" lastFinishedPulling="2026-03-20 11:21:51.965434988 +0000 UTC m=+2747.106980802" observedRunningTime="2026-03-20 11:21:52.572421383 +0000 UTC m=+2747.713967207" watchObservedRunningTime="2026-03-20 11:21:52.580788062 +0000 UTC m=+2747.722333876" Mar 20 11:21:58 crc kubenswrapper[4748]: I0320 11:21:58.515756 4748 scope.go:117] "RemoveContainer" containerID="9efbf264ae5e3724c39e42cf83857fbc6d1e070b3c91135d558f8fb199ffbbb5" Mar 20 11:21:58 crc kubenswrapper[4748]: E0320 11:21:58.516684 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lbvz_openshift-machine-config-operator(8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" Mar 20 11:22:00 crc kubenswrapper[4748]: I0320 11:22:00.149246 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566762-dwf2g"] Mar 20 11:22:00 crc kubenswrapper[4748]: I0320 11:22:00.151133 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566762-dwf2g" Mar 20 11:22:00 crc kubenswrapper[4748]: I0320 11:22:00.154636 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-p6q8z" Mar 20 11:22:00 crc kubenswrapper[4748]: I0320 11:22:00.154723 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 11:22:00 crc kubenswrapper[4748]: I0320 11:22:00.155281 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 11:22:00 crc kubenswrapper[4748]: I0320 11:22:00.157434 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566762-dwf2g"] Mar 20 11:22:00 crc kubenswrapper[4748]: I0320 11:22:00.190189 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vf99q\" (UniqueName: \"kubernetes.io/projected/a8b76d0b-b803-4b0f-8f32-b3316847237d-kube-api-access-vf99q\") pod \"auto-csr-approver-29566762-dwf2g\" (UID: \"a8b76d0b-b803-4b0f-8f32-b3316847237d\") " pod="openshift-infra/auto-csr-approver-29566762-dwf2g" Mar 20 11:22:00 crc kubenswrapper[4748]: I0320 11:22:00.291181 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vf99q\" (UniqueName: \"kubernetes.io/projected/a8b76d0b-b803-4b0f-8f32-b3316847237d-kube-api-access-vf99q\") pod \"auto-csr-approver-29566762-dwf2g\" (UID: \"a8b76d0b-b803-4b0f-8f32-b3316847237d\") " pod="openshift-infra/auto-csr-approver-29566762-dwf2g" Mar 20 11:22:00 crc kubenswrapper[4748]: I0320 11:22:00.311787 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vf99q\" (UniqueName: \"kubernetes.io/projected/a8b76d0b-b803-4b0f-8f32-b3316847237d-kube-api-access-vf99q\") pod \"auto-csr-approver-29566762-dwf2g\" (UID: \"a8b76d0b-b803-4b0f-8f32-b3316847237d\") " pod="openshift-infra/auto-csr-approver-29566762-dwf2g" Mar 20 11:22:00 crc kubenswrapper[4748]: I0320 11:22:00.469849 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566762-dwf2g" Mar 20 11:22:00 crc kubenswrapper[4748]: I0320 11:22:00.929778 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566762-dwf2g"] Mar 20 11:22:01 crc kubenswrapper[4748]: I0320 11:22:01.630127 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566762-dwf2g" event={"ID":"a8b76d0b-b803-4b0f-8f32-b3316847237d","Type":"ContainerStarted","Data":"8073ed8baa2d67665e0edaad12dbaf285bae0f807c92a8098a5bd27d2c2312b8"} Mar 20 11:22:02 crc kubenswrapper[4748]: I0320 11:22:02.159522 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-p79hj" Mar 20 11:22:02 crc kubenswrapper[4748]: I0320 11:22:02.159590 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-p79hj" Mar 20 11:22:02 crc kubenswrapper[4748]: I0320 11:22:02.206630 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-p79hj" Mar 20 11:22:02 crc kubenswrapper[4748]: I0320 11:22:02.687537 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-p79hj" Mar 20 11:22:02 crc kubenswrapper[4748]: I0320 11:22:02.737874 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-p79hj"] Mar 20 11:22:03 crc kubenswrapper[4748]: I0320 11:22:03.648116 4748 generic.go:334] "Generic (PLEG): container finished" podID="a8b76d0b-b803-4b0f-8f32-b3316847237d" containerID="a05a62a5662e00d3059c7f59b6c6513779165a5e04b68bd60b4e14459c60bb5d" exitCode=0 Mar 20 11:22:03 crc kubenswrapper[4748]: I0320 11:22:03.648340 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566762-dwf2g" event={"ID":"a8b76d0b-b803-4b0f-8f32-b3316847237d","Type":"ContainerDied","Data":"a05a62a5662e00d3059c7f59b6c6513779165a5e04b68bd60b4e14459c60bb5d"} Mar 20 11:22:04 crc kubenswrapper[4748]: I0320 11:22:04.657104 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-p79hj" podUID="a0019456-624b-4ba1-b950-76813498c5f7" containerName="registry-server" containerID="cri-o://4cb903a9f822c080926c0ab5af7a48f1f229114ca56d028aabfa0e1c3399ca2b" gracePeriod=2 Mar 20 11:22:04 crc kubenswrapper[4748]: I0320 11:22:04.997610 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566762-dwf2g" Mar 20 11:22:05 crc kubenswrapper[4748]: I0320 11:22:05.133736 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p79hj" Mar 20 11:22:05 crc kubenswrapper[4748]: I0320 11:22:05.191537 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vf99q\" (UniqueName: \"kubernetes.io/projected/a8b76d0b-b803-4b0f-8f32-b3316847237d-kube-api-access-vf99q\") pod \"a8b76d0b-b803-4b0f-8f32-b3316847237d\" (UID: \"a8b76d0b-b803-4b0f-8f32-b3316847237d\") " Mar 20 11:22:05 crc kubenswrapper[4748]: I0320 11:22:05.197187 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8b76d0b-b803-4b0f-8f32-b3316847237d-kube-api-access-vf99q" (OuterVolumeSpecName: "kube-api-access-vf99q") pod "a8b76d0b-b803-4b0f-8f32-b3316847237d" (UID: "a8b76d0b-b803-4b0f-8f32-b3316847237d"). InnerVolumeSpecName "kube-api-access-vf99q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:22:05 crc kubenswrapper[4748]: I0320 11:22:05.293209 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0019456-624b-4ba1-b950-76813498c5f7-utilities\") pod \"a0019456-624b-4ba1-b950-76813498c5f7\" (UID: \"a0019456-624b-4ba1-b950-76813498c5f7\") " Mar 20 11:22:05 crc kubenswrapper[4748]: I0320 11:22:05.293275 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gxckj\" (UniqueName: \"kubernetes.io/projected/a0019456-624b-4ba1-b950-76813498c5f7-kube-api-access-gxckj\") pod \"a0019456-624b-4ba1-b950-76813498c5f7\" (UID: \"a0019456-624b-4ba1-b950-76813498c5f7\") " Mar 20 11:22:05 crc kubenswrapper[4748]: I0320 11:22:05.293455 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0019456-624b-4ba1-b950-76813498c5f7-catalog-content\") pod \"a0019456-624b-4ba1-b950-76813498c5f7\" (UID: \"a0019456-624b-4ba1-b950-76813498c5f7\") " Mar 20 11:22:05 crc kubenswrapper[4748]: I0320 11:22:05.293996 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vf99q\" (UniqueName: \"kubernetes.io/projected/a8b76d0b-b803-4b0f-8f32-b3316847237d-kube-api-access-vf99q\") on node \"crc\" DevicePath \"\"" Mar 20 11:22:05 crc kubenswrapper[4748]: I0320 11:22:05.294676 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0019456-624b-4ba1-b950-76813498c5f7-utilities" (OuterVolumeSpecName: "utilities") pod "a0019456-624b-4ba1-b950-76813498c5f7" (UID: "a0019456-624b-4ba1-b950-76813498c5f7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:22:05 crc kubenswrapper[4748]: I0320 11:22:05.297436 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0019456-624b-4ba1-b950-76813498c5f7-kube-api-access-gxckj" (OuterVolumeSpecName: "kube-api-access-gxckj") pod "a0019456-624b-4ba1-b950-76813498c5f7" (UID: "a0019456-624b-4ba1-b950-76813498c5f7"). InnerVolumeSpecName "kube-api-access-gxckj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:22:05 crc kubenswrapper[4748]: I0320 11:22:05.395437 4748 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0019456-624b-4ba1-b950-76813498c5f7-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 11:22:05 crc kubenswrapper[4748]: I0320 11:22:05.395483 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gxckj\" (UniqueName: \"kubernetes.io/projected/a0019456-624b-4ba1-b950-76813498c5f7-kube-api-access-gxckj\") on node \"crc\" DevicePath \"\"" Mar 20 11:22:05 crc kubenswrapper[4748]: I0320 11:22:05.425206 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0019456-624b-4ba1-b950-76813498c5f7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a0019456-624b-4ba1-b950-76813498c5f7" (UID: "a0019456-624b-4ba1-b950-76813498c5f7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:22:05 crc kubenswrapper[4748]: I0320 11:22:05.497795 4748 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0019456-624b-4ba1-b950-76813498c5f7-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 11:22:05 crc kubenswrapper[4748]: I0320 11:22:05.668692 4748 generic.go:334] "Generic (PLEG): container finished" podID="a0019456-624b-4ba1-b950-76813498c5f7" containerID="4cb903a9f822c080926c0ab5af7a48f1f229114ca56d028aabfa0e1c3399ca2b" exitCode=0 Mar 20 11:22:05 crc kubenswrapper[4748]: I0320 11:22:05.668735 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p79hj" event={"ID":"a0019456-624b-4ba1-b950-76813498c5f7","Type":"ContainerDied","Data":"4cb903a9f822c080926c0ab5af7a48f1f229114ca56d028aabfa0e1c3399ca2b"} Mar 20 11:22:05 crc kubenswrapper[4748]: I0320 11:22:05.668806 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p79hj" event={"ID":"a0019456-624b-4ba1-b950-76813498c5f7","Type":"ContainerDied","Data":"557f61fd8c6df98c9e8265ede0e01edff2ee49aff33b35d0c62e86ff5fc7d53d"} Mar 20 11:22:05 crc kubenswrapper[4748]: I0320 11:22:05.668808 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p79hj" Mar 20 11:22:05 crc kubenswrapper[4748]: I0320 11:22:05.668845 4748 scope.go:117] "RemoveContainer" containerID="4cb903a9f822c080926c0ab5af7a48f1f229114ca56d028aabfa0e1c3399ca2b" Mar 20 11:22:05 crc kubenswrapper[4748]: I0320 11:22:05.672955 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566762-dwf2g" event={"ID":"a8b76d0b-b803-4b0f-8f32-b3316847237d","Type":"ContainerDied","Data":"8073ed8baa2d67665e0edaad12dbaf285bae0f807c92a8098a5bd27d2c2312b8"} Mar 20 11:22:05 crc kubenswrapper[4748]: I0320 11:22:05.672999 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566762-dwf2g" Mar 20 11:22:05 crc kubenswrapper[4748]: I0320 11:22:05.673001 4748 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8073ed8baa2d67665e0edaad12dbaf285bae0f807c92a8098a5bd27d2c2312b8" Mar 20 11:22:05 crc kubenswrapper[4748]: I0320 11:22:05.701564 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-p79hj"] Mar 20 11:22:05 crc kubenswrapper[4748]: I0320 11:22:05.711373 4748 scope.go:117] "RemoveContainer" containerID="523217349a35e5f72bfaadf4b65820b7b377f0554a08c40bca39d1b386f7c670" Mar 20 11:22:05 crc kubenswrapper[4748]: I0320 11:22:05.712948 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-p79hj"] Mar 20 11:22:05 crc kubenswrapper[4748]: I0320 11:22:05.735975 4748 scope.go:117] "RemoveContainer" containerID="c55554a272c5ede3644a915fde1b12d559a44131c45f15f8c39c14ce4c7d37d5" Mar 20 11:22:05 crc kubenswrapper[4748]: I0320 11:22:05.753504 4748 scope.go:117] "RemoveContainer" containerID="4cb903a9f822c080926c0ab5af7a48f1f229114ca56d028aabfa0e1c3399ca2b" Mar 20 11:22:05 crc kubenswrapper[4748]: E0320 11:22:05.754485 4748 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4cb903a9f822c080926c0ab5af7a48f1f229114ca56d028aabfa0e1c3399ca2b\": container with ID starting with 4cb903a9f822c080926c0ab5af7a48f1f229114ca56d028aabfa0e1c3399ca2b not found: ID does not exist" containerID="4cb903a9f822c080926c0ab5af7a48f1f229114ca56d028aabfa0e1c3399ca2b" Mar 20 11:22:05 crc kubenswrapper[4748]: I0320 11:22:05.754518 4748 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4cb903a9f822c080926c0ab5af7a48f1f229114ca56d028aabfa0e1c3399ca2b"} err="failed to get container status \"4cb903a9f822c080926c0ab5af7a48f1f229114ca56d028aabfa0e1c3399ca2b\": rpc error: code = NotFound desc = could not find container \"4cb903a9f822c080926c0ab5af7a48f1f229114ca56d028aabfa0e1c3399ca2b\": container with ID starting with 4cb903a9f822c080926c0ab5af7a48f1f229114ca56d028aabfa0e1c3399ca2b not found: ID does not exist" Mar 20 11:22:05 crc kubenswrapper[4748]: I0320 11:22:05.754539 4748 scope.go:117] "RemoveContainer" containerID="523217349a35e5f72bfaadf4b65820b7b377f0554a08c40bca39d1b386f7c670" Mar 20 11:22:05 crc kubenswrapper[4748]: E0320 11:22:05.755066 4748 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"523217349a35e5f72bfaadf4b65820b7b377f0554a08c40bca39d1b386f7c670\": container with ID starting with 523217349a35e5f72bfaadf4b65820b7b377f0554a08c40bca39d1b386f7c670 not found: ID does not exist" containerID="523217349a35e5f72bfaadf4b65820b7b377f0554a08c40bca39d1b386f7c670" Mar 20 11:22:05 crc kubenswrapper[4748]: I0320 11:22:05.755140 4748 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"523217349a35e5f72bfaadf4b65820b7b377f0554a08c40bca39d1b386f7c670"} err="failed to get container status \"523217349a35e5f72bfaadf4b65820b7b377f0554a08c40bca39d1b386f7c670\": rpc error: code = NotFound desc = could not find container \"523217349a35e5f72bfaadf4b65820b7b377f0554a08c40bca39d1b386f7c670\": container with ID starting with 523217349a35e5f72bfaadf4b65820b7b377f0554a08c40bca39d1b386f7c670 not found: ID does not exist" Mar 20 11:22:05 crc kubenswrapper[4748]: I0320 11:22:05.755194 4748 scope.go:117] "RemoveContainer" containerID="c55554a272c5ede3644a915fde1b12d559a44131c45f15f8c39c14ce4c7d37d5" Mar 20 11:22:05 crc kubenswrapper[4748]: E0320 11:22:05.756885 4748 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c55554a272c5ede3644a915fde1b12d559a44131c45f15f8c39c14ce4c7d37d5\": container with ID starting with c55554a272c5ede3644a915fde1b12d559a44131c45f15f8c39c14ce4c7d37d5 not found: ID does not exist" containerID="c55554a272c5ede3644a915fde1b12d559a44131c45f15f8c39c14ce4c7d37d5" Mar 20 11:22:05 crc kubenswrapper[4748]: I0320 11:22:05.756948 4748 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c55554a272c5ede3644a915fde1b12d559a44131c45f15f8c39c14ce4c7d37d5"} err="failed to get container status \"c55554a272c5ede3644a915fde1b12d559a44131c45f15f8c39c14ce4c7d37d5\": rpc error: code = NotFound desc = could not find container \"c55554a272c5ede3644a915fde1b12d559a44131c45f15f8c39c14ce4c7d37d5\": container with ID starting with c55554a272c5ede3644a915fde1b12d559a44131c45f15f8c39c14ce4c7d37d5 not found: ID does not exist" Mar 20 11:22:06 crc kubenswrapper[4748]: I0320 11:22:06.066039 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566756-s7r5n"] Mar 20 11:22:06 crc kubenswrapper[4748]: I0320 11:22:06.077602 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566756-s7r5n"] Mar 20 11:22:07 crc kubenswrapper[4748]: I0320 11:22:07.525639 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="193a6d7a-46ae-4834-a29d-b539ce90fc65" path="/var/lib/kubelet/pods/193a6d7a-46ae-4834-a29d-b539ce90fc65/volumes" Mar 20 11:22:07 crc kubenswrapper[4748]: I0320 11:22:07.526369 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0019456-624b-4ba1-b950-76813498c5f7" path="/var/lib/kubelet/pods/a0019456-624b-4ba1-b950-76813498c5f7/volumes" Mar 20 11:22:11 crc kubenswrapper[4748]: I0320 11:22:11.515925 4748 scope.go:117] "RemoveContainer" containerID="9efbf264ae5e3724c39e42cf83857fbc6d1e070b3c91135d558f8fb199ffbbb5" Mar 20 11:22:11 crc kubenswrapper[4748]: E0320 11:22:11.516763 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lbvz_openshift-machine-config-operator(8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" Mar 20 11:22:12 crc kubenswrapper[4748]: E0320 11:22:12.669978 4748 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda8b76d0b_b803_4b0f_8f32_b3316847237d.slice/crio-8073ed8baa2d67665e0edaad12dbaf285bae0f807c92a8098a5bd27d2c2312b8\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda0019456_624b_4ba1_b950_76813498c5f7.slice/crio-557f61fd8c6df98c9e8265ede0e01edff2ee49aff33b35d0c62e86ff5fc7d53d\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda8b76d0b_b803_4b0f_8f32_b3316847237d.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda0019456_624b_4ba1_b950_76813498c5f7.slice\": RecentStats: unable to find data in memory cache]" Mar 20 11:22:21 crc kubenswrapper[4748]: I0320 11:22:21.183092 4748 scope.go:117] "RemoveContainer" containerID="27e518e54fdd74a7549c9ad651969f888e457063ac2fc259671305be23e1e38b" Mar 20 11:22:22 crc kubenswrapper[4748]: I0320 11:22:22.515647 4748 scope.go:117] "RemoveContainer" containerID="9efbf264ae5e3724c39e42cf83857fbc6d1e070b3c91135d558f8fb199ffbbb5" Mar 20 11:22:22 crc kubenswrapper[4748]: E0320 11:22:22.516057 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lbvz_openshift-machine-config-operator(8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" Mar 20 11:22:22 crc kubenswrapper[4748]: E0320 11:22:22.923845 4748 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda0019456_624b_4ba1_b950_76813498c5f7.slice/crio-557f61fd8c6df98c9e8265ede0e01edff2ee49aff33b35d0c62e86ff5fc7d53d\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda8b76d0b_b803_4b0f_8f32_b3316847237d.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda8b76d0b_b803_4b0f_8f32_b3316847237d.slice/crio-8073ed8baa2d67665e0edaad12dbaf285bae0f807c92a8098a5bd27d2c2312b8\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda0019456_624b_4ba1_b950_76813498c5f7.slice\": RecentStats: unable to find data in memory cache]" Mar 20 11:22:33 crc kubenswrapper[4748]: E0320 11:22:33.150064 4748 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda0019456_624b_4ba1_b950_76813498c5f7.slice/crio-557f61fd8c6df98c9e8265ede0e01edff2ee49aff33b35d0c62e86ff5fc7d53d\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda0019456_624b_4ba1_b950_76813498c5f7.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda8b76d0b_b803_4b0f_8f32_b3316847237d.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda8b76d0b_b803_4b0f_8f32_b3316847237d.slice/crio-8073ed8baa2d67665e0edaad12dbaf285bae0f807c92a8098a5bd27d2c2312b8\": RecentStats: unable to find data in memory cache]" Mar 20 11:22:37 crc kubenswrapper[4748]: I0320 11:22:37.515128 4748 scope.go:117] "RemoveContainer" containerID="9efbf264ae5e3724c39e42cf83857fbc6d1e070b3c91135d558f8fb199ffbbb5" Mar 20 11:22:37 crc kubenswrapper[4748]: E0320 11:22:37.516887 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lbvz_openshift-machine-config-operator(8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" Mar 20 11:22:43 crc kubenswrapper[4748]: E0320 11:22:43.391089 4748 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda0019456_624b_4ba1_b950_76813498c5f7.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda8b76d0b_b803_4b0f_8f32_b3316847237d.slice/crio-8073ed8baa2d67665e0edaad12dbaf285bae0f807c92a8098a5bd27d2c2312b8\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda8b76d0b_b803_4b0f_8f32_b3316847237d.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda0019456_624b_4ba1_b950_76813498c5f7.slice/crio-557f61fd8c6df98c9e8265ede0e01edff2ee49aff33b35d0c62e86ff5fc7d53d\": RecentStats: unable to find data in memory cache]" Mar 20 11:22:49 crc kubenswrapper[4748]: I0320 11:22:49.115964 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-gmcsp"] Mar 20 11:22:49 crc kubenswrapper[4748]: E0320 11:22:49.116964 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8b76d0b-b803-4b0f-8f32-b3316847237d" containerName="oc" Mar 20 11:22:49 crc kubenswrapper[4748]: I0320 11:22:49.116982 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8b76d0b-b803-4b0f-8f32-b3316847237d" containerName="oc" Mar 20 11:22:49 crc kubenswrapper[4748]: E0320 11:22:49.116997 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0019456-624b-4ba1-b950-76813498c5f7" containerName="extract-content" Mar 20 11:22:49 crc kubenswrapper[4748]: I0320 11:22:49.117006 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0019456-624b-4ba1-b950-76813498c5f7" containerName="extract-content" Mar 20 11:22:49 crc kubenswrapper[4748]: E0320 11:22:49.117029 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0019456-624b-4ba1-b950-76813498c5f7" containerName="extract-utilities" Mar 20 11:22:49 crc kubenswrapper[4748]: I0320 11:22:49.117037 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0019456-624b-4ba1-b950-76813498c5f7" containerName="extract-utilities" Mar 20 11:22:49 crc kubenswrapper[4748]: E0320 11:22:49.117062 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0019456-624b-4ba1-b950-76813498c5f7" containerName="registry-server" Mar 20 11:22:49 crc kubenswrapper[4748]: I0320 11:22:49.117069 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0019456-624b-4ba1-b950-76813498c5f7" containerName="registry-server" Mar 20 11:22:49 crc kubenswrapper[4748]: I0320 11:22:49.117267 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8b76d0b-b803-4b0f-8f32-b3316847237d" containerName="oc" Mar 20 11:22:49 crc kubenswrapper[4748]: I0320 11:22:49.117292 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0019456-624b-4ba1-b950-76813498c5f7" containerName="registry-server" Mar 20 11:22:49 crc kubenswrapper[4748]: I0320 11:22:49.118817 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gmcsp" Mar 20 11:22:49 crc kubenswrapper[4748]: I0320 11:22:49.128172 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gmcsp"] Mar 20 11:22:49 crc kubenswrapper[4748]: I0320 11:22:49.205738 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e831d69-b4bf-49ed-b0f8-9b73b8f14b09-catalog-content\") pod \"redhat-marketplace-gmcsp\" (UID: \"6e831d69-b4bf-49ed-b0f8-9b73b8f14b09\") " pod="openshift-marketplace/redhat-marketplace-gmcsp" Mar 20 11:22:49 crc kubenswrapper[4748]: I0320 11:22:49.205809 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqc5h\" (UniqueName: \"kubernetes.io/projected/6e831d69-b4bf-49ed-b0f8-9b73b8f14b09-kube-api-access-jqc5h\") pod \"redhat-marketplace-gmcsp\" (UID: \"6e831d69-b4bf-49ed-b0f8-9b73b8f14b09\") " pod="openshift-marketplace/redhat-marketplace-gmcsp" Mar 20 11:22:49 crc kubenswrapper[4748]: I0320 11:22:49.206237 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e831d69-b4bf-49ed-b0f8-9b73b8f14b09-utilities\") pod \"redhat-marketplace-gmcsp\" (UID: \"6e831d69-b4bf-49ed-b0f8-9b73b8f14b09\") " pod="openshift-marketplace/redhat-marketplace-gmcsp" Mar 20 11:22:49 crc kubenswrapper[4748]: I0320 11:22:49.308488 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e831d69-b4bf-49ed-b0f8-9b73b8f14b09-catalog-content\") pod \"redhat-marketplace-gmcsp\" (UID: \"6e831d69-b4bf-49ed-b0f8-9b73b8f14b09\") " pod="openshift-marketplace/redhat-marketplace-gmcsp" Mar 20 11:22:49 crc kubenswrapper[4748]: I0320 11:22:49.308559 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jqc5h\" (UniqueName: \"kubernetes.io/projected/6e831d69-b4bf-49ed-b0f8-9b73b8f14b09-kube-api-access-jqc5h\") pod \"redhat-marketplace-gmcsp\" (UID: \"6e831d69-b4bf-49ed-b0f8-9b73b8f14b09\") " pod="openshift-marketplace/redhat-marketplace-gmcsp" Mar 20 11:22:49 crc kubenswrapper[4748]: I0320 11:22:49.308719 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e831d69-b4bf-49ed-b0f8-9b73b8f14b09-utilities\") pod \"redhat-marketplace-gmcsp\" (UID: \"6e831d69-b4bf-49ed-b0f8-9b73b8f14b09\") " pod="openshift-marketplace/redhat-marketplace-gmcsp" Mar 20 11:22:49 crc kubenswrapper[4748]: I0320 11:22:49.309031 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e831d69-b4bf-49ed-b0f8-9b73b8f14b09-catalog-content\") pod \"redhat-marketplace-gmcsp\" (UID: \"6e831d69-b4bf-49ed-b0f8-9b73b8f14b09\") " pod="openshift-marketplace/redhat-marketplace-gmcsp" Mar 20 11:22:49 crc kubenswrapper[4748]: I0320 11:22:49.309333 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e831d69-b4bf-49ed-b0f8-9b73b8f14b09-utilities\") pod \"redhat-marketplace-gmcsp\" (UID: \"6e831d69-b4bf-49ed-b0f8-9b73b8f14b09\") " pod="openshift-marketplace/redhat-marketplace-gmcsp" Mar 20 11:22:49 crc kubenswrapper[4748]: I0320 11:22:49.329819 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqc5h\" (UniqueName: \"kubernetes.io/projected/6e831d69-b4bf-49ed-b0f8-9b73b8f14b09-kube-api-access-jqc5h\") pod \"redhat-marketplace-gmcsp\" (UID: \"6e831d69-b4bf-49ed-b0f8-9b73b8f14b09\") " pod="openshift-marketplace/redhat-marketplace-gmcsp" Mar 20 11:22:49 crc kubenswrapper[4748]: I0320 11:22:49.440444 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gmcsp" Mar 20 11:22:49 crc kubenswrapper[4748]: I0320 11:22:49.912747 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gmcsp"] Mar 20 11:22:50 crc kubenswrapper[4748]: I0320 11:22:50.037629 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gmcsp" event={"ID":"6e831d69-b4bf-49ed-b0f8-9b73b8f14b09","Type":"ContainerStarted","Data":"ff8279830164b212e3d6cc2731355f5ed1450b300c2d53de3f41b480c01f1671"} Mar 20 11:22:50 crc kubenswrapper[4748]: I0320 11:22:50.515990 4748 scope.go:117] "RemoveContainer" containerID="9efbf264ae5e3724c39e42cf83857fbc6d1e070b3c91135d558f8fb199ffbbb5" Mar 20 11:22:51 crc kubenswrapper[4748]: I0320 11:22:51.050704 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" event={"ID":"8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c","Type":"ContainerStarted","Data":"a585b460ade4b7044a27fe4f08a1d29b65f1fac9c0d4f42f3e310bffe005c7ed"} Mar 20 11:22:51 crc kubenswrapper[4748]: I0320 11:22:51.053959 4748 generic.go:334] "Generic (PLEG): container finished" podID="6e831d69-b4bf-49ed-b0f8-9b73b8f14b09" containerID="525e5535dd9480968e8e3fd5d21424a94b3eb15c7b875ea58c458c6cd3507d8d" exitCode=0 Mar 20 11:22:51 crc kubenswrapper[4748]: I0320 11:22:51.054008 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gmcsp" event={"ID":"6e831d69-b4bf-49ed-b0f8-9b73b8f14b09","Type":"ContainerDied","Data":"525e5535dd9480968e8e3fd5d21424a94b3eb15c7b875ea58c458c6cd3507d8d"} Mar 20 11:22:53 crc kubenswrapper[4748]: I0320 11:22:53.074715 4748 generic.go:334] "Generic (PLEG): container finished" podID="6e831d69-b4bf-49ed-b0f8-9b73b8f14b09" containerID="fc84a698f4f6ae3f3877aad485a88d74ff00ffdc8a93591076fa55a959206693" exitCode=0 Mar 20 11:22:53 crc kubenswrapper[4748]: I0320 11:22:53.074848 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gmcsp" event={"ID":"6e831d69-b4bf-49ed-b0f8-9b73b8f14b09","Type":"ContainerDied","Data":"fc84a698f4f6ae3f3877aad485a88d74ff00ffdc8a93591076fa55a959206693"} Mar 20 11:22:53 crc kubenswrapper[4748]: E0320 11:22:53.661990 4748 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda8b76d0b_b803_4b0f_8f32_b3316847237d.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda8b76d0b_b803_4b0f_8f32_b3316847237d.slice/crio-8073ed8baa2d67665e0edaad12dbaf285bae0f807c92a8098a5bd27d2c2312b8\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda0019456_624b_4ba1_b950_76813498c5f7.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda0019456_624b_4ba1_b950_76813498c5f7.slice/crio-557f61fd8c6df98c9e8265ede0e01edff2ee49aff33b35d0c62e86ff5fc7d53d\": RecentStats: unable to find data in memory cache]" Mar 20 11:22:54 crc kubenswrapper[4748]: I0320 11:22:54.083756 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gmcsp" event={"ID":"6e831d69-b4bf-49ed-b0f8-9b73b8f14b09","Type":"ContainerStarted","Data":"4793640f2235c15c909c29dc14c63ba45956d7c198e41a0b1fbb54d1ef32fe5b"} Mar 20 11:22:54 crc kubenswrapper[4748]: I0320 11:22:54.108059 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-gmcsp" podStartSLOduration=2.406362149 podStartE2EDuration="5.108030231s" podCreationTimestamp="2026-03-20 11:22:49 +0000 UTC" firstStartedPulling="2026-03-20 11:22:51.0559858 +0000 UTC m=+2806.197531614" lastFinishedPulling="2026-03-20 11:22:53.757653882 +0000 UTC m=+2808.899199696" observedRunningTime="2026-03-20 11:22:54.103137648 +0000 UTC m=+2809.244683482" watchObservedRunningTime="2026-03-20 11:22:54.108030231 +0000 UTC m=+2809.249576055" Mar 20 11:22:59 crc kubenswrapper[4748]: I0320 11:22:59.441334 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-gmcsp" Mar 20 11:22:59 crc kubenswrapper[4748]: I0320 11:22:59.441881 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-gmcsp" Mar 20 11:22:59 crc kubenswrapper[4748]: I0320 11:22:59.496857 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-gmcsp" Mar 20 11:23:00 crc kubenswrapper[4748]: I0320 11:23:00.184702 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-gmcsp" Mar 20 11:23:00 crc kubenswrapper[4748]: I0320 11:23:00.237422 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gmcsp"] Mar 20 11:23:02 crc kubenswrapper[4748]: I0320 11:23:02.153341 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-gmcsp" podUID="6e831d69-b4bf-49ed-b0f8-9b73b8f14b09" containerName="registry-server" containerID="cri-o://4793640f2235c15c909c29dc14c63ba45956d7c198e41a0b1fbb54d1ef32fe5b" gracePeriod=2 Mar 20 11:23:02 crc kubenswrapper[4748]: I0320 11:23:02.592055 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gmcsp" Mar 20 11:23:02 crc kubenswrapper[4748]: I0320 11:23:02.769386 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jqc5h\" (UniqueName: \"kubernetes.io/projected/6e831d69-b4bf-49ed-b0f8-9b73b8f14b09-kube-api-access-jqc5h\") pod \"6e831d69-b4bf-49ed-b0f8-9b73b8f14b09\" (UID: \"6e831d69-b4bf-49ed-b0f8-9b73b8f14b09\") " Mar 20 11:23:02 crc kubenswrapper[4748]: I0320 11:23:02.769485 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e831d69-b4bf-49ed-b0f8-9b73b8f14b09-utilities\") pod \"6e831d69-b4bf-49ed-b0f8-9b73b8f14b09\" (UID: \"6e831d69-b4bf-49ed-b0f8-9b73b8f14b09\") " Mar 20 11:23:02 crc kubenswrapper[4748]: I0320 11:23:02.769611 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e831d69-b4bf-49ed-b0f8-9b73b8f14b09-catalog-content\") pod \"6e831d69-b4bf-49ed-b0f8-9b73b8f14b09\" (UID: \"6e831d69-b4bf-49ed-b0f8-9b73b8f14b09\") " Mar 20 11:23:02 crc kubenswrapper[4748]: I0320 11:23:02.770347 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e831d69-b4bf-49ed-b0f8-9b73b8f14b09-utilities" (OuterVolumeSpecName: "utilities") pod "6e831d69-b4bf-49ed-b0f8-9b73b8f14b09" (UID: "6e831d69-b4bf-49ed-b0f8-9b73b8f14b09"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:23:02 crc kubenswrapper[4748]: I0320 11:23:02.775529 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e831d69-b4bf-49ed-b0f8-9b73b8f14b09-kube-api-access-jqc5h" (OuterVolumeSpecName: "kube-api-access-jqc5h") pod "6e831d69-b4bf-49ed-b0f8-9b73b8f14b09" (UID: "6e831d69-b4bf-49ed-b0f8-9b73b8f14b09"). InnerVolumeSpecName "kube-api-access-jqc5h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:23:02 crc kubenswrapper[4748]: I0320 11:23:02.801010 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e831d69-b4bf-49ed-b0f8-9b73b8f14b09-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6e831d69-b4bf-49ed-b0f8-9b73b8f14b09" (UID: "6e831d69-b4bf-49ed-b0f8-9b73b8f14b09"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:23:02 crc kubenswrapper[4748]: I0320 11:23:02.871864 4748 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e831d69-b4bf-49ed-b0f8-9b73b8f14b09-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 11:23:02 crc kubenswrapper[4748]: I0320 11:23:02.871904 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jqc5h\" (UniqueName: \"kubernetes.io/projected/6e831d69-b4bf-49ed-b0f8-9b73b8f14b09-kube-api-access-jqc5h\") on node \"crc\" DevicePath \"\"" Mar 20 11:23:02 crc kubenswrapper[4748]: I0320 11:23:02.871917 4748 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e831d69-b4bf-49ed-b0f8-9b73b8f14b09-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 11:23:03 crc kubenswrapper[4748]: I0320 11:23:03.162869 4748 generic.go:334] "Generic (PLEG): container finished" podID="6e831d69-b4bf-49ed-b0f8-9b73b8f14b09" containerID="4793640f2235c15c909c29dc14c63ba45956d7c198e41a0b1fbb54d1ef32fe5b" exitCode=0 Mar 20 11:23:03 crc kubenswrapper[4748]: I0320 11:23:03.162918 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gmcsp" event={"ID":"6e831d69-b4bf-49ed-b0f8-9b73b8f14b09","Type":"ContainerDied","Data":"4793640f2235c15c909c29dc14c63ba45956d7c198e41a0b1fbb54d1ef32fe5b"} Mar 20 11:23:03 crc kubenswrapper[4748]: I0320 11:23:03.162947 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gmcsp" event={"ID":"6e831d69-b4bf-49ed-b0f8-9b73b8f14b09","Type":"ContainerDied","Data":"ff8279830164b212e3d6cc2731355f5ed1450b300c2d53de3f41b480c01f1671"} Mar 20 11:23:03 crc kubenswrapper[4748]: I0320 11:23:03.162962 4748 scope.go:117] "RemoveContainer" containerID="4793640f2235c15c909c29dc14c63ba45956d7c198e41a0b1fbb54d1ef32fe5b" Mar 20 11:23:03 crc kubenswrapper[4748]: I0320 11:23:03.163082 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gmcsp" Mar 20 11:23:03 crc kubenswrapper[4748]: I0320 11:23:03.196274 4748 scope.go:117] "RemoveContainer" containerID="fc84a698f4f6ae3f3877aad485a88d74ff00ffdc8a93591076fa55a959206693" Mar 20 11:23:03 crc kubenswrapper[4748]: I0320 11:23:03.203662 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gmcsp"] Mar 20 11:23:03 crc kubenswrapper[4748]: I0320 11:23:03.218779 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-gmcsp"] Mar 20 11:23:03 crc kubenswrapper[4748]: I0320 11:23:03.222661 4748 scope.go:117] "RemoveContainer" containerID="525e5535dd9480968e8e3fd5d21424a94b3eb15c7b875ea58c458c6cd3507d8d" Mar 20 11:23:03 crc kubenswrapper[4748]: I0320 11:23:03.261593 4748 scope.go:117] "RemoveContainer" containerID="4793640f2235c15c909c29dc14c63ba45956d7c198e41a0b1fbb54d1ef32fe5b" Mar 20 11:23:03 crc kubenswrapper[4748]: E0320 11:23:03.262511 4748 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4793640f2235c15c909c29dc14c63ba45956d7c198e41a0b1fbb54d1ef32fe5b\": container with ID starting with 4793640f2235c15c909c29dc14c63ba45956d7c198e41a0b1fbb54d1ef32fe5b not found: ID does not exist" containerID="4793640f2235c15c909c29dc14c63ba45956d7c198e41a0b1fbb54d1ef32fe5b" Mar 20 11:23:03 crc kubenswrapper[4748]: I0320 11:23:03.262556 4748 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4793640f2235c15c909c29dc14c63ba45956d7c198e41a0b1fbb54d1ef32fe5b"} err="failed to get container status \"4793640f2235c15c909c29dc14c63ba45956d7c198e41a0b1fbb54d1ef32fe5b\": rpc error: code = NotFound desc = could not find container \"4793640f2235c15c909c29dc14c63ba45956d7c198e41a0b1fbb54d1ef32fe5b\": container with ID starting with 4793640f2235c15c909c29dc14c63ba45956d7c198e41a0b1fbb54d1ef32fe5b not found: ID does not exist" Mar 20 11:23:03 crc kubenswrapper[4748]: I0320 11:23:03.262586 4748 scope.go:117] "RemoveContainer" containerID="fc84a698f4f6ae3f3877aad485a88d74ff00ffdc8a93591076fa55a959206693" Mar 20 11:23:03 crc kubenswrapper[4748]: E0320 11:23:03.263117 4748 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc84a698f4f6ae3f3877aad485a88d74ff00ffdc8a93591076fa55a959206693\": container with ID starting with fc84a698f4f6ae3f3877aad485a88d74ff00ffdc8a93591076fa55a959206693 not found: ID does not exist" containerID="fc84a698f4f6ae3f3877aad485a88d74ff00ffdc8a93591076fa55a959206693" Mar 20 11:23:03 crc kubenswrapper[4748]: I0320 11:23:03.263159 4748 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc84a698f4f6ae3f3877aad485a88d74ff00ffdc8a93591076fa55a959206693"} err="failed to get container status \"fc84a698f4f6ae3f3877aad485a88d74ff00ffdc8a93591076fa55a959206693\": rpc error: code = NotFound desc = could not find container \"fc84a698f4f6ae3f3877aad485a88d74ff00ffdc8a93591076fa55a959206693\": container with ID starting with fc84a698f4f6ae3f3877aad485a88d74ff00ffdc8a93591076fa55a959206693 not found: ID does not exist" Mar 20 11:23:03 crc kubenswrapper[4748]: I0320 11:23:03.263206 4748 scope.go:117] "RemoveContainer" containerID="525e5535dd9480968e8e3fd5d21424a94b3eb15c7b875ea58c458c6cd3507d8d" Mar 20 11:23:03 crc kubenswrapper[4748]: E0320 11:23:03.263537 4748 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"525e5535dd9480968e8e3fd5d21424a94b3eb15c7b875ea58c458c6cd3507d8d\": container with ID starting with 525e5535dd9480968e8e3fd5d21424a94b3eb15c7b875ea58c458c6cd3507d8d not found: ID does not exist" containerID="525e5535dd9480968e8e3fd5d21424a94b3eb15c7b875ea58c458c6cd3507d8d" Mar 20 11:23:03 crc kubenswrapper[4748]: I0320 11:23:03.263564 4748 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"525e5535dd9480968e8e3fd5d21424a94b3eb15c7b875ea58c458c6cd3507d8d"} err="failed to get container status \"525e5535dd9480968e8e3fd5d21424a94b3eb15c7b875ea58c458c6cd3507d8d\": rpc error: code = NotFound desc = could not find container \"525e5535dd9480968e8e3fd5d21424a94b3eb15c7b875ea58c458c6cd3507d8d\": container with ID starting with 525e5535dd9480968e8e3fd5d21424a94b3eb15c7b875ea58c458c6cd3507d8d not found: ID does not exist" Mar 20 11:23:03 crc kubenswrapper[4748]: I0320 11:23:03.525844 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e831d69-b4bf-49ed-b0f8-9b73b8f14b09" path="/var/lib/kubelet/pods/6e831d69-b4bf-49ed-b0f8-9b73b8f14b09/volumes" Mar 20 11:23:03 crc kubenswrapper[4748]: E0320 11:23:03.886255 4748 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda0019456_624b_4ba1_b950_76813498c5f7.slice/crio-557f61fd8c6df98c9e8265ede0e01edff2ee49aff33b35d0c62e86ff5fc7d53d\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda8b76d0b_b803_4b0f_8f32_b3316847237d.slice/crio-8073ed8baa2d67665e0edaad12dbaf285bae0f807c92a8098a5bd27d2c2312b8\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda8b76d0b_b803_4b0f_8f32_b3316847237d.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda0019456_624b_4ba1_b950_76813498c5f7.slice\": RecentStats: unable to find data in memory cache]" Mar 20 11:23:34 crc kubenswrapper[4748]: I0320 11:23:34.424128 4748 generic.go:334] "Generic (PLEG): container finished" podID="d03d3cf8-b0f5-46bf-9396-e6da7698e6fb" containerID="6545c51563d5f190a056e7e616d76a0ea4e991b4c505b57d606cd776e0660f0b" exitCode=0 Mar 20 11:23:34 crc kubenswrapper[4748]: I0320 11:23:34.424202 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7wd5b" event={"ID":"d03d3cf8-b0f5-46bf-9396-e6da7698e6fb","Type":"ContainerDied","Data":"6545c51563d5f190a056e7e616d76a0ea4e991b4c505b57d606cd776e0660f0b"} Mar 20 11:23:35 crc kubenswrapper[4748]: I0320 11:23:35.852870 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7wd5b" Mar 20 11:23:35 crc kubenswrapper[4748]: I0320 11:23:35.977159 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d03d3cf8-b0f5-46bf-9396-e6da7698e6fb-inventory\") pod \"d03d3cf8-b0f5-46bf-9396-e6da7698e6fb\" (UID: \"d03d3cf8-b0f5-46bf-9396-e6da7698e6fb\") " Mar 20 11:23:35 crc kubenswrapper[4748]: I0320 11:23:35.977522 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/d03d3cf8-b0f5-46bf-9396-e6da7698e6fb-ceilometer-compute-config-data-2\") pod \"d03d3cf8-b0f5-46bf-9396-e6da7698e6fb\" (UID: \"d03d3cf8-b0f5-46bf-9396-e6da7698e6fb\") " Mar 20 11:23:35 crc kubenswrapper[4748]: I0320 11:23:35.977563 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d03d3cf8-b0f5-46bf-9396-e6da7698e6fb-ssh-key-openstack-edpm-ipam\") pod \"d03d3cf8-b0f5-46bf-9396-e6da7698e6fb\" (UID: \"d03d3cf8-b0f5-46bf-9396-e6da7698e6fb\") " Mar 20 11:23:35 crc kubenswrapper[4748]: I0320 11:23:35.977596 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/d03d3cf8-b0f5-46bf-9396-e6da7698e6fb-ceilometer-compute-config-data-1\") pod \"d03d3cf8-b0f5-46bf-9396-e6da7698e6fb\" (UID: \"d03d3cf8-b0f5-46bf-9396-e6da7698e6fb\") " Mar 20 11:23:35 crc kubenswrapper[4748]: I0320 11:23:35.977718 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m56t4\" (UniqueName: \"kubernetes.io/projected/d03d3cf8-b0f5-46bf-9396-e6da7698e6fb-kube-api-access-m56t4\") pod \"d03d3cf8-b0f5-46bf-9396-e6da7698e6fb\" (UID: \"d03d3cf8-b0f5-46bf-9396-e6da7698e6fb\") " Mar 20 11:23:35 crc kubenswrapper[4748]: I0320 11:23:35.977810 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/d03d3cf8-b0f5-46bf-9396-e6da7698e6fb-ceilometer-compute-config-data-0\") pod \"d03d3cf8-b0f5-46bf-9396-e6da7698e6fb\" (UID: \"d03d3cf8-b0f5-46bf-9396-e6da7698e6fb\") " Mar 20 11:23:35 crc kubenswrapper[4748]: I0320 11:23:35.977873 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d03d3cf8-b0f5-46bf-9396-e6da7698e6fb-telemetry-combined-ca-bundle\") pod \"d03d3cf8-b0f5-46bf-9396-e6da7698e6fb\" (UID: \"d03d3cf8-b0f5-46bf-9396-e6da7698e6fb\") " Mar 20 11:23:35 crc kubenswrapper[4748]: I0320 11:23:35.982916 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d03d3cf8-b0f5-46bf-9396-e6da7698e6fb-kube-api-access-m56t4" (OuterVolumeSpecName: "kube-api-access-m56t4") pod "d03d3cf8-b0f5-46bf-9396-e6da7698e6fb" (UID: "d03d3cf8-b0f5-46bf-9396-e6da7698e6fb"). InnerVolumeSpecName "kube-api-access-m56t4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:23:35 crc kubenswrapper[4748]: I0320 11:23:35.996418 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d03d3cf8-b0f5-46bf-9396-e6da7698e6fb-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "d03d3cf8-b0f5-46bf-9396-e6da7698e6fb" (UID: "d03d3cf8-b0f5-46bf-9396-e6da7698e6fb"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 11:23:36 crc kubenswrapper[4748]: I0320 11:23:36.008515 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d03d3cf8-b0f5-46bf-9396-e6da7698e6fb-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "d03d3cf8-b0f5-46bf-9396-e6da7698e6fb" (UID: "d03d3cf8-b0f5-46bf-9396-e6da7698e6fb"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 11:23:36 crc kubenswrapper[4748]: I0320 11:23:36.008695 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d03d3cf8-b0f5-46bf-9396-e6da7698e6fb-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "d03d3cf8-b0f5-46bf-9396-e6da7698e6fb" (UID: "d03d3cf8-b0f5-46bf-9396-e6da7698e6fb"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 11:23:36 crc kubenswrapper[4748]: I0320 11:23:36.010167 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d03d3cf8-b0f5-46bf-9396-e6da7698e6fb-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "d03d3cf8-b0f5-46bf-9396-e6da7698e6fb" (UID: "d03d3cf8-b0f5-46bf-9396-e6da7698e6fb"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 11:23:36 crc kubenswrapper[4748]: I0320 11:23:36.012123 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d03d3cf8-b0f5-46bf-9396-e6da7698e6fb-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "d03d3cf8-b0f5-46bf-9396-e6da7698e6fb" (UID: "d03d3cf8-b0f5-46bf-9396-e6da7698e6fb"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 11:23:36 crc kubenswrapper[4748]: I0320 11:23:36.014210 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d03d3cf8-b0f5-46bf-9396-e6da7698e6fb-inventory" (OuterVolumeSpecName: "inventory") pod "d03d3cf8-b0f5-46bf-9396-e6da7698e6fb" (UID: "d03d3cf8-b0f5-46bf-9396-e6da7698e6fb"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 11:23:36 crc kubenswrapper[4748]: I0320 11:23:36.079988 4748 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/d03d3cf8-b0f5-46bf-9396-e6da7698e6fb-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Mar 20 11:23:36 crc kubenswrapper[4748]: I0320 11:23:36.080032 4748 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d03d3cf8-b0f5-46bf-9396-e6da7698e6fb-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 11:23:36 crc kubenswrapper[4748]: I0320 11:23:36.080047 4748 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/d03d3cf8-b0f5-46bf-9396-e6da7698e6fb-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Mar 20 11:23:36 crc kubenswrapper[4748]: I0320 11:23:36.080065 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m56t4\" (UniqueName: \"kubernetes.io/projected/d03d3cf8-b0f5-46bf-9396-e6da7698e6fb-kube-api-access-m56t4\") on node \"crc\" DevicePath \"\"" Mar 20 11:23:36 crc kubenswrapper[4748]: I0320 11:23:36.080081 4748 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/d03d3cf8-b0f5-46bf-9396-e6da7698e6fb-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Mar 20 11:23:36 crc kubenswrapper[4748]: I0320 11:23:36.080093 4748 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d03d3cf8-b0f5-46bf-9396-e6da7698e6fb-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 11:23:36 crc kubenswrapper[4748]: I0320 11:23:36.080108 4748 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d03d3cf8-b0f5-46bf-9396-e6da7698e6fb-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 11:23:36 crc kubenswrapper[4748]: I0320 11:23:36.465349 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7wd5b" event={"ID":"d03d3cf8-b0f5-46bf-9396-e6da7698e6fb","Type":"ContainerDied","Data":"b533e2ecfa644c491d116e0d47aea5ba526ac983855baa4e3a93b9fa6c96768d"} Mar 20 11:23:36 crc kubenswrapper[4748]: I0320 11:23:36.465445 4748 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b533e2ecfa644c491d116e0d47aea5ba526ac983855baa4e3a93b9fa6c96768d" Mar 20 11:23:36 crc kubenswrapper[4748]: I0320 11:23:36.473934 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7wd5b" Mar 20 11:24:00 crc kubenswrapper[4748]: I0320 11:24:00.141099 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566764-wfwcl"] Mar 20 11:24:00 crc kubenswrapper[4748]: E0320 11:24:00.141944 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e831d69-b4bf-49ed-b0f8-9b73b8f14b09" containerName="extract-content" Mar 20 11:24:00 crc kubenswrapper[4748]: I0320 11:24:00.141956 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e831d69-b4bf-49ed-b0f8-9b73b8f14b09" containerName="extract-content" Mar 20 11:24:00 crc kubenswrapper[4748]: E0320 11:24:00.141971 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e831d69-b4bf-49ed-b0f8-9b73b8f14b09" containerName="extract-utilities" Mar 20 11:24:00 crc kubenswrapper[4748]: I0320 11:24:00.141977 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e831d69-b4bf-49ed-b0f8-9b73b8f14b09" containerName="extract-utilities" Mar 20 11:24:00 crc kubenswrapper[4748]: E0320 11:24:00.142003 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d03d3cf8-b0f5-46bf-9396-e6da7698e6fb" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Mar 20 11:24:00 crc kubenswrapper[4748]: I0320 11:24:00.142010 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="d03d3cf8-b0f5-46bf-9396-e6da7698e6fb" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Mar 20 11:24:00 crc kubenswrapper[4748]: E0320 11:24:00.142022 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e831d69-b4bf-49ed-b0f8-9b73b8f14b09" containerName="registry-server" Mar 20 11:24:00 crc kubenswrapper[4748]: I0320 11:24:00.142027 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e831d69-b4bf-49ed-b0f8-9b73b8f14b09" containerName="registry-server" Mar 20 11:24:00 crc kubenswrapper[4748]: I0320 11:24:00.142201 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="d03d3cf8-b0f5-46bf-9396-e6da7698e6fb" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Mar 20 11:24:00 crc kubenswrapper[4748]: I0320 11:24:00.142211 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e831d69-b4bf-49ed-b0f8-9b73b8f14b09" containerName="registry-server" Mar 20 11:24:00 crc kubenswrapper[4748]: I0320 11:24:00.142808 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566764-wfwcl" Mar 20 11:24:00 crc kubenswrapper[4748]: I0320 11:24:00.146168 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 11:24:00 crc kubenswrapper[4748]: I0320 11:24:00.146212 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-p6q8z" Mar 20 11:24:00 crc kubenswrapper[4748]: I0320 11:24:00.146623 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 11:24:00 crc kubenswrapper[4748]: I0320 11:24:00.155981 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566764-wfwcl"] Mar 20 11:24:00 crc kubenswrapper[4748]: I0320 11:24:00.243471 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbmlh\" (UniqueName: \"kubernetes.io/projected/8fd94020-5267-4d83-94e4-4cd358e1f134-kube-api-access-bbmlh\") pod \"auto-csr-approver-29566764-wfwcl\" (UID: \"8fd94020-5267-4d83-94e4-4cd358e1f134\") " pod="openshift-infra/auto-csr-approver-29566764-wfwcl" Mar 20 11:24:00 crc kubenswrapper[4748]: I0320 11:24:00.346199 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bbmlh\" (UniqueName: \"kubernetes.io/projected/8fd94020-5267-4d83-94e4-4cd358e1f134-kube-api-access-bbmlh\") pod \"auto-csr-approver-29566764-wfwcl\" (UID: \"8fd94020-5267-4d83-94e4-4cd358e1f134\") " pod="openshift-infra/auto-csr-approver-29566764-wfwcl" Mar 20 11:24:00 crc kubenswrapper[4748]: I0320 11:24:00.364440 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbmlh\" (UniqueName: \"kubernetes.io/projected/8fd94020-5267-4d83-94e4-4cd358e1f134-kube-api-access-bbmlh\") pod \"auto-csr-approver-29566764-wfwcl\" (UID: \"8fd94020-5267-4d83-94e4-4cd358e1f134\") " pod="openshift-infra/auto-csr-approver-29566764-wfwcl" Mar 20 11:24:00 crc kubenswrapper[4748]: I0320 11:24:00.468692 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566764-wfwcl" Mar 20 11:24:00 crc kubenswrapper[4748]: I0320 11:24:00.920397 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566764-wfwcl"] Mar 20 11:24:01 crc kubenswrapper[4748]: I0320 11:24:01.675557 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566764-wfwcl" event={"ID":"8fd94020-5267-4d83-94e4-4cd358e1f134","Type":"ContainerStarted","Data":"d0d3c7e5ee205ea55a466dd6a0422c05b26835788ea8e419f28e1b55e88706e6"} Mar 20 11:24:03 crc kubenswrapper[4748]: I0320 11:24:03.691994 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566764-wfwcl" event={"ID":"8fd94020-5267-4d83-94e4-4cd358e1f134","Type":"ContainerStarted","Data":"39e42753a07aa0fb0b7dec08330ef381824d586618f5f35d52c00091f0aff57d"} Mar 20 11:24:03 crc kubenswrapper[4748]: I0320 11:24:03.723763 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29566764-wfwcl" podStartSLOduration=1.297719078 podStartE2EDuration="3.723736925s" podCreationTimestamp="2026-03-20 11:24:00 +0000 UTC" firstStartedPulling="2026-03-20 11:24:00.92425418 +0000 UTC m=+2876.065799994" lastFinishedPulling="2026-03-20 11:24:03.350272027 +0000 UTC m=+2878.491817841" observedRunningTime="2026-03-20 11:24:03.707743153 +0000 UTC m=+2878.849288967" watchObservedRunningTime="2026-03-20 11:24:03.723736925 +0000 UTC m=+2878.865282749" Mar 20 11:24:04 crc kubenswrapper[4748]: I0320 11:24:04.701141 4748 generic.go:334] "Generic (PLEG): container finished" podID="8fd94020-5267-4d83-94e4-4cd358e1f134" containerID="39e42753a07aa0fb0b7dec08330ef381824d586618f5f35d52c00091f0aff57d" exitCode=0 Mar 20 11:24:04 crc kubenswrapper[4748]: I0320 11:24:04.701337 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566764-wfwcl" event={"ID":"8fd94020-5267-4d83-94e4-4cd358e1f134","Type":"ContainerDied","Data":"39e42753a07aa0fb0b7dec08330ef381824d586618f5f35d52c00091f0aff57d"} Mar 20 11:24:06 crc kubenswrapper[4748]: I0320 11:24:06.049559 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566764-wfwcl" Mar 20 11:24:06 crc kubenswrapper[4748]: I0320 11:24:06.175165 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bbmlh\" (UniqueName: \"kubernetes.io/projected/8fd94020-5267-4d83-94e4-4cd358e1f134-kube-api-access-bbmlh\") pod \"8fd94020-5267-4d83-94e4-4cd358e1f134\" (UID: \"8fd94020-5267-4d83-94e4-4cd358e1f134\") " Mar 20 11:24:06 crc kubenswrapper[4748]: I0320 11:24:06.191212 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8fd94020-5267-4d83-94e4-4cd358e1f134-kube-api-access-bbmlh" (OuterVolumeSpecName: "kube-api-access-bbmlh") pod "8fd94020-5267-4d83-94e4-4cd358e1f134" (UID: "8fd94020-5267-4d83-94e4-4cd358e1f134"). InnerVolumeSpecName "kube-api-access-bbmlh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:24:06 crc kubenswrapper[4748]: I0320 11:24:06.277618 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bbmlh\" (UniqueName: \"kubernetes.io/projected/8fd94020-5267-4d83-94e4-4cd358e1f134-kube-api-access-bbmlh\") on node \"crc\" DevicePath \"\"" Mar 20 11:24:06 crc kubenswrapper[4748]: I0320 11:24:06.718396 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566764-wfwcl" event={"ID":"8fd94020-5267-4d83-94e4-4cd358e1f134","Type":"ContainerDied","Data":"d0d3c7e5ee205ea55a466dd6a0422c05b26835788ea8e419f28e1b55e88706e6"} Mar 20 11:24:06 crc kubenswrapper[4748]: I0320 11:24:06.718431 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566764-wfwcl" Mar 20 11:24:06 crc kubenswrapper[4748]: I0320 11:24:06.718437 4748 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d0d3c7e5ee205ea55a466dd6a0422c05b26835788ea8e419f28e1b55e88706e6" Mar 20 11:24:06 crc kubenswrapper[4748]: I0320 11:24:06.781116 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566758-pwps5"] Mar 20 11:24:06 crc kubenswrapper[4748]: I0320 11:24:06.790145 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566758-pwps5"] Mar 20 11:24:07 crc kubenswrapper[4748]: I0320 11:24:07.525762 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d203f64a-081c-4375-99d8-3eba99e8f895" path="/var/lib/kubelet/pods/d203f64a-081c-4375-99d8-3eba99e8f895/volumes" Mar 20 11:24:21 crc kubenswrapper[4748]: I0320 11:24:21.320331 4748 scope.go:117] "RemoveContainer" containerID="d3aa0a4d0156df77f2c3ca4480f412de71c3687ec773f7f50ce6f3021f63fbaa" Mar 20 11:24:30 crc kubenswrapper[4748]: I0320 11:24:30.186545 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Mar 20 11:24:30 crc kubenswrapper[4748]: E0320 11:24:30.187895 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fd94020-5267-4d83-94e4-4cd358e1f134" containerName="oc" Mar 20 11:24:30 crc kubenswrapper[4748]: I0320 11:24:30.187938 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fd94020-5267-4d83-94e4-4cd358e1f134" containerName="oc" Mar 20 11:24:30 crc kubenswrapper[4748]: I0320 11:24:30.188503 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="8fd94020-5267-4d83-94e4-4cd358e1f134" containerName="oc" Mar 20 11:24:30 crc kubenswrapper[4748]: I0320 11:24:30.189524 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 20 11:24:30 crc kubenswrapper[4748]: I0320 11:24:30.198260 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-fmdpt" Mar 20 11:24:30 crc kubenswrapper[4748]: I0320 11:24:30.198861 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Mar 20 11:24:30 crc kubenswrapper[4748]: I0320 11:24:30.199190 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Mar 20 11:24:30 crc kubenswrapper[4748]: I0320 11:24:30.199512 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Mar 20 11:24:30 crc kubenswrapper[4748]: I0320 11:24:30.213626 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Mar 20 11:24:30 crc kubenswrapper[4748]: I0320 11:24:30.335690 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/da9b4776-4f59-46e4-9cdf-953b0a7f83bf-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"da9b4776-4f59-46e4-9cdf-953b0a7f83bf\") " pod="openstack/tempest-tests-tempest" Mar 20 11:24:30 crc kubenswrapper[4748]: I0320 11:24:30.335801 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/da9b4776-4f59-46e4-9cdf-953b0a7f83bf-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"da9b4776-4f59-46e4-9cdf-953b0a7f83bf\") " pod="openstack/tempest-tests-tempest" Mar 20 11:24:30 crc kubenswrapper[4748]: I0320 11:24:30.335889 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-stbvw\" (UniqueName: \"kubernetes.io/projected/da9b4776-4f59-46e4-9cdf-953b0a7f83bf-kube-api-access-stbvw\") pod \"tempest-tests-tempest\" (UID: \"da9b4776-4f59-46e4-9cdf-953b0a7f83bf\") " pod="openstack/tempest-tests-tempest" Mar 20 11:24:30 crc kubenswrapper[4748]: I0320 11:24:30.336005 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"tempest-tests-tempest\" (UID: \"da9b4776-4f59-46e4-9cdf-953b0a7f83bf\") " pod="openstack/tempest-tests-tempest" Mar 20 11:24:30 crc kubenswrapper[4748]: I0320 11:24:30.336284 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/da9b4776-4f59-46e4-9cdf-953b0a7f83bf-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"da9b4776-4f59-46e4-9cdf-953b0a7f83bf\") " pod="openstack/tempest-tests-tempest" Mar 20 11:24:30 crc kubenswrapper[4748]: I0320 11:24:30.337052 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/da9b4776-4f59-46e4-9cdf-953b0a7f83bf-config-data\") pod \"tempest-tests-tempest\" (UID: \"da9b4776-4f59-46e4-9cdf-953b0a7f83bf\") " pod="openstack/tempest-tests-tempest" Mar 20 11:24:30 crc kubenswrapper[4748]: I0320 11:24:30.337158 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/da9b4776-4f59-46e4-9cdf-953b0a7f83bf-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"da9b4776-4f59-46e4-9cdf-953b0a7f83bf\") " pod="openstack/tempest-tests-tempest" Mar 20 11:24:30 crc kubenswrapper[4748]: I0320 11:24:30.337264 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/da9b4776-4f59-46e4-9cdf-953b0a7f83bf-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"da9b4776-4f59-46e4-9cdf-953b0a7f83bf\") " pod="openstack/tempest-tests-tempest" Mar 20 11:24:30 crc kubenswrapper[4748]: I0320 11:24:30.337349 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/da9b4776-4f59-46e4-9cdf-953b0a7f83bf-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"da9b4776-4f59-46e4-9cdf-953b0a7f83bf\") " pod="openstack/tempest-tests-tempest" Mar 20 11:24:30 crc kubenswrapper[4748]: I0320 11:24:30.439362 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/da9b4776-4f59-46e4-9cdf-953b0a7f83bf-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"da9b4776-4f59-46e4-9cdf-953b0a7f83bf\") " pod="openstack/tempest-tests-tempest" Mar 20 11:24:30 crc kubenswrapper[4748]: I0320 11:24:30.439691 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/da9b4776-4f59-46e4-9cdf-953b0a7f83bf-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"da9b4776-4f59-46e4-9cdf-953b0a7f83bf\") " pod="openstack/tempest-tests-tempest" Mar 20 11:24:30 crc kubenswrapper[4748]: I0320 11:24:30.439816 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/da9b4776-4f59-46e4-9cdf-953b0a7f83bf-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"da9b4776-4f59-46e4-9cdf-953b0a7f83bf\") " pod="openstack/tempest-tests-tempest" Mar 20 11:24:30 crc kubenswrapper[4748]: I0320 11:24:30.439948 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/da9b4776-4f59-46e4-9cdf-953b0a7f83bf-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"da9b4776-4f59-46e4-9cdf-953b0a7f83bf\") " pod="openstack/tempest-tests-tempest" Mar 20 11:24:30 crc kubenswrapper[4748]: I0320 11:24:30.440104 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/da9b4776-4f59-46e4-9cdf-953b0a7f83bf-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"da9b4776-4f59-46e4-9cdf-953b0a7f83bf\") " pod="openstack/tempest-tests-tempest" Mar 20 11:24:30 crc kubenswrapper[4748]: I0320 11:24:30.440282 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-stbvw\" (UniqueName: \"kubernetes.io/projected/da9b4776-4f59-46e4-9cdf-953b0a7f83bf-kube-api-access-stbvw\") pod \"tempest-tests-tempest\" (UID: \"da9b4776-4f59-46e4-9cdf-953b0a7f83bf\") " pod="openstack/tempest-tests-tempest" Mar 20 11:24:30 crc kubenswrapper[4748]: I0320 11:24:30.440400 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"tempest-tests-tempest\" (UID: \"da9b4776-4f59-46e4-9cdf-953b0a7f83bf\") " pod="openstack/tempest-tests-tempest" Mar 20 11:24:30 crc kubenswrapper[4748]: I0320 11:24:30.440555 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/da9b4776-4f59-46e4-9cdf-953b0a7f83bf-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"da9b4776-4f59-46e4-9cdf-953b0a7f83bf\") " pod="openstack/tempest-tests-tempest" Mar 20 11:24:30 crc kubenswrapper[4748]: I0320 11:24:30.440660 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/da9b4776-4f59-46e4-9cdf-953b0a7f83bf-config-data\") pod \"tempest-tests-tempest\" (UID: \"da9b4776-4f59-46e4-9cdf-953b0a7f83bf\") " pod="openstack/tempest-tests-tempest" Mar 20 11:24:30 crc kubenswrapper[4748]: I0320 11:24:30.440807 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/da9b4776-4f59-46e4-9cdf-953b0a7f83bf-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"da9b4776-4f59-46e4-9cdf-953b0a7f83bf\") " pod="openstack/tempest-tests-tempest" Mar 20 11:24:30 crc kubenswrapper[4748]: I0320 11:24:30.441173 4748 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"tempest-tests-tempest\" (UID: \"da9b4776-4f59-46e4-9cdf-953b0a7f83bf\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/tempest-tests-tempest" Mar 20 11:24:30 crc kubenswrapper[4748]: I0320 11:24:30.441266 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/da9b4776-4f59-46e4-9cdf-953b0a7f83bf-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"da9b4776-4f59-46e4-9cdf-953b0a7f83bf\") " pod="openstack/tempest-tests-tempest" Mar 20 11:24:30 crc kubenswrapper[4748]: I0320 11:24:30.441584 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/da9b4776-4f59-46e4-9cdf-953b0a7f83bf-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"da9b4776-4f59-46e4-9cdf-953b0a7f83bf\") " pod="openstack/tempest-tests-tempest" Mar 20 11:24:30 crc kubenswrapper[4748]: I0320 11:24:30.442345 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/da9b4776-4f59-46e4-9cdf-953b0a7f83bf-config-data\") pod \"tempest-tests-tempest\" (UID: \"da9b4776-4f59-46e4-9cdf-953b0a7f83bf\") " pod="openstack/tempest-tests-tempest" Mar 20 11:24:30 crc kubenswrapper[4748]: I0320 11:24:30.448868 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/da9b4776-4f59-46e4-9cdf-953b0a7f83bf-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"da9b4776-4f59-46e4-9cdf-953b0a7f83bf\") " pod="openstack/tempest-tests-tempest" Mar 20 11:24:30 crc kubenswrapper[4748]: I0320 11:24:30.451229 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/da9b4776-4f59-46e4-9cdf-953b0a7f83bf-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"da9b4776-4f59-46e4-9cdf-953b0a7f83bf\") " pod="openstack/tempest-tests-tempest" Mar 20 11:24:30 crc kubenswrapper[4748]: I0320 11:24:30.451848 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/da9b4776-4f59-46e4-9cdf-953b0a7f83bf-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"da9b4776-4f59-46e4-9cdf-953b0a7f83bf\") " pod="openstack/tempest-tests-tempest" Mar 20 11:24:30 crc kubenswrapper[4748]: I0320 11:24:30.465716 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-stbvw\" (UniqueName: \"kubernetes.io/projected/da9b4776-4f59-46e4-9cdf-953b0a7f83bf-kube-api-access-stbvw\") pod \"tempest-tests-tempest\" (UID: \"da9b4776-4f59-46e4-9cdf-953b0a7f83bf\") " pod="openstack/tempest-tests-tempest" Mar 20 11:24:30 crc kubenswrapper[4748]: I0320 11:24:30.481420 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"tempest-tests-tempest\" (UID: \"da9b4776-4f59-46e4-9cdf-953b0a7f83bf\") " pod="openstack/tempest-tests-tempest" Mar 20 11:24:30 crc kubenswrapper[4748]: I0320 11:24:30.525965 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 20 11:24:30 crc kubenswrapper[4748]: I0320 11:24:30.962918 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Mar 20 11:24:31 crc kubenswrapper[4748]: I0320 11:24:31.972439 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"da9b4776-4f59-46e4-9cdf-953b0a7f83bf","Type":"ContainerStarted","Data":"4ddb9546ce0d27085ea90fc667c1d432c125b2bbdb76fa2736308e846999e0ac"} Mar 20 11:25:02 crc kubenswrapper[4748]: E0320 11:25:02.745823 4748 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Mar 20 11:25:02 crc kubenswrapper[4748]: E0320 11:25:02.746469 4748 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-stbvw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(da9b4776-4f59-46e4-9cdf-953b0a7f83bf): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 11:25:02 crc kubenswrapper[4748]: E0320 11:25:02.747682 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="da9b4776-4f59-46e4-9cdf-953b0a7f83bf" Mar 20 11:25:03 crc kubenswrapper[4748]: E0320 11:25:03.477485 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="da9b4776-4f59-46e4-9cdf-953b0a7f83bf" Mar 20 11:25:12 crc kubenswrapper[4748]: I0320 11:25:12.928989 4748 patch_prober.go:28] interesting pod/machine-config-daemon-5lbvz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:25:12 crc kubenswrapper[4748]: I0320 11:25:12.929554 4748 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:25:15 crc kubenswrapper[4748]: I0320 11:25:15.985499 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Mar 20 11:25:17 crc kubenswrapper[4748]: I0320 11:25:17.598360 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"da9b4776-4f59-46e4-9cdf-953b0a7f83bf","Type":"ContainerStarted","Data":"6264d77b4f783d8c446f419c3e26316d6922a8e2bba2ed44c27ecc804310b54a"} Mar 20 11:25:17 crc kubenswrapper[4748]: I0320 11:25:17.625337 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=3.611298841 podStartE2EDuration="48.625317207s" podCreationTimestamp="2026-03-20 11:24:29 +0000 UTC" firstStartedPulling="2026-03-20 11:24:30.967728188 +0000 UTC m=+2906.109274002" lastFinishedPulling="2026-03-20 11:25:15.981746554 +0000 UTC m=+2951.123292368" observedRunningTime="2026-03-20 11:25:17.61747911 +0000 UTC m=+2952.759024944" watchObservedRunningTime="2026-03-20 11:25:17.625317207 +0000 UTC m=+2952.766863011" Mar 20 11:25:42 crc kubenswrapper[4748]: I0320 11:25:42.928376 4748 patch_prober.go:28] interesting pod/machine-config-daemon-5lbvz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:25:42 crc kubenswrapper[4748]: I0320 11:25:42.928943 4748 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:26:00 crc kubenswrapper[4748]: I0320 11:26:00.153922 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566766-j4fsd"] Mar 20 11:26:00 crc kubenswrapper[4748]: I0320 11:26:00.157797 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566766-j4fsd" Mar 20 11:26:00 crc kubenswrapper[4748]: I0320 11:26:00.161967 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 11:26:00 crc kubenswrapper[4748]: I0320 11:26:00.164404 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 11:26:00 crc kubenswrapper[4748]: I0320 11:26:00.165250 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566766-j4fsd"] Mar 20 11:26:00 crc kubenswrapper[4748]: I0320 11:26:00.169952 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-p6q8z" Mar 20 11:26:00 crc kubenswrapper[4748]: I0320 11:26:00.250504 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wllt2\" (UniqueName: \"kubernetes.io/projected/016a8101-d004-4a4b-aba6-9c862a26f9f8-kube-api-access-wllt2\") pod \"auto-csr-approver-29566766-j4fsd\" (UID: \"016a8101-d004-4a4b-aba6-9c862a26f9f8\") " pod="openshift-infra/auto-csr-approver-29566766-j4fsd" Mar 20 11:26:00 crc kubenswrapper[4748]: I0320 11:26:00.352599 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wllt2\" (UniqueName: \"kubernetes.io/projected/016a8101-d004-4a4b-aba6-9c862a26f9f8-kube-api-access-wllt2\") pod \"auto-csr-approver-29566766-j4fsd\" (UID: \"016a8101-d004-4a4b-aba6-9c862a26f9f8\") " pod="openshift-infra/auto-csr-approver-29566766-j4fsd" Mar 20 11:26:00 crc kubenswrapper[4748]: I0320 11:26:00.377622 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wllt2\" (UniqueName: \"kubernetes.io/projected/016a8101-d004-4a4b-aba6-9c862a26f9f8-kube-api-access-wllt2\") pod \"auto-csr-approver-29566766-j4fsd\" (UID: \"016a8101-d004-4a4b-aba6-9c862a26f9f8\") " pod="openshift-infra/auto-csr-approver-29566766-j4fsd" Mar 20 11:26:00 crc kubenswrapper[4748]: I0320 11:26:00.478517 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566766-j4fsd" Mar 20 11:26:01 crc kubenswrapper[4748]: I0320 11:26:01.007583 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566766-j4fsd"] Mar 20 11:26:01 crc kubenswrapper[4748]: W0320 11:26:01.008140 4748 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod016a8101_d004_4a4b_aba6_9c862a26f9f8.slice/crio-a72664d2d8678b08418ed57ae177ee3d4d4cd4adbd43682582f00ef6250daa1a WatchSource:0}: Error finding container a72664d2d8678b08418ed57ae177ee3d4d4cd4adbd43682582f00ef6250daa1a: Status 404 returned error can't find the container with id a72664d2d8678b08418ed57ae177ee3d4d4cd4adbd43682582f00ef6250daa1a Mar 20 11:26:02 crc kubenswrapper[4748]: I0320 11:26:02.003347 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566766-j4fsd" event={"ID":"016a8101-d004-4a4b-aba6-9c862a26f9f8","Type":"ContainerStarted","Data":"a72664d2d8678b08418ed57ae177ee3d4d4cd4adbd43682582f00ef6250daa1a"} Mar 20 11:26:03 crc kubenswrapper[4748]: I0320 11:26:03.014882 4748 generic.go:334] "Generic (PLEG): container finished" podID="016a8101-d004-4a4b-aba6-9c862a26f9f8" containerID="f973f04dc494ed86c1d1e3885034c229dbbb5dd3c04d2bd3d9c571bcc77ba032" exitCode=0 Mar 20 11:26:03 crc kubenswrapper[4748]: I0320 11:26:03.014941 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566766-j4fsd" event={"ID":"016a8101-d004-4a4b-aba6-9c862a26f9f8","Type":"ContainerDied","Data":"f973f04dc494ed86c1d1e3885034c229dbbb5dd3c04d2bd3d9c571bcc77ba032"} Mar 20 11:26:04 crc kubenswrapper[4748]: I0320 11:26:04.400531 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566766-j4fsd" Mar 20 11:26:04 crc kubenswrapper[4748]: I0320 11:26:04.532387 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wllt2\" (UniqueName: \"kubernetes.io/projected/016a8101-d004-4a4b-aba6-9c862a26f9f8-kube-api-access-wllt2\") pod \"016a8101-d004-4a4b-aba6-9c862a26f9f8\" (UID: \"016a8101-d004-4a4b-aba6-9c862a26f9f8\") " Mar 20 11:26:04 crc kubenswrapper[4748]: I0320 11:26:04.543110 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/016a8101-d004-4a4b-aba6-9c862a26f9f8-kube-api-access-wllt2" (OuterVolumeSpecName: "kube-api-access-wllt2") pod "016a8101-d004-4a4b-aba6-9c862a26f9f8" (UID: "016a8101-d004-4a4b-aba6-9c862a26f9f8"). InnerVolumeSpecName "kube-api-access-wllt2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:26:04 crc kubenswrapper[4748]: I0320 11:26:04.634901 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wllt2\" (UniqueName: \"kubernetes.io/projected/016a8101-d004-4a4b-aba6-9c862a26f9f8-kube-api-access-wllt2\") on node \"crc\" DevicePath \"\"" Mar 20 11:26:05 crc kubenswrapper[4748]: I0320 11:26:05.037160 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566766-j4fsd" event={"ID":"016a8101-d004-4a4b-aba6-9c862a26f9f8","Type":"ContainerDied","Data":"a72664d2d8678b08418ed57ae177ee3d4d4cd4adbd43682582f00ef6250daa1a"} Mar 20 11:26:05 crc kubenswrapper[4748]: I0320 11:26:05.037695 4748 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a72664d2d8678b08418ed57ae177ee3d4d4cd4adbd43682582f00ef6250daa1a" Mar 20 11:26:05 crc kubenswrapper[4748]: I0320 11:26:05.037320 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566766-j4fsd" Mar 20 11:26:05 crc kubenswrapper[4748]: I0320 11:26:05.480653 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566760-pf429"] Mar 20 11:26:05 crc kubenswrapper[4748]: I0320 11:26:05.493464 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566760-pf429"] Mar 20 11:26:05 crc kubenswrapper[4748]: I0320 11:26:05.530710 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11faf586-5b92-4f52-87e2-cdb907eb9f9a" path="/var/lib/kubelet/pods/11faf586-5b92-4f52-87e2-cdb907eb9f9a/volumes" Mar 20 11:26:12 crc kubenswrapper[4748]: I0320 11:26:12.928820 4748 patch_prober.go:28] interesting pod/machine-config-daemon-5lbvz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:26:12 crc kubenswrapper[4748]: I0320 11:26:12.929429 4748 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:26:12 crc kubenswrapper[4748]: I0320 11:26:12.929525 4748 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" Mar 20 11:26:12 crc kubenswrapper[4748]: I0320 11:26:12.930338 4748 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a585b460ade4b7044a27fe4f08a1d29b65f1fac9c0d4f42f3e310bffe005c7ed"} pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 11:26:12 crc kubenswrapper[4748]: I0320 11:26:12.930404 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" containerName="machine-config-daemon" containerID="cri-o://a585b460ade4b7044a27fe4f08a1d29b65f1fac9c0d4f42f3e310bffe005c7ed" gracePeriod=600 Mar 20 11:26:13 crc kubenswrapper[4748]: I0320 11:26:13.119059 4748 generic.go:334] "Generic (PLEG): container finished" podID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" containerID="a585b460ade4b7044a27fe4f08a1d29b65f1fac9c0d4f42f3e310bffe005c7ed" exitCode=0 Mar 20 11:26:13 crc kubenswrapper[4748]: I0320 11:26:13.119114 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" event={"ID":"8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c","Type":"ContainerDied","Data":"a585b460ade4b7044a27fe4f08a1d29b65f1fac9c0d4f42f3e310bffe005c7ed"} Mar 20 11:26:13 crc kubenswrapper[4748]: I0320 11:26:13.119151 4748 scope.go:117] "RemoveContainer" containerID="9efbf264ae5e3724c39e42cf83857fbc6d1e070b3c91135d558f8fb199ffbbb5" Mar 20 11:26:14 crc kubenswrapper[4748]: I0320 11:26:14.128980 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" event={"ID":"8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c","Type":"ContainerStarted","Data":"9c9ee957322fa34d688fcab2aab1e68d05f6773bc13436058fe28e475976d334"} Mar 20 11:26:21 crc kubenswrapper[4748]: I0320 11:26:21.425952 4748 scope.go:117] "RemoveContainer" containerID="de2f94c17e2ff5aba9bd32186bdcf407ba76a863c40c3ddb0cdb6363dc4edd9a" Mar 20 11:26:27 crc kubenswrapper[4748]: I0320 11:26:27.639484 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-lnbmv"] Mar 20 11:26:27 crc kubenswrapper[4748]: E0320 11:26:27.640447 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="016a8101-d004-4a4b-aba6-9c862a26f9f8" containerName="oc" Mar 20 11:26:27 crc kubenswrapper[4748]: I0320 11:26:27.640465 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="016a8101-d004-4a4b-aba6-9c862a26f9f8" containerName="oc" Mar 20 11:26:27 crc kubenswrapper[4748]: I0320 11:26:27.640690 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="016a8101-d004-4a4b-aba6-9c862a26f9f8" containerName="oc" Mar 20 11:26:27 crc kubenswrapper[4748]: I0320 11:26:27.642095 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lnbmv" Mar 20 11:26:27 crc kubenswrapper[4748]: I0320 11:26:27.668692 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lnbmv"] Mar 20 11:26:27 crc kubenswrapper[4748]: I0320 11:26:27.781594 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e152aa56-2971-4596-84ee-5ba1c22ef8e3-utilities\") pod \"community-operators-lnbmv\" (UID: \"e152aa56-2971-4596-84ee-5ba1c22ef8e3\") " pod="openshift-marketplace/community-operators-lnbmv" Mar 20 11:26:27 crc kubenswrapper[4748]: I0320 11:26:27.781728 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e152aa56-2971-4596-84ee-5ba1c22ef8e3-catalog-content\") pod \"community-operators-lnbmv\" (UID: \"e152aa56-2971-4596-84ee-5ba1c22ef8e3\") " pod="openshift-marketplace/community-operators-lnbmv" Mar 20 11:26:27 crc kubenswrapper[4748]: I0320 11:26:27.781765 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7p7wd\" (UniqueName: \"kubernetes.io/projected/e152aa56-2971-4596-84ee-5ba1c22ef8e3-kube-api-access-7p7wd\") pod \"community-operators-lnbmv\" (UID: \"e152aa56-2971-4596-84ee-5ba1c22ef8e3\") " pod="openshift-marketplace/community-operators-lnbmv" Mar 20 11:26:27 crc kubenswrapper[4748]: I0320 11:26:27.883983 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e152aa56-2971-4596-84ee-5ba1c22ef8e3-catalog-content\") pod \"community-operators-lnbmv\" (UID: \"e152aa56-2971-4596-84ee-5ba1c22ef8e3\") " pod="openshift-marketplace/community-operators-lnbmv" Mar 20 11:26:27 crc kubenswrapper[4748]: I0320 11:26:27.884055 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7p7wd\" (UniqueName: \"kubernetes.io/projected/e152aa56-2971-4596-84ee-5ba1c22ef8e3-kube-api-access-7p7wd\") pod \"community-operators-lnbmv\" (UID: \"e152aa56-2971-4596-84ee-5ba1c22ef8e3\") " pod="openshift-marketplace/community-operators-lnbmv" Mar 20 11:26:27 crc kubenswrapper[4748]: I0320 11:26:27.884613 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e152aa56-2971-4596-84ee-5ba1c22ef8e3-catalog-content\") pod \"community-operators-lnbmv\" (UID: \"e152aa56-2971-4596-84ee-5ba1c22ef8e3\") " pod="openshift-marketplace/community-operators-lnbmv" Mar 20 11:26:27 crc kubenswrapper[4748]: I0320 11:26:27.884784 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e152aa56-2971-4596-84ee-5ba1c22ef8e3-utilities\") pod \"community-operators-lnbmv\" (UID: \"e152aa56-2971-4596-84ee-5ba1c22ef8e3\") " pod="openshift-marketplace/community-operators-lnbmv" Mar 20 11:26:27 crc kubenswrapper[4748]: I0320 11:26:27.885218 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e152aa56-2971-4596-84ee-5ba1c22ef8e3-utilities\") pod \"community-operators-lnbmv\" (UID: \"e152aa56-2971-4596-84ee-5ba1c22ef8e3\") " pod="openshift-marketplace/community-operators-lnbmv" Mar 20 11:26:27 crc kubenswrapper[4748]: I0320 11:26:27.908527 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7p7wd\" (UniqueName: \"kubernetes.io/projected/e152aa56-2971-4596-84ee-5ba1c22ef8e3-kube-api-access-7p7wd\") pod \"community-operators-lnbmv\" (UID: \"e152aa56-2971-4596-84ee-5ba1c22ef8e3\") " pod="openshift-marketplace/community-operators-lnbmv" Mar 20 11:26:27 crc kubenswrapper[4748]: I0320 11:26:27.963084 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lnbmv" Mar 20 11:26:28 crc kubenswrapper[4748]: I0320 11:26:28.509860 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lnbmv"] Mar 20 11:26:28 crc kubenswrapper[4748]: W0320 11:26:28.515255 4748 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode152aa56_2971_4596_84ee_5ba1c22ef8e3.slice/crio-53186ec3d44f3d2f42fd816ddadfce1425c09079030c57f4851b0ffb80d6aa08 WatchSource:0}: Error finding container 53186ec3d44f3d2f42fd816ddadfce1425c09079030c57f4851b0ffb80d6aa08: Status 404 returned error can't find the container with id 53186ec3d44f3d2f42fd816ddadfce1425c09079030c57f4851b0ffb80d6aa08 Mar 20 11:26:29 crc kubenswrapper[4748]: I0320 11:26:29.276394 4748 generic.go:334] "Generic (PLEG): container finished" podID="e152aa56-2971-4596-84ee-5ba1c22ef8e3" containerID="08a32593fcacaabbd99d262478ce81889b129ea5f13b43ba686b4c590dcda893" exitCode=0 Mar 20 11:26:29 crc kubenswrapper[4748]: I0320 11:26:29.276454 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lnbmv" event={"ID":"e152aa56-2971-4596-84ee-5ba1c22ef8e3","Type":"ContainerDied","Data":"08a32593fcacaabbd99d262478ce81889b129ea5f13b43ba686b4c590dcda893"} Mar 20 11:26:29 crc kubenswrapper[4748]: I0320 11:26:29.276724 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lnbmv" event={"ID":"e152aa56-2971-4596-84ee-5ba1c22ef8e3","Type":"ContainerStarted","Data":"53186ec3d44f3d2f42fd816ddadfce1425c09079030c57f4851b0ffb80d6aa08"} Mar 20 11:26:35 crc kubenswrapper[4748]: I0320 11:26:35.357933 4748 generic.go:334] "Generic (PLEG): container finished" podID="e152aa56-2971-4596-84ee-5ba1c22ef8e3" containerID="d6c6b180eb406901d06efa284f4c93059abe58e76e7c0bc9bccd6ece3b3f37ec" exitCode=0 Mar 20 11:26:35 crc kubenswrapper[4748]: I0320 11:26:35.358024 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lnbmv" event={"ID":"e152aa56-2971-4596-84ee-5ba1c22ef8e3","Type":"ContainerDied","Data":"d6c6b180eb406901d06efa284f4c93059abe58e76e7c0bc9bccd6ece3b3f37ec"} Mar 20 11:26:39 crc kubenswrapper[4748]: I0320 11:26:39.023092 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lnbmv" event={"ID":"e152aa56-2971-4596-84ee-5ba1c22ef8e3","Type":"ContainerStarted","Data":"1f2482019e7190023bf4a9374b65e5b64726063336bae704f4730505b4a77bb0"} Mar 20 11:26:39 crc kubenswrapper[4748]: I0320 11:26:39.051429 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-lnbmv" podStartSLOduration=3.332900798 podStartE2EDuration="12.051406919s" podCreationTimestamp="2026-03-20 11:26:27 +0000 UTC" firstStartedPulling="2026-03-20 11:26:29.278640156 +0000 UTC m=+3024.420185970" lastFinishedPulling="2026-03-20 11:26:37.997146277 +0000 UTC m=+3033.138692091" observedRunningTime="2026-03-20 11:26:39.040525046 +0000 UTC m=+3034.182070860" watchObservedRunningTime="2026-03-20 11:26:39.051406919 +0000 UTC m=+3034.192952723" Mar 20 11:26:47 crc kubenswrapper[4748]: I0320 11:26:47.963671 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-lnbmv" Mar 20 11:26:47 crc kubenswrapper[4748]: I0320 11:26:47.964224 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-lnbmv" Mar 20 11:26:48 crc kubenswrapper[4748]: I0320 11:26:48.012941 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-lnbmv" Mar 20 11:26:48 crc kubenswrapper[4748]: I0320 11:26:48.167310 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-lnbmv" Mar 20 11:26:48 crc kubenswrapper[4748]: I0320 11:26:48.231275 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lnbmv"] Mar 20 11:26:48 crc kubenswrapper[4748]: I0320 11:26:48.319288 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-m7p7r"] Mar 20 11:26:48 crc kubenswrapper[4748]: I0320 11:26:48.319592 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-m7p7r" podUID="9f5e3014-2051-4bf5-a1df-b761841fcdaa" containerName="registry-server" containerID="cri-o://c1ef5b64fa459611b73fc70b1351facc185315a2a5a6ebdd5c64375ae7ec4705" gracePeriod=2 Mar 20 11:26:49 crc kubenswrapper[4748]: I0320 11:26:49.017924 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m7p7r" Mar 20 11:26:49 crc kubenswrapper[4748]: I0320 11:26:49.130508 4748 generic.go:334] "Generic (PLEG): container finished" podID="9f5e3014-2051-4bf5-a1df-b761841fcdaa" containerID="c1ef5b64fa459611b73fc70b1351facc185315a2a5a6ebdd5c64375ae7ec4705" exitCode=0 Mar 20 11:26:49 crc kubenswrapper[4748]: I0320 11:26:49.130663 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m7p7r" event={"ID":"9f5e3014-2051-4bf5-a1df-b761841fcdaa","Type":"ContainerDied","Data":"c1ef5b64fa459611b73fc70b1351facc185315a2a5a6ebdd5c64375ae7ec4705"} Mar 20 11:26:49 crc kubenswrapper[4748]: I0320 11:26:49.131072 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m7p7r" event={"ID":"9f5e3014-2051-4bf5-a1df-b761841fcdaa","Type":"ContainerDied","Data":"a768881acfa6eef2ccbd22dc4b62813f3f4ceda59c6ed5a23a4c34f18c75ebdb"} Mar 20 11:26:49 crc kubenswrapper[4748]: I0320 11:26:49.131103 4748 scope.go:117] "RemoveContainer" containerID="c1ef5b64fa459611b73fc70b1351facc185315a2a5a6ebdd5c64375ae7ec4705" Mar 20 11:26:49 crc kubenswrapper[4748]: I0320 11:26:49.130716 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m7p7r" Mar 20 11:26:49 crc kubenswrapper[4748]: I0320 11:26:49.173639 4748 scope.go:117] "RemoveContainer" containerID="02b6abc773e2198162ecd085fda3710605be41e4a5d6b20e5011e1b74a004ab2" Mar 20 11:26:49 crc kubenswrapper[4748]: I0320 11:26:49.188398 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f5e3014-2051-4bf5-a1df-b761841fcdaa-utilities\") pod \"9f5e3014-2051-4bf5-a1df-b761841fcdaa\" (UID: \"9f5e3014-2051-4bf5-a1df-b761841fcdaa\") " Mar 20 11:26:49 crc kubenswrapper[4748]: I0320 11:26:49.188615 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rttjf\" (UniqueName: \"kubernetes.io/projected/9f5e3014-2051-4bf5-a1df-b761841fcdaa-kube-api-access-rttjf\") pod \"9f5e3014-2051-4bf5-a1df-b761841fcdaa\" (UID: \"9f5e3014-2051-4bf5-a1df-b761841fcdaa\") " Mar 20 11:26:49 crc kubenswrapper[4748]: I0320 11:26:49.188944 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f5e3014-2051-4bf5-a1df-b761841fcdaa-catalog-content\") pod \"9f5e3014-2051-4bf5-a1df-b761841fcdaa\" (UID: \"9f5e3014-2051-4bf5-a1df-b761841fcdaa\") " Mar 20 11:26:49 crc kubenswrapper[4748]: I0320 11:26:49.191994 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f5e3014-2051-4bf5-a1df-b761841fcdaa-utilities" (OuterVolumeSpecName: "utilities") pod "9f5e3014-2051-4bf5-a1df-b761841fcdaa" (UID: "9f5e3014-2051-4bf5-a1df-b761841fcdaa"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:26:49 crc kubenswrapper[4748]: I0320 11:26:49.197021 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f5e3014-2051-4bf5-a1df-b761841fcdaa-kube-api-access-rttjf" (OuterVolumeSpecName: "kube-api-access-rttjf") pod "9f5e3014-2051-4bf5-a1df-b761841fcdaa" (UID: "9f5e3014-2051-4bf5-a1df-b761841fcdaa"). InnerVolumeSpecName "kube-api-access-rttjf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:26:49 crc kubenswrapper[4748]: I0320 11:26:49.213235 4748 scope.go:117] "RemoveContainer" containerID="f3dcc17a058604c935e0fab3100d46d8231d05e491813ef46bd98f13dec34e3b" Mar 20 11:26:49 crc kubenswrapper[4748]: I0320 11:26:49.274455 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f5e3014-2051-4bf5-a1df-b761841fcdaa-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9f5e3014-2051-4bf5-a1df-b761841fcdaa" (UID: "9f5e3014-2051-4bf5-a1df-b761841fcdaa"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:26:49 crc kubenswrapper[4748]: I0320 11:26:49.291080 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rttjf\" (UniqueName: \"kubernetes.io/projected/9f5e3014-2051-4bf5-a1df-b761841fcdaa-kube-api-access-rttjf\") on node \"crc\" DevicePath \"\"" Mar 20 11:26:49 crc kubenswrapper[4748]: I0320 11:26:49.291112 4748 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f5e3014-2051-4bf5-a1df-b761841fcdaa-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 11:26:49 crc kubenswrapper[4748]: I0320 11:26:49.291124 4748 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f5e3014-2051-4bf5-a1df-b761841fcdaa-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 11:26:49 crc kubenswrapper[4748]: I0320 11:26:49.312800 4748 scope.go:117] "RemoveContainer" containerID="c1ef5b64fa459611b73fc70b1351facc185315a2a5a6ebdd5c64375ae7ec4705" Mar 20 11:26:49 crc kubenswrapper[4748]: E0320 11:26:49.315348 4748 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1ef5b64fa459611b73fc70b1351facc185315a2a5a6ebdd5c64375ae7ec4705\": container with ID starting with c1ef5b64fa459611b73fc70b1351facc185315a2a5a6ebdd5c64375ae7ec4705 not found: ID does not exist" containerID="c1ef5b64fa459611b73fc70b1351facc185315a2a5a6ebdd5c64375ae7ec4705" Mar 20 11:26:49 crc kubenswrapper[4748]: I0320 11:26:49.315423 4748 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1ef5b64fa459611b73fc70b1351facc185315a2a5a6ebdd5c64375ae7ec4705"} err="failed to get container status \"c1ef5b64fa459611b73fc70b1351facc185315a2a5a6ebdd5c64375ae7ec4705\": rpc error: code = NotFound desc = could not find container \"c1ef5b64fa459611b73fc70b1351facc185315a2a5a6ebdd5c64375ae7ec4705\": container with ID starting with c1ef5b64fa459611b73fc70b1351facc185315a2a5a6ebdd5c64375ae7ec4705 not found: ID does not exist" Mar 20 11:26:49 crc kubenswrapper[4748]: I0320 11:26:49.315465 4748 scope.go:117] "RemoveContainer" containerID="02b6abc773e2198162ecd085fda3710605be41e4a5d6b20e5011e1b74a004ab2" Mar 20 11:26:49 crc kubenswrapper[4748]: E0320 11:26:49.316003 4748 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02b6abc773e2198162ecd085fda3710605be41e4a5d6b20e5011e1b74a004ab2\": container with ID starting with 02b6abc773e2198162ecd085fda3710605be41e4a5d6b20e5011e1b74a004ab2 not found: ID does not exist" containerID="02b6abc773e2198162ecd085fda3710605be41e4a5d6b20e5011e1b74a004ab2" Mar 20 11:26:49 crc kubenswrapper[4748]: I0320 11:26:49.316032 4748 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02b6abc773e2198162ecd085fda3710605be41e4a5d6b20e5011e1b74a004ab2"} err="failed to get container status \"02b6abc773e2198162ecd085fda3710605be41e4a5d6b20e5011e1b74a004ab2\": rpc error: code = NotFound desc = could not find container \"02b6abc773e2198162ecd085fda3710605be41e4a5d6b20e5011e1b74a004ab2\": container with ID starting with 02b6abc773e2198162ecd085fda3710605be41e4a5d6b20e5011e1b74a004ab2 not found: ID does not exist" Mar 20 11:26:49 crc kubenswrapper[4748]: I0320 11:26:49.316056 4748 scope.go:117] "RemoveContainer" containerID="f3dcc17a058604c935e0fab3100d46d8231d05e491813ef46bd98f13dec34e3b" Mar 20 11:26:49 crc kubenswrapper[4748]: E0320 11:26:49.316392 4748 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f3dcc17a058604c935e0fab3100d46d8231d05e491813ef46bd98f13dec34e3b\": container with ID starting with f3dcc17a058604c935e0fab3100d46d8231d05e491813ef46bd98f13dec34e3b not found: ID does not exist" containerID="f3dcc17a058604c935e0fab3100d46d8231d05e491813ef46bd98f13dec34e3b" Mar 20 11:26:49 crc kubenswrapper[4748]: I0320 11:26:49.316425 4748 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3dcc17a058604c935e0fab3100d46d8231d05e491813ef46bd98f13dec34e3b"} err="failed to get container status \"f3dcc17a058604c935e0fab3100d46d8231d05e491813ef46bd98f13dec34e3b\": rpc error: code = NotFound desc = could not find container \"f3dcc17a058604c935e0fab3100d46d8231d05e491813ef46bd98f13dec34e3b\": container with ID starting with f3dcc17a058604c935e0fab3100d46d8231d05e491813ef46bd98f13dec34e3b not found: ID does not exist" Mar 20 11:26:49 crc kubenswrapper[4748]: I0320 11:26:49.532203 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-m7p7r"] Mar 20 11:26:49 crc kubenswrapper[4748]: I0320 11:26:49.534134 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-m7p7r"] Mar 20 11:26:51 crc kubenswrapper[4748]: I0320 11:26:51.526764 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f5e3014-2051-4bf5-a1df-b761841fcdaa" path="/var/lib/kubelet/pods/9f5e3014-2051-4bf5-a1df-b761841fcdaa/volumes" Mar 20 11:28:00 crc kubenswrapper[4748]: I0320 11:28:00.157389 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566768-wxhzm"] Mar 20 11:28:00 crc kubenswrapper[4748]: E0320 11:28:00.158726 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f5e3014-2051-4bf5-a1df-b761841fcdaa" containerName="extract-utilities" Mar 20 11:28:00 crc kubenswrapper[4748]: I0320 11:28:00.158745 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f5e3014-2051-4bf5-a1df-b761841fcdaa" containerName="extract-utilities" Mar 20 11:28:00 crc kubenswrapper[4748]: E0320 11:28:00.158756 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f5e3014-2051-4bf5-a1df-b761841fcdaa" containerName="extract-content" Mar 20 11:28:00 crc kubenswrapper[4748]: I0320 11:28:00.158764 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f5e3014-2051-4bf5-a1df-b761841fcdaa" containerName="extract-content" Mar 20 11:28:00 crc kubenswrapper[4748]: E0320 11:28:00.158810 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f5e3014-2051-4bf5-a1df-b761841fcdaa" containerName="registry-server" Mar 20 11:28:00 crc kubenswrapper[4748]: I0320 11:28:00.158818 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f5e3014-2051-4bf5-a1df-b761841fcdaa" containerName="registry-server" Mar 20 11:28:00 crc kubenswrapper[4748]: I0320 11:28:00.159108 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f5e3014-2051-4bf5-a1df-b761841fcdaa" containerName="registry-server" Mar 20 11:28:00 crc kubenswrapper[4748]: I0320 11:28:00.159956 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566768-wxhzm" Mar 20 11:28:00 crc kubenswrapper[4748]: I0320 11:28:00.167440 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 11:28:00 crc kubenswrapper[4748]: I0320 11:28:00.167549 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 11:28:00 crc kubenswrapper[4748]: I0320 11:28:00.167604 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-p6q8z" Mar 20 11:28:00 crc kubenswrapper[4748]: I0320 11:28:00.171794 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566768-wxhzm"] Mar 20 11:28:00 crc kubenswrapper[4748]: I0320 11:28:00.295665 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zl9nq\" (UniqueName: \"kubernetes.io/projected/60133479-2ce5-441a-a3d7-0e05c4f5ac67-kube-api-access-zl9nq\") pod \"auto-csr-approver-29566768-wxhzm\" (UID: \"60133479-2ce5-441a-a3d7-0e05c4f5ac67\") " pod="openshift-infra/auto-csr-approver-29566768-wxhzm" Mar 20 11:28:00 crc kubenswrapper[4748]: I0320 11:28:00.397020 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zl9nq\" (UniqueName: \"kubernetes.io/projected/60133479-2ce5-441a-a3d7-0e05c4f5ac67-kube-api-access-zl9nq\") pod \"auto-csr-approver-29566768-wxhzm\" (UID: \"60133479-2ce5-441a-a3d7-0e05c4f5ac67\") " pod="openshift-infra/auto-csr-approver-29566768-wxhzm" Mar 20 11:28:00 crc kubenswrapper[4748]: I0320 11:28:00.426648 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zl9nq\" (UniqueName: \"kubernetes.io/projected/60133479-2ce5-441a-a3d7-0e05c4f5ac67-kube-api-access-zl9nq\") pod \"auto-csr-approver-29566768-wxhzm\" (UID: \"60133479-2ce5-441a-a3d7-0e05c4f5ac67\") " pod="openshift-infra/auto-csr-approver-29566768-wxhzm" Mar 20 11:28:00 crc kubenswrapper[4748]: I0320 11:28:00.483019 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566768-wxhzm" Mar 20 11:28:01 crc kubenswrapper[4748]: I0320 11:28:01.013131 4748 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 11:28:01 crc kubenswrapper[4748]: I0320 11:28:01.013141 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566768-wxhzm"] Mar 20 11:28:01 crc kubenswrapper[4748]: I0320 11:28:01.800088 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566768-wxhzm" event={"ID":"60133479-2ce5-441a-a3d7-0e05c4f5ac67","Type":"ContainerStarted","Data":"20e19816458ad72914d2c1c85631044dd9aa7886d279ad0bff0c11eaf553a072"} Mar 20 11:28:02 crc kubenswrapper[4748]: I0320 11:28:02.811288 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566768-wxhzm" event={"ID":"60133479-2ce5-441a-a3d7-0e05c4f5ac67","Type":"ContainerStarted","Data":"73434fabb3c42ae469c6a5dd03641b07e2abd41abc4c49ab70a58552dbff6c9c"} Mar 20 11:28:02 crc kubenswrapper[4748]: I0320 11:28:02.830098 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29566768-wxhzm" podStartSLOduration=1.5765537809999999 podStartE2EDuration="2.830073644s" podCreationTimestamp="2026-03-20 11:28:00 +0000 UTC" firstStartedPulling="2026-03-20 11:28:01.01290918 +0000 UTC m=+3116.154454994" lastFinishedPulling="2026-03-20 11:28:02.266429033 +0000 UTC m=+3117.407974857" observedRunningTime="2026-03-20 11:28:02.825053997 +0000 UTC m=+3117.966599811" watchObservedRunningTime="2026-03-20 11:28:02.830073644 +0000 UTC m=+3117.971619458" Mar 20 11:28:03 crc kubenswrapper[4748]: I0320 11:28:03.822368 4748 generic.go:334] "Generic (PLEG): container finished" podID="60133479-2ce5-441a-a3d7-0e05c4f5ac67" containerID="73434fabb3c42ae469c6a5dd03641b07e2abd41abc4c49ab70a58552dbff6c9c" exitCode=0 Mar 20 11:28:03 crc kubenswrapper[4748]: I0320 11:28:03.822433 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566768-wxhzm" event={"ID":"60133479-2ce5-441a-a3d7-0e05c4f5ac67","Type":"ContainerDied","Data":"73434fabb3c42ae469c6a5dd03641b07e2abd41abc4c49ab70a58552dbff6c9c"} Mar 20 11:28:05 crc kubenswrapper[4748]: I0320 11:28:05.404824 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566768-wxhzm" Mar 20 11:28:05 crc kubenswrapper[4748]: I0320 11:28:05.502871 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zl9nq\" (UniqueName: \"kubernetes.io/projected/60133479-2ce5-441a-a3d7-0e05c4f5ac67-kube-api-access-zl9nq\") pod \"60133479-2ce5-441a-a3d7-0e05c4f5ac67\" (UID: \"60133479-2ce5-441a-a3d7-0e05c4f5ac67\") " Mar 20 11:28:05 crc kubenswrapper[4748]: I0320 11:28:05.511291 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60133479-2ce5-441a-a3d7-0e05c4f5ac67-kube-api-access-zl9nq" (OuterVolumeSpecName: "kube-api-access-zl9nq") pod "60133479-2ce5-441a-a3d7-0e05c4f5ac67" (UID: "60133479-2ce5-441a-a3d7-0e05c4f5ac67"). InnerVolumeSpecName "kube-api-access-zl9nq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:28:05 crc kubenswrapper[4748]: I0320 11:28:05.605028 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zl9nq\" (UniqueName: \"kubernetes.io/projected/60133479-2ce5-441a-a3d7-0e05c4f5ac67-kube-api-access-zl9nq\") on node \"crc\" DevicePath \"\"" Mar 20 11:28:05 crc kubenswrapper[4748]: I0320 11:28:05.838904 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566768-wxhzm" event={"ID":"60133479-2ce5-441a-a3d7-0e05c4f5ac67","Type":"ContainerDied","Data":"20e19816458ad72914d2c1c85631044dd9aa7886d279ad0bff0c11eaf553a072"} Mar 20 11:28:05 crc kubenswrapper[4748]: I0320 11:28:05.839225 4748 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="20e19816458ad72914d2c1c85631044dd9aa7886d279ad0bff0c11eaf553a072" Mar 20 11:28:05 crc kubenswrapper[4748]: I0320 11:28:05.839320 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566768-wxhzm" Mar 20 11:28:05 crc kubenswrapper[4748]: I0320 11:28:05.907599 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566762-dwf2g"] Mar 20 11:28:05 crc kubenswrapper[4748]: I0320 11:28:05.916598 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566762-dwf2g"] Mar 20 11:28:07 crc kubenswrapper[4748]: I0320 11:28:07.529826 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8b76d0b-b803-4b0f-8f32-b3316847237d" path="/var/lib/kubelet/pods/a8b76d0b-b803-4b0f-8f32-b3316847237d/volumes" Mar 20 11:28:21 crc kubenswrapper[4748]: I0320 11:28:21.537995 4748 scope.go:117] "RemoveContainer" containerID="a05a62a5662e00d3059c7f59b6c6513779165a5e04b68bd60b4e14459c60bb5d" Mar 20 11:28:42 crc kubenswrapper[4748]: I0320 11:28:42.928616 4748 patch_prober.go:28] interesting pod/machine-config-daemon-5lbvz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:28:42 crc kubenswrapper[4748]: I0320 11:28:42.929875 4748 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:29:11 crc kubenswrapper[4748]: I0320 11:29:11.349464 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-x4t49"] Mar 20 11:29:11 crc kubenswrapper[4748]: E0320 11:29:11.354939 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60133479-2ce5-441a-a3d7-0e05c4f5ac67" containerName="oc" Mar 20 11:29:11 crc kubenswrapper[4748]: I0320 11:29:11.354964 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="60133479-2ce5-441a-a3d7-0e05c4f5ac67" containerName="oc" Mar 20 11:29:11 crc kubenswrapper[4748]: I0320 11:29:11.355343 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="60133479-2ce5-441a-a3d7-0e05c4f5ac67" containerName="oc" Mar 20 11:29:11 crc kubenswrapper[4748]: I0320 11:29:11.357106 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x4t49" Mar 20 11:29:11 crc kubenswrapper[4748]: I0320 11:29:11.362243 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-x4t49"] Mar 20 11:29:11 crc kubenswrapper[4748]: I0320 11:29:11.494207 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01a8e444-a23d-4e97-b335-f26e8ccee9ab-utilities\") pod \"certified-operators-x4t49\" (UID: \"01a8e444-a23d-4e97-b335-f26e8ccee9ab\") " pod="openshift-marketplace/certified-operators-x4t49" Mar 20 11:29:11 crc kubenswrapper[4748]: I0320 11:29:11.494300 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01a8e444-a23d-4e97-b335-f26e8ccee9ab-catalog-content\") pod \"certified-operators-x4t49\" (UID: \"01a8e444-a23d-4e97-b335-f26e8ccee9ab\") " pod="openshift-marketplace/certified-operators-x4t49" Mar 20 11:29:11 crc kubenswrapper[4748]: I0320 11:29:11.494528 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjxnv\" (UniqueName: \"kubernetes.io/projected/01a8e444-a23d-4e97-b335-f26e8ccee9ab-kube-api-access-sjxnv\") pod \"certified-operators-x4t49\" (UID: \"01a8e444-a23d-4e97-b335-f26e8ccee9ab\") " pod="openshift-marketplace/certified-operators-x4t49" Mar 20 11:29:11 crc kubenswrapper[4748]: I0320 11:29:11.596423 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01a8e444-a23d-4e97-b335-f26e8ccee9ab-utilities\") pod \"certified-operators-x4t49\" (UID: \"01a8e444-a23d-4e97-b335-f26e8ccee9ab\") " pod="openshift-marketplace/certified-operators-x4t49" Mar 20 11:29:11 crc kubenswrapper[4748]: I0320 11:29:11.596503 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01a8e444-a23d-4e97-b335-f26e8ccee9ab-catalog-content\") pod \"certified-operators-x4t49\" (UID: \"01a8e444-a23d-4e97-b335-f26e8ccee9ab\") " pod="openshift-marketplace/certified-operators-x4t49" Mar 20 11:29:11 crc kubenswrapper[4748]: I0320 11:29:11.596733 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sjxnv\" (UniqueName: \"kubernetes.io/projected/01a8e444-a23d-4e97-b335-f26e8ccee9ab-kube-api-access-sjxnv\") pod \"certified-operators-x4t49\" (UID: \"01a8e444-a23d-4e97-b335-f26e8ccee9ab\") " pod="openshift-marketplace/certified-operators-x4t49" Mar 20 11:29:11 crc kubenswrapper[4748]: I0320 11:29:11.597279 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01a8e444-a23d-4e97-b335-f26e8ccee9ab-utilities\") pod \"certified-operators-x4t49\" (UID: \"01a8e444-a23d-4e97-b335-f26e8ccee9ab\") " pod="openshift-marketplace/certified-operators-x4t49" Mar 20 11:29:11 crc kubenswrapper[4748]: I0320 11:29:11.597361 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01a8e444-a23d-4e97-b335-f26e8ccee9ab-catalog-content\") pod \"certified-operators-x4t49\" (UID: \"01a8e444-a23d-4e97-b335-f26e8ccee9ab\") " pod="openshift-marketplace/certified-operators-x4t49" Mar 20 11:29:11 crc kubenswrapper[4748]: I0320 11:29:11.622527 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjxnv\" (UniqueName: \"kubernetes.io/projected/01a8e444-a23d-4e97-b335-f26e8ccee9ab-kube-api-access-sjxnv\") pod \"certified-operators-x4t49\" (UID: \"01a8e444-a23d-4e97-b335-f26e8ccee9ab\") " pod="openshift-marketplace/certified-operators-x4t49" Mar 20 11:29:11 crc kubenswrapper[4748]: I0320 11:29:11.691222 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x4t49" Mar 20 11:29:12 crc kubenswrapper[4748]: I0320 11:29:12.246104 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-x4t49"] Mar 20 11:29:12 crc kubenswrapper[4748]: I0320 11:29:12.436922 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x4t49" event={"ID":"01a8e444-a23d-4e97-b335-f26e8ccee9ab","Type":"ContainerStarted","Data":"8f7cbb099b70fec2e78b6a2c955c905715e4967d9ca6de7cfb520ff2232bdd97"} Mar 20 11:29:12 crc kubenswrapper[4748]: I0320 11:29:12.928220 4748 patch_prober.go:28] interesting pod/machine-config-daemon-5lbvz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:29:12 crc kubenswrapper[4748]: I0320 11:29:12.928618 4748 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:29:13 crc kubenswrapper[4748]: I0320 11:29:13.446220 4748 generic.go:334] "Generic (PLEG): container finished" podID="01a8e444-a23d-4e97-b335-f26e8ccee9ab" containerID="7068eecf75ea273c666c7f9eeb610b2a152f54f43fb6fe9af334d56cfdfbe469" exitCode=0 Mar 20 11:29:13 crc kubenswrapper[4748]: I0320 11:29:13.446288 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x4t49" event={"ID":"01a8e444-a23d-4e97-b335-f26e8ccee9ab","Type":"ContainerDied","Data":"7068eecf75ea273c666c7f9eeb610b2a152f54f43fb6fe9af334d56cfdfbe469"} Mar 20 11:29:14 crc kubenswrapper[4748]: I0320 11:29:14.459165 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x4t49" event={"ID":"01a8e444-a23d-4e97-b335-f26e8ccee9ab","Type":"ContainerStarted","Data":"73a9a5e78f3f2030181f5930d927b305aee499c119b4f2dc523eead55b324be1"} Mar 20 11:29:15 crc kubenswrapper[4748]: I0320 11:29:15.470117 4748 generic.go:334] "Generic (PLEG): container finished" podID="01a8e444-a23d-4e97-b335-f26e8ccee9ab" containerID="73a9a5e78f3f2030181f5930d927b305aee499c119b4f2dc523eead55b324be1" exitCode=0 Mar 20 11:29:15 crc kubenswrapper[4748]: I0320 11:29:15.470168 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x4t49" event={"ID":"01a8e444-a23d-4e97-b335-f26e8ccee9ab","Type":"ContainerDied","Data":"73a9a5e78f3f2030181f5930d927b305aee499c119b4f2dc523eead55b324be1"} Mar 20 11:29:17 crc kubenswrapper[4748]: I0320 11:29:17.488634 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x4t49" event={"ID":"01a8e444-a23d-4e97-b335-f26e8ccee9ab","Type":"ContainerStarted","Data":"7c7c77d12d8f179b30a7745717afb00c798f80b54accf3f95008a5d1bfd540e4"} Mar 20 11:29:17 crc kubenswrapper[4748]: I0320 11:29:17.511723 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-x4t49" podStartSLOduration=2.942592057 podStartE2EDuration="6.511701083s" podCreationTimestamp="2026-03-20 11:29:11 +0000 UTC" firstStartedPulling="2026-03-20 11:29:13.448982813 +0000 UTC m=+3188.590528627" lastFinishedPulling="2026-03-20 11:29:17.018091839 +0000 UTC m=+3192.159637653" observedRunningTime="2026-03-20 11:29:17.504185563 +0000 UTC m=+3192.645731377" watchObservedRunningTime="2026-03-20 11:29:17.511701083 +0000 UTC m=+3192.653246897" Mar 20 11:29:21 crc kubenswrapper[4748]: I0320 11:29:21.691648 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-x4t49" Mar 20 11:29:21 crc kubenswrapper[4748]: I0320 11:29:21.693168 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-x4t49" Mar 20 11:29:21 crc kubenswrapper[4748]: I0320 11:29:21.748337 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-x4t49" Mar 20 11:29:22 crc kubenswrapper[4748]: I0320 11:29:22.587478 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-x4t49" Mar 20 11:29:22 crc kubenswrapper[4748]: I0320 11:29:22.646229 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-x4t49"] Mar 20 11:29:24 crc kubenswrapper[4748]: I0320 11:29:24.553216 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-x4t49" podUID="01a8e444-a23d-4e97-b335-f26e8ccee9ab" containerName="registry-server" containerID="cri-o://7c7c77d12d8f179b30a7745717afb00c798f80b54accf3f95008a5d1bfd540e4" gracePeriod=2 Mar 20 11:29:25 crc kubenswrapper[4748]: I0320 11:29:25.257075 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x4t49" Mar 20 11:29:25 crc kubenswrapper[4748]: I0320 11:29:25.354619 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sjxnv\" (UniqueName: \"kubernetes.io/projected/01a8e444-a23d-4e97-b335-f26e8ccee9ab-kube-api-access-sjxnv\") pod \"01a8e444-a23d-4e97-b335-f26e8ccee9ab\" (UID: \"01a8e444-a23d-4e97-b335-f26e8ccee9ab\") " Mar 20 11:29:25 crc kubenswrapper[4748]: I0320 11:29:25.354857 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01a8e444-a23d-4e97-b335-f26e8ccee9ab-utilities\") pod \"01a8e444-a23d-4e97-b335-f26e8ccee9ab\" (UID: \"01a8e444-a23d-4e97-b335-f26e8ccee9ab\") " Mar 20 11:29:25 crc kubenswrapper[4748]: I0320 11:29:25.354926 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01a8e444-a23d-4e97-b335-f26e8ccee9ab-catalog-content\") pod \"01a8e444-a23d-4e97-b335-f26e8ccee9ab\" (UID: \"01a8e444-a23d-4e97-b335-f26e8ccee9ab\") " Mar 20 11:29:25 crc kubenswrapper[4748]: I0320 11:29:25.355619 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01a8e444-a23d-4e97-b335-f26e8ccee9ab-utilities" (OuterVolumeSpecName: "utilities") pod "01a8e444-a23d-4e97-b335-f26e8ccee9ab" (UID: "01a8e444-a23d-4e97-b335-f26e8ccee9ab"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:29:25 crc kubenswrapper[4748]: I0320 11:29:25.366585 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01a8e444-a23d-4e97-b335-f26e8ccee9ab-kube-api-access-sjxnv" (OuterVolumeSpecName: "kube-api-access-sjxnv") pod "01a8e444-a23d-4e97-b335-f26e8ccee9ab" (UID: "01a8e444-a23d-4e97-b335-f26e8ccee9ab"). InnerVolumeSpecName "kube-api-access-sjxnv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:29:25 crc kubenswrapper[4748]: I0320 11:29:25.436165 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01a8e444-a23d-4e97-b335-f26e8ccee9ab-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "01a8e444-a23d-4e97-b335-f26e8ccee9ab" (UID: "01a8e444-a23d-4e97-b335-f26e8ccee9ab"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:29:25 crc kubenswrapper[4748]: I0320 11:29:25.457091 4748 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01a8e444-a23d-4e97-b335-f26e8ccee9ab-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 11:29:25 crc kubenswrapper[4748]: I0320 11:29:25.457124 4748 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01a8e444-a23d-4e97-b335-f26e8ccee9ab-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 11:29:25 crc kubenswrapper[4748]: I0320 11:29:25.457139 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sjxnv\" (UniqueName: \"kubernetes.io/projected/01a8e444-a23d-4e97-b335-f26e8ccee9ab-kube-api-access-sjxnv\") on node \"crc\" DevicePath \"\"" Mar 20 11:29:25 crc kubenswrapper[4748]: I0320 11:29:25.565430 4748 generic.go:334] "Generic (PLEG): container finished" podID="01a8e444-a23d-4e97-b335-f26e8ccee9ab" containerID="7c7c77d12d8f179b30a7745717afb00c798f80b54accf3f95008a5d1bfd540e4" exitCode=0 Mar 20 11:29:25 crc kubenswrapper[4748]: I0320 11:29:25.565472 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x4t49" event={"ID":"01a8e444-a23d-4e97-b335-f26e8ccee9ab","Type":"ContainerDied","Data":"7c7c77d12d8f179b30a7745717afb00c798f80b54accf3f95008a5d1bfd540e4"} Mar 20 11:29:25 crc kubenswrapper[4748]: I0320 11:29:25.565497 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x4t49" event={"ID":"01a8e444-a23d-4e97-b335-f26e8ccee9ab","Type":"ContainerDied","Data":"8f7cbb099b70fec2e78b6a2c955c905715e4967d9ca6de7cfb520ff2232bdd97"} Mar 20 11:29:25 crc kubenswrapper[4748]: I0320 11:29:25.565512 4748 scope.go:117] "RemoveContainer" containerID="7c7c77d12d8f179b30a7745717afb00c798f80b54accf3f95008a5d1bfd540e4" Mar 20 11:29:25 crc kubenswrapper[4748]: I0320 11:29:25.565632 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x4t49" Mar 20 11:29:25 crc kubenswrapper[4748]: I0320 11:29:25.601992 4748 scope.go:117] "RemoveContainer" containerID="73a9a5e78f3f2030181f5930d927b305aee499c119b4f2dc523eead55b324be1" Mar 20 11:29:25 crc kubenswrapper[4748]: I0320 11:29:25.605777 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-x4t49"] Mar 20 11:29:25 crc kubenswrapper[4748]: I0320 11:29:25.616296 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-x4t49"] Mar 20 11:29:25 crc kubenswrapper[4748]: I0320 11:29:25.626772 4748 scope.go:117] "RemoveContainer" containerID="7068eecf75ea273c666c7f9eeb610b2a152f54f43fb6fe9af334d56cfdfbe469" Mar 20 11:29:25 crc kubenswrapper[4748]: I0320 11:29:25.674229 4748 scope.go:117] "RemoveContainer" containerID="7c7c77d12d8f179b30a7745717afb00c798f80b54accf3f95008a5d1bfd540e4" Mar 20 11:29:25 crc kubenswrapper[4748]: E0320 11:29:25.675330 4748 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c7c77d12d8f179b30a7745717afb00c798f80b54accf3f95008a5d1bfd540e4\": container with ID starting with 7c7c77d12d8f179b30a7745717afb00c798f80b54accf3f95008a5d1bfd540e4 not found: ID does not exist" containerID="7c7c77d12d8f179b30a7745717afb00c798f80b54accf3f95008a5d1bfd540e4" Mar 20 11:29:25 crc kubenswrapper[4748]: I0320 11:29:25.675463 4748 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c7c77d12d8f179b30a7745717afb00c798f80b54accf3f95008a5d1bfd540e4"} err="failed to get container status \"7c7c77d12d8f179b30a7745717afb00c798f80b54accf3f95008a5d1bfd540e4\": rpc error: code = NotFound desc = could not find container \"7c7c77d12d8f179b30a7745717afb00c798f80b54accf3f95008a5d1bfd540e4\": container with ID starting with 7c7c77d12d8f179b30a7745717afb00c798f80b54accf3f95008a5d1bfd540e4 not found: ID does not exist" Mar 20 11:29:25 crc kubenswrapper[4748]: I0320 11:29:25.675581 4748 scope.go:117] "RemoveContainer" containerID="73a9a5e78f3f2030181f5930d927b305aee499c119b4f2dc523eead55b324be1" Mar 20 11:29:25 crc kubenswrapper[4748]: E0320 11:29:25.676124 4748 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"73a9a5e78f3f2030181f5930d927b305aee499c119b4f2dc523eead55b324be1\": container with ID starting with 73a9a5e78f3f2030181f5930d927b305aee499c119b4f2dc523eead55b324be1 not found: ID does not exist" containerID="73a9a5e78f3f2030181f5930d927b305aee499c119b4f2dc523eead55b324be1" Mar 20 11:29:25 crc kubenswrapper[4748]: I0320 11:29:25.676158 4748 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73a9a5e78f3f2030181f5930d927b305aee499c119b4f2dc523eead55b324be1"} err="failed to get container status \"73a9a5e78f3f2030181f5930d927b305aee499c119b4f2dc523eead55b324be1\": rpc error: code = NotFound desc = could not find container \"73a9a5e78f3f2030181f5930d927b305aee499c119b4f2dc523eead55b324be1\": container with ID starting with 73a9a5e78f3f2030181f5930d927b305aee499c119b4f2dc523eead55b324be1 not found: ID does not exist" Mar 20 11:29:25 crc kubenswrapper[4748]: I0320 11:29:25.676182 4748 scope.go:117] "RemoveContainer" containerID="7068eecf75ea273c666c7f9eeb610b2a152f54f43fb6fe9af334d56cfdfbe469" Mar 20 11:29:25 crc kubenswrapper[4748]: E0320 11:29:25.676490 4748 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7068eecf75ea273c666c7f9eeb610b2a152f54f43fb6fe9af334d56cfdfbe469\": container with ID starting with 7068eecf75ea273c666c7f9eeb610b2a152f54f43fb6fe9af334d56cfdfbe469 not found: ID does not exist" containerID="7068eecf75ea273c666c7f9eeb610b2a152f54f43fb6fe9af334d56cfdfbe469" Mar 20 11:29:25 crc kubenswrapper[4748]: I0320 11:29:25.676584 4748 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7068eecf75ea273c666c7f9eeb610b2a152f54f43fb6fe9af334d56cfdfbe469"} err="failed to get container status \"7068eecf75ea273c666c7f9eeb610b2a152f54f43fb6fe9af334d56cfdfbe469\": rpc error: code = NotFound desc = could not find container \"7068eecf75ea273c666c7f9eeb610b2a152f54f43fb6fe9af334d56cfdfbe469\": container with ID starting with 7068eecf75ea273c666c7f9eeb610b2a152f54f43fb6fe9af334d56cfdfbe469 not found: ID does not exist" Mar 20 11:29:27 crc kubenswrapper[4748]: I0320 11:29:27.533959 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01a8e444-a23d-4e97-b335-f26e8ccee9ab" path="/var/lib/kubelet/pods/01a8e444-a23d-4e97-b335-f26e8ccee9ab/volumes" Mar 20 11:29:42 crc kubenswrapper[4748]: I0320 11:29:42.928210 4748 patch_prober.go:28] interesting pod/machine-config-daemon-5lbvz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:29:42 crc kubenswrapper[4748]: I0320 11:29:42.928800 4748 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:29:42 crc kubenswrapper[4748]: I0320 11:29:42.928866 4748 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" Mar 20 11:29:42 crc kubenswrapper[4748]: I0320 11:29:42.929597 4748 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9c9ee957322fa34d688fcab2aab1e68d05f6773bc13436058fe28e475976d334"} pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 11:29:42 crc kubenswrapper[4748]: I0320 11:29:42.929648 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" containerName="machine-config-daemon" containerID="cri-o://9c9ee957322fa34d688fcab2aab1e68d05f6773bc13436058fe28e475976d334" gracePeriod=600 Mar 20 11:29:43 crc kubenswrapper[4748]: E0320 11:29:43.046592 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lbvz_openshift-machine-config-operator(8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" Mar 20 11:29:43 crc kubenswrapper[4748]: I0320 11:29:43.711341 4748 generic.go:334] "Generic (PLEG): container finished" podID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" containerID="9c9ee957322fa34d688fcab2aab1e68d05f6773bc13436058fe28e475976d334" exitCode=0 Mar 20 11:29:43 crc kubenswrapper[4748]: I0320 11:29:43.711396 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" event={"ID":"8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c","Type":"ContainerDied","Data":"9c9ee957322fa34d688fcab2aab1e68d05f6773bc13436058fe28e475976d334"} Mar 20 11:29:43 crc kubenswrapper[4748]: I0320 11:29:43.711437 4748 scope.go:117] "RemoveContainer" containerID="a585b460ade4b7044a27fe4f08a1d29b65f1fac9c0d4f42f3e310bffe005c7ed" Mar 20 11:29:43 crc kubenswrapper[4748]: I0320 11:29:43.712280 4748 scope.go:117] "RemoveContainer" containerID="9c9ee957322fa34d688fcab2aab1e68d05f6773bc13436058fe28e475976d334" Mar 20 11:29:43 crc kubenswrapper[4748]: E0320 11:29:43.712593 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lbvz_openshift-machine-config-operator(8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" Mar 20 11:29:55 crc kubenswrapper[4748]: I0320 11:29:55.521544 4748 scope.go:117] "RemoveContainer" containerID="9c9ee957322fa34d688fcab2aab1e68d05f6773bc13436058fe28e475976d334" Mar 20 11:29:55 crc kubenswrapper[4748]: E0320 11:29:55.522326 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lbvz_openshift-machine-config-operator(8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" Mar 20 11:30:00 crc kubenswrapper[4748]: I0320 11:30:00.186248 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566770-gvk8w"] Mar 20 11:30:00 crc kubenswrapper[4748]: E0320 11:30:00.187352 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01a8e444-a23d-4e97-b335-f26e8ccee9ab" containerName="registry-server" Mar 20 11:30:00 crc kubenswrapper[4748]: I0320 11:30:00.187371 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="01a8e444-a23d-4e97-b335-f26e8ccee9ab" containerName="registry-server" Mar 20 11:30:00 crc kubenswrapper[4748]: E0320 11:30:00.187399 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01a8e444-a23d-4e97-b335-f26e8ccee9ab" containerName="extract-utilities" Mar 20 11:30:00 crc kubenswrapper[4748]: I0320 11:30:00.187409 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="01a8e444-a23d-4e97-b335-f26e8ccee9ab" containerName="extract-utilities" Mar 20 11:30:00 crc kubenswrapper[4748]: E0320 11:30:00.187426 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01a8e444-a23d-4e97-b335-f26e8ccee9ab" containerName="extract-content" Mar 20 11:30:00 crc kubenswrapper[4748]: I0320 11:30:00.187433 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="01a8e444-a23d-4e97-b335-f26e8ccee9ab" containerName="extract-content" Mar 20 11:30:00 crc kubenswrapper[4748]: I0320 11:30:00.187657 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="01a8e444-a23d-4e97-b335-f26e8ccee9ab" containerName="registry-server" Mar 20 11:30:00 crc kubenswrapper[4748]: I0320 11:30:00.188364 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566770-gvk8w" Mar 20 11:30:00 crc kubenswrapper[4748]: I0320 11:30:00.190915 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 11:30:00 crc kubenswrapper[4748]: I0320 11:30:00.190977 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-p6q8z" Mar 20 11:30:00 crc kubenswrapper[4748]: I0320 11:30:00.191415 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 11:30:00 crc kubenswrapper[4748]: I0320 11:30:00.198624 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566770-xr87r"] Mar 20 11:30:00 crc kubenswrapper[4748]: I0320 11:30:00.200259 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566770-xr87r" Mar 20 11:30:00 crc kubenswrapper[4748]: I0320 11:30:00.202018 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8n4fd\" (UniqueName: \"kubernetes.io/projected/8d96e85b-256f-4c95-bcae-d38395ee0820-kube-api-access-8n4fd\") pod \"auto-csr-approver-29566770-gvk8w\" (UID: \"8d96e85b-256f-4c95-bcae-d38395ee0820\") " pod="openshift-infra/auto-csr-approver-29566770-gvk8w" Mar 20 11:30:00 crc kubenswrapper[4748]: I0320 11:30:00.208323 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 20 11:30:00 crc kubenswrapper[4748]: I0320 11:30:00.208534 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 20 11:30:00 crc kubenswrapper[4748]: I0320 11:30:00.226009 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566770-gvk8w"] Mar 20 11:30:00 crc kubenswrapper[4748]: I0320 11:30:00.239718 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566770-xr87r"] Mar 20 11:30:00 crc kubenswrapper[4748]: I0320 11:30:00.303396 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/01896c60-49a4-481c-83af-53724b9c6edc-config-volume\") pod \"collect-profiles-29566770-xr87r\" (UID: \"01896c60-49a4-481c-83af-53724b9c6edc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566770-xr87r" Mar 20 11:30:00 crc kubenswrapper[4748]: I0320 11:30:00.303518 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/01896c60-49a4-481c-83af-53724b9c6edc-secret-volume\") pod \"collect-profiles-29566770-xr87r\" (UID: \"01896c60-49a4-481c-83af-53724b9c6edc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566770-xr87r" Mar 20 11:30:00 crc kubenswrapper[4748]: I0320 11:30:00.303627 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8n4fd\" (UniqueName: \"kubernetes.io/projected/8d96e85b-256f-4c95-bcae-d38395ee0820-kube-api-access-8n4fd\") pod \"auto-csr-approver-29566770-gvk8w\" (UID: \"8d96e85b-256f-4c95-bcae-d38395ee0820\") " pod="openshift-infra/auto-csr-approver-29566770-gvk8w" Mar 20 11:30:00 crc kubenswrapper[4748]: I0320 11:30:00.303938 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2kkc9\" (UniqueName: \"kubernetes.io/projected/01896c60-49a4-481c-83af-53724b9c6edc-kube-api-access-2kkc9\") pod \"collect-profiles-29566770-xr87r\" (UID: \"01896c60-49a4-481c-83af-53724b9c6edc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566770-xr87r" Mar 20 11:30:00 crc kubenswrapper[4748]: I0320 11:30:00.322912 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8n4fd\" (UniqueName: \"kubernetes.io/projected/8d96e85b-256f-4c95-bcae-d38395ee0820-kube-api-access-8n4fd\") pod \"auto-csr-approver-29566770-gvk8w\" (UID: \"8d96e85b-256f-4c95-bcae-d38395ee0820\") " pod="openshift-infra/auto-csr-approver-29566770-gvk8w" Mar 20 11:30:00 crc kubenswrapper[4748]: I0320 11:30:00.405952 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2kkc9\" (UniqueName: \"kubernetes.io/projected/01896c60-49a4-481c-83af-53724b9c6edc-kube-api-access-2kkc9\") pod \"collect-profiles-29566770-xr87r\" (UID: \"01896c60-49a4-481c-83af-53724b9c6edc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566770-xr87r" Mar 20 11:30:00 crc kubenswrapper[4748]: I0320 11:30:00.406382 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/01896c60-49a4-481c-83af-53724b9c6edc-config-volume\") pod \"collect-profiles-29566770-xr87r\" (UID: \"01896c60-49a4-481c-83af-53724b9c6edc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566770-xr87r" Mar 20 11:30:00 crc kubenswrapper[4748]: I0320 11:30:00.406414 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/01896c60-49a4-481c-83af-53724b9c6edc-secret-volume\") pod \"collect-profiles-29566770-xr87r\" (UID: \"01896c60-49a4-481c-83af-53724b9c6edc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566770-xr87r" Mar 20 11:30:00 crc kubenswrapper[4748]: I0320 11:30:00.407277 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/01896c60-49a4-481c-83af-53724b9c6edc-config-volume\") pod \"collect-profiles-29566770-xr87r\" (UID: \"01896c60-49a4-481c-83af-53724b9c6edc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566770-xr87r" Mar 20 11:30:00 crc kubenswrapper[4748]: I0320 11:30:00.410046 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/01896c60-49a4-481c-83af-53724b9c6edc-secret-volume\") pod \"collect-profiles-29566770-xr87r\" (UID: \"01896c60-49a4-481c-83af-53724b9c6edc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566770-xr87r" Mar 20 11:30:00 crc kubenswrapper[4748]: I0320 11:30:00.425763 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2kkc9\" (UniqueName: \"kubernetes.io/projected/01896c60-49a4-481c-83af-53724b9c6edc-kube-api-access-2kkc9\") pod \"collect-profiles-29566770-xr87r\" (UID: \"01896c60-49a4-481c-83af-53724b9c6edc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566770-xr87r" Mar 20 11:30:00 crc kubenswrapper[4748]: I0320 11:30:00.521491 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566770-gvk8w" Mar 20 11:30:00 crc kubenswrapper[4748]: I0320 11:30:00.538485 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566770-xr87r" Mar 20 11:30:01 crc kubenswrapper[4748]: I0320 11:30:01.010859 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566770-gvk8w"] Mar 20 11:30:01 crc kubenswrapper[4748]: I0320 11:30:01.161361 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566770-xr87r"] Mar 20 11:30:02 crc kubenswrapper[4748]: I0320 11:30:02.015731 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566770-gvk8w" event={"ID":"8d96e85b-256f-4c95-bcae-d38395ee0820","Type":"ContainerStarted","Data":"1aa43da711a200bc67f6f6e18e1214df87877d80a8a74ed5afcc17d6e1e9dbb0"} Mar 20 11:30:02 crc kubenswrapper[4748]: I0320 11:30:02.017530 4748 generic.go:334] "Generic (PLEG): container finished" podID="01896c60-49a4-481c-83af-53724b9c6edc" containerID="3db768adadb0a0961feb393214cf338038d779c077577fbedfa1cfbfe828c74b" exitCode=0 Mar 20 11:30:02 crc kubenswrapper[4748]: I0320 11:30:02.017573 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566770-xr87r" event={"ID":"01896c60-49a4-481c-83af-53724b9c6edc","Type":"ContainerDied","Data":"3db768adadb0a0961feb393214cf338038d779c077577fbedfa1cfbfe828c74b"} Mar 20 11:30:02 crc kubenswrapper[4748]: I0320 11:30:02.017597 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566770-xr87r" event={"ID":"01896c60-49a4-481c-83af-53724b9c6edc","Type":"ContainerStarted","Data":"e46aaa11b27812ccd09445cdaa508973d6db478e787da6425bf1621d45c631c7"} Mar 20 11:30:03 crc kubenswrapper[4748]: I0320 11:30:03.619234 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566770-xr87r" Mar 20 11:30:03 crc kubenswrapper[4748]: I0320 11:30:03.769392 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2kkc9\" (UniqueName: \"kubernetes.io/projected/01896c60-49a4-481c-83af-53724b9c6edc-kube-api-access-2kkc9\") pod \"01896c60-49a4-481c-83af-53724b9c6edc\" (UID: \"01896c60-49a4-481c-83af-53724b9c6edc\") " Mar 20 11:30:03 crc kubenswrapper[4748]: I0320 11:30:03.769951 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/01896c60-49a4-481c-83af-53724b9c6edc-config-volume\") pod \"01896c60-49a4-481c-83af-53724b9c6edc\" (UID: \"01896c60-49a4-481c-83af-53724b9c6edc\") " Mar 20 11:30:03 crc kubenswrapper[4748]: I0320 11:30:03.769990 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/01896c60-49a4-481c-83af-53724b9c6edc-secret-volume\") pod \"01896c60-49a4-481c-83af-53724b9c6edc\" (UID: \"01896c60-49a4-481c-83af-53724b9c6edc\") " Mar 20 11:30:03 crc kubenswrapper[4748]: I0320 11:30:03.770573 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01896c60-49a4-481c-83af-53724b9c6edc-config-volume" (OuterVolumeSpecName: "config-volume") pod "01896c60-49a4-481c-83af-53724b9c6edc" (UID: "01896c60-49a4-481c-83af-53724b9c6edc"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 11:30:03 crc kubenswrapper[4748]: I0320 11:30:03.776709 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01896c60-49a4-481c-83af-53724b9c6edc-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "01896c60-49a4-481c-83af-53724b9c6edc" (UID: "01896c60-49a4-481c-83af-53724b9c6edc"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 11:30:03 crc kubenswrapper[4748]: I0320 11:30:03.782726 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01896c60-49a4-481c-83af-53724b9c6edc-kube-api-access-2kkc9" (OuterVolumeSpecName: "kube-api-access-2kkc9") pod "01896c60-49a4-481c-83af-53724b9c6edc" (UID: "01896c60-49a4-481c-83af-53724b9c6edc"). InnerVolumeSpecName "kube-api-access-2kkc9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:30:03 crc kubenswrapper[4748]: I0320 11:30:03.872461 4748 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/01896c60-49a4-481c-83af-53724b9c6edc-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 20 11:30:03 crc kubenswrapper[4748]: I0320 11:30:03.872512 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2kkc9\" (UniqueName: \"kubernetes.io/projected/01896c60-49a4-481c-83af-53724b9c6edc-kube-api-access-2kkc9\") on node \"crc\" DevicePath \"\"" Mar 20 11:30:03 crc kubenswrapper[4748]: I0320 11:30:03.872526 4748 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/01896c60-49a4-481c-83af-53724b9c6edc-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 11:30:04 crc kubenswrapper[4748]: I0320 11:30:04.040957 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566770-xr87r" event={"ID":"01896c60-49a4-481c-83af-53724b9c6edc","Type":"ContainerDied","Data":"e46aaa11b27812ccd09445cdaa508973d6db478e787da6425bf1621d45c631c7"} Mar 20 11:30:04 crc kubenswrapper[4748]: I0320 11:30:04.041011 4748 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e46aaa11b27812ccd09445cdaa508973d6db478e787da6425bf1621d45c631c7" Mar 20 11:30:04 crc kubenswrapper[4748]: I0320 11:30:04.041076 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566770-xr87r" Mar 20 11:30:04 crc kubenswrapper[4748]: I0320 11:30:04.053724 4748 generic.go:334] "Generic (PLEG): container finished" podID="8d96e85b-256f-4c95-bcae-d38395ee0820" containerID="a09888fd6fab3edbd3166869eecff9abba2fc54929fbf2edc5615ba03cf6ad3a" exitCode=0 Mar 20 11:30:04 crc kubenswrapper[4748]: I0320 11:30:04.053776 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566770-gvk8w" event={"ID":"8d96e85b-256f-4c95-bcae-d38395ee0820","Type":"ContainerDied","Data":"a09888fd6fab3edbd3166869eecff9abba2fc54929fbf2edc5615ba03cf6ad3a"} Mar 20 11:30:04 crc kubenswrapper[4748]: I0320 11:30:04.701796 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566725-gpxq7"] Mar 20 11:30:04 crc kubenswrapper[4748]: I0320 11:30:04.715086 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566725-gpxq7"] Mar 20 11:30:05 crc kubenswrapper[4748]: I0320 11:30:05.491866 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566770-gvk8w" Mar 20 11:30:05 crc kubenswrapper[4748]: I0320 11:30:05.528867 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ead5bfd4-b9a5-4edf-b020-e088057446c8" path="/var/lib/kubelet/pods/ead5bfd4-b9a5-4edf-b020-e088057446c8/volumes" Mar 20 11:30:05 crc kubenswrapper[4748]: I0320 11:30:05.606145 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8n4fd\" (UniqueName: \"kubernetes.io/projected/8d96e85b-256f-4c95-bcae-d38395ee0820-kube-api-access-8n4fd\") pod \"8d96e85b-256f-4c95-bcae-d38395ee0820\" (UID: \"8d96e85b-256f-4c95-bcae-d38395ee0820\") " Mar 20 11:30:05 crc kubenswrapper[4748]: I0320 11:30:05.612670 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d96e85b-256f-4c95-bcae-d38395ee0820-kube-api-access-8n4fd" (OuterVolumeSpecName: "kube-api-access-8n4fd") pod "8d96e85b-256f-4c95-bcae-d38395ee0820" (UID: "8d96e85b-256f-4c95-bcae-d38395ee0820"). InnerVolumeSpecName "kube-api-access-8n4fd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:30:05 crc kubenswrapper[4748]: I0320 11:30:05.708678 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8n4fd\" (UniqueName: \"kubernetes.io/projected/8d96e85b-256f-4c95-bcae-d38395ee0820-kube-api-access-8n4fd\") on node \"crc\" DevicePath \"\"" Mar 20 11:30:06 crc kubenswrapper[4748]: I0320 11:30:06.071346 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566770-gvk8w" event={"ID":"8d96e85b-256f-4c95-bcae-d38395ee0820","Type":"ContainerDied","Data":"1aa43da711a200bc67f6f6e18e1214df87877d80a8a74ed5afcc17d6e1e9dbb0"} Mar 20 11:30:06 crc kubenswrapper[4748]: I0320 11:30:06.071410 4748 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1aa43da711a200bc67f6f6e18e1214df87877d80a8a74ed5afcc17d6e1e9dbb0" Mar 20 11:30:06 crc kubenswrapper[4748]: I0320 11:30:06.071486 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566770-gvk8w" Mar 20 11:30:06 crc kubenswrapper[4748]: I0320 11:30:06.555462 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566764-wfwcl"] Mar 20 11:30:06 crc kubenswrapper[4748]: I0320 11:30:06.564433 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566764-wfwcl"] Mar 20 11:30:07 crc kubenswrapper[4748]: I0320 11:30:07.528133 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8fd94020-5267-4d83-94e4-4cd358e1f134" path="/var/lib/kubelet/pods/8fd94020-5267-4d83-94e4-4cd358e1f134/volumes" Mar 20 11:30:09 crc kubenswrapper[4748]: I0320 11:30:09.515879 4748 scope.go:117] "RemoveContainer" containerID="9c9ee957322fa34d688fcab2aab1e68d05f6773bc13436058fe28e475976d334" Mar 20 11:30:09 crc kubenswrapper[4748]: E0320 11:30:09.516471 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lbvz_openshift-machine-config-operator(8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" Mar 20 11:30:21 crc kubenswrapper[4748]: I0320 11:30:21.624859 4748 scope.go:117] "RemoveContainer" containerID="39e42753a07aa0fb0b7dec08330ef381824d586618f5f35d52c00091f0aff57d" Mar 20 11:30:21 crc kubenswrapper[4748]: I0320 11:30:21.669659 4748 scope.go:117] "RemoveContainer" containerID="5bd32b9d23d9890b6a0cb2ba9e1c35f3220e35a9f55eb4065b0e177debfa0db2" Mar 20 11:30:24 crc kubenswrapper[4748]: I0320 11:30:24.515331 4748 scope.go:117] "RemoveContainer" containerID="9c9ee957322fa34d688fcab2aab1e68d05f6773bc13436058fe28e475976d334" Mar 20 11:30:24 crc kubenswrapper[4748]: E0320 11:30:24.515919 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lbvz_openshift-machine-config-operator(8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" Mar 20 11:30:39 crc kubenswrapper[4748]: I0320 11:30:39.516202 4748 scope.go:117] "RemoveContainer" containerID="9c9ee957322fa34d688fcab2aab1e68d05f6773bc13436058fe28e475976d334" Mar 20 11:30:39 crc kubenswrapper[4748]: E0320 11:30:39.516939 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lbvz_openshift-machine-config-operator(8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" Mar 20 11:30:51 crc kubenswrapper[4748]: I0320 11:30:51.516355 4748 scope.go:117] "RemoveContainer" containerID="9c9ee957322fa34d688fcab2aab1e68d05f6773bc13436058fe28e475976d334" Mar 20 11:30:51 crc kubenswrapper[4748]: E0320 11:30:51.517050 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lbvz_openshift-machine-config-operator(8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" Mar 20 11:31:06 crc kubenswrapper[4748]: I0320 11:31:06.514825 4748 scope.go:117] "RemoveContainer" containerID="9c9ee957322fa34d688fcab2aab1e68d05f6773bc13436058fe28e475976d334" Mar 20 11:31:06 crc kubenswrapper[4748]: E0320 11:31:06.516752 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lbvz_openshift-machine-config-operator(8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" Mar 20 11:31:21 crc kubenswrapper[4748]: I0320 11:31:21.515595 4748 scope.go:117] "RemoveContainer" containerID="9c9ee957322fa34d688fcab2aab1e68d05f6773bc13436058fe28e475976d334" Mar 20 11:31:21 crc kubenswrapper[4748]: E0320 11:31:21.516415 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lbvz_openshift-machine-config-operator(8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" Mar 20 11:31:35 crc kubenswrapper[4748]: I0320 11:31:35.522438 4748 scope.go:117] "RemoveContainer" containerID="9c9ee957322fa34d688fcab2aab1e68d05f6773bc13436058fe28e475976d334" Mar 20 11:31:35 crc kubenswrapper[4748]: E0320 11:31:35.523308 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lbvz_openshift-machine-config-operator(8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" Mar 20 11:31:47 crc kubenswrapper[4748]: I0320 11:31:47.516175 4748 scope.go:117] "RemoveContainer" containerID="9c9ee957322fa34d688fcab2aab1e68d05f6773bc13436058fe28e475976d334" Mar 20 11:31:47 crc kubenswrapper[4748]: E0320 11:31:47.516956 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lbvz_openshift-machine-config-operator(8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" Mar 20 11:31:58 crc kubenswrapper[4748]: I0320 11:31:58.516561 4748 scope.go:117] "RemoveContainer" containerID="9c9ee957322fa34d688fcab2aab1e68d05f6773bc13436058fe28e475976d334" Mar 20 11:31:58 crc kubenswrapper[4748]: E0320 11:31:58.517379 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lbvz_openshift-machine-config-operator(8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" Mar 20 11:32:00 crc kubenswrapper[4748]: I0320 11:32:00.148723 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566772-j4rsj"] Mar 20 11:32:00 crc kubenswrapper[4748]: E0320 11:32:00.150211 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d96e85b-256f-4c95-bcae-d38395ee0820" containerName="oc" Mar 20 11:32:00 crc kubenswrapper[4748]: I0320 11:32:00.150312 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d96e85b-256f-4c95-bcae-d38395ee0820" containerName="oc" Mar 20 11:32:00 crc kubenswrapper[4748]: E0320 11:32:00.150404 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01896c60-49a4-481c-83af-53724b9c6edc" containerName="collect-profiles" Mar 20 11:32:00 crc kubenswrapper[4748]: I0320 11:32:00.150483 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="01896c60-49a4-481c-83af-53724b9c6edc" containerName="collect-profiles" Mar 20 11:32:00 crc kubenswrapper[4748]: I0320 11:32:00.150804 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="01896c60-49a4-481c-83af-53724b9c6edc" containerName="collect-profiles" Mar 20 11:32:00 crc kubenswrapper[4748]: I0320 11:32:00.150979 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d96e85b-256f-4c95-bcae-d38395ee0820" containerName="oc" Mar 20 11:32:00 crc kubenswrapper[4748]: I0320 11:32:00.151858 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566772-j4rsj" Mar 20 11:32:00 crc kubenswrapper[4748]: I0320 11:32:00.154270 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-p6q8z" Mar 20 11:32:00 crc kubenswrapper[4748]: I0320 11:32:00.154639 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 11:32:00 crc kubenswrapper[4748]: I0320 11:32:00.154978 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 11:32:00 crc kubenswrapper[4748]: I0320 11:32:00.160084 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566772-j4rsj"] Mar 20 11:32:00 crc kubenswrapper[4748]: I0320 11:32:00.276854 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68kn9\" (UniqueName: \"kubernetes.io/projected/258d3ead-d0c6-4a5a-8687-69b4385f5eb4-kube-api-access-68kn9\") pod \"auto-csr-approver-29566772-j4rsj\" (UID: \"258d3ead-d0c6-4a5a-8687-69b4385f5eb4\") " pod="openshift-infra/auto-csr-approver-29566772-j4rsj" Mar 20 11:32:00 crc kubenswrapper[4748]: I0320 11:32:00.379170 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-68kn9\" (UniqueName: \"kubernetes.io/projected/258d3ead-d0c6-4a5a-8687-69b4385f5eb4-kube-api-access-68kn9\") pod \"auto-csr-approver-29566772-j4rsj\" (UID: \"258d3ead-d0c6-4a5a-8687-69b4385f5eb4\") " pod="openshift-infra/auto-csr-approver-29566772-j4rsj" Mar 20 11:32:00 crc kubenswrapper[4748]: I0320 11:32:00.399523 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-68kn9\" (UniqueName: \"kubernetes.io/projected/258d3ead-d0c6-4a5a-8687-69b4385f5eb4-kube-api-access-68kn9\") pod \"auto-csr-approver-29566772-j4rsj\" (UID: \"258d3ead-d0c6-4a5a-8687-69b4385f5eb4\") " pod="openshift-infra/auto-csr-approver-29566772-j4rsj" Mar 20 11:32:00 crc kubenswrapper[4748]: I0320 11:32:00.477015 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566772-j4rsj" Mar 20 11:32:00 crc kubenswrapper[4748]: I0320 11:32:00.908314 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566772-j4rsj"] Mar 20 11:32:01 crc kubenswrapper[4748]: I0320 11:32:01.021994 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566772-j4rsj" event={"ID":"258d3ead-d0c6-4a5a-8687-69b4385f5eb4","Type":"ContainerStarted","Data":"d86714449604abc9bf8c6e8f64342da7d3eb8e5ef5c2cf3f3f1b4977cd88bcde"} Mar 20 11:32:03 crc kubenswrapper[4748]: I0320 11:32:03.047601 4748 generic.go:334] "Generic (PLEG): container finished" podID="258d3ead-d0c6-4a5a-8687-69b4385f5eb4" containerID="81f36b4a6b7e34a56376654022dfc979842f5e530672e4498fdf8e821f03d05c" exitCode=0 Mar 20 11:32:03 crc kubenswrapper[4748]: I0320 11:32:03.048162 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566772-j4rsj" event={"ID":"258d3ead-d0c6-4a5a-8687-69b4385f5eb4","Type":"ContainerDied","Data":"81f36b4a6b7e34a56376654022dfc979842f5e530672e4498fdf8e821f03d05c"} Mar 20 11:32:04 crc kubenswrapper[4748]: I0320 11:32:04.490055 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566772-j4rsj" Mar 20 11:32:04 crc kubenswrapper[4748]: I0320 11:32:04.569078 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-68kn9\" (UniqueName: \"kubernetes.io/projected/258d3ead-d0c6-4a5a-8687-69b4385f5eb4-kube-api-access-68kn9\") pod \"258d3ead-d0c6-4a5a-8687-69b4385f5eb4\" (UID: \"258d3ead-d0c6-4a5a-8687-69b4385f5eb4\") " Mar 20 11:32:04 crc kubenswrapper[4748]: I0320 11:32:04.579281 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/258d3ead-d0c6-4a5a-8687-69b4385f5eb4-kube-api-access-68kn9" (OuterVolumeSpecName: "kube-api-access-68kn9") pod "258d3ead-d0c6-4a5a-8687-69b4385f5eb4" (UID: "258d3ead-d0c6-4a5a-8687-69b4385f5eb4"). InnerVolumeSpecName "kube-api-access-68kn9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:32:04 crc kubenswrapper[4748]: I0320 11:32:04.670466 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-68kn9\" (UniqueName: \"kubernetes.io/projected/258d3ead-d0c6-4a5a-8687-69b4385f5eb4-kube-api-access-68kn9\") on node \"crc\" DevicePath \"\"" Mar 20 11:32:05 crc kubenswrapper[4748]: I0320 11:32:05.067101 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566772-j4rsj" event={"ID":"258d3ead-d0c6-4a5a-8687-69b4385f5eb4","Type":"ContainerDied","Data":"d86714449604abc9bf8c6e8f64342da7d3eb8e5ef5c2cf3f3f1b4977cd88bcde"} Mar 20 11:32:05 crc kubenswrapper[4748]: I0320 11:32:05.067152 4748 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d86714449604abc9bf8c6e8f64342da7d3eb8e5ef5c2cf3f3f1b4977cd88bcde" Mar 20 11:32:05 crc kubenswrapper[4748]: I0320 11:32:05.067182 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566772-j4rsj" Mar 20 11:32:05 crc kubenswrapper[4748]: I0320 11:32:05.573982 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566766-j4fsd"] Mar 20 11:32:05 crc kubenswrapper[4748]: I0320 11:32:05.585353 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566766-j4fsd"] Mar 20 11:32:07 crc kubenswrapper[4748]: I0320 11:32:07.529491 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="016a8101-d004-4a4b-aba6-9c862a26f9f8" path="/var/lib/kubelet/pods/016a8101-d004-4a4b-aba6-9c862a26f9f8/volumes" Mar 20 11:32:11 crc kubenswrapper[4748]: I0320 11:32:11.515396 4748 scope.go:117] "RemoveContainer" containerID="9c9ee957322fa34d688fcab2aab1e68d05f6773bc13436058fe28e475976d334" Mar 20 11:32:11 crc kubenswrapper[4748]: E0320 11:32:11.516289 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lbvz_openshift-machine-config-operator(8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" Mar 20 11:32:21 crc kubenswrapper[4748]: I0320 11:32:21.808422 4748 scope.go:117] "RemoveContainer" containerID="f973f04dc494ed86c1d1e3885034c229dbbb5dd3c04d2bd3d9c571bcc77ba032" Mar 20 11:32:24 crc kubenswrapper[4748]: I0320 11:32:24.516086 4748 scope.go:117] "RemoveContainer" containerID="9c9ee957322fa34d688fcab2aab1e68d05f6773bc13436058fe28e475976d334" Mar 20 11:32:24 crc kubenswrapper[4748]: E0320 11:32:24.516648 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lbvz_openshift-machine-config-operator(8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" Mar 20 11:32:25 crc kubenswrapper[4748]: I0320 11:32:25.325251 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-szqxd"] Mar 20 11:32:25 crc kubenswrapper[4748]: E0320 11:32:25.325689 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="258d3ead-d0c6-4a5a-8687-69b4385f5eb4" containerName="oc" Mar 20 11:32:25 crc kubenswrapper[4748]: I0320 11:32:25.325704 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="258d3ead-d0c6-4a5a-8687-69b4385f5eb4" containerName="oc" Mar 20 11:32:25 crc kubenswrapper[4748]: I0320 11:32:25.325902 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="258d3ead-d0c6-4a5a-8687-69b4385f5eb4" containerName="oc" Mar 20 11:32:25 crc kubenswrapper[4748]: I0320 11:32:25.327796 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-szqxd" Mar 20 11:32:25 crc kubenswrapper[4748]: I0320 11:32:25.338882 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-szqxd"] Mar 20 11:32:25 crc kubenswrapper[4748]: I0320 11:32:25.455578 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1eb5c60a-5677-4dc2-81fe-9fb6dc15947e-catalog-content\") pod \"redhat-operators-szqxd\" (UID: \"1eb5c60a-5677-4dc2-81fe-9fb6dc15947e\") " pod="openshift-marketplace/redhat-operators-szqxd" Mar 20 11:32:25 crc kubenswrapper[4748]: I0320 11:32:25.455894 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k744w\" (UniqueName: \"kubernetes.io/projected/1eb5c60a-5677-4dc2-81fe-9fb6dc15947e-kube-api-access-k744w\") pod \"redhat-operators-szqxd\" (UID: \"1eb5c60a-5677-4dc2-81fe-9fb6dc15947e\") " pod="openshift-marketplace/redhat-operators-szqxd" Mar 20 11:32:25 crc kubenswrapper[4748]: I0320 11:32:25.456125 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1eb5c60a-5677-4dc2-81fe-9fb6dc15947e-utilities\") pod \"redhat-operators-szqxd\" (UID: \"1eb5c60a-5677-4dc2-81fe-9fb6dc15947e\") " pod="openshift-marketplace/redhat-operators-szqxd" Mar 20 11:32:25 crc kubenswrapper[4748]: I0320 11:32:25.558197 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1eb5c60a-5677-4dc2-81fe-9fb6dc15947e-utilities\") pod \"redhat-operators-szqxd\" (UID: \"1eb5c60a-5677-4dc2-81fe-9fb6dc15947e\") " pod="openshift-marketplace/redhat-operators-szqxd" Mar 20 11:32:25 crc kubenswrapper[4748]: I0320 11:32:25.558918 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1eb5c60a-5677-4dc2-81fe-9fb6dc15947e-utilities\") pod \"redhat-operators-szqxd\" (UID: \"1eb5c60a-5677-4dc2-81fe-9fb6dc15947e\") " pod="openshift-marketplace/redhat-operators-szqxd" Mar 20 11:32:25 crc kubenswrapper[4748]: I0320 11:32:25.559045 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1eb5c60a-5677-4dc2-81fe-9fb6dc15947e-catalog-content\") pod \"redhat-operators-szqxd\" (UID: \"1eb5c60a-5677-4dc2-81fe-9fb6dc15947e\") " pod="openshift-marketplace/redhat-operators-szqxd" Mar 20 11:32:25 crc kubenswrapper[4748]: I0320 11:32:25.559193 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k744w\" (UniqueName: \"kubernetes.io/projected/1eb5c60a-5677-4dc2-81fe-9fb6dc15947e-kube-api-access-k744w\") pod \"redhat-operators-szqxd\" (UID: \"1eb5c60a-5677-4dc2-81fe-9fb6dc15947e\") " pod="openshift-marketplace/redhat-operators-szqxd" Mar 20 11:32:25 crc kubenswrapper[4748]: I0320 11:32:25.559387 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1eb5c60a-5677-4dc2-81fe-9fb6dc15947e-catalog-content\") pod \"redhat-operators-szqxd\" (UID: \"1eb5c60a-5677-4dc2-81fe-9fb6dc15947e\") " pod="openshift-marketplace/redhat-operators-szqxd" Mar 20 11:32:25 crc kubenswrapper[4748]: I0320 11:32:25.582721 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k744w\" (UniqueName: \"kubernetes.io/projected/1eb5c60a-5677-4dc2-81fe-9fb6dc15947e-kube-api-access-k744w\") pod \"redhat-operators-szqxd\" (UID: \"1eb5c60a-5677-4dc2-81fe-9fb6dc15947e\") " pod="openshift-marketplace/redhat-operators-szqxd" Mar 20 11:32:25 crc kubenswrapper[4748]: I0320 11:32:25.646345 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-szqxd" Mar 20 11:32:26 crc kubenswrapper[4748]: I0320 11:32:26.104534 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-szqxd"] Mar 20 11:32:26 crc kubenswrapper[4748]: I0320 11:32:26.267387 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-szqxd" event={"ID":"1eb5c60a-5677-4dc2-81fe-9fb6dc15947e","Type":"ContainerStarted","Data":"d3cde145420ad0c099330f6d510b6223b67c52df6701132ebed05f4fbac9d648"} Mar 20 11:32:27 crc kubenswrapper[4748]: I0320 11:32:27.278999 4748 generic.go:334] "Generic (PLEG): container finished" podID="1eb5c60a-5677-4dc2-81fe-9fb6dc15947e" containerID="f5e752916f6277d1a104053eed805d847b4ffde1d886d935c4908c3983fcd5b5" exitCode=0 Mar 20 11:32:27 crc kubenswrapper[4748]: I0320 11:32:27.279050 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-szqxd" event={"ID":"1eb5c60a-5677-4dc2-81fe-9fb6dc15947e","Type":"ContainerDied","Data":"f5e752916f6277d1a104053eed805d847b4ffde1d886d935c4908c3983fcd5b5"} Mar 20 11:32:30 crc kubenswrapper[4748]: I0320 11:32:30.308281 4748 generic.go:334] "Generic (PLEG): container finished" podID="1eb5c60a-5677-4dc2-81fe-9fb6dc15947e" containerID="730ef5d86677570ee74346501a6f4c74a19da2baa00a41c46623f9817b44e8b6" exitCode=0 Mar 20 11:32:30 crc kubenswrapper[4748]: I0320 11:32:30.308504 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-szqxd" event={"ID":"1eb5c60a-5677-4dc2-81fe-9fb6dc15947e","Type":"ContainerDied","Data":"730ef5d86677570ee74346501a6f4c74a19da2baa00a41c46623f9817b44e8b6"} Mar 20 11:32:31 crc kubenswrapper[4748]: I0320 11:32:31.323852 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-szqxd" event={"ID":"1eb5c60a-5677-4dc2-81fe-9fb6dc15947e","Type":"ContainerStarted","Data":"8d6cee471afa999d2f5d9370f140f6ba823415c5f9ba75c9a79e2f4356e4b69d"} Mar 20 11:32:31 crc kubenswrapper[4748]: I0320 11:32:31.346279 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-szqxd" podStartSLOduration=2.913992221 podStartE2EDuration="6.346263358s" podCreationTimestamp="2026-03-20 11:32:25 +0000 UTC" firstStartedPulling="2026-03-20 11:32:27.281522406 +0000 UTC m=+3382.423068220" lastFinishedPulling="2026-03-20 11:32:30.713793543 +0000 UTC m=+3385.855339357" observedRunningTime="2026-03-20 11:32:31.338428539 +0000 UTC m=+3386.479974353" watchObservedRunningTime="2026-03-20 11:32:31.346263358 +0000 UTC m=+3386.487809172" Mar 20 11:32:35 crc kubenswrapper[4748]: I0320 11:32:35.647302 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-szqxd" Mar 20 11:32:35 crc kubenswrapper[4748]: I0320 11:32:35.647643 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-szqxd" Mar 20 11:32:36 crc kubenswrapper[4748]: I0320 11:32:36.693340 4748 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-szqxd" podUID="1eb5c60a-5677-4dc2-81fe-9fb6dc15947e" containerName="registry-server" probeResult="failure" output=< Mar 20 11:32:36 crc kubenswrapper[4748]: timeout: failed to connect service ":50051" within 1s Mar 20 11:32:36 crc kubenswrapper[4748]: > Mar 20 11:32:39 crc kubenswrapper[4748]: I0320 11:32:39.515734 4748 scope.go:117] "RemoveContainer" containerID="9c9ee957322fa34d688fcab2aab1e68d05f6773bc13436058fe28e475976d334" Mar 20 11:32:39 crc kubenswrapper[4748]: E0320 11:32:39.516545 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lbvz_openshift-machine-config-operator(8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" Mar 20 11:32:45 crc kubenswrapper[4748]: I0320 11:32:45.695020 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-szqxd" Mar 20 11:32:45 crc kubenswrapper[4748]: I0320 11:32:45.748258 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-szqxd" Mar 20 11:32:45 crc kubenswrapper[4748]: I0320 11:32:45.929271 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-szqxd"] Mar 20 11:32:47 crc kubenswrapper[4748]: I0320 11:32:47.485925 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-szqxd" podUID="1eb5c60a-5677-4dc2-81fe-9fb6dc15947e" containerName="registry-server" containerID="cri-o://8d6cee471afa999d2f5d9370f140f6ba823415c5f9ba75c9a79e2f4356e4b69d" gracePeriod=2 Mar 20 11:32:48 crc kubenswrapper[4748]: I0320 11:32:48.203270 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-szqxd" Mar 20 11:32:48 crc kubenswrapper[4748]: I0320 11:32:48.309786 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1eb5c60a-5677-4dc2-81fe-9fb6dc15947e-catalog-content\") pod \"1eb5c60a-5677-4dc2-81fe-9fb6dc15947e\" (UID: \"1eb5c60a-5677-4dc2-81fe-9fb6dc15947e\") " Mar 20 11:32:48 crc kubenswrapper[4748]: I0320 11:32:48.309932 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k744w\" (UniqueName: \"kubernetes.io/projected/1eb5c60a-5677-4dc2-81fe-9fb6dc15947e-kube-api-access-k744w\") pod \"1eb5c60a-5677-4dc2-81fe-9fb6dc15947e\" (UID: \"1eb5c60a-5677-4dc2-81fe-9fb6dc15947e\") " Mar 20 11:32:48 crc kubenswrapper[4748]: I0320 11:32:48.310180 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1eb5c60a-5677-4dc2-81fe-9fb6dc15947e-utilities\") pod \"1eb5c60a-5677-4dc2-81fe-9fb6dc15947e\" (UID: \"1eb5c60a-5677-4dc2-81fe-9fb6dc15947e\") " Mar 20 11:32:48 crc kubenswrapper[4748]: I0320 11:32:48.311521 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1eb5c60a-5677-4dc2-81fe-9fb6dc15947e-utilities" (OuterVolumeSpecName: "utilities") pod "1eb5c60a-5677-4dc2-81fe-9fb6dc15947e" (UID: "1eb5c60a-5677-4dc2-81fe-9fb6dc15947e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:32:48 crc kubenswrapper[4748]: I0320 11:32:48.315717 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1eb5c60a-5677-4dc2-81fe-9fb6dc15947e-kube-api-access-k744w" (OuterVolumeSpecName: "kube-api-access-k744w") pod "1eb5c60a-5677-4dc2-81fe-9fb6dc15947e" (UID: "1eb5c60a-5677-4dc2-81fe-9fb6dc15947e"). InnerVolumeSpecName "kube-api-access-k744w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:32:48 crc kubenswrapper[4748]: I0320 11:32:48.412185 4748 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1eb5c60a-5677-4dc2-81fe-9fb6dc15947e-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 11:32:48 crc kubenswrapper[4748]: I0320 11:32:48.412238 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k744w\" (UniqueName: \"kubernetes.io/projected/1eb5c60a-5677-4dc2-81fe-9fb6dc15947e-kube-api-access-k744w\") on node \"crc\" DevicePath \"\"" Mar 20 11:32:48 crc kubenswrapper[4748]: I0320 11:32:48.443802 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1eb5c60a-5677-4dc2-81fe-9fb6dc15947e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1eb5c60a-5677-4dc2-81fe-9fb6dc15947e" (UID: "1eb5c60a-5677-4dc2-81fe-9fb6dc15947e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:32:48 crc kubenswrapper[4748]: I0320 11:32:48.514670 4748 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1eb5c60a-5677-4dc2-81fe-9fb6dc15947e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 11:32:48 crc kubenswrapper[4748]: I0320 11:32:48.521674 4748 generic.go:334] "Generic (PLEG): container finished" podID="1eb5c60a-5677-4dc2-81fe-9fb6dc15947e" containerID="8d6cee471afa999d2f5d9370f140f6ba823415c5f9ba75c9a79e2f4356e4b69d" exitCode=0 Mar 20 11:32:48 crc kubenswrapper[4748]: I0320 11:32:48.522058 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-szqxd" Mar 20 11:32:48 crc kubenswrapper[4748]: I0320 11:32:48.522096 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-szqxd" event={"ID":"1eb5c60a-5677-4dc2-81fe-9fb6dc15947e","Type":"ContainerDied","Data":"8d6cee471afa999d2f5d9370f140f6ba823415c5f9ba75c9a79e2f4356e4b69d"} Mar 20 11:32:48 crc kubenswrapper[4748]: I0320 11:32:48.524023 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-szqxd" event={"ID":"1eb5c60a-5677-4dc2-81fe-9fb6dc15947e","Type":"ContainerDied","Data":"d3cde145420ad0c099330f6d510b6223b67c52df6701132ebed05f4fbac9d648"} Mar 20 11:32:48 crc kubenswrapper[4748]: I0320 11:32:48.524045 4748 scope.go:117] "RemoveContainer" containerID="8d6cee471afa999d2f5d9370f140f6ba823415c5f9ba75c9a79e2f4356e4b69d" Mar 20 11:32:48 crc kubenswrapper[4748]: I0320 11:32:48.578089 4748 scope.go:117] "RemoveContainer" containerID="730ef5d86677570ee74346501a6f4c74a19da2baa00a41c46623f9817b44e8b6" Mar 20 11:32:48 crc kubenswrapper[4748]: I0320 11:32:48.608888 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-szqxd"] Mar 20 11:32:48 crc kubenswrapper[4748]: I0320 11:32:48.615936 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-szqxd"] Mar 20 11:32:48 crc kubenswrapper[4748]: I0320 11:32:48.629062 4748 scope.go:117] "RemoveContainer" containerID="f5e752916f6277d1a104053eed805d847b4ffde1d886d935c4908c3983fcd5b5" Mar 20 11:32:48 crc kubenswrapper[4748]: I0320 11:32:48.667936 4748 scope.go:117] "RemoveContainer" containerID="8d6cee471afa999d2f5d9370f140f6ba823415c5f9ba75c9a79e2f4356e4b69d" Mar 20 11:32:48 crc kubenswrapper[4748]: E0320 11:32:48.668844 4748 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d6cee471afa999d2f5d9370f140f6ba823415c5f9ba75c9a79e2f4356e4b69d\": container with ID starting with 8d6cee471afa999d2f5d9370f140f6ba823415c5f9ba75c9a79e2f4356e4b69d not found: ID does not exist" containerID="8d6cee471afa999d2f5d9370f140f6ba823415c5f9ba75c9a79e2f4356e4b69d" Mar 20 11:32:48 crc kubenswrapper[4748]: I0320 11:32:48.668927 4748 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d6cee471afa999d2f5d9370f140f6ba823415c5f9ba75c9a79e2f4356e4b69d"} err="failed to get container status \"8d6cee471afa999d2f5d9370f140f6ba823415c5f9ba75c9a79e2f4356e4b69d\": rpc error: code = NotFound desc = could not find container \"8d6cee471afa999d2f5d9370f140f6ba823415c5f9ba75c9a79e2f4356e4b69d\": container with ID starting with 8d6cee471afa999d2f5d9370f140f6ba823415c5f9ba75c9a79e2f4356e4b69d not found: ID does not exist" Mar 20 11:32:48 crc kubenswrapper[4748]: I0320 11:32:48.668985 4748 scope.go:117] "RemoveContainer" containerID="730ef5d86677570ee74346501a6f4c74a19da2baa00a41c46623f9817b44e8b6" Mar 20 11:32:48 crc kubenswrapper[4748]: E0320 11:32:48.669457 4748 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"730ef5d86677570ee74346501a6f4c74a19da2baa00a41c46623f9817b44e8b6\": container with ID starting with 730ef5d86677570ee74346501a6f4c74a19da2baa00a41c46623f9817b44e8b6 not found: ID does not exist" containerID="730ef5d86677570ee74346501a6f4c74a19da2baa00a41c46623f9817b44e8b6" Mar 20 11:32:48 crc kubenswrapper[4748]: I0320 11:32:48.669529 4748 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"730ef5d86677570ee74346501a6f4c74a19da2baa00a41c46623f9817b44e8b6"} err="failed to get container status \"730ef5d86677570ee74346501a6f4c74a19da2baa00a41c46623f9817b44e8b6\": rpc error: code = NotFound desc = could not find container \"730ef5d86677570ee74346501a6f4c74a19da2baa00a41c46623f9817b44e8b6\": container with ID starting with 730ef5d86677570ee74346501a6f4c74a19da2baa00a41c46623f9817b44e8b6 not found: ID does not exist" Mar 20 11:32:48 crc kubenswrapper[4748]: I0320 11:32:48.669573 4748 scope.go:117] "RemoveContainer" containerID="f5e752916f6277d1a104053eed805d847b4ffde1d886d935c4908c3983fcd5b5" Mar 20 11:32:48 crc kubenswrapper[4748]: E0320 11:32:48.670259 4748 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f5e752916f6277d1a104053eed805d847b4ffde1d886d935c4908c3983fcd5b5\": container with ID starting with f5e752916f6277d1a104053eed805d847b4ffde1d886d935c4908c3983fcd5b5 not found: ID does not exist" containerID="f5e752916f6277d1a104053eed805d847b4ffde1d886d935c4908c3983fcd5b5" Mar 20 11:32:48 crc kubenswrapper[4748]: I0320 11:32:48.670298 4748 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5e752916f6277d1a104053eed805d847b4ffde1d886d935c4908c3983fcd5b5"} err="failed to get container status \"f5e752916f6277d1a104053eed805d847b4ffde1d886d935c4908c3983fcd5b5\": rpc error: code = NotFound desc = could not find container \"f5e752916f6277d1a104053eed805d847b4ffde1d886d935c4908c3983fcd5b5\": container with ID starting with f5e752916f6277d1a104053eed805d847b4ffde1d886d935c4908c3983fcd5b5 not found: ID does not exist" Mar 20 11:32:49 crc kubenswrapper[4748]: I0320 11:32:49.525090 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1eb5c60a-5677-4dc2-81fe-9fb6dc15947e" path="/var/lib/kubelet/pods/1eb5c60a-5677-4dc2-81fe-9fb6dc15947e/volumes" Mar 20 11:32:50 crc kubenswrapper[4748]: I0320 11:32:50.515662 4748 scope.go:117] "RemoveContainer" containerID="9c9ee957322fa34d688fcab2aab1e68d05f6773bc13436058fe28e475976d334" Mar 20 11:32:50 crc kubenswrapper[4748]: E0320 11:32:50.516258 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lbvz_openshift-machine-config-operator(8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" Mar 20 11:33:03 crc kubenswrapper[4748]: I0320 11:33:03.517244 4748 scope.go:117] "RemoveContainer" containerID="9c9ee957322fa34d688fcab2aab1e68d05f6773bc13436058fe28e475976d334" Mar 20 11:33:03 crc kubenswrapper[4748]: E0320 11:33:03.517896 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lbvz_openshift-machine-config-operator(8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" Mar 20 11:33:14 crc kubenswrapper[4748]: I0320 11:33:14.515737 4748 scope.go:117] "RemoveContainer" containerID="9c9ee957322fa34d688fcab2aab1e68d05f6773bc13436058fe28e475976d334" Mar 20 11:33:14 crc kubenswrapper[4748]: E0320 11:33:14.516577 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lbvz_openshift-machine-config-operator(8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" Mar 20 11:33:26 crc kubenswrapper[4748]: I0320 11:33:26.515822 4748 scope.go:117] "RemoveContainer" containerID="9c9ee957322fa34d688fcab2aab1e68d05f6773bc13436058fe28e475976d334" Mar 20 11:33:26 crc kubenswrapper[4748]: E0320 11:33:26.516791 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lbvz_openshift-machine-config-operator(8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" Mar 20 11:33:38 crc kubenswrapper[4748]: I0320 11:33:38.515728 4748 scope.go:117] "RemoveContainer" containerID="9c9ee957322fa34d688fcab2aab1e68d05f6773bc13436058fe28e475976d334" Mar 20 11:33:38 crc kubenswrapper[4748]: E0320 11:33:38.516450 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lbvz_openshift-machine-config-operator(8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" Mar 20 11:33:39 crc kubenswrapper[4748]: I0320 11:33:39.009253 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-kq895"] Mar 20 11:33:39 crc kubenswrapper[4748]: E0320 11:33:39.010031 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1eb5c60a-5677-4dc2-81fe-9fb6dc15947e" containerName="extract-utilities" Mar 20 11:33:39 crc kubenswrapper[4748]: I0320 11:33:39.010050 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="1eb5c60a-5677-4dc2-81fe-9fb6dc15947e" containerName="extract-utilities" Mar 20 11:33:39 crc kubenswrapper[4748]: E0320 11:33:39.010078 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1eb5c60a-5677-4dc2-81fe-9fb6dc15947e" containerName="extract-content" Mar 20 11:33:39 crc kubenswrapper[4748]: I0320 11:33:39.010086 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="1eb5c60a-5677-4dc2-81fe-9fb6dc15947e" containerName="extract-content" Mar 20 11:33:39 crc kubenswrapper[4748]: E0320 11:33:39.010105 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1eb5c60a-5677-4dc2-81fe-9fb6dc15947e" containerName="registry-server" Mar 20 11:33:39 crc kubenswrapper[4748]: I0320 11:33:39.010113 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="1eb5c60a-5677-4dc2-81fe-9fb6dc15947e" containerName="registry-server" Mar 20 11:33:39 crc kubenswrapper[4748]: I0320 11:33:39.010347 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="1eb5c60a-5677-4dc2-81fe-9fb6dc15947e" containerName="registry-server" Mar 20 11:33:39 crc kubenswrapper[4748]: I0320 11:33:39.012257 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kq895" Mar 20 11:33:39 crc kubenswrapper[4748]: I0320 11:33:39.024516 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kq895"] Mar 20 11:33:39 crc kubenswrapper[4748]: I0320 11:33:39.120112 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d4f7438-390a-4412-927c-3c71340cf41e-utilities\") pod \"redhat-marketplace-kq895\" (UID: \"2d4f7438-390a-4412-927c-3c71340cf41e\") " pod="openshift-marketplace/redhat-marketplace-kq895" Mar 20 11:33:39 crc kubenswrapper[4748]: I0320 11:33:39.120431 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5qqq\" (UniqueName: \"kubernetes.io/projected/2d4f7438-390a-4412-927c-3c71340cf41e-kube-api-access-w5qqq\") pod \"redhat-marketplace-kq895\" (UID: \"2d4f7438-390a-4412-927c-3c71340cf41e\") " pod="openshift-marketplace/redhat-marketplace-kq895" Mar 20 11:33:39 crc kubenswrapper[4748]: I0320 11:33:39.120518 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d4f7438-390a-4412-927c-3c71340cf41e-catalog-content\") pod \"redhat-marketplace-kq895\" (UID: \"2d4f7438-390a-4412-927c-3c71340cf41e\") " pod="openshift-marketplace/redhat-marketplace-kq895" Mar 20 11:33:39 crc kubenswrapper[4748]: I0320 11:33:39.221823 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5qqq\" (UniqueName: \"kubernetes.io/projected/2d4f7438-390a-4412-927c-3c71340cf41e-kube-api-access-w5qqq\") pod \"redhat-marketplace-kq895\" (UID: \"2d4f7438-390a-4412-927c-3c71340cf41e\") " pod="openshift-marketplace/redhat-marketplace-kq895" Mar 20 11:33:39 crc kubenswrapper[4748]: I0320 11:33:39.221888 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d4f7438-390a-4412-927c-3c71340cf41e-catalog-content\") pod \"redhat-marketplace-kq895\" (UID: \"2d4f7438-390a-4412-927c-3c71340cf41e\") " pod="openshift-marketplace/redhat-marketplace-kq895" Mar 20 11:33:39 crc kubenswrapper[4748]: I0320 11:33:39.221986 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d4f7438-390a-4412-927c-3c71340cf41e-utilities\") pod \"redhat-marketplace-kq895\" (UID: \"2d4f7438-390a-4412-927c-3c71340cf41e\") " pod="openshift-marketplace/redhat-marketplace-kq895" Mar 20 11:33:39 crc kubenswrapper[4748]: I0320 11:33:39.222429 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d4f7438-390a-4412-927c-3c71340cf41e-utilities\") pod \"redhat-marketplace-kq895\" (UID: \"2d4f7438-390a-4412-927c-3c71340cf41e\") " pod="openshift-marketplace/redhat-marketplace-kq895" Mar 20 11:33:39 crc kubenswrapper[4748]: I0320 11:33:39.222725 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d4f7438-390a-4412-927c-3c71340cf41e-catalog-content\") pod \"redhat-marketplace-kq895\" (UID: \"2d4f7438-390a-4412-927c-3c71340cf41e\") " pod="openshift-marketplace/redhat-marketplace-kq895" Mar 20 11:33:39 crc kubenswrapper[4748]: I0320 11:33:39.260487 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5qqq\" (UniqueName: \"kubernetes.io/projected/2d4f7438-390a-4412-927c-3c71340cf41e-kube-api-access-w5qqq\") pod \"redhat-marketplace-kq895\" (UID: \"2d4f7438-390a-4412-927c-3c71340cf41e\") " pod="openshift-marketplace/redhat-marketplace-kq895" Mar 20 11:33:39 crc kubenswrapper[4748]: I0320 11:33:39.348978 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kq895" Mar 20 11:33:39 crc kubenswrapper[4748]: I0320 11:33:39.862408 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kq895"] Mar 20 11:33:39 crc kubenswrapper[4748]: I0320 11:33:39.956889 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kq895" event={"ID":"2d4f7438-390a-4412-927c-3c71340cf41e","Type":"ContainerStarted","Data":"a2f84eaae6c60e6761dd5817a34e5cf162e7504246badc8f486e5863e92b670e"} Mar 20 11:33:40 crc kubenswrapper[4748]: I0320 11:33:40.966667 4748 generic.go:334] "Generic (PLEG): container finished" podID="2d4f7438-390a-4412-927c-3c71340cf41e" containerID="fbae17cc91c7d41de391c71ce71ebef25429d7749f506ab8135e2029e203693f" exitCode=0 Mar 20 11:33:40 crc kubenswrapper[4748]: I0320 11:33:40.966751 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kq895" event={"ID":"2d4f7438-390a-4412-927c-3c71340cf41e","Type":"ContainerDied","Data":"fbae17cc91c7d41de391c71ce71ebef25429d7749f506ab8135e2029e203693f"} Mar 20 11:33:40 crc kubenswrapper[4748]: I0320 11:33:40.969259 4748 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 11:33:42 crc kubenswrapper[4748]: I0320 11:33:42.987744 4748 generic.go:334] "Generic (PLEG): container finished" podID="2d4f7438-390a-4412-927c-3c71340cf41e" containerID="1756c5ccaf512d06774af2fc9f357946556d7c7fb40624f84903dc1c79aa5190" exitCode=0 Mar 20 11:33:42 crc kubenswrapper[4748]: I0320 11:33:42.987800 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kq895" event={"ID":"2d4f7438-390a-4412-927c-3c71340cf41e","Type":"ContainerDied","Data":"1756c5ccaf512d06774af2fc9f357946556d7c7fb40624f84903dc1c79aa5190"} Mar 20 11:33:44 crc kubenswrapper[4748]: I0320 11:33:44.005453 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kq895" event={"ID":"2d4f7438-390a-4412-927c-3c71340cf41e","Type":"ContainerStarted","Data":"a05c92df8e0964fdf044e350c0a70fd797bebddcb312513606f31ae012605dc2"} Mar 20 11:33:44 crc kubenswrapper[4748]: I0320 11:33:44.029235 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-kq895" podStartSLOduration=3.320839843 podStartE2EDuration="6.029204904s" podCreationTimestamp="2026-03-20 11:33:38 +0000 UTC" firstStartedPulling="2026-03-20 11:33:40.969046173 +0000 UTC m=+3456.110591987" lastFinishedPulling="2026-03-20 11:33:43.677411234 +0000 UTC m=+3458.818957048" observedRunningTime="2026-03-20 11:33:44.024123835 +0000 UTC m=+3459.165669659" watchObservedRunningTime="2026-03-20 11:33:44.029204904 +0000 UTC m=+3459.170750718" Mar 20 11:33:49 crc kubenswrapper[4748]: I0320 11:33:49.349382 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-kq895" Mar 20 11:33:49 crc kubenswrapper[4748]: I0320 11:33:49.349866 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-kq895" Mar 20 11:33:49 crc kubenswrapper[4748]: I0320 11:33:49.394669 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-kq895" Mar 20 11:33:49 crc kubenswrapper[4748]: I0320 11:33:49.515241 4748 scope.go:117] "RemoveContainer" containerID="9c9ee957322fa34d688fcab2aab1e68d05f6773bc13436058fe28e475976d334" Mar 20 11:33:49 crc kubenswrapper[4748]: E0320 11:33:49.515704 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lbvz_openshift-machine-config-operator(8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" Mar 20 11:33:50 crc kubenswrapper[4748]: I0320 11:33:50.098864 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-kq895" Mar 20 11:33:50 crc kubenswrapper[4748]: I0320 11:33:50.152500 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kq895"] Mar 20 11:33:52 crc kubenswrapper[4748]: I0320 11:33:52.065026 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-kq895" podUID="2d4f7438-390a-4412-927c-3c71340cf41e" containerName="registry-server" containerID="cri-o://a05c92df8e0964fdf044e350c0a70fd797bebddcb312513606f31ae012605dc2" gracePeriod=2 Mar 20 11:33:52 crc kubenswrapper[4748]: I0320 11:33:52.664422 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kq895" Mar 20 11:33:52 crc kubenswrapper[4748]: I0320 11:33:52.775167 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d4f7438-390a-4412-927c-3c71340cf41e-catalog-content\") pod \"2d4f7438-390a-4412-927c-3c71340cf41e\" (UID: \"2d4f7438-390a-4412-927c-3c71340cf41e\") " Mar 20 11:33:52 crc kubenswrapper[4748]: I0320 11:33:52.775230 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d4f7438-390a-4412-927c-3c71340cf41e-utilities\") pod \"2d4f7438-390a-4412-927c-3c71340cf41e\" (UID: \"2d4f7438-390a-4412-927c-3c71340cf41e\") " Mar 20 11:33:52 crc kubenswrapper[4748]: I0320 11:33:52.775287 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w5qqq\" (UniqueName: \"kubernetes.io/projected/2d4f7438-390a-4412-927c-3c71340cf41e-kube-api-access-w5qqq\") pod \"2d4f7438-390a-4412-927c-3c71340cf41e\" (UID: \"2d4f7438-390a-4412-927c-3c71340cf41e\") " Mar 20 11:33:52 crc kubenswrapper[4748]: I0320 11:33:52.776228 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2d4f7438-390a-4412-927c-3c71340cf41e-utilities" (OuterVolumeSpecName: "utilities") pod "2d4f7438-390a-4412-927c-3c71340cf41e" (UID: "2d4f7438-390a-4412-927c-3c71340cf41e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:33:52 crc kubenswrapper[4748]: I0320 11:33:52.781056 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d4f7438-390a-4412-927c-3c71340cf41e-kube-api-access-w5qqq" (OuterVolumeSpecName: "kube-api-access-w5qqq") pod "2d4f7438-390a-4412-927c-3c71340cf41e" (UID: "2d4f7438-390a-4412-927c-3c71340cf41e"). InnerVolumeSpecName "kube-api-access-w5qqq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:33:52 crc kubenswrapper[4748]: I0320 11:33:52.804395 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2d4f7438-390a-4412-927c-3c71340cf41e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2d4f7438-390a-4412-927c-3c71340cf41e" (UID: "2d4f7438-390a-4412-927c-3c71340cf41e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:33:52 crc kubenswrapper[4748]: I0320 11:33:52.877894 4748 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d4f7438-390a-4412-927c-3c71340cf41e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 11:33:52 crc kubenswrapper[4748]: I0320 11:33:52.878149 4748 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d4f7438-390a-4412-927c-3c71340cf41e-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 11:33:52 crc kubenswrapper[4748]: I0320 11:33:52.878220 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w5qqq\" (UniqueName: \"kubernetes.io/projected/2d4f7438-390a-4412-927c-3c71340cf41e-kube-api-access-w5qqq\") on node \"crc\" DevicePath \"\"" Mar 20 11:33:53 crc kubenswrapper[4748]: I0320 11:33:53.075599 4748 generic.go:334] "Generic (PLEG): container finished" podID="2d4f7438-390a-4412-927c-3c71340cf41e" containerID="a05c92df8e0964fdf044e350c0a70fd797bebddcb312513606f31ae012605dc2" exitCode=0 Mar 20 11:33:53 crc kubenswrapper[4748]: I0320 11:33:53.075643 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kq895" event={"ID":"2d4f7438-390a-4412-927c-3c71340cf41e","Type":"ContainerDied","Data":"a05c92df8e0964fdf044e350c0a70fd797bebddcb312513606f31ae012605dc2"} Mar 20 11:33:53 crc kubenswrapper[4748]: I0320 11:33:53.075670 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kq895" event={"ID":"2d4f7438-390a-4412-927c-3c71340cf41e","Type":"ContainerDied","Data":"a2f84eaae6c60e6761dd5817a34e5cf162e7504246badc8f486e5863e92b670e"} Mar 20 11:33:53 crc kubenswrapper[4748]: I0320 11:33:53.075687 4748 scope.go:117] "RemoveContainer" containerID="a05c92df8e0964fdf044e350c0a70fd797bebddcb312513606f31ae012605dc2" Mar 20 11:33:53 crc kubenswrapper[4748]: I0320 11:33:53.075819 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kq895" Mar 20 11:33:53 crc kubenswrapper[4748]: I0320 11:33:53.114005 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kq895"] Mar 20 11:33:53 crc kubenswrapper[4748]: I0320 11:33:53.118136 4748 scope.go:117] "RemoveContainer" containerID="1756c5ccaf512d06774af2fc9f357946556d7c7fb40624f84903dc1c79aa5190" Mar 20 11:33:53 crc kubenswrapper[4748]: I0320 11:33:53.124581 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-kq895"] Mar 20 11:33:53 crc kubenswrapper[4748]: I0320 11:33:53.146740 4748 scope.go:117] "RemoveContainer" containerID="fbae17cc91c7d41de391c71ce71ebef25429d7749f506ab8135e2029e203693f" Mar 20 11:33:53 crc kubenswrapper[4748]: I0320 11:33:53.182607 4748 scope.go:117] "RemoveContainer" containerID="a05c92df8e0964fdf044e350c0a70fd797bebddcb312513606f31ae012605dc2" Mar 20 11:33:53 crc kubenswrapper[4748]: E0320 11:33:53.183593 4748 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a05c92df8e0964fdf044e350c0a70fd797bebddcb312513606f31ae012605dc2\": container with ID starting with a05c92df8e0964fdf044e350c0a70fd797bebddcb312513606f31ae012605dc2 not found: ID does not exist" containerID="a05c92df8e0964fdf044e350c0a70fd797bebddcb312513606f31ae012605dc2" Mar 20 11:33:53 crc kubenswrapper[4748]: I0320 11:33:53.183642 4748 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a05c92df8e0964fdf044e350c0a70fd797bebddcb312513606f31ae012605dc2"} err="failed to get container status \"a05c92df8e0964fdf044e350c0a70fd797bebddcb312513606f31ae012605dc2\": rpc error: code = NotFound desc = could not find container \"a05c92df8e0964fdf044e350c0a70fd797bebddcb312513606f31ae012605dc2\": container with ID starting with a05c92df8e0964fdf044e350c0a70fd797bebddcb312513606f31ae012605dc2 not found: ID does not exist" Mar 20 11:33:53 crc kubenswrapper[4748]: I0320 11:33:53.183675 4748 scope.go:117] "RemoveContainer" containerID="1756c5ccaf512d06774af2fc9f357946556d7c7fb40624f84903dc1c79aa5190" Mar 20 11:33:53 crc kubenswrapper[4748]: E0320 11:33:53.183996 4748 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1756c5ccaf512d06774af2fc9f357946556d7c7fb40624f84903dc1c79aa5190\": container with ID starting with 1756c5ccaf512d06774af2fc9f357946556d7c7fb40624f84903dc1c79aa5190 not found: ID does not exist" containerID="1756c5ccaf512d06774af2fc9f357946556d7c7fb40624f84903dc1c79aa5190" Mar 20 11:33:53 crc kubenswrapper[4748]: I0320 11:33:53.184053 4748 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1756c5ccaf512d06774af2fc9f357946556d7c7fb40624f84903dc1c79aa5190"} err="failed to get container status \"1756c5ccaf512d06774af2fc9f357946556d7c7fb40624f84903dc1c79aa5190\": rpc error: code = NotFound desc = could not find container \"1756c5ccaf512d06774af2fc9f357946556d7c7fb40624f84903dc1c79aa5190\": container with ID starting with 1756c5ccaf512d06774af2fc9f357946556d7c7fb40624f84903dc1c79aa5190 not found: ID does not exist" Mar 20 11:33:53 crc kubenswrapper[4748]: I0320 11:33:53.184095 4748 scope.go:117] "RemoveContainer" containerID="fbae17cc91c7d41de391c71ce71ebef25429d7749f506ab8135e2029e203693f" Mar 20 11:33:53 crc kubenswrapper[4748]: E0320 11:33:53.184519 4748 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fbae17cc91c7d41de391c71ce71ebef25429d7749f506ab8135e2029e203693f\": container with ID starting with fbae17cc91c7d41de391c71ce71ebef25429d7749f506ab8135e2029e203693f not found: ID does not exist" containerID="fbae17cc91c7d41de391c71ce71ebef25429d7749f506ab8135e2029e203693f" Mar 20 11:33:53 crc kubenswrapper[4748]: I0320 11:33:53.184711 4748 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fbae17cc91c7d41de391c71ce71ebef25429d7749f506ab8135e2029e203693f"} err="failed to get container status \"fbae17cc91c7d41de391c71ce71ebef25429d7749f506ab8135e2029e203693f\": rpc error: code = NotFound desc = could not find container \"fbae17cc91c7d41de391c71ce71ebef25429d7749f506ab8135e2029e203693f\": container with ID starting with fbae17cc91c7d41de391c71ce71ebef25429d7749f506ab8135e2029e203693f not found: ID does not exist" Mar 20 11:33:53 crc kubenswrapper[4748]: I0320 11:33:53.527059 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d4f7438-390a-4412-927c-3c71340cf41e" path="/var/lib/kubelet/pods/2d4f7438-390a-4412-927c-3c71340cf41e/volumes" Mar 20 11:34:00 crc kubenswrapper[4748]: I0320 11:34:00.154729 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566774-gpvgr"] Mar 20 11:34:00 crc kubenswrapper[4748]: E0320 11:34:00.156079 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d4f7438-390a-4412-927c-3c71340cf41e" containerName="extract-utilities" Mar 20 11:34:00 crc kubenswrapper[4748]: I0320 11:34:00.156099 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d4f7438-390a-4412-927c-3c71340cf41e" containerName="extract-utilities" Mar 20 11:34:00 crc kubenswrapper[4748]: E0320 11:34:00.156133 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d4f7438-390a-4412-927c-3c71340cf41e" containerName="registry-server" Mar 20 11:34:00 crc kubenswrapper[4748]: I0320 11:34:00.156143 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d4f7438-390a-4412-927c-3c71340cf41e" containerName="registry-server" Mar 20 11:34:00 crc kubenswrapper[4748]: E0320 11:34:00.156164 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d4f7438-390a-4412-927c-3c71340cf41e" containerName="extract-content" Mar 20 11:34:00 crc kubenswrapper[4748]: I0320 11:34:00.156173 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d4f7438-390a-4412-927c-3c71340cf41e" containerName="extract-content" Mar 20 11:34:00 crc kubenswrapper[4748]: I0320 11:34:00.156390 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d4f7438-390a-4412-927c-3c71340cf41e" containerName="registry-server" Mar 20 11:34:00 crc kubenswrapper[4748]: I0320 11:34:00.157144 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566774-gpvgr" Mar 20 11:34:00 crc kubenswrapper[4748]: I0320 11:34:00.159346 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-p6q8z" Mar 20 11:34:00 crc kubenswrapper[4748]: I0320 11:34:00.160891 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 11:34:00 crc kubenswrapper[4748]: I0320 11:34:00.162689 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 11:34:00 crc kubenswrapper[4748]: I0320 11:34:00.167872 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566774-gpvgr"] Mar 20 11:34:00 crc kubenswrapper[4748]: I0320 11:34:00.214189 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwfb5\" (UniqueName: \"kubernetes.io/projected/d5ce9df7-7771-4990-aae5-710d568b5c1e-kube-api-access-nwfb5\") pod \"auto-csr-approver-29566774-gpvgr\" (UID: \"d5ce9df7-7771-4990-aae5-710d568b5c1e\") " pod="openshift-infra/auto-csr-approver-29566774-gpvgr" Mar 20 11:34:00 crc kubenswrapper[4748]: I0320 11:34:00.316709 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nwfb5\" (UniqueName: \"kubernetes.io/projected/d5ce9df7-7771-4990-aae5-710d568b5c1e-kube-api-access-nwfb5\") pod \"auto-csr-approver-29566774-gpvgr\" (UID: \"d5ce9df7-7771-4990-aae5-710d568b5c1e\") " pod="openshift-infra/auto-csr-approver-29566774-gpvgr" Mar 20 11:34:00 crc kubenswrapper[4748]: I0320 11:34:00.338369 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nwfb5\" (UniqueName: \"kubernetes.io/projected/d5ce9df7-7771-4990-aae5-710d568b5c1e-kube-api-access-nwfb5\") pod \"auto-csr-approver-29566774-gpvgr\" (UID: \"d5ce9df7-7771-4990-aae5-710d568b5c1e\") " pod="openshift-infra/auto-csr-approver-29566774-gpvgr" Mar 20 11:34:00 crc kubenswrapper[4748]: I0320 11:34:00.476830 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566774-gpvgr" Mar 20 11:34:00 crc kubenswrapper[4748]: I0320 11:34:00.925163 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566774-gpvgr"] Mar 20 11:34:01 crc kubenswrapper[4748]: I0320 11:34:01.145545 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566774-gpvgr" event={"ID":"d5ce9df7-7771-4990-aae5-710d568b5c1e","Type":"ContainerStarted","Data":"6b673b8c0f46218ad429f0968bbf5bee6a70d8052cf4deba08573509c29206c7"} Mar 20 11:34:03 crc kubenswrapper[4748]: I0320 11:34:03.163675 4748 generic.go:334] "Generic (PLEG): container finished" podID="d5ce9df7-7771-4990-aae5-710d568b5c1e" containerID="25733099c3b84a74f0ccadf3cfbbb9ca78894a58965e11e66a18a9ce47b60f65" exitCode=0 Mar 20 11:34:03 crc kubenswrapper[4748]: I0320 11:34:03.163737 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566774-gpvgr" event={"ID":"d5ce9df7-7771-4990-aae5-710d568b5c1e","Type":"ContainerDied","Data":"25733099c3b84a74f0ccadf3cfbbb9ca78894a58965e11e66a18a9ce47b60f65"} Mar 20 11:34:03 crc kubenswrapper[4748]: I0320 11:34:03.516117 4748 scope.go:117] "RemoveContainer" containerID="9c9ee957322fa34d688fcab2aab1e68d05f6773bc13436058fe28e475976d334" Mar 20 11:34:03 crc kubenswrapper[4748]: E0320 11:34:03.516648 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lbvz_openshift-machine-config-operator(8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" Mar 20 11:34:04 crc kubenswrapper[4748]: I0320 11:34:04.683382 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566774-gpvgr" Mar 20 11:34:04 crc kubenswrapper[4748]: I0320 11:34:04.809457 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nwfb5\" (UniqueName: \"kubernetes.io/projected/d5ce9df7-7771-4990-aae5-710d568b5c1e-kube-api-access-nwfb5\") pod \"d5ce9df7-7771-4990-aae5-710d568b5c1e\" (UID: \"d5ce9df7-7771-4990-aae5-710d568b5c1e\") " Mar 20 11:34:04 crc kubenswrapper[4748]: I0320 11:34:04.815537 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5ce9df7-7771-4990-aae5-710d568b5c1e-kube-api-access-nwfb5" (OuterVolumeSpecName: "kube-api-access-nwfb5") pod "d5ce9df7-7771-4990-aae5-710d568b5c1e" (UID: "d5ce9df7-7771-4990-aae5-710d568b5c1e"). InnerVolumeSpecName "kube-api-access-nwfb5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:34:04 crc kubenswrapper[4748]: I0320 11:34:04.912136 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nwfb5\" (UniqueName: \"kubernetes.io/projected/d5ce9df7-7771-4990-aae5-710d568b5c1e-kube-api-access-nwfb5\") on node \"crc\" DevicePath \"\"" Mar 20 11:34:05 crc kubenswrapper[4748]: I0320 11:34:05.179155 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566774-gpvgr" event={"ID":"d5ce9df7-7771-4990-aae5-710d568b5c1e","Type":"ContainerDied","Data":"6b673b8c0f46218ad429f0968bbf5bee6a70d8052cf4deba08573509c29206c7"} Mar 20 11:34:05 crc kubenswrapper[4748]: I0320 11:34:05.179196 4748 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6b673b8c0f46218ad429f0968bbf5bee6a70d8052cf4deba08573509c29206c7" Mar 20 11:34:05 crc kubenswrapper[4748]: I0320 11:34:05.179229 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566774-gpvgr" Mar 20 11:34:05 crc kubenswrapper[4748]: I0320 11:34:05.748931 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566768-wxhzm"] Mar 20 11:34:05 crc kubenswrapper[4748]: I0320 11:34:05.758343 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566768-wxhzm"] Mar 20 11:34:07 crc kubenswrapper[4748]: I0320 11:34:07.527967 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60133479-2ce5-441a-a3d7-0e05c4f5ac67" path="/var/lib/kubelet/pods/60133479-2ce5-441a-a3d7-0e05c4f5ac67/volumes" Mar 20 11:34:15 crc kubenswrapper[4748]: I0320 11:34:15.522420 4748 scope.go:117] "RemoveContainer" containerID="9c9ee957322fa34d688fcab2aab1e68d05f6773bc13436058fe28e475976d334" Mar 20 11:34:15 crc kubenswrapper[4748]: E0320 11:34:15.523479 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lbvz_openshift-machine-config-operator(8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" Mar 20 11:34:21 crc kubenswrapper[4748]: I0320 11:34:21.945626 4748 scope.go:117] "RemoveContainer" containerID="73434fabb3c42ae469c6a5dd03641b07e2abd41abc4c49ab70a58552dbff6c9c" Mar 20 11:34:29 crc kubenswrapper[4748]: I0320 11:34:29.518699 4748 scope.go:117] "RemoveContainer" containerID="9c9ee957322fa34d688fcab2aab1e68d05f6773bc13436058fe28e475976d334" Mar 20 11:34:29 crc kubenswrapper[4748]: E0320 11:34:29.519637 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lbvz_openshift-machine-config-operator(8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" Mar 20 11:34:44 crc kubenswrapper[4748]: I0320 11:34:44.515432 4748 scope.go:117] "RemoveContainer" containerID="9c9ee957322fa34d688fcab2aab1e68d05f6773bc13436058fe28e475976d334" Mar 20 11:34:45 crc kubenswrapper[4748]: I0320 11:34:45.660872 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" event={"ID":"8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c","Type":"ContainerStarted","Data":"497a1fd1f41e7ba95a2cee4f115e9a20d225eab5bd9fa7ed076e957960cae82e"} Mar 20 11:36:00 crc kubenswrapper[4748]: I0320 11:36:00.144941 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566776-rhmwq"] Mar 20 11:36:00 crc kubenswrapper[4748]: E0320 11:36:00.145929 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5ce9df7-7771-4990-aae5-710d568b5c1e" containerName="oc" Mar 20 11:36:00 crc kubenswrapper[4748]: I0320 11:36:00.145943 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5ce9df7-7771-4990-aae5-710d568b5c1e" containerName="oc" Mar 20 11:36:00 crc kubenswrapper[4748]: I0320 11:36:00.146136 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5ce9df7-7771-4990-aae5-710d568b5c1e" containerName="oc" Mar 20 11:36:00 crc kubenswrapper[4748]: I0320 11:36:00.146860 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566776-rhmwq" Mar 20 11:36:00 crc kubenswrapper[4748]: I0320 11:36:00.148728 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 11:36:00 crc kubenswrapper[4748]: I0320 11:36:00.149520 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-p6q8z" Mar 20 11:36:00 crc kubenswrapper[4748]: I0320 11:36:00.153224 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 11:36:00 crc kubenswrapper[4748]: I0320 11:36:00.156237 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566776-rhmwq"] Mar 20 11:36:00 crc kubenswrapper[4748]: I0320 11:36:00.225328 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvqkm\" (UniqueName: \"kubernetes.io/projected/12887856-ffb1-4e0a-924c-f53d52d94235-kube-api-access-vvqkm\") pod \"auto-csr-approver-29566776-rhmwq\" (UID: \"12887856-ffb1-4e0a-924c-f53d52d94235\") " pod="openshift-infra/auto-csr-approver-29566776-rhmwq" Mar 20 11:36:00 crc kubenswrapper[4748]: I0320 11:36:00.326845 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvqkm\" (UniqueName: \"kubernetes.io/projected/12887856-ffb1-4e0a-924c-f53d52d94235-kube-api-access-vvqkm\") pod \"auto-csr-approver-29566776-rhmwq\" (UID: \"12887856-ffb1-4e0a-924c-f53d52d94235\") " pod="openshift-infra/auto-csr-approver-29566776-rhmwq" Mar 20 11:36:00 crc kubenswrapper[4748]: I0320 11:36:00.349368 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvqkm\" (UniqueName: \"kubernetes.io/projected/12887856-ffb1-4e0a-924c-f53d52d94235-kube-api-access-vvqkm\") pod \"auto-csr-approver-29566776-rhmwq\" (UID: \"12887856-ffb1-4e0a-924c-f53d52d94235\") " pod="openshift-infra/auto-csr-approver-29566776-rhmwq" Mar 20 11:36:00 crc kubenswrapper[4748]: I0320 11:36:00.479182 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566776-rhmwq" Mar 20 11:36:00 crc kubenswrapper[4748]: I0320 11:36:00.915893 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566776-rhmwq"] Mar 20 11:36:01 crc kubenswrapper[4748]: I0320 11:36:01.363419 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566776-rhmwq" event={"ID":"12887856-ffb1-4e0a-924c-f53d52d94235","Type":"ContainerStarted","Data":"44be057610b26af6788494d475e597e9d6168d2183f00101750eda4a7002fcca"} Mar 20 11:36:03 crc kubenswrapper[4748]: I0320 11:36:03.382585 4748 generic.go:334] "Generic (PLEG): container finished" podID="12887856-ffb1-4e0a-924c-f53d52d94235" containerID="6236884d31a49d4756f2ff3cf7fee61b1fb1eaa800f6f9a1b1772761de81c87e" exitCode=0 Mar 20 11:36:03 crc kubenswrapper[4748]: I0320 11:36:03.382634 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566776-rhmwq" event={"ID":"12887856-ffb1-4e0a-924c-f53d52d94235","Type":"ContainerDied","Data":"6236884d31a49d4756f2ff3cf7fee61b1fb1eaa800f6f9a1b1772761de81c87e"} Mar 20 11:36:04 crc kubenswrapper[4748]: I0320 11:36:04.918179 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566776-rhmwq" Mar 20 11:36:05 crc kubenswrapper[4748]: I0320 11:36:05.033694 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vvqkm\" (UniqueName: \"kubernetes.io/projected/12887856-ffb1-4e0a-924c-f53d52d94235-kube-api-access-vvqkm\") pod \"12887856-ffb1-4e0a-924c-f53d52d94235\" (UID: \"12887856-ffb1-4e0a-924c-f53d52d94235\") " Mar 20 11:36:05 crc kubenswrapper[4748]: I0320 11:36:05.044865 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12887856-ffb1-4e0a-924c-f53d52d94235-kube-api-access-vvqkm" (OuterVolumeSpecName: "kube-api-access-vvqkm") pod "12887856-ffb1-4e0a-924c-f53d52d94235" (UID: "12887856-ffb1-4e0a-924c-f53d52d94235"). InnerVolumeSpecName "kube-api-access-vvqkm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:36:05 crc kubenswrapper[4748]: I0320 11:36:05.138326 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vvqkm\" (UniqueName: \"kubernetes.io/projected/12887856-ffb1-4e0a-924c-f53d52d94235-kube-api-access-vvqkm\") on node \"crc\" DevicePath \"\"" Mar 20 11:36:05 crc kubenswrapper[4748]: I0320 11:36:05.401238 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566776-rhmwq" event={"ID":"12887856-ffb1-4e0a-924c-f53d52d94235","Type":"ContainerDied","Data":"44be057610b26af6788494d475e597e9d6168d2183f00101750eda4a7002fcca"} Mar 20 11:36:05 crc kubenswrapper[4748]: I0320 11:36:05.401546 4748 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="44be057610b26af6788494d475e597e9d6168d2183f00101750eda4a7002fcca" Mar 20 11:36:05 crc kubenswrapper[4748]: I0320 11:36:05.401292 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566776-rhmwq" Mar 20 11:36:06 crc kubenswrapper[4748]: I0320 11:36:06.034764 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566770-gvk8w"] Mar 20 11:36:06 crc kubenswrapper[4748]: I0320 11:36:06.064054 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566770-gvk8w"] Mar 20 11:36:07 crc kubenswrapper[4748]: I0320 11:36:07.527987 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d96e85b-256f-4c95-bcae-d38395ee0820" path="/var/lib/kubelet/pods/8d96e85b-256f-4c95-bcae-d38395ee0820/volumes" Mar 20 11:36:22 crc kubenswrapper[4748]: I0320 11:36:22.083115 4748 scope.go:117] "RemoveContainer" containerID="a09888fd6fab3edbd3166869eecff9abba2fc54929fbf2edc5615ba03cf6ad3a" Mar 20 11:36:33 crc kubenswrapper[4748]: I0320 11:36:33.251859 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-f7lpq"] Mar 20 11:36:33 crc kubenswrapper[4748]: E0320 11:36:33.252887 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12887856-ffb1-4e0a-924c-f53d52d94235" containerName="oc" Mar 20 11:36:33 crc kubenswrapper[4748]: I0320 11:36:33.252900 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="12887856-ffb1-4e0a-924c-f53d52d94235" containerName="oc" Mar 20 11:36:33 crc kubenswrapper[4748]: I0320 11:36:33.253105 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="12887856-ffb1-4e0a-924c-f53d52d94235" containerName="oc" Mar 20 11:36:33 crc kubenswrapper[4748]: I0320 11:36:33.254902 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f7lpq" Mar 20 11:36:33 crc kubenswrapper[4748]: I0320 11:36:33.278704 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-f7lpq"] Mar 20 11:36:33 crc kubenswrapper[4748]: I0320 11:36:33.397958 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3e2df52c-1a2b-4d5f-9080-4cac0426d416-utilities\") pod \"community-operators-f7lpq\" (UID: \"3e2df52c-1a2b-4d5f-9080-4cac0426d416\") " pod="openshift-marketplace/community-operators-f7lpq" Mar 20 11:36:33 crc kubenswrapper[4748]: I0320 11:36:33.398155 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvxbv\" (UniqueName: \"kubernetes.io/projected/3e2df52c-1a2b-4d5f-9080-4cac0426d416-kube-api-access-mvxbv\") pod \"community-operators-f7lpq\" (UID: \"3e2df52c-1a2b-4d5f-9080-4cac0426d416\") " pod="openshift-marketplace/community-operators-f7lpq" Mar 20 11:36:33 crc kubenswrapper[4748]: I0320 11:36:33.398306 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3e2df52c-1a2b-4d5f-9080-4cac0426d416-catalog-content\") pod \"community-operators-f7lpq\" (UID: \"3e2df52c-1a2b-4d5f-9080-4cac0426d416\") " pod="openshift-marketplace/community-operators-f7lpq" Mar 20 11:36:33 crc kubenswrapper[4748]: I0320 11:36:33.500205 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3e2df52c-1a2b-4d5f-9080-4cac0426d416-catalog-content\") pod \"community-operators-f7lpq\" (UID: \"3e2df52c-1a2b-4d5f-9080-4cac0426d416\") " pod="openshift-marketplace/community-operators-f7lpq" Mar 20 11:36:33 crc kubenswrapper[4748]: I0320 11:36:33.500282 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3e2df52c-1a2b-4d5f-9080-4cac0426d416-utilities\") pod \"community-operators-f7lpq\" (UID: \"3e2df52c-1a2b-4d5f-9080-4cac0426d416\") " pod="openshift-marketplace/community-operators-f7lpq" Mar 20 11:36:33 crc kubenswrapper[4748]: I0320 11:36:33.500406 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvxbv\" (UniqueName: \"kubernetes.io/projected/3e2df52c-1a2b-4d5f-9080-4cac0426d416-kube-api-access-mvxbv\") pod \"community-operators-f7lpq\" (UID: \"3e2df52c-1a2b-4d5f-9080-4cac0426d416\") " pod="openshift-marketplace/community-operators-f7lpq" Mar 20 11:36:33 crc kubenswrapper[4748]: I0320 11:36:33.500776 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3e2df52c-1a2b-4d5f-9080-4cac0426d416-catalog-content\") pod \"community-operators-f7lpq\" (UID: \"3e2df52c-1a2b-4d5f-9080-4cac0426d416\") " pod="openshift-marketplace/community-operators-f7lpq" Mar 20 11:36:33 crc kubenswrapper[4748]: I0320 11:36:33.500816 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3e2df52c-1a2b-4d5f-9080-4cac0426d416-utilities\") pod \"community-operators-f7lpq\" (UID: \"3e2df52c-1a2b-4d5f-9080-4cac0426d416\") " pod="openshift-marketplace/community-operators-f7lpq" Mar 20 11:36:33 crc kubenswrapper[4748]: I0320 11:36:33.520977 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvxbv\" (UniqueName: \"kubernetes.io/projected/3e2df52c-1a2b-4d5f-9080-4cac0426d416-kube-api-access-mvxbv\") pod \"community-operators-f7lpq\" (UID: \"3e2df52c-1a2b-4d5f-9080-4cac0426d416\") " pod="openshift-marketplace/community-operators-f7lpq" Mar 20 11:36:33 crc kubenswrapper[4748]: I0320 11:36:33.584717 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f7lpq" Mar 20 11:36:34 crc kubenswrapper[4748]: I0320 11:36:34.134364 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-f7lpq"] Mar 20 11:36:34 crc kubenswrapper[4748]: I0320 11:36:34.671657 4748 generic.go:334] "Generic (PLEG): container finished" podID="3e2df52c-1a2b-4d5f-9080-4cac0426d416" containerID="dab40e7124fddab9f488b1a2858820c95d4ffcfa5389b1e94f34c0ab876e34f1" exitCode=0 Mar 20 11:36:34 crc kubenswrapper[4748]: I0320 11:36:34.671702 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f7lpq" event={"ID":"3e2df52c-1a2b-4d5f-9080-4cac0426d416","Type":"ContainerDied","Data":"dab40e7124fddab9f488b1a2858820c95d4ffcfa5389b1e94f34c0ab876e34f1"} Mar 20 11:36:34 crc kubenswrapper[4748]: I0320 11:36:34.671729 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f7lpq" event={"ID":"3e2df52c-1a2b-4d5f-9080-4cac0426d416","Type":"ContainerStarted","Data":"1a52cc15624a42f7630b1a1a998242b2e8b7dd34427b085969ca24938ecd2159"} Mar 20 11:36:35 crc kubenswrapper[4748]: I0320 11:36:35.681219 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f7lpq" event={"ID":"3e2df52c-1a2b-4d5f-9080-4cac0426d416","Type":"ContainerStarted","Data":"7f8eba70ed1eaa8e6bf0475e75b53c2196dc820391dfb2162f0676254a3a777b"} Mar 20 11:36:36 crc kubenswrapper[4748]: I0320 11:36:36.691447 4748 generic.go:334] "Generic (PLEG): container finished" podID="3e2df52c-1a2b-4d5f-9080-4cac0426d416" containerID="7f8eba70ed1eaa8e6bf0475e75b53c2196dc820391dfb2162f0676254a3a777b" exitCode=0 Mar 20 11:36:36 crc kubenswrapper[4748]: I0320 11:36:36.691537 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f7lpq" event={"ID":"3e2df52c-1a2b-4d5f-9080-4cac0426d416","Type":"ContainerDied","Data":"7f8eba70ed1eaa8e6bf0475e75b53c2196dc820391dfb2162f0676254a3a777b"} Mar 20 11:36:37 crc kubenswrapper[4748]: I0320 11:36:37.709647 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f7lpq" event={"ID":"3e2df52c-1a2b-4d5f-9080-4cac0426d416","Type":"ContainerStarted","Data":"c116a8032646c2be656ae5792f887bc687fc64b0d4a65bd2383d3d3a8986191d"} Mar 20 11:36:37 crc kubenswrapper[4748]: I0320 11:36:37.734372 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-f7lpq" podStartSLOduration=2.09198596 podStartE2EDuration="4.734351878s" podCreationTimestamp="2026-03-20 11:36:33 +0000 UTC" firstStartedPulling="2026-03-20 11:36:34.673279251 +0000 UTC m=+3629.814825075" lastFinishedPulling="2026-03-20 11:36:37.315645179 +0000 UTC m=+3632.457190993" observedRunningTime="2026-03-20 11:36:37.731288529 +0000 UTC m=+3632.872834343" watchObservedRunningTime="2026-03-20 11:36:37.734351878 +0000 UTC m=+3632.875897702" Mar 20 11:36:43 crc kubenswrapper[4748]: I0320 11:36:43.584965 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-f7lpq" Mar 20 11:36:43 crc kubenswrapper[4748]: I0320 11:36:43.585598 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-f7lpq" Mar 20 11:36:43 crc kubenswrapper[4748]: I0320 11:36:43.639591 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-f7lpq" Mar 20 11:36:43 crc kubenswrapper[4748]: I0320 11:36:43.830608 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-f7lpq" Mar 20 11:36:43 crc kubenswrapper[4748]: I0320 11:36:43.900150 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-f7lpq"] Mar 20 11:36:45 crc kubenswrapper[4748]: I0320 11:36:45.798152 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-f7lpq" podUID="3e2df52c-1a2b-4d5f-9080-4cac0426d416" containerName="registry-server" containerID="cri-o://c116a8032646c2be656ae5792f887bc687fc64b0d4a65bd2383d3d3a8986191d" gracePeriod=2 Mar 20 11:36:46 crc kubenswrapper[4748]: I0320 11:36:46.512154 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f7lpq" Mar 20 11:36:46 crc kubenswrapper[4748]: I0320 11:36:46.650926 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mvxbv\" (UniqueName: \"kubernetes.io/projected/3e2df52c-1a2b-4d5f-9080-4cac0426d416-kube-api-access-mvxbv\") pod \"3e2df52c-1a2b-4d5f-9080-4cac0426d416\" (UID: \"3e2df52c-1a2b-4d5f-9080-4cac0426d416\") " Mar 20 11:36:46 crc kubenswrapper[4748]: I0320 11:36:46.651290 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3e2df52c-1a2b-4d5f-9080-4cac0426d416-catalog-content\") pod \"3e2df52c-1a2b-4d5f-9080-4cac0426d416\" (UID: \"3e2df52c-1a2b-4d5f-9080-4cac0426d416\") " Mar 20 11:36:46 crc kubenswrapper[4748]: I0320 11:36:46.651396 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3e2df52c-1a2b-4d5f-9080-4cac0426d416-utilities\") pod \"3e2df52c-1a2b-4d5f-9080-4cac0426d416\" (UID: \"3e2df52c-1a2b-4d5f-9080-4cac0426d416\") " Mar 20 11:36:46 crc kubenswrapper[4748]: I0320 11:36:46.652275 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3e2df52c-1a2b-4d5f-9080-4cac0426d416-utilities" (OuterVolumeSpecName: "utilities") pod "3e2df52c-1a2b-4d5f-9080-4cac0426d416" (UID: "3e2df52c-1a2b-4d5f-9080-4cac0426d416"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:36:46 crc kubenswrapper[4748]: I0320 11:36:46.657146 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e2df52c-1a2b-4d5f-9080-4cac0426d416-kube-api-access-mvxbv" (OuterVolumeSpecName: "kube-api-access-mvxbv") pod "3e2df52c-1a2b-4d5f-9080-4cac0426d416" (UID: "3e2df52c-1a2b-4d5f-9080-4cac0426d416"). InnerVolumeSpecName "kube-api-access-mvxbv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:36:46 crc kubenswrapper[4748]: I0320 11:36:46.753490 4748 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3e2df52c-1a2b-4d5f-9080-4cac0426d416-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 11:36:46 crc kubenswrapper[4748]: I0320 11:36:46.753525 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mvxbv\" (UniqueName: \"kubernetes.io/projected/3e2df52c-1a2b-4d5f-9080-4cac0426d416-kube-api-access-mvxbv\") on node \"crc\" DevicePath \"\"" Mar 20 11:36:46 crc kubenswrapper[4748]: I0320 11:36:46.808396 4748 generic.go:334] "Generic (PLEG): container finished" podID="3e2df52c-1a2b-4d5f-9080-4cac0426d416" containerID="c116a8032646c2be656ae5792f887bc687fc64b0d4a65bd2383d3d3a8986191d" exitCode=0 Mar 20 11:36:46 crc kubenswrapper[4748]: I0320 11:36:46.808437 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f7lpq" event={"ID":"3e2df52c-1a2b-4d5f-9080-4cac0426d416","Type":"ContainerDied","Data":"c116a8032646c2be656ae5792f887bc687fc64b0d4a65bd2383d3d3a8986191d"} Mar 20 11:36:46 crc kubenswrapper[4748]: I0320 11:36:46.808456 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f7lpq" Mar 20 11:36:46 crc kubenswrapper[4748]: I0320 11:36:46.808474 4748 scope.go:117] "RemoveContainer" containerID="c116a8032646c2be656ae5792f887bc687fc64b0d4a65bd2383d3d3a8986191d" Mar 20 11:36:46 crc kubenswrapper[4748]: I0320 11:36:46.808464 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f7lpq" event={"ID":"3e2df52c-1a2b-4d5f-9080-4cac0426d416","Type":"ContainerDied","Data":"1a52cc15624a42f7630b1a1a998242b2e8b7dd34427b085969ca24938ecd2159"} Mar 20 11:36:46 crc kubenswrapper[4748]: I0320 11:36:46.832605 4748 scope.go:117] "RemoveContainer" containerID="7f8eba70ed1eaa8e6bf0475e75b53c2196dc820391dfb2162f0676254a3a777b" Mar 20 11:36:46 crc kubenswrapper[4748]: I0320 11:36:46.863920 4748 scope.go:117] "RemoveContainer" containerID="dab40e7124fddab9f488b1a2858820c95d4ffcfa5389b1e94f34c0ab876e34f1" Mar 20 11:36:46 crc kubenswrapper[4748]: I0320 11:36:46.920133 4748 scope.go:117] "RemoveContainer" containerID="c116a8032646c2be656ae5792f887bc687fc64b0d4a65bd2383d3d3a8986191d" Mar 20 11:36:46 crc kubenswrapper[4748]: E0320 11:36:46.920643 4748 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c116a8032646c2be656ae5792f887bc687fc64b0d4a65bd2383d3d3a8986191d\": container with ID starting with c116a8032646c2be656ae5792f887bc687fc64b0d4a65bd2383d3d3a8986191d not found: ID does not exist" containerID="c116a8032646c2be656ae5792f887bc687fc64b0d4a65bd2383d3d3a8986191d" Mar 20 11:36:46 crc kubenswrapper[4748]: I0320 11:36:46.920686 4748 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c116a8032646c2be656ae5792f887bc687fc64b0d4a65bd2383d3d3a8986191d"} err="failed to get container status \"c116a8032646c2be656ae5792f887bc687fc64b0d4a65bd2383d3d3a8986191d\": rpc error: code = NotFound desc = could not find container \"c116a8032646c2be656ae5792f887bc687fc64b0d4a65bd2383d3d3a8986191d\": container with ID starting with c116a8032646c2be656ae5792f887bc687fc64b0d4a65bd2383d3d3a8986191d not found: ID does not exist" Mar 20 11:36:46 crc kubenswrapper[4748]: I0320 11:36:46.920714 4748 scope.go:117] "RemoveContainer" containerID="7f8eba70ed1eaa8e6bf0475e75b53c2196dc820391dfb2162f0676254a3a777b" Mar 20 11:36:46 crc kubenswrapper[4748]: E0320 11:36:46.921254 4748 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f8eba70ed1eaa8e6bf0475e75b53c2196dc820391dfb2162f0676254a3a777b\": container with ID starting with 7f8eba70ed1eaa8e6bf0475e75b53c2196dc820391dfb2162f0676254a3a777b not found: ID does not exist" containerID="7f8eba70ed1eaa8e6bf0475e75b53c2196dc820391dfb2162f0676254a3a777b" Mar 20 11:36:46 crc kubenswrapper[4748]: I0320 11:36:46.921429 4748 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f8eba70ed1eaa8e6bf0475e75b53c2196dc820391dfb2162f0676254a3a777b"} err="failed to get container status \"7f8eba70ed1eaa8e6bf0475e75b53c2196dc820391dfb2162f0676254a3a777b\": rpc error: code = NotFound desc = could not find container \"7f8eba70ed1eaa8e6bf0475e75b53c2196dc820391dfb2162f0676254a3a777b\": container with ID starting with 7f8eba70ed1eaa8e6bf0475e75b53c2196dc820391dfb2162f0676254a3a777b not found: ID does not exist" Mar 20 11:36:46 crc kubenswrapper[4748]: I0320 11:36:46.921478 4748 scope.go:117] "RemoveContainer" containerID="dab40e7124fddab9f488b1a2858820c95d4ffcfa5389b1e94f34c0ab876e34f1" Mar 20 11:36:46 crc kubenswrapper[4748]: E0320 11:36:46.921813 4748 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dab40e7124fddab9f488b1a2858820c95d4ffcfa5389b1e94f34c0ab876e34f1\": container with ID starting with dab40e7124fddab9f488b1a2858820c95d4ffcfa5389b1e94f34c0ab876e34f1 not found: ID does not exist" containerID="dab40e7124fddab9f488b1a2858820c95d4ffcfa5389b1e94f34c0ab876e34f1" Mar 20 11:36:46 crc kubenswrapper[4748]: I0320 11:36:46.921863 4748 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dab40e7124fddab9f488b1a2858820c95d4ffcfa5389b1e94f34c0ab876e34f1"} err="failed to get container status \"dab40e7124fddab9f488b1a2858820c95d4ffcfa5389b1e94f34c0ab876e34f1\": rpc error: code = NotFound desc = could not find container \"dab40e7124fddab9f488b1a2858820c95d4ffcfa5389b1e94f34c0ab876e34f1\": container with ID starting with dab40e7124fddab9f488b1a2858820c95d4ffcfa5389b1e94f34c0ab876e34f1 not found: ID does not exist" Mar 20 11:36:49 crc kubenswrapper[4748]: I0320 11:36:49.557279 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3e2df52c-1a2b-4d5f-9080-4cac0426d416-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3e2df52c-1a2b-4d5f-9080-4cac0426d416" (UID: "3e2df52c-1a2b-4d5f-9080-4cac0426d416"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:36:49 crc kubenswrapper[4748]: I0320 11:36:49.608068 4748 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3e2df52c-1a2b-4d5f-9080-4cac0426d416-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 11:36:49 crc kubenswrapper[4748]: I0320 11:36:49.851982 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-f7lpq"] Mar 20 11:36:49 crc kubenswrapper[4748]: I0320 11:36:49.859471 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-f7lpq"] Mar 20 11:36:51 crc kubenswrapper[4748]: I0320 11:36:51.525409 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e2df52c-1a2b-4d5f-9080-4cac0426d416" path="/var/lib/kubelet/pods/3e2df52c-1a2b-4d5f-9080-4cac0426d416/volumes" Mar 20 11:37:12 crc kubenswrapper[4748]: I0320 11:37:12.928515 4748 patch_prober.go:28] interesting pod/machine-config-daemon-5lbvz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:37:12 crc kubenswrapper[4748]: I0320 11:37:12.929865 4748 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:37:42 crc kubenswrapper[4748]: I0320 11:37:42.928134 4748 patch_prober.go:28] interesting pod/machine-config-daemon-5lbvz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:37:42 crc kubenswrapper[4748]: I0320 11:37:42.928812 4748 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:38:00 crc kubenswrapper[4748]: I0320 11:38:00.156176 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566778-dscqn"] Mar 20 11:38:00 crc kubenswrapper[4748]: E0320 11:38:00.157165 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e2df52c-1a2b-4d5f-9080-4cac0426d416" containerName="extract-content" Mar 20 11:38:00 crc kubenswrapper[4748]: I0320 11:38:00.157184 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e2df52c-1a2b-4d5f-9080-4cac0426d416" containerName="extract-content" Mar 20 11:38:00 crc kubenswrapper[4748]: E0320 11:38:00.157203 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e2df52c-1a2b-4d5f-9080-4cac0426d416" containerName="extract-utilities" Mar 20 11:38:00 crc kubenswrapper[4748]: I0320 11:38:00.157211 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e2df52c-1a2b-4d5f-9080-4cac0426d416" containerName="extract-utilities" Mar 20 11:38:00 crc kubenswrapper[4748]: E0320 11:38:00.157224 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e2df52c-1a2b-4d5f-9080-4cac0426d416" containerName="registry-server" Mar 20 11:38:00 crc kubenswrapper[4748]: I0320 11:38:00.157229 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e2df52c-1a2b-4d5f-9080-4cac0426d416" containerName="registry-server" Mar 20 11:38:00 crc kubenswrapper[4748]: I0320 11:38:00.157490 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e2df52c-1a2b-4d5f-9080-4cac0426d416" containerName="registry-server" Mar 20 11:38:00 crc kubenswrapper[4748]: I0320 11:38:00.158254 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566778-dscqn" Mar 20 11:38:00 crc kubenswrapper[4748]: I0320 11:38:00.161207 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 11:38:00 crc kubenswrapper[4748]: I0320 11:38:00.161497 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 11:38:00 crc kubenswrapper[4748]: I0320 11:38:00.161504 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-p6q8z" Mar 20 11:38:00 crc kubenswrapper[4748]: I0320 11:38:00.165088 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566778-dscqn"] Mar 20 11:38:00 crc kubenswrapper[4748]: I0320 11:38:00.199658 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2252d\" (UniqueName: \"kubernetes.io/projected/d3961923-315d-4a52-98d7-98a8a9c3a9fc-kube-api-access-2252d\") pod \"auto-csr-approver-29566778-dscqn\" (UID: \"d3961923-315d-4a52-98d7-98a8a9c3a9fc\") " pod="openshift-infra/auto-csr-approver-29566778-dscqn" Mar 20 11:38:00 crc kubenswrapper[4748]: I0320 11:38:00.301094 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2252d\" (UniqueName: \"kubernetes.io/projected/d3961923-315d-4a52-98d7-98a8a9c3a9fc-kube-api-access-2252d\") pod \"auto-csr-approver-29566778-dscqn\" (UID: \"d3961923-315d-4a52-98d7-98a8a9c3a9fc\") " pod="openshift-infra/auto-csr-approver-29566778-dscqn" Mar 20 11:38:00 crc kubenswrapper[4748]: I0320 11:38:00.320983 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2252d\" (UniqueName: \"kubernetes.io/projected/d3961923-315d-4a52-98d7-98a8a9c3a9fc-kube-api-access-2252d\") pod \"auto-csr-approver-29566778-dscqn\" (UID: \"d3961923-315d-4a52-98d7-98a8a9c3a9fc\") " pod="openshift-infra/auto-csr-approver-29566778-dscqn" Mar 20 11:38:00 crc kubenswrapper[4748]: I0320 11:38:00.492494 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566778-dscqn" Mar 20 11:38:00 crc kubenswrapper[4748]: I0320 11:38:00.948625 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566778-dscqn"] Mar 20 11:38:01 crc kubenswrapper[4748]: I0320 11:38:01.430811 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566778-dscqn" event={"ID":"d3961923-315d-4a52-98d7-98a8a9c3a9fc","Type":"ContainerStarted","Data":"051e29683b44cc1556f2c150381cebebab1d5dcc65a5477d132e6f4ade848141"} Mar 20 11:38:07 crc kubenswrapper[4748]: I0320 11:38:07.478028 4748 generic.go:334] "Generic (PLEG): container finished" podID="d3961923-315d-4a52-98d7-98a8a9c3a9fc" containerID="db990d66cf4ee1f553d87f88d3f6ca6b0c6dc157bd0b5eb9a2ac783626d34078" exitCode=0 Mar 20 11:38:07 crc kubenswrapper[4748]: I0320 11:38:07.478117 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566778-dscqn" event={"ID":"d3961923-315d-4a52-98d7-98a8a9c3a9fc","Type":"ContainerDied","Data":"db990d66cf4ee1f553d87f88d3f6ca6b0c6dc157bd0b5eb9a2ac783626d34078"} Mar 20 11:38:08 crc kubenswrapper[4748]: I0320 11:38:08.855952 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566778-dscqn" Mar 20 11:38:08 crc kubenswrapper[4748]: I0320 11:38:08.970113 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2252d\" (UniqueName: \"kubernetes.io/projected/d3961923-315d-4a52-98d7-98a8a9c3a9fc-kube-api-access-2252d\") pod \"d3961923-315d-4a52-98d7-98a8a9c3a9fc\" (UID: \"d3961923-315d-4a52-98d7-98a8a9c3a9fc\") " Mar 20 11:38:08 crc kubenswrapper[4748]: I0320 11:38:08.977746 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3961923-315d-4a52-98d7-98a8a9c3a9fc-kube-api-access-2252d" (OuterVolumeSpecName: "kube-api-access-2252d") pod "d3961923-315d-4a52-98d7-98a8a9c3a9fc" (UID: "d3961923-315d-4a52-98d7-98a8a9c3a9fc"). InnerVolumeSpecName "kube-api-access-2252d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:38:09 crc kubenswrapper[4748]: I0320 11:38:09.073226 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2252d\" (UniqueName: \"kubernetes.io/projected/d3961923-315d-4a52-98d7-98a8a9c3a9fc-kube-api-access-2252d\") on node \"crc\" DevicePath \"\"" Mar 20 11:38:09 crc kubenswrapper[4748]: I0320 11:38:09.494638 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566778-dscqn" event={"ID":"d3961923-315d-4a52-98d7-98a8a9c3a9fc","Type":"ContainerDied","Data":"051e29683b44cc1556f2c150381cebebab1d5dcc65a5477d132e6f4ade848141"} Mar 20 11:38:09 crc kubenswrapper[4748]: I0320 11:38:09.494677 4748 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="051e29683b44cc1556f2c150381cebebab1d5dcc65a5477d132e6f4ade848141" Mar 20 11:38:09 crc kubenswrapper[4748]: I0320 11:38:09.494716 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566778-dscqn" Mar 20 11:38:09 crc kubenswrapper[4748]: I0320 11:38:09.931753 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566772-j4rsj"] Mar 20 11:38:09 crc kubenswrapper[4748]: I0320 11:38:09.943590 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566772-j4rsj"] Mar 20 11:38:11 crc kubenswrapper[4748]: I0320 11:38:11.526562 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="258d3ead-d0c6-4a5a-8687-69b4385f5eb4" path="/var/lib/kubelet/pods/258d3ead-d0c6-4a5a-8687-69b4385f5eb4/volumes" Mar 20 11:38:12 crc kubenswrapper[4748]: I0320 11:38:12.928496 4748 patch_prober.go:28] interesting pod/machine-config-daemon-5lbvz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:38:12 crc kubenswrapper[4748]: I0320 11:38:12.928561 4748 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:38:12 crc kubenswrapper[4748]: I0320 11:38:12.928608 4748 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" Mar 20 11:38:12 crc kubenswrapper[4748]: I0320 11:38:12.929435 4748 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"497a1fd1f41e7ba95a2cee4f115e9a20d225eab5bd9fa7ed076e957960cae82e"} pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 11:38:12 crc kubenswrapper[4748]: I0320 11:38:12.929502 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" containerName="machine-config-daemon" containerID="cri-o://497a1fd1f41e7ba95a2cee4f115e9a20d225eab5bd9fa7ed076e957960cae82e" gracePeriod=600 Mar 20 11:38:13 crc kubenswrapper[4748]: I0320 11:38:13.530309 4748 generic.go:334] "Generic (PLEG): container finished" podID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" containerID="497a1fd1f41e7ba95a2cee4f115e9a20d225eab5bd9fa7ed076e957960cae82e" exitCode=0 Mar 20 11:38:13 crc kubenswrapper[4748]: I0320 11:38:13.530360 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" event={"ID":"8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c","Type":"ContainerDied","Data":"497a1fd1f41e7ba95a2cee4f115e9a20d225eab5bd9fa7ed076e957960cae82e"} Mar 20 11:38:13 crc kubenswrapper[4748]: I0320 11:38:13.530944 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" event={"ID":"8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c","Type":"ContainerStarted","Data":"73b367691f863dac06f8e78eb6207c111ac5931452db06413eefe37857198933"} Mar 20 11:38:13 crc kubenswrapper[4748]: I0320 11:38:13.530969 4748 scope.go:117] "RemoveContainer" containerID="9c9ee957322fa34d688fcab2aab1e68d05f6773bc13436058fe28e475976d334" Mar 20 11:38:22 crc kubenswrapper[4748]: I0320 11:38:22.205051 4748 scope.go:117] "RemoveContainer" containerID="81f36b4a6b7e34a56376654022dfc979842f5e530672e4498fdf8e821f03d05c" Mar 20 11:40:00 crc kubenswrapper[4748]: I0320 11:40:00.147602 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566780-wm8dm"] Mar 20 11:40:00 crc kubenswrapper[4748]: E0320 11:40:00.148622 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3961923-315d-4a52-98d7-98a8a9c3a9fc" containerName="oc" Mar 20 11:40:00 crc kubenswrapper[4748]: I0320 11:40:00.148639 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3961923-315d-4a52-98d7-98a8a9c3a9fc" containerName="oc" Mar 20 11:40:00 crc kubenswrapper[4748]: I0320 11:40:00.148927 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3961923-315d-4a52-98d7-98a8a9c3a9fc" containerName="oc" Mar 20 11:40:00 crc kubenswrapper[4748]: I0320 11:40:00.149690 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566780-wm8dm" Mar 20 11:40:00 crc kubenswrapper[4748]: I0320 11:40:00.155548 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 11:40:00 crc kubenswrapper[4748]: I0320 11:40:00.155803 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 11:40:00 crc kubenswrapper[4748]: I0320 11:40:00.155983 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-p6q8z" Mar 20 11:40:00 crc kubenswrapper[4748]: I0320 11:40:00.175052 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566780-wm8dm"] Mar 20 11:40:00 crc kubenswrapper[4748]: I0320 11:40:00.264269 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88jlt\" (UniqueName: \"kubernetes.io/projected/4f104ce4-9d59-4c4b-97d8-8b186f48bc00-kube-api-access-88jlt\") pod \"auto-csr-approver-29566780-wm8dm\" (UID: \"4f104ce4-9d59-4c4b-97d8-8b186f48bc00\") " pod="openshift-infra/auto-csr-approver-29566780-wm8dm" Mar 20 11:40:00 crc kubenswrapper[4748]: I0320 11:40:00.366191 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88jlt\" (UniqueName: \"kubernetes.io/projected/4f104ce4-9d59-4c4b-97d8-8b186f48bc00-kube-api-access-88jlt\") pod \"auto-csr-approver-29566780-wm8dm\" (UID: \"4f104ce4-9d59-4c4b-97d8-8b186f48bc00\") " pod="openshift-infra/auto-csr-approver-29566780-wm8dm" Mar 20 11:40:00 crc kubenswrapper[4748]: I0320 11:40:00.391220 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88jlt\" (UniqueName: \"kubernetes.io/projected/4f104ce4-9d59-4c4b-97d8-8b186f48bc00-kube-api-access-88jlt\") pod \"auto-csr-approver-29566780-wm8dm\" (UID: \"4f104ce4-9d59-4c4b-97d8-8b186f48bc00\") " pod="openshift-infra/auto-csr-approver-29566780-wm8dm" Mar 20 11:40:00 crc kubenswrapper[4748]: I0320 11:40:00.478538 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566780-wm8dm" Mar 20 11:40:00 crc kubenswrapper[4748]: I0320 11:40:00.937627 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566780-wm8dm"] Mar 20 11:40:00 crc kubenswrapper[4748]: I0320 11:40:00.944111 4748 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 11:40:01 crc kubenswrapper[4748]: I0320 11:40:01.950255 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566780-wm8dm" event={"ID":"4f104ce4-9d59-4c4b-97d8-8b186f48bc00","Type":"ContainerStarted","Data":"ca172027b594ec2a19025713ecebb25383cdca50a7e4fa091bc0c960a84f0ff5"} Mar 20 11:40:02 crc kubenswrapper[4748]: I0320 11:40:02.959826 4748 generic.go:334] "Generic (PLEG): container finished" podID="4f104ce4-9d59-4c4b-97d8-8b186f48bc00" containerID="3be38a3d2958121772ea0fc9425d0d4401d392ed7ceaecb4d1ff03be9104ac74" exitCode=0 Mar 20 11:40:02 crc kubenswrapper[4748]: I0320 11:40:02.959907 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566780-wm8dm" event={"ID":"4f104ce4-9d59-4c4b-97d8-8b186f48bc00","Type":"ContainerDied","Data":"3be38a3d2958121772ea0fc9425d0d4401d392ed7ceaecb4d1ff03be9104ac74"} Mar 20 11:40:04 crc kubenswrapper[4748]: I0320 11:40:04.521018 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566780-wm8dm" Mar 20 11:40:04 crc kubenswrapper[4748]: I0320 11:40:04.566657 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-88jlt\" (UniqueName: \"kubernetes.io/projected/4f104ce4-9d59-4c4b-97d8-8b186f48bc00-kube-api-access-88jlt\") pod \"4f104ce4-9d59-4c4b-97d8-8b186f48bc00\" (UID: \"4f104ce4-9d59-4c4b-97d8-8b186f48bc00\") " Mar 20 11:40:04 crc kubenswrapper[4748]: I0320 11:40:04.573392 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f104ce4-9d59-4c4b-97d8-8b186f48bc00-kube-api-access-88jlt" (OuterVolumeSpecName: "kube-api-access-88jlt") pod "4f104ce4-9d59-4c4b-97d8-8b186f48bc00" (UID: "4f104ce4-9d59-4c4b-97d8-8b186f48bc00"). InnerVolumeSpecName "kube-api-access-88jlt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:40:04 crc kubenswrapper[4748]: I0320 11:40:04.669893 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-88jlt\" (UniqueName: \"kubernetes.io/projected/4f104ce4-9d59-4c4b-97d8-8b186f48bc00-kube-api-access-88jlt\") on node \"crc\" DevicePath \"\"" Mar 20 11:40:04 crc kubenswrapper[4748]: I0320 11:40:04.980122 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566780-wm8dm" event={"ID":"4f104ce4-9d59-4c4b-97d8-8b186f48bc00","Type":"ContainerDied","Data":"ca172027b594ec2a19025713ecebb25383cdca50a7e4fa091bc0c960a84f0ff5"} Mar 20 11:40:04 crc kubenswrapper[4748]: I0320 11:40:04.980467 4748 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ca172027b594ec2a19025713ecebb25383cdca50a7e4fa091bc0c960a84f0ff5" Mar 20 11:40:04 crc kubenswrapper[4748]: I0320 11:40:04.980416 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566780-wm8dm" Mar 20 11:40:05 crc kubenswrapper[4748]: I0320 11:40:05.587164 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566774-gpvgr"] Mar 20 11:40:05 crc kubenswrapper[4748]: I0320 11:40:05.596645 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566774-gpvgr"] Mar 20 11:40:07 crc kubenswrapper[4748]: I0320 11:40:07.534149 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5ce9df7-7771-4990-aae5-710d568b5c1e" path="/var/lib/kubelet/pods/d5ce9df7-7771-4990-aae5-710d568b5c1e/volumes" Mar 20 11:40:22 crc kubenswrapper[4748]: I0320 11:40:22.293342 4748 scope.go:117] "RemoveContainer" containerID="25733099c3b84a74f0ccadf3cfbbb9ca78894a58965e11e66a18a9ce47b60f65" Mar 20 11:40:42 crc kubenswrapper[4748]: I0320 11:40:42.928077 4748 patch_prober.go:28] interesting pod/machine-config-daemon-5lbvz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:40:42 crc kubenswrapper[4748]: I0320 11:40:42.928608 4748 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:41:12 crc kubenswrapper[4748]: I0320 11:41:12.928995 4748 patch_prober.go:28] interesting pod/machine-config-daemon-5lbvz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:41:12 crc kubenswrapper[4748]: I0320 11:41:12.929528 4748 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:41:42 crc kubenswrapper[4748]: I0320 11:41:42.928533 4748 patch_prober.go:28] interesting pod/machine-config-daemon-5lbvz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:41:42 crc kubenswrapper[4748]: I0320 11:41:42.929066 4748 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:41:42 crc kubenswrapper[4748]: I0320 11:41:42.929116 4748 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" Mar 20 11:41:42 crc kubenswrapper[4748]: I0320 11:41:42.929859 4748 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"73b367691f863dac06f8e78eb6207c111ac5931452db06413eefe37857198933"} pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 11:41:42 crc kubenswrapper[4748]: I0320 11:41:42.929915 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" containerName="machine-config-daemon" containerID="cri-o://73b367691f863dac06f8e78eb6207c111ac5931452db06413eefe37857198933" gracePeriod=600 Mar 20 11:41:43 crc kubenswrapper[4748]: E0320 11:41:43.055253 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lbvz_openshift-machine-config-operator(8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" Mar 20 11:41:43 crc kubenswrapper[4748]: I0320 11:41:43.852447 4748 generic.go:334] "Generic (PLEG): container finished" podID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" containerID="73b367691f863dac06f8e78eb6207c111ac5931452db06413eefe37857198933" exitCode=0 Mar 20 11:41:43 crc kubenswrapper[4748]: I0320 11:41:43.852491 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" event={"ID":"8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c","Type":"ContainerDied","Data":"73b367691f863dac06f8e78eb6207c111ac5931452db06413eefe37857198933"} Mar 20 11:41:43 crc kubenswrapper[4748]: I0320 11:41:43.852523 4748 scope.go:117] "RemoveContainer" containerID="497a1fd1f41e7ba95a2cee4f115e9a20d225eab5bd9fa7ed076e957960cae82e" Mar 20 11:41:43 crc kubenswrapper[4748]: I0320 11:41:43.853160 4748 scope.go:117] "RemoveContainer" containerID="73b367691f863dac06f8e78eb6207c111ac5931452db06413eefe37857198933" Mar 20 11:41:43 crc kubenswrapper[4748]: E0320 11:41:43.853511 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lbvz_openshift-machine-config-operator(8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" Mar 20 11:41:56 crc kubenswrapper[4748]: I0320 11:41:56.516043 4748 scope.go:117] "RemoveContainer" containerID="73b367691f863dac06f8e78eb6207c111ac5931452db06413eefe37857198933" Mar 20 11:41:56 crc kubenswrapper[4748]: E0320 11:41:56.516996 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lbvz_openshift-machine-config-operator(8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" Mar 20 11:42:00 crc kubenswrapper[4748]: I0320 11:42:00.143985 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566782-vvbtb"] Mar 20 11:42:00 crc kubenswrapper[4748]: E0320 11:42:00.144965 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f104ce4-9d59-4c4b-97d8-8b186f48bc00" containerName="oc" Mar 20 11:42:00 crc kubenswrapper[4748]: I0320 11:42:00.144985 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f104ce4-9d59-4c4b-97d8-8b186f48bc00" containerName="oc" Mar 20 11:42:00 crc kubenswrapper[4748]: I0320 11:42:00.145260 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f104ce4-9d59-4c4b-97d8-8b186f48bc00" containerName="oc" Mar 20 11:42:00 crc kubenswrapper[4748]: I0320 11:42:00.145901 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566782-vvbtb" Mar 20 11:42:00 crc kubenswrapper[4748]: I0320 11:42:00.149360 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 11:42:00 crc kubenswrapper[4748]: I0320 11:42:00.149461 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-p6q8z" Mar 20 11:42:00 crc kubenswrapper[4748]: I0320 11:42:00.155283 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566782-vvbtb"] Mar 20 11:42:00 crc kubenswrapper[4748]: I0320 11:42:00.155546 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 11:42:00 crc kubenswrapper[4748]: I0320 11:42:00.247323 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7jjv\" (UniqueName: \"kubernetes.io/projected/0c85e48f-fb2f-489f-8143-226f44751edd-kube-api-access-r7jjv\") pod \"auto-csr-approver-29566782-vvbtb\" (UID: \"0c85e48f-fb2f-489f-8143-226f44751edd\") " pod="openshift-infra/auto-csr-approver-29566782-vvbtb" Mar 20 11:42:00 crc kubenswrapper[4748]: I0320 11:42:00.349541 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7jjv\" (UniqueName: \"kubernetes.io/projected/0c85e48f-fb2f-489f-8143-226f44751edd-kube-api-access-r7jjv\") pod \"auto-csr-approver-29566782-vvbtb\" (UID: \"0c85e48f-fb2f-489f-8143-226f44751edd\") " pod="openshift-infra/auto-csr-approver-29566782-vvbtb" Mar 20 11:42:00 crc kubenswrapper[4748]: I0320 11:42:00.372873 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7jjv\" (UniqueName: \"kubernetes.io/projected/0c85e48f-fb2f-489f-8143-226f44751edd-kube-api-access-r7jjv\") pod \"auto-csr-approver-29566782-vvbtb\" (UID: \"0c85e48f-fb2f-489f-8143-226f44751edd\") " pod="openshift-infra/auto-csr-approver-29566782-vvbtb" Mar 20 11:42:00 crc kubenswrapper[4748]: I0320 11:42:00.468631 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566782-vvbtb" Mar 20 11:42:00 crc kubenswrapper[4748]: I0320 11:42:00.919107 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566782-vvbtb"] Mar 20 11:42:01 crc kubenswrapper[4748]: I0320 11:42:01.002826 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566782-vvbtb" event={"ID":"0c85e48f-fb2f-489f-8143-226f44751edd","Type":"ContainerStarted","Data":"160f9511d00320739dc22f9bc21dcce4a08ac1cdd50c32fb852ac4f6eceeda7e"} Mar 20 11:42:03 crc kubenswrapper[4748]: I0320 11:42:03.020118 4748 generic.go:334] "Generic (PLEG): container finished" podID="0c85e48f-fb2f-489f-8143-226f44751edd" containerID="fd6901472ea5b987df1064749005e0ef3b2d705e51d677ecf4c54828a8d3799f" exitCode=0 Mar 20 11:42:03 crc kubenswrapper[4748]: I0320 11:42:03.020182 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566782-vvbtb" event={"ID":"0c85e48f-fb2f-489f-8143-226f44751edd","Type":"ContainerDied","Data":"fd6901472ea5b987df1064749005e0ef3b2d705e51d677ecf4c54828a8d3799f"} Mar 20 11:42:04 crc kubenswrapper[4748]: I0320 11:42:04.449595 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566782-vvbtb" Mar 20 11:42:04 crc kubenswrapper[4748]: I0320 11:42:04.542660 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r7jjv\" (UniqueName: \"kubernetes.io/projected/0c85e48f-fb2f-489f-8143-226f44751edd-kube-api-access-r7jjv\") pod \"0c85e48f-fb2f-489f-8143-226f44751edd\" (UID: \"0c85e48f-fb2f-489f-8143-226f44751edd\") " Mar 20 11:42:04 crc kubenswrapper[4748]: I0320 11:42:04.548706 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c85e48f-fb2f-489f-8143-226f44751edd-kube-api-access-r7jjv" (OuterVolumeSpecName: "kube-api-access-r7jjv") pod "0c85e48f-fb2f-489f-8143-226f44751edd" (UID: "0c85e48f-fb2f-489f-8143-226f44751edd"). InnerVolumeSpecName "kube-api-access-r7jjv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:42:04 crc kubenswrapper[4748]: I0320 11:42:04.644795 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r7jjv\" (UniqueName: \"kubernetes.io/projected/0c85e48f-fb2f-489f-8143-226f44751edd-kube-api-access-r7jjv\") on node \"crc\" DevicePath \"\"" Mar 20 11:42:05 crc kubenswrapper[4748]: I0320 11:42:05.040895 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566782-vvbtb" event={"ID":"0c85e48f-fb2f-489f-8143-226f44751edd","Type":"ContainerDied","Data":"160f9511d00320739dc22f9bc21dcce4a08ac1cdd50c32fb852ac4f6eceeda7e"} Mar 20 11:42:05 crc kubenswrapper[4748]: I0320 11:42:05.040941 4748 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="160f9511d00320739dc22f9bc21dcce4a08ac1cdd50c32fb852ac4f6eceeda7e" Mar 20 11:42:05 crc kubenswrapper[4748]: I0320 11:42:05.040968 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566782-vvbtb" Mar 20 11:42:05 crc kubenswrapper[4748]: I0320 11:42:05.524584 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566776-rhmwq"] Mar 20 11:42:05 crc kubenswrapper[4748]: I0320 11:42:05.526497 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566776-rhmwq"] Mar 20 11:42:07 crc kubenswrapper[4748]: I0320 11:42:07.526374 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12887856-ffb1-4e0a-924c-f53d52d94235" path="/var/lib/kubelet/pods/12887856-ffb1-4e0a-924c-f53d52d94235/volumes" Mar 20 11:42:08 crc kubenswrapper[4748]: I0320 11:42:08.515383 4748 scope.go:117] "RemoveContainer" containerID="73b367691f863dac06f8e78eb6207c111ac5931452db06413eefe37857198933" Mar 20 11:42:08 crc kubenswrapper[4748]: E0320 11:42:08.515783 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lbvz_openshift-machine-config-operator(8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" Mar 20 11:42:12 crc kubenswrapper[4748]: I0320 11:42:12.523234 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-tkx42"] Mar 20 11:42:12 crc kubenswrapper[4748]: E0320 11:42:12.525557 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c85e48f-fb2f-489f-8143-226f44751edd" containerName="oc" Mar 20 11:42:12 crc kubenswrapper[4748]: I0320 11:42:12.525582 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c85e48f-fb2f-489f-8143-226f44751edd" containerName="oc" Mar 20 11:42:12 crc kubenswrapper[4748]: I0320 11:42:12.526056 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c85e48f-fb2f-489f-8143-226f44751edd" containerName="oc" Mar 20 11:42:12 crc kubenswrapper[4748]: I0320 11:42:12.527815 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tkx42" Mar 20 11:42:12 crc kubenswrapper[4748]: I0320 11:42:12.534921 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tkx42"] Mar 20 11:42:12 crc kubenswrapper[4748]: I0320 11:42:12.589795 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ls8pc\" (UniqueName: \"kubernetes.io/projected/7e668b48-23a8-4f2a-b8ed-4789b979fbaa-kube-api-access-ls8pc\") pod \"certified-operators-tkx42\" (UID: \"7e668b48-23a8-4f2a-b8ed-4789b979fbaa\") " pod="openshift-marketplace/certified-operators-tkx42" Mar 20 11:42:12 crc kubenswrapper[4748]: I0320 11:42:12.589871 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e668b48-23a8-4f2a-b8ed-4789b979fbaa-utilities\") pod \"certified-operators-tkx42\" (UID: \"7e668b48-23a8-4f2a-b8ed-4789b979fbaa\") " pod="openshift-marketplace/certified-operators-tkx42" Mar 20 11:42:12 crc kubenswrapper[4748]: I0320 11:42:12.589938 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e668b48-23a8-4f2a-b8ed-4789b979fbaa-catalog-content\") pod \"certified-operators-tkx42\" (UID: \"7e668b48-23a8-4f2a-b8ed-4789b979fbaa\") " pod="openshift-marketplace/certified-operators-tkx42" Mar 20 11:42:12 crc kubenswrapper[4748]: I0320 11:42:12.692029 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ls8pc\" (UniqueName: \"kubernetes.io/projected/7e668b48-23a8-4f2a-b8ed-4789b979fbaa-kube-api-access-ls8pc\") pod \"certified-operators-tkx42\" (UID: \"7e668b48-23a8-4f2a-b8ed-4789b979fbaa\") " pod="openshift-marketplace/certified-operators-tkx42" Mar 20 11:42:12 crc kubenswrapper[4748]: I0320 11:42:12.692097 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e668b48-23a8-4f2a-b8ed-4789b979fbaa-utilities\") pod \"certified-operators-tkx42\" (UID: \"7e668b48-23a8-4f2a-b8ed-4789b979fbaa\") " pod="openshift-marketplace/certified-operators-tkx42" Mar 20 11:42:12 crc kubenswrapper[4748]: I0320 11:42:12.692151 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e668b48-23a8-4f2a-b8ed-4789b979fbaa-catalog-content\") pod \"certified-operators-tkx42\" (UID: \"7e668b48-23a8-4f2a-b8ed-4789b979fbaa\") " pod="openshift-marketplace/certified-operators-tkx42" Mar 20 11:42:12 crc kubenswrapper[4748]: I0320 11:42:12.692736 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e668b48-23a8-4f2a-b8ed-4789b979fbaa-catalog-content\") pod \"certified-operators-tkx42\" (UID: \"7e668b48-23a8-4f2a-b8ed-4789b979fbaa\") " pod="openshift-marketplace/certified-operators-tkx42" Mar 20 11:42:12 crc kubenswrapper[4748]: I0320 11:42:12.693404 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e668b48-23a8-4f2a-b8ed-4789b979fbaa-utilities\") pod \"certified-operators-tkx42\" (UID: \"7e668b48-23a8-4f2a-b8ed-4789b979fbaa\") " pod="openshift-marketplace/certified-operators-tkx42" Mar 20 11:42:12 crc kubenswrapper[4748]: I0320 11:42:12.716579 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ls8pc\" (UniqueName: \"kubernetes.io/projected/7e668b48-23a8-4f2a-b8ed-4789b979fbaa-kube-api-access-ls8pc\") pod \"certified-operators-tkx42\" (UID: \"7e668b48-23a8-4f2a-b8ed-4789b979fbaa\") " pod="openshift-marketplace/certified-operators-tkx42" Mar 20 11:42:12 crc kubenswrapper[4748]: I0320 11:42:12.853856 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tkx42" Mar 20 11:42:13 crc kubenswrapper[4748]: I0320 11:42:13.423073 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tkx42"] Mar 20 11:42:14 crc kubenswrapper[4748]: I0320 11:42:14.118913 4748 generic.go:334] "Generic (PLEG): container finished" podID="7e668b48-23a8-4f2a-b8ed-4789b979fbaa" containerID="00b80bfb517fd8d1518da2f97314a546235122189e1af9f8fd70b63eacefafb0" exitCode=0 Mar 20 11:42:14 crc kubenswrapper[4748]: I0320 11:42:14.119044 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tkx42" event={"ID":"7e668b48-23a8-4f2a-b8ed-4789b979fbaa","Type":"ContainerDied","Data":"00b80bfb517fd8d1518da2f97314a546235122189e1af9f8fd70b63eacefafb0"} Mar 20 11:42:14 crc kubenswrapper[4748]: I0320 11:42:14.119297 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tkx42" event={"ID":"7e668b48-23a8-4f2a-b8ed-4789b979fbaa","Type":"ContainerStarted","Data":"2f10254d63405cc13017821bb2c4f56d701d803eab7f3500ed23ed55dced48d6"} Mar 20 11:42:16 crc kubenswrapper[4748]: I0320 11:42:16.136975 4748 generic.go:334] "Generic (PLEG): container finished" podID="7e668b48-23a8-4f2a-b8ed-4789b979fbaa" containerID="67cfed4d42f9b3ad1dc104cd5a59f78acbd23f3511c229d7049d63d1f6570941" exitCode=0 Mar 20 11:42:16 crc kubenswrapper[4748]: I0320 11:42:16.137133 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tkx42" event={"ID":"7e668b48-23a8-4f2a-b8ed-4789b979fbaa","Type":"ContainerDied","Data":"67cfed4d42f9b3ad1dc104cd5a59f78acbd23f3511c229d7049d63d1f6570941"} Mar 20 11:42:17 crc kubenswrapper[4748]: I0320 11:42:17.147210 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tkx42" event={"ID":"7e668b48-23a8-4f2a-b8ed-4789b979fbaa","Type":"ContainerStarted","Data":"02e9b18d6acb2bd0bdc5b41229a45a45ba8017621e7f1ce90214f920fcf89071"} Mar 20 11:42:17 crc kubenswrapper[4748]: I0320 11:42:17.171777 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-tkx42" podStartSLOduration=2.424049632 podStartE2EDuration="5.171755396s" podCreationTimestamp="2026-03-20 11:42:12 +0000 UTC" firstStartedPulling="2026-03-20 11:42:14.123566019 +0000 UTC m=+3969.265111833" lastFinishedPulling="2026-03-20 11:42:16.871271783 +0000 UTC m=+3972.012817597" observedRunningTime="2026-03-20 11:42:17.168075922 +0000 UTC m=+3972.309621746" watchObservedRunningTime="2026-03-20 11:42:17.171755396 +0000 UTC m=+3972.313301210" Mar 20 11:42:21 crc kubenswrapper[4748]: I0320 11:42:21.515967 4748 scope.go:117] "RemoveContainer" containerID="73b367691f863dac06f8e78eb6207c111ac5931452db06413eefe37857198933" Mar 20 11:42:21 crc kubenswrapper[4748]: E0320 11:42:21.517811 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lbvz_openshift-machine-config-operator(8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" Mar 20 11:42:22 crc kubenswrapper[4748]: I0320 11:42:22.391077 4748 scope.go:117] "RemoveContainer" containerID="6236884d31a49d4756f2ff3cf7fee61b1fb1eaa800f6f9a1b1772761de81c87e" Mar 20 11:42:22 crc kubenswrapper[4748]: I0320 11:42:22.854035 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-tkx42" Mar 20 11:42:22 crc kubenswrapper[4748]: I0320 11:42:22.855443 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-tkx42" Mar 20 11:42:22 crc kubenswrapper[4748]: I0320 11:42:22.908479 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-tkx42" Mar 20 11:42:23 crc kubenswrapper[4748]: I0320 11:42:23.249020 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-tkx42" Mar 20 11:42:23 crc kubenswrapper[4748]: I0320 11:42:23.300229 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-tkx42"] Mar 20 11:42:25 crc kubenswrapper[4748]: I0320 11:42:25.216246 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-tkx42" podUID="7e668b48-23a8-4f2a-b8ed-4789b979fbaa" containerName="registry-server" containerID="cri-o://02e9b18d6acb2bd0bdc5b41229a45a45ba8017621e7f1ce90214f920fcf89071" gracePeriod=2 Mar 20 11:42:25 crc kubenswrapper[4748]: I0320 11:42:25.788806 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tkx42" Mar 20 11:42:25 crc kubenswrapper[4748]: I0320 11:42:25.952307 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ls8pc\" (UniqueName: \"kubernetes.io/projected/7e668b48-23a8-4f2a-b8ed-4789b979fbaa-kube-api-access-ls8pc\") pod \"7e668b48-23a8-4f2a-b8ed-4789b979fbaa\" (UID: \"7e668b48-23a8-4f2a-b8ed-4789b979fbaa\") " Mar 20 11:42:25 crc kubenswrapper[4748]: I0320 11:42:25.952644 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e668b48-23a8-4f2a-b8ed-4789b979fbaa-catalog-content\") pod \"7e668b48-23a8-4f2a-b8ed-4789b979fbaa\" (UID: \"7e668b48-23a8-4f2a-b8ed-4789b979fbaa\") " Mar 20 11:42:25 crc kubenswrapper[4748]: I0320 11:42:25.952772 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e668b48-23a8-4f2a-b8ed-4789b979fbaa-utilities\") pod \"7e668b48-23a8-4f2a-b8ed-4789b979fbaa\" (UID: \"7e668b48-23a8-4f2a-b8ed-4789b979fbaa\") " Mar 20 11:42:25 crc kubenswrapper[4748]: I0320 11:42:25.954185 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e668b48-23a8-4f2a-b8ed-4789b979fbaa-utilities" (OuterVolumeSpecName: "utilities") pod "7e668b48-23a8-4f2a-b8ed-4789b979fbaa" (UID: "7e668b48-23a8-4f2a-b8ed-4789b979fbaa"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:42:25 crc kubenswrapper[4748]: I0320 11:42:25.958863 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e668b48-23a8-4f2a-b8ed-4789b979fbaa-kube-api-access-ls8pc" (OuterVolumeSpecName: "kube-api-access-ls8pc") pod "7e668b48-23a8-4f2a-b8ed-4789b979fbaa" (UID: "7e668b48-23a8-4f2a-b8ed-4789b979fbaa"). InnerVolumeSpecName "kube-api-access-ls8pc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:42:26 crc kubenswrapper[4748]: I0320 11:42:26.055932 4748 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e668b48-23a8-4f2a-b8ed-4789b979fbaa-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 11:42:26 crc kubenswrapper[4748]: I0320 11:42:26.056670 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ls8pc\" (UniqueName: \"kubernetes.io/projected/7e668b48-23a8-4f2a-b8ed-4789b979fbaa-kube-api-access-ls8pc\") on node \"crc\" DevicePath \"\"" Mar 20 11:42:26 crc kubenswrapper[4748]: I0320 11:42:26.225945 4748 generic.go:334] "Generic (PLEG): container finished" podID="7e668b48-23a8-4f2a-b8ed-4789b979fbaa" containerID="02e9b18d6acb2bd0bdc5b41229a45a45ba8017621e7f1ce90214f920fcf89071" exitCode=0 Mar 20 11:42:26 crc kubenswrapper[4748]: I0320 11:42:26.225990 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tkx42" event={"ID":"7e668b48-23a8-4f2a-b8ed-4789b979fbaa","Type":"ContainerDied","Data":"02e9b18d6acb2bd0bdc5b41229a45a45ba8017621e7f1ce90214f920fcf89071"} Mar 20 11:42:26 crc kubenswrapper[4748]: I0320 11:42:26.226023 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tkx42" event={"ID":"7e668b48-23a8-4f2a-b8ed-4789b979fbaa","Type":"ContainerDied","Data":"2f10254d63405cc13017821bb2c4f56d701d803eab7f3500ed23ed55dced48d6"} Mar 20 11:42:26 crc kubenswrapper[4748]: I0320 11:42:26.226043 4748 scope.go:117] "RemoveContainer" containerID="02e9b18d6acb2bd0bdc5b41229a45a45ba8017621e7f1ce90214f920fcf89071" Mar 20 11:42:26 crc kubenswrapper[4748]: I0320 11:42:26.226188 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tkx42" Mar 20 11:42:26 crc kubenswrapper[4748]: I0320 11:42:26.247356 4748 scope.go:117] "RemoveContainer" containerID="67cfed4d42f9b3ad1dc104cd5a59f78acbd23f3511c229d7049d63d1f6570941" Mar 20 11:42:26 crc kubenswrapper[4748]: I0320 11:42:26.272402 4748 scope.go:117] "RemoveContainer" containerID="00b80bfb517fd8d1518da2f97314a546235122189e1af9f8fd70b63eacefafb0" Mar 20 11:42:26 crc kubenswrapper[4748]: I0320 11:42:26.307615 4748 scope.go:117] "RemoveContainer" containerID="02e9b18d6acb2bd0bdc5b41229a45a45ba8017621e7f1ce90214f920fcf89071" Mar 20 11:42:26 crc kubenswrapper[4748]: E0320 11:42:26.308172 4748 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02e9b18d6acb2bd0bdc5b41229a45a45ba8017621e7f1ce90214f920fcf89071\": container with ID starting with 02e9b18d6acb2bd0bdc5b41229a45a45ba8017621e7f1ce90214f920fcf89071 not found: ID does not exist" containerID="02e9b18d6acb2bd0bdc5b41229a45a45ba8017621e7f1ce90214f920fcf89071" Mar 20 11:42:26 crc kubenswrapper[4748]: I0320 11:42:26.308234 4748 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02e9b18d6acb2bd0bdc5b41229a45a45ba8017621e7f1ce90214f920fcf89071"} err="failed to get container status \"02e9b18d6acb2bd0bdc5b41229a45a45ba8017621e7f1ce90214f920fcf89071\": rpc error: code = NotFound desc = could not find container \"02e9b18d6acb2bd0bdc5b41229a45a45ba8017621e7f1ce90214f920fcf89071\": container with ID starting with 02e9b18d6acb2bd0bdc5b41229a45a45ba8017621e7f1ce90214f920fcf89071 not found: ID does not exist" Mar 20 11:42:26 crc kubenswrapper[4748]: I0320 11:42:26.308271 4748 scope.go:117] "RemoveContainer" containerID="67cfed4d42f9b3ad1dc104cd5a59f78acbd23f3511c229d7049d63d1f6570941" Mar 20 11:42:26 crc kubenswrapper[4748]: E0320 11:42:26.308765 4748 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"67cfed4d42f9b3ad1dc104cd5a59f78acbd23f3511c229d7049d63d1f6570941\": container with ID starting with 67cfed4d42f9b3ad1dc104cd5a59f78acbd23f3511c229d7049d63d1f6570941 not found: ID does not exist" containerID="67cfed4d42f9b3ad1dc104cd5a59f78acbd23f3511c229d7049d63d1f6570941" Mar 20 11:42:26 crc kubenswrapper[4748]: I0320 11:42:26.308800 4748 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67cfed4d42f9b3ad1dc104cd5a59f78acbd23f3511c229d7049d63d1f6570941"} err="failed to get container status \"67cfed4d42f9b3ad1dc104cd5a59f78acbd23f3511c229d7049d63d1f6570941\": rpc error: code = NotFound desc = could not find container \"67cfed4d42f9b3ad1dc104cd5a59f78acbd23f3511c229d7049d63d1f6570941\": container with ID starting with 67cfed4d42f9b3ad1dc104cd5a59f78acbd23f3511c229d7049d63d1f6570941 not found: ID does not exist" Mar 20 11:42:26 crc kubenswrapper[4748]: I0320 11:42:26.308820 4748 scope.go:117] "RemoveContainer" containerID="00b80bfb517fd8d1518da2f97314a546235122189e1af9f8fd70b63eacefafb0" Mar 20 11:42:26 crc kubenswrapper[4748]: E0320 11:42:26.309297 4748 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00b80bfb517fd8d1518da2f97314a546235122189e1af9f8fd70b63eacefafb0\": container with ID starting with 00b80bfb517fd8d1518da2f97314a546235122189e1af9f8fd70b63eacefafb0 not found: ID does not exist" containerID="00b80bfb517fd8d1518da2f97314a546235122189e1af9f8fd70b63eacefafb0" Mar 20 11:42:26 crc kubenswrapper[4748]: I0320 11:42:26.309335 4748 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00b80bfb517fd8d1518da2f97314a546235122189e1af9f8fd70b63eacefafb0"} err="failed to get container status \"00b80bfb517fd8d1518da2f97314a546235122189e1af9f8fd70b63eacefafb0\": rpc error: code = NotFound desc = could not find container \"00b80bfb517fd8d1518da2f97314a546235122189e1af9f8fd70b63eacefafb0\": container with ID starting with 00b80bfb517fd8d1518da2f97314a546235122189e1af9f8fd70b63eacefafb0 not found: ID does not exist" Mar 20 11:42:26 crc kubenswrapper[4748]: I0320 11:42:26.736112 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e668b48-23a8-4f2a-b8ed-4789b979fbaa-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7e668b48-23a8-4f2a-b8ed-4789b979fbaa" (UID: "7e668b48-23a8-4f2a-b8ed-4789b979fbaa"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:42:26 crc kubenswrapper[4748]: I0320 11:42:26.768381 4748 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e668b48-23a8-4f2a-b8ed-4789b979fbaa-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 11:42:26 crc kubenswrapper[4748]: I0320 11:42:26.863009 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-tkx42"] Mar 20 11:42:26 crc kubenswrapper[4748]: I0320 11:42:26.879325 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-tkx42"] Mar 20 11:42:27 crc kubenswrapper[4748]: I0320 11:42:27.531112 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e668b48-23a8-4f2a-b8ed-4789b979fbaa" path="/var/lib/kubelet/pods/7e668b48-23a8-4f2a-b8ed-4789b979fbaa/volumes" Mar 20 11:42:35 crc kubenswrapper[4748]: I0320 11:42:35.528246 4748 scope.go:117] "RemoveContainer" containerID="73b367691f863dac06f8e78eb6207c111ac5931452db06413eefe37857198933" Mar 20 11:42:35 crc kubenswrapper[4748]: E0320 11:42:35.529491 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lbvz_openshift-machine-config-operator(8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" Mar 20 11:42:49 crc kubenswrapper[4748]: I0320 11:42:49.515529 4748 scope.go:117] "RemoveContainer" containerID="73b367691f863dac06f8e78eb6207c111ac5931452db06413eefe37857198933" Mar 20 11:42:49 crc kubenswrapper[4748]: E0320 11:42:49.516168 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lbvz_openshift-machine-config-operator(8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" Mar 20 11:43:01 crc kubenswrapper[4748]: I0320 11:43:01.515768 4748 scope.go:117] "RemoveContainer" containerID="73b367691f863dac06f8e78eb6207c111ac5931452db06413eefe37857198933" Mar 20 11:43:01 crc kubenswrapper[4748]: E0320 11:43:01.516524 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lbvz_openshift-machine-config-operator(8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" Mar 20 11:43:15 crc kubenswrapper[4748]: I0320 11:43:15.521082 4748 scope.go:117] "RemoveContainer" containerID="73b367691f863dac06f8e78eb6207c111ac5931452db06413eefe37857198933" Mar 20 11:43:15 crc kubenswrapper[4748]: E0320 11:43:15.521998 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lbvz_openshift-machine-config-operator(8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" Mar 20 11:43:30 crc kubenswrapper[4748]: I0320 11:43:30.515653 4748 scope.go:117] "RemoveContainer" containerID="73b367691f863dac06f8e78eb6207c111ac5931452db06413eefe37857198933" Mar 20 11:43:30 crc kubenswrapper[4748]: E0320 11:43:30.517564 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lbvz_openshift-machine-config-operator(8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" Mar 20 11:43:42 crc kubenswrapper[4748]: I0320 11:43:42.515466 4748 scope.go:117] "RemoveContainer" containerID="73b367691f863dac06f8e78eb6207c111ac5931452db06413eefe37857198933" Mar 20 11:43:42 crc kubenswrapper[4748]: E0320 11:43:42.516341 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lbvz_openshift-machine-config-operator(8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" Mar 20 11:43:54 crc kubenswrapper[4748]: I0320 11:43:54.516255 4748 scope.go:117] "RemoveContainer" containerID="73b367691f863dac06f8e78eb6207c111ac5931452db06413eefe37857198933" Mar 20 11:43:54 crc kubenswrapper[4748]: E0320 11:43:54.517231 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lbvz_openshift-machine-config-operator(8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" Mar 20 11:44:00 crc kubenswrapper[4748]: I0320 11:44:00.154541 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566784-9bjsw"] Mar 20 11:44:00 crc kubenswrapper[4748]: E0320 11:44:00.155527 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e668b48-23a8-4f2a-b8ed-4789b979fbaa" containerName="extract-content" Mar 20 11:44:00 crc kubenswrapper[4748]: I0320 11:44:00.155545 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e668b48-23a8-4f2a-b8ed-4789b979fbaa" containerName="extract-content" Mar 20 11:44:00 crc kubenswrapper[4748]: E0320 11:44:00.155585 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e668b48-23a8-4f2a-b8ed-4789b979fbaa" containerName="extract-utilities" Mar 20 11:44:00 crc kubenswrapper[4748]: I0320 11:44:00.155594 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e668b48-23a8-4f2a-b8ed-4789b979fbaa" containerName="extract-utilities" Mar 20 11:44:00 crc kubenswrapper[4748]: E0320 11:44:00.155614 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e668b48-23a8-4f2a-b8ed-4789b979fbaa" containerName="registry-server" Mar 20 11:44:00 crc kubenswrapper[4748]: I0320 11:44:00.155623 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e668b48-23a8-4f2a-b8ed-4789b979fbaa" containerName="registry-server" Mar 20 11:44:00 crc kubenswrapper[4748]: I0320 11:44:00.155886 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e668b48-23a8-4f2a-b8ed-4789b979fbaa" containerName="registry-server" Mar 20 11:44:00 crc kubenswrapper[4748]: I0320 11:44:00.156656 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566784-9bjsw" Mar 20 11:44:00 crc kubenswrapper[4748]: I0320 11:44:00.159432 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 11:44:00 crc kubenswrapper[4748]: I0320 11:44:00.159625 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-p6q8z" Mar 20 11:44:00 crc kubenswrapper[4748]: I0320 11:44:00.159665 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 11:44:00 crc kubenswrapper[4748]: I0320 11:44:00.174872 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566784-9bjsw"] Mar 20 11:44:00 crc kubenswrapper[4748]: I0320 11:44:00.195941 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zht5n\" (UniqueName: \"kubernetes.io/projected/e98a37db-3a1d-4b1b-9442-7f4d005d3c1f-kube-api-access-zht5n\") pod \"auto-csr-approver-29566784-9bjsw\" (UID: \"e98a37db-3a1d-4b1b-9442-7f4d005d3c1f\") " pod="openshift-infra/auto-csr-approver-29566784-9bjsw" Mar 20 11:44:00 crc kubenswrapper[4748]: I0320 11:44:00.298211 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zht5n\" (UniqueName: \"kubernetes.io/projected/e98a37db-3a1d-4b1b-9442-7f4d005d3c1f-kube-api-access-zht5n\") pod \"auto-csr-approver-29566784-9bjsw\" (UID: \"e98a37db-3a1d-4b1b-9442-7f4d005d3c1f\") " pod="openshift-infra/auto-csr-approver-29566784-9bjsw" Mar 20 11:44:00 crc kubenswrapper[4748]: I0320 11:44:00.322097 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zht5n\" (UniqueName: \"kubernetes.io/projected/e98a37db-3a1d-4b1b-9442-7f4d005d3c1f-kube-api-access-zht5n\") pod \"auto-csr-approver-29566784-9bjsw\" (UID: \"e98a37db-3a1d-4b1b-9442-7f4d005d3c1f\") " pod="openshift-infra/auto-csr-approver-29566784-9bjsw" Mar 20 11:44:00 crc kubenswrapper[4748]: I0320 11:44:00.483387 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566784-9bjsw" Mar 20 11:44:00 crc kubenswrapper[4748]: I0320 11:44:00.949807 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566784-9bjsw"] Mar 20 11:44:02 crc kubenswrapper[4748]: I0320 11:44:02.073427 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566784-9bjsw" event={"ID":"e98a37db-3a1d-4b1b-9442-7f4d005d3c1f","Type":"ContainerStarted","Data":"fe2a1b827140d721b61dfd083b024e2cbf8eff85d5fc724a55b4772bb22f253a"} Mar 20 11:44:03 crc kubenswrapper[4748]: I0320 11:44:03.084780 4748 generic.go:334] "Generic (PLEG): container finished" podID="e98a37db-3a1d-4b1b-9442-7f4d005d3c1f" containerID="6ff3a1bc20f0af391a6f5a181e62de21f5b826195df1c2c99f784c889a972fac" exitCode=0 Mar 20 11:44:03 crc kubenswrapper[4748]: I0320 11:44:03.084885 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566784-9bjsw" event={"ID":"e98a37db-3a1d-4b1b-9442-7f4d005d3c1f","Type":"ContainerDied","Data":"6ff3a1bc20f0af391a6f5a181e62de21f5b826195df1c2c99f784c889a972fac"} Mar 20 11:44:04 crc kubenswrapper[4748]: I0320 11:44:04.508302 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566784-9bjsw" Mar 20 11:44:04 crc kubenswrapper[4748]: I0320 11:44:04.585974 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zht5n\" (UniqueName: \"kubernetes.io/projected/e98a37db-3a1d-4b1b-9442-7f4d005d3c1f-kube-api-access-zht5n\") pod \"e98a37db-3a1d-4b1b-9442-7f4d005d3c1f\" (UID: \"e98a37db-3a1d-4b1b-9442-7f4d005d3c1f\") " Mar 20 11:44:04 crc kubenswrapper[4748]: I0320 11:44:04.592410 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e98a37db-3a1d-4b1b-9442-7f4d005d3c1f-kube-api-access-zht5n" (OuterVolumeSpecName: "kube-api-access-zht5n") pod "e98a37db-3a1d-4b1b-9442-7f4d005d3c1f" (UID: "e98a37db-3a1d-4b1b-9442-7f4d005d3c1f"). InnerVolumeSpecName "kube-api-access-zht5n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:44:04 crc kubenswrapper[4748]: I0320 11:44:04.688815 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zht5n\" (UniqueName: \"kubernetes.io/projected/e98a37db-3a1d-4b1b-9442-7f4d005d3c1f-kube-api-access-zht5n\") on node \"crc\" DevicePath \"\"" Mar 20 11:44:05 crc kubenswrapper[4748]: I0320 11:44:05.102328 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566784-9bjsw" event={"ID":"e98a37db-3a1d-4b1b-9442-7f4d005d3c1f","Type":"ContainerDied","Data":"fe2a1b827140d721b61dfd083b024e2cbf8eff85d5fc724a55b4772bb22f253a"} Mar 20 11:44:05 crc kubenswrapper[4748]: I0320 11:44:05.102366 4748 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fe2a1b827140d721b61dfd083b024e2cbf8eff85d5fc724a55b4772bb22f253a" Mar 20 11:44:05 crc kubenswrapper[4748]: I0320 11:44:05.102395 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566784-9bjsw" Mar 20 11:44:05 crc kubenswrapper[4748]: I0320 11:44:05.579407 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566778-dscqn"] Mar 20 11:44:05 crc kubenswrapper[4748]: I0320 11:44:05.587814 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566778-dscqn"] Mar 20 11:44:07 crc kubenswrapper[4748]: I0320 11:44:07.529183 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3961923-315d-4a52-98d7-98a8a9c3a9fc" path="/var/lib/kubelet/pods/d3961923-315d-4a52-98d7-98a8a9c3a9fc/volumes" Mar 20 11:44:08 crc kubenswrapper[4748]: I0320 11:44:08.515251 4748 scope.go:117] "RemoveContainer" containerID="73b367691f863dac06f8e78eb6207c111ac5931452db06413eefe37857198933" Mar 20 11:44:08 crc kubenswrapper[4748]: E0320 11:44:08.515878 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lbvz_openshift-machine-config-operator(8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" Mar 20 11:44:19 crc kubenswrapper[4748]: I0320 11:44:19.628782 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-fl95f"] Mar 20 11:44:19 crc kubenswrapper[4748]: E0320 11:44:19.629700 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e98a37db-3a1d-4b1b-9442-7f4d005d3c1f" containerName="oc" Mar 20 11:44:19 crc kubenswrapper[4748]: I0320 11:44:19.629717 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="e98a37db-3a1d-4b1b-9442-7f4d005d3c1f" containerName="oc" Mar 20 11:44:19 crc kubenswrapper[4748]: I0320 11:44:19.630292 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="e98a37db-3a1d-4b1b-9442-7f4d005d3c1f" containerName="oc" Mar 20 11:44:19 crc kubenswrapper[4748]: I0320 11:44:19.631801 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fl95f" Mar 20 11:44:19 crc kubenswrapper[4748]: I0320 11:44:19.654154 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fl95f"] Mar 20 11:44:19 crc kubenswrapper[4748]: I0320 11:44:19.782472 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/322e4a1e-f319-4644-930c-5195ff246a22-utilities\") pod \"redhat-marketplace-fl95f\" (UID: \"322e4a1e-f319-4644-930c-5195ff246a22\") " pod="openshift-marketplace/redhat-marketplace-fl95f" Mar 20 11:44:19 crc kubenswrapper[4748]: I0320 11:44:19.782942 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jvn4\" (UniqueName: \"kubernetes.io/projected/322e4a1e-f319-4644-930c-5195ff246a22-kube-api-access-6jvn4\") pod \"redhat-marketplace-fl95f\" (UID: \"322e4a1e-f319-4644-930c-5195ff246a22\") " pod="openshift-marketplace/redhat-marketplace-fl95f" Mar 20 11:44:19 crc kubenswrapper[4748]: I0320 11:44:19.783103 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/322e4a1e-f319-4644-930c-5195ff246a22-catalog-content\") pod \"redhat-marketplace-fl95f\" (UID: \"322e4a1e-f319-4644-930c-5195ff246a22\") " pod="openshift-marketplace/redhat-marketplace-fl95f" Mar 20 11:44:19 crc kubenswrapper[4748]: I0320 11:44:19.885247 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/322e4a1e-f319-4644-930c-5195ff246a22-catalog-content\") pod \"redhat-marketplace-fl95f\" (UID: \"322e4a1e-f319-4644-930c-5195ff246a22\") " pod="openshift-marketplace/redhat-marketplace-fl95f" Mar 20 11:44:19 crc kubenswrapper[4748]: I0320 11:44:19.885382 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/322e4a1e-f319-4644-930c-5195ff246a22-utilities\") pod \"redhat-marketplace-fl95f\" (UID: \"322e4a1e-f319-4644-930c-5195ff246a22\") " pod="openshift-marketplace/redhat-marketplace-fl95f" Mar 20 11:44:19 crc kubenswrapper[4748]: I0320 11:44:19.885425 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6jvn4\" (UniqueName: \"kubernetes.io/projected/322e4a1e-f319-4644-930c-5195ff246a22-kube-api-access-6jvn4\") pod \"redhat-marketplace-fl95f\" (UID: \"322e4a1e-f319-4644-930c-5195ff246a22\") " pod="openshift-marketplace/redhat-marketplace-fl95f" Mar 20 11:44:19 crc kubenswrapper[4748]: I0320 11:44:19.886195 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/322e4a1e-f319-4644-930c-5195ff246a22-catalog-content\") pod \"redhat-marketplace-fl95f\" (UID: \"322e4a1e-f319-4644-930c-5195ff246a22\") " pod="openshift-marketplace/redhat-marketplace-fl95f" Mar 20 11:44:19 crc kubenswrapper[4748]: I0320 11:44:19.886415 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/322e4a1e-f319-4644-930c-5195ff246a22-utilities\") pod \"redhat-marketplace-fl95f\" (UID: \"322e4a1e-f319-4644-930c-5195ff246a22\") " pod="openshift-marketplace/redhat-marketplace-fl95f" Mar 20 11:44:19 crc kubenswrapper[4748]: I0320 11:44:19.904959 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jvn4\" (UniqueName: \"kubernetes.io/projected/322e4a1e-f319-4644-930c-5195ff246a22-kube-api-access-6jvn4\") pod \"redhat-marketplace-fl95f\" (UID: \"322e4a1e-f319-4644-930c-5195ff246a22\") " pod="openshift-marketplace/redhat-marketplace-fl95f" Mar 20 11:44:19 crc kubenswrapper[4748]: I0320 11:44:19.955477 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fl95f" Mar 20 11:44:20 crc kubenswrapper[4748]: I0320 11:44:20.440387 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fl95f"] Mar 20 11:44:21 crc kubenswrapper[4748]: I0320 11:44:21.240778 4748 generic.go:334] "Generic (PLEG): container finished" podID="322e4a1e-f319-4644-930c-5195ff246a22" containerID="ae3b2f10e93ea2e8e0a3b925c18719685a6f2d3fe7c7c89e246d1670f328ae82" exitCode=0 Mar 20 11:44:21 crc kubenswrapper[4748]: I0320 11:44:21.240885 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fl95f" event={"ID":"322e4a1e-f319-4644-930c-5195ff246a22","Type":"ContainerDied","Data":"ae3b2f10e93ea2e8e0a3b925c18719685a6f2d3fe7c7c89e246d1670f328ae82"} Mar 20 11:44:21 crc kubenswrapper[4748]: I0320 11:44:21.241069 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fl95f" event={"ID":"322e4a1e-f319-4644-930c-5195ff246a22","Type":"ContainerStarted","Data":"168f019332a706b2a339c5efe4c701b7c726c60186abd2224d7e76f5dfe93b78"} Mar 20 11:44:22 crc kubenswrapper[4748]: I0320 11:44:22.252422 4748 generic.go:334] "Generic (PLEG): container finished" podID="322e4a1e-f319-4644-930c-5195ff246a22" containerID="59344a71475ec7b7bfc40dc2c08a49edbe8ba402a8f3116a5ff3c3bc134193a0" exitCode=0 Mar 20 11:44:22 crc kubenswrapper[4748]: I0320 11:44:22.252472 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fl95f" event={"ID":"322e4a1e-f319-4644-930c-5195ff246a22","Type":"ContainerDied","Data":"59344a71475ec7b7bfc40dc2c08a49edbe8ba402a8f3116a5ff3c3bc134193a0"} Mar 20 11:44:22 crc kubenswrapper[4748]: I0320 11:44:22.494525 4748 scope.go:117] "RemoveContainer" containerID="db990d66cf4ee1f553d87f88d3f6ca6b0c6dc157bd0b5eb9a2ac783626d34078" Mar 20 11:44:23 crc kubenswrapper[4748]: I0320 11:44:23.262655 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fl95f" event={"ID":"322e4a1e-f319-4644-930c-5195ff246a22","Type":"ContainerStarted","Data":"2324521c117803e618a67790ab4de42d2dfe3a629c2887d8ad8fa4e61e4e0728"} Mar 20 11:44:23 crc kubenswrapper[4748]: I0320 11:44:23.284260 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-fl95f" podStartSLOduration=2.811068259 podStartE2EDuration="4.284218893s" podCreationTimestamp="2026-03-20 11:44:19 +0000 UTC" firstStartedPulling="2026-03-20 11:44:21.243775584 +0000 UTC m=+4096.385321408" lastFinishedPulling="2026-03-20 11:44:22.716926228 +0000 UTC m=+4097.858472042" observedRunningTime="2026-03-20 11:44:23.280777956 +0000 UTC m=+4098.422323780" watchObservedRunningTime="2026-03-20 11:44:23.284218893 +0000 UTC m=+4098.425764707" Mar 20 11:44:23 crc kubenswrapper[4748]: I0320 11:44:23.515997 4748 scope.go:117] "RemoveContainer" containerID="73b367691f863dac06f8e78eb6207c111ac5931452db06413eefe37857198933" Mar 20 11:44:23 crc kubenswrapper[4748]: E0320 11:44:23.516282 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lbvz_openshift-machine-config-operator(8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" Mar 20 11:44:29 crc kubenswrapper[4748]: I0320 11:44:29.956942 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-fl95f" Mar 20 11:44:29 crc kubenswrapper[4748]: I0320 11:44:29.957993 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-fl95f" Mar 20 11:44:30 crc kubenswrapper[4748]: I0320 11:44:30.003470 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-fl95f" Mar 20 11:44:30 crc kubenswrapper[4748]: I0320 11:44:30.373497 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-fl95f" Mar 20 11:44:30 crc kubenswrapper[4748]: I0320 11:44:30.420888 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fl95f"] Mar 20 11:44:32 crc kubenswrapper[4748]: I0320 11:44:32.346895 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-fl95f" podUID="322e4a1e-f319-4644-930c-5195ff246a22" containerName="registry-server" containerID="cri-o://2324521c117803e618a67790ab4de42d2dfe3a629c2887d8ad8fa4e61e4e0728" gracePeriod=2 Mar 20 11:44:32 crc kubenswrapper[4748]: I0320 11:44:32.764538 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fl95f" Mar 20 11:44:32 crc kubenswrapper[4748]: I0320 11:44:32.835201 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6jvn4\" (UniqueName: \"kubernetes.io/projected/322e4a1e-f319-4644-930c-5195ff246a22-kube-api-access-6jvn4\") pod \"322e4a1e-f319-4644-930c-5195ff246a22\" (UID: \"322e4a1e-f319-4644-930c-5195ff246a22\") " Mar 20 11:44:32 crc kubenswrapper[4748]: I0320 11:44:32.835455 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/322e4a1e-f319-4644-930c-5195ff246a22-catalog-content\") pod \"322e4a1e-f319-4644-930c-5195ff246a22\" (UID: \"322e4a1e-f319-4644-930c-5195ff246a22\") " Mar 20 11:44:32 crc kubenswrapper[4748]: I0320 11:44:32.835488 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/322e4a1e-f319-4644-930c-5195ff246a22-utilities\") pod \"322e4a1e-f319-4644-930c-5195ff246a22\" (UID: \"322e4a1e-f319-4644-930c-5195ff246a22\") " Mar 20 11:44:32 crc kubenswrapper[4748]: I0320 11:44:32.836647 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/322e4a1e-f319-4644-930c-5195ff246a22-utilities" (OuterVolumeSpecName: "utilities") pod "322e4a1e-f319-4644-930c-5195ff246a22" (UID: "322e4a1e-f319-4644-930c-5195ff246a22"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:44:32 crc kubenswrapper[4748]: I0320 11:44:32.843203 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/322e4a1e-f319-4644-930c-5195ff246a22-kube-api-access-6jvn4" (OuterVolumeSpecName: "kube-api-access-6jvn4") pod "322e4a1e-f319-4644-930c-5195ff246a22" (UID: "322e4a1e-f319-4644-930c-5195ff246a22"). InnerVolumeSpecName "kube-api-access-6jvn4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:44:32 crc kubenswrapper[4748]: I0320 11:44:32.926795 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/322e4a1e-f319-4644-930c-5195ff246a22-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "322e4a1e-f319-4644-930c-5195ff246a22" (UID: "322e4a1e-f319-4644-930c-5195ff246a22"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:44:32 crc kubenswrapper[4748]: I0320 11:44:32.938039 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6jvn4\" (UniqueName: \"kubernetes.io/projected/322e4a1e-f319-4644-930c-5195ff246a22-kube-api-access-6jvn4\") on node \"crc\" DevicePath \"\"" Mar 20 11:44:32 crc kubenswrapper[4748]: I0320 11:44:32.938304 4748 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/322e4a1e-f319-4644-930c-5195ff246a22-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 11:44:32 crc kubenswrapper[4748]: I0320 11:44:32.938416 4748 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/322e4a1e-f319-4644-930c-5195ff246a22-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 11:44:33 crc kubenswrapper[4748]: I0320 11:44:33.356903 4748 generic.go:334] "Generic (PLEG): container finished" podID="322e4a1e-f319-4644-930c-5195ff246a22" containerID="2324521c117803e618a67790ab4de42d2dfe3a629c2887d8ad8fa4e61e4e0728" exitCode=0 Mar 20 11:44:33 crc kubenswrapper[4748]: I0320 11:44:33.356971 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fl95f" event={"ID":"322e4a1e-f319-4644-930c-5195ff246a22","Type":"ContainerDied","Data":"2324521c117803e618a67790ab4de42d2dfe3a629c2887d8ad8fa4e61e4e0728"} Mar 20 11:44:33 crc kubenswrapper[4748]: I0320 11:44:33.357011 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fl95f" event={"ID":"322e4a1e-f319-4644-930c-5195ff246a22","Type":"ContainerDied","Data":"168f019332a706b2a339c5efe4c701b7c726c60186abd2224d7e76f5dfe93b78"} Mar 20 11:44:33 crc kubenswrapper[4748]: I0320 11:44:33.357039 4748 scope.go:117] "RemoveContainer" containerID="2324521c117803e618a67790ab4de42d2dfe3a629c2887d8ad8fa4e61e4e0728" Mar 20 11:44:33 crc kubenswrapper[4748]: I0320 11:44:33.357254 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fl95f" Mar 20 11:44:33 crc kubenswrapper[4748]: I0320 11:44:33.385138 4748 scope.go:117] "RemoveContainer" containerID="59344a71475ec7b7bfc40dc2c08a49edbe8ba402a8f3116a5ff3c3bc134193a0" Mar 20 11:44:33 crc kubenswrapper[4748]: I0320 11:44:33.399002 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fl95f"] Mar 20 11:44:33 crc kubenswrapper[4748]: I0320 11:44:33.407690 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-fl95f"] Mar 20 11:44:33 crc kubenswrapper[4748]: I0320 11:44:33.428351 4748 scope.go:117] "RemoveContainer" containerID="ae3b2f10e93ea2e8e0a3b925c18719685a6f2d3fe7c7c89e246d1670f328ae82" Mar 20 11:44:33 crc kubenswrapper[4748]: I0320 11:44:33.478292 4748 scope.go:117] "RemoveContainer" containerID="2324521c117803e618a67790ab4de42d2dfe3a629c2887d8ad8fa4e61e4e0728" Mar 20 11:44:33 crc kubenswrapper[4748]: E0320 11:44:33.478689 4748 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2324521c117803e618a67790ab4de42d2dfe3a629c2887d8ad8fa4e61e4e0728\": container with ID starting with 2324521c117803e618a67790ab4de42d2dfe3a629c2887d8ad8fa4e61e4e0728 not found: ID does not exist" containerID="2324521c117803e618a67790ab4de42d2dfe3a629c2887d8ad8fa4e61e4e0728" Mar 20 11:44:33 crc kubenswrapper[4748]: I0320 11:44:33.478744 4748 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2324521c117803e618a67790ab4de42d2dfe3a629c2887d8ad8fa4e61e4e0728"} err="failed to get container status \"2324521c117803e618a67790ab4de42d2dfe3a629c2887d8ad8fa4e61e4e0728\": rpc error: code = NotFound desc = could not find container \"2324521c117803e618a67790ab4de42d2dfe3a629c2887d8ad8fa4e61e4e0728\": container with ID starting with 2324521c117803e618a67790ab4de42d2dfe3a629c2887d8ad8fa4e61e4e0728 not found: ID does not exist" Mar 20 11:44:33 crc kubenswrapper[4748]: I0320 11:44:33.478775 4748 scope.go:117] "RemoveContainer" containerID="59344a71475ec7b7bfc40dc2c08a49edbe8ba402a8f3116a5ff3c3bc134193a0" Mar 20 11:44:33 crc kubenswrapper[4748]: E0320 11:44:33.479173 4748 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59344a71475ec7b7bfc40dc2c08a49edbe8ba402a8f3116a5ff3c3bc134193a0\": container with ID starting with 59344a71475ec7b7bfc40dc2c08a49edbe8ba402a8f3116a5ff3c3bc134193a0 not found: ID does not exist" containerID="59344a71475ec7b7bfc40dc2c08a49edbe8ba402a8f3116a5ff3c3bc134193a0" Mar 20 11:44:33 crc kubenswrapper[4748]: I0320 11:44:33.479208 4748 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59344a71475ec7b7bfc40dc2c08a49edbe8ba402a8f3116a5ff3c3bc134193a0"} err="failed to get container status \"59344a71475ec7b7bfc40dc2c08a49edbe8ba402a8f3116a5ff3c3bc134193a0\": rpc error: code = NotFound desc = could not find container \"59344a71475ec7b7bfc40dc2c08a49edbe8ba402a8f3116a5ff3c3bc134193a0\": container with ID starting with 59344a71475ec7b7bfc40dc2c08a49edbe8ba402a8f3116a5ff3c3bc134193a0 not found: ID does not exist" Mar 20 11:44:33 crc kubenswrapper[4748]: I0320 11:44:33.479237 4748 scope.go:117] "RemoveContainer" containerID="ae3b2f10e93ea2e8e0a3b925c18719685a6f2d3fe7c7c89e246d1670f328ae82" Mar 20 11:44:33 crc kubenswrapper[4748]: E0320 11:44:33.479509 4748 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae3b2f10e93ea2e8e0a3b925c18719685a6f2d3fe7c7c89e246d1670f328ae82\": container with ID starting with ae3b2f10e93ea2e8e0a3b925c18719685a6f2d3fe7c7c89e246d1670f328ae82 not found: ID does not exist" containerID="ae3b2f10e93ea2e8e0a3b925c18719685a6f2d3fe7c7c89e246d1670f328ae82" Mar 20 11:44:33 crc kubenswrapper[4748]: I0320 11:44:33.479533 4748 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae3b2f10e93ea2e8e0a3b925c18719685a6f2d3fe7c7c89e246d1670f328ae82"} err="failed to get container status \"ae3b2f10e93ea2e8e0a3b925c18719685a6f2d3fe7c7c89e246d1670f328ae82\": rpc error: code = NotFound desc = could not find container \"ae3b2f10e93ea2e8e0a3b925c18719685a6f2d3fe7c7c89e246d1670f328ae82\": container with ID starting with ae3b2f10e93ea2e8e0a3b925c18719685a6f2d3fe7c7c89e246d1670f328ae82 not found: ID does not exist" Mar 20 11:44:33 crc kubenswrapper[4748]: I0320 11:44:33.525936 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="322e4a1e-f319-4644-930c-5195ff246a22" path="/var/lib/kubelet/pods/322e4a1e-f319-4644-930c-5195ff246a22/volumes" Mar 20 11:44:38 crc kubenswrapper[4748]: I0320 11:44:38.515434 4748 scope.go:117] "RemoveContainer" containerID="73b367691f863dac06f8e78eb6207c111ac5931452db06413eefe37857198933" Mar 20 11:44:38 crc kubenswrapper[4748]: E0320 11:44:38.516280 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lbvz_openshift-machine-config-operator(8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" Mar 20 11:44:52 crc kubenswrapper[4748]: I0320 11:44:52.515367 4748 scope.go:117] "RemoveContainer" containerID="73b367691f863dac06f8e78eb6207c111ac5931452db06413eefe37857198933" Mar 20 11:44:52 crc kubenswrapper[4748]: E0320 11:44:52.517378 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lbvz_openshift-machine-config-operator(8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" Mar 20 11:44:55 crc kubenswrapper[4748]: I0320 11:44:55.057007 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-8kn4d"] Mar 20 11:44:55 crc kubenswrapper[4748]: E0320 11:44:55.057758 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="322e4a1e-f319-4644-930c-5195ff246a22" containerName="extract-utilities" Mar 20 11:44:55 crc kubenswrapper[4748]: I0320 11:44:55.057774 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="322e4a1e-f319-4644-930c-5195ff246a22" containerName="extract-utilities" Mar 20 11:44:55 crc kubenswrapper[4748]: E0320 11:44:55.057806 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="322e4a1e-f319-4644-930c-5195ff246a22" containerName="registry-server" Mar 20 11:44:55 crc kubenswrapper[4748]: I0320 11:44:55.057812 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="322e4a1e-f319-4644-930c-5195ff246a22" containerName="registry-server" Mar 20 11:44:55 crc kubenswrapper[4748]: E0320 11:44:55.057849 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="322e4a1e-f319-4644-930c-5195ff246a22" containerName="extract-content" Mar 20 11:44:55 crc kubenswrapper[4748]: I0320 11:44:55.057855 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="322e4a1e-f319-4644-930c-5195ff246a22" containerName="extract-content" Mar 20 11:44:55 crc kubenswrapper[4748]: I0320 11:44:55.058053 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="322e4a1e-f319-4644-930c-5195ff246a22" containerName="registry-server" Mar 20 11:44:55 crc kubenswrapper[4748]: I0320 11:44:55.059586 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8kn4d" Mar 20 11:44:55 crc kubenswrapper[4748]: I0320 11:44:55.066218 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8kn4d"] Mar 20 11:44:55 crc kubenswrapper[4748]: I0320 11:44:55.123170 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3021b62b-12d9-4bea-ad56-39a75ba44da7-catalog-content\") pod \"redhat-operators-8kn4d\" (UID: \"3021b62b-12d9-4bea-ad56-39a75ba44da7\") " pod="openshift-marketplace/redhat-operators-8kn4d" Mar 20 11:44:55 crc kubenswrapper[4748]: I0320 11:44:55.123353 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3021b62b-12d9-4bea-ad56-39a75ba44da7-utilities\") pod \"redhat-operators-8kn4d\" (UID: \"3021b62b-12d9-4bea-ad56-39a75ba44da7\") " pod="openshift-marketplace/redhat-operators-8kn4d" Mar 20 11:44:55 crc kubenswrapper[4748]: I0320 11:44:55.124103 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwpw6\" (UniqueName: \"kubernetes.io/projected/3021b62b-12d9-4bea-ad56-39a75ba44da7-kube-api-access-qwpw6\") pod \"redhat-operators-8kn4d\" (UID: \"3021b62b-12d9-4bea-ad56-39a75ba44da7\") " pod="openshift-marketplace/redhat-operators-8kn4d" Mar 20 11:44:55 crc kubenswrapper[4748]: I0320 11:44:55.225935 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qwpw6\" (UniqueName: \"kubernetes.io/projected/3021b62b-12d9-4bea-ad56-39a75ba44da7-kube-api-access-qwpw6\") pod \"redhat-operators-8kn4d\" (UID: \"3021b62b-12d9-4bea-ad56-39a75ba44da7\") " pod="openshift-marketplace/redhat-operators-8kn4d" Mar 20 11:44:55 crc kubenswrapper[4748]: I0320 11:44:55.226026 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3021b62b-12d9-4bea-ad56-39a75ba44da7-catalog-content\") pod \"redhat-operators-8kn4d\" (UID: \"3021b62b-12d9-4bea-ad56-39a75ba44da7\") " pod="openshift-marketplace/redhat-operators-8kn4d" Mar 20 11:44:55 crc kubenswrapper[4748]: I0320 11:44:55.226073 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3021b62b-12d9-4bea-ad56-39a75ba44da7-utilities\") pod \"redhat-operators-8kn4d\" (UID: \"3021b62b-12d9-4bea-ad56-39a75ba44da7\") " pod="openshift-marketplace/redhat-operators-8kn4d" Mar 20 11:44:55 crc kubenswrapper[4748]: I0320 11:44:55.226708 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3021b62b-12d9-4bea-ad56-39a75ba44da7-utilities\") pod \"redhat-operators-8kn4d\" (UID: \"3021b62b-12d9-4bea-ad56-39a75ba44da7\") " pod="openshift-marketplace/redhat-operators-8kn4d" Mar 20 11:44:55 crc kubenswrapper[4748]: I0320 11:44:55.226713 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3021b62b-12d9-4bea-ad56-39a75ba44da7-catalog-content\") pod \"redhat-operators-8kn4d\" (UID: \"3021b62b-12d9-4bea-ad56-39a75ba44da7\") " pod="openshift-marketplace/redhat-operators-8kn4d" Mar 20 11:44:55 crc kubenswrapper[4748]: I0320 11:44:55.248537 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwpw6\" (UniqueName: \"kubernetes.io/projected/3021b62b-12d9-4bea-ad56-39a75ba44da7-kube-api-access-qwpw6\") pod \"redhat-operators-8kn4d\" (UID: \"3021b62b-12d9-4bea-ad56-39a75ba44da7\") " pod="openshift-marketplace/redhat-operators-8kn4d" Mar 20 11:44:55 crc kubenswrapper[4748]: I0320 11:44:55.393868 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8kn4d" Mar 20 11:44:55 crc kubenswrapper[4748]: I0320 11:44:55.878239 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8kn4d"] Mar 20 11:44:56 crc kubenswrapper[4748]: I0320 11:44:56.796017 4748 generic.go:334] "Generic (PLEG): container finished" podID="3021b62b-12d9-4bea-ad56-39a75ba44da7" containerID="8a6186263327eccb3b4faf802b54f8ee0d6f3c6ebe6cfa8d96b4cf7350408c5c" exitCode=0 Mar 20 11:44:56 crc kubenswrapper[4748]: I0320 11:44:56.796099 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8kn4d" event={"ID":"3021b62b-12d9-4bea-ad56-39a75ba44da7","Type":"ContainerDied","Data":"8a6186263327eccb3b4faf802b54f8ee0d6f3c6ebe6cfa8d96b4cf7350408c5c"} Mar 20 11:44:56 crc kubenswrapper[4748]: I0320 11:44:56.796333 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8kn4d" event={"ID":"3021b62b-12d9-4bea-ad56-39a75ba44da7","Type":"ContainerStarted","Data":"3279747f7d8f303e27816358a258743930935716a422dce18ad3351e7786d554"} Mar 20 11:44:57 crc kubenswrapper[4748]: I0320 11:44:57.807475 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8kn4d" event={"ID":"3021b62b-12d9-4bea-ad56-39a75ba44da7","Type":"ContainerStarted","Data":"47802c0de561de5586456002b03803e80d4f6a95d8a358cf4470023d9b1c2e92"} Mar 20 11:44:58 crc kubenswrapper[4748]: I0320 11:44:58.819469 4748 generic.go:334] "Generic (PLEG): container finished" podID="3021b62b-12d9-4bea-ad56-39a75ba44da7" containerID="47802c0de561de5586456002b03803e80d4f6a95d8a358cf4470023d9b1c2e92" exitCode=0 Mar 20 11:44:58 crc kubenswrapper[4748]: I0320 11:44:58.819563 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8kn4d" event={"ID":"3021b62b-12d9-4bea-ad56-39a75ba44da7","Type":"ContainerDied","Data":"47802c0de561de5586456002b03803e80d4f6a95d8a358cf4470023d9b1c2e92"} Mar 20 11:44:59 crc kubenswrapper[4748]: I0320 11:44:59.832469 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8kn4d" event={"ID":"3021b62b-12d9-4bea-ad56-39a75ba44da7","Type":"ContainerStarted","Data":"70372c5a98aea8f1332ac1d1bf47127bb095b917c66ba66f02a3598265a9fa4a"} Mar 20 11:44:59 crc kubenswrapper[4748]: I0320 11:44:59.850133 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-8kn4d" podStartSLOduration=2.469256996 podStartE2EDuration="4.850116074s" podCreationTimestamp="2026-03-20 11:44:55 +0000 UTC" firstStartedPulling="2026-03-20 11:44:56.800555406 +0000 UTC m=+4131.942101220" lastFinishedPulling="2026-03-20 11:44:59.181414484 +0000 UTC m=+4134.322960298" observedRunningTime="2026-03-20 11:44:59.848045512 +0000 UTC m=+4134.989591346" watchObservedRunningTime="2026-03-20 11:44:59.850116074 +0000 UTC m=+4134.991661888" Mar 20 11:45:00 crc kubenswrapper[4748]: I0320 11:45:00.157161 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566785-c7czq"] Mar 20 11:45:00 crc kubenswrapper[4748]: I0320 11:45:00.159360 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566785-c7czq" Mar 20 11:45:00 crc kubenswrapper[4748]: I0320 11:45:00.163170 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 20 11:45:00 crc kubenswrapper[4748]: I0320 11:45:00.163294 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 20 11:45:00 crc kubenswrapper[4748]: I0320 11:45:00.171762 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566785-c7czq"] Mar 20 11:45:00 crc kubenswrapper[4748]: I0320 11:45:00.221967 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/be605b49-d384-4a69-b79d-f3ae02325da4-secret-volume\") pod \"collect-profiles-29566785-c7czq\" (UID: \"be605b49-d384-4a69-b79d-f3ae02325da4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566785-c7czq" Mar 20 11:45:00 crc kubenswrapper[4748]: I0320 11:45:00.222257 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/be605b49-d384-4a69-b79d-f3ae02325da4-config-volume\") pod \"collect-profiles-29566785-c7czq\" (UID: \"be605b49-d384-4a69-b79d-f3ae02325da4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566785-c7czq" Mar 20 11:45:00 crc kubenswrapper[4748]: I0320 11:45:00.222313 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jt7rm\" (UniqueName: \"kubernetes.io/projected/be605b49-d384-4a69-b79d-f3ae02325da4-kube-api-access-jt7rm\") pod \"collect-profiles-29566785-c7czq\" (UID: \"be605b49-d384-4a69-b79d-f3ae02325da4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566785-c7czq" Mar 20 11:45:00 crc kubenswrapper[4748]: I0320 11:45:00.323530 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/be605b49-d384-4a69-b79d-f3ae02325da4-secret-volume\") pod \"collect-profiles-29566785-c7czq\" (UID: \"be605b49-d384-4a69-b79d-f3ae02325da4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566785-c7czq" Mar 20 11:45:00 crc kubenswrapper[4748]: I0320 11:45:00.323691 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/be605b49-d384-4a69-b79d-f3ae02325da4-config-volume\") pod \"collect-profiles-29566785-c7czq\" (UID: \"be605b49-d384-4a69-b79d-f3ae02325da4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566785-c7czq" Mar 20 11:45:00 crc kubenswrapper[4748]: I0320 11:45:00.323722 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jt7rm\" (UniqueName: \"kubernetes.io/projected/be605b49-d384-4a69-b79d-f3ae02325da4-kube-api-access-jt7rm\") pod \"collect-profiles-29566785-c7czq\" (UID: \"be605b49-d384-4a69-b79d-f3ae02325da4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566785-c7czq" Mar 20 11:45:00 crc kubenswrapper[4748]: I0320 11:45:00.325665 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/be605b49-d384-4a69-b79d-f3ae02325da4-config-volume\") pod \"collect-profiles-29566785-c7czq\" (UID: \"be605b49-d384-4a69-b79d-f3ae02325da4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566785-c7czq" Mar 20 11:45:00 crc kubenswrapper[4748]: I0320 11:45:00.572236 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/be605b49-d384-4a69-b79d-f3ae02325da4-secret-volume\") pod \"collect-profiles-29566785-c7czq\" (UID: \"be605b49-d384-4a69-b79d-f3ae02325da4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566785-c7czq" Mar 20 11:45:00 crc kubenswrapper[4748]: I0320 11:45:00.573134 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jt7rm\" (UniqueName: \"kubernetes.io/projected/be605b49-d384-4a69-b79d-f3ae02325da4-kube-api-access-jt7rm\") pod \"collect-profiles-29566785-c7czq\" (UID: \"be605b49-d384-4a69-b79d-f3ae02325da4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566785-c7czq" Mar 20 11:45:00 crc kubenswrapper[4748]: I0320 11:45:00.786713 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566785-c7czq" Mar 20 11:45:01 crc kubenswrapper[4748]: I0320 11:45:01.269128 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566785-c7czq"] Mar 20 11:45:01 crc kubenswrapper[4748]: W0320 11:45:01.274354 4748 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbe605b49_d384_4a69_b79d_f3ae02325da4.slice/crio-88cdd931f05f11a87e2fbd7f1374d53ed53aa5d7743b19ed62d032b3859a00a6 WatchSource:0}: Error finding container 88cdd931f05f11a87e2fbd7f1374d53ed53aa5d7743b19ed62d032b3859a00a6: Status 404 returned error can't find the container with id 88cdd931f05f11a87e2fbd7f1374d53ed53aa5d7743b19ed62d032b3859a00a6 Mar 20 11:45:01 crc kubenswrapper[4748]: I0320 11:45:01.852170 4748 generic.go:334] "Generic (PLEG): container finished" podID="be605b49-d384-4a69-b79d-f3ae02325da4" containerID="0599a0e3f5f84d892ed36f17f295a62b97c46261feb03714fdcd10c0c93caf30" exitCode=0 Mar 20 11:45:01 crc kubenswrapper[4748]: I0320 11:45:01.852448 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566785-c7czq" event={"ID":"be605b49-d384-4a69-b79d-f3ae02325da4","Type":"ContainerDied","Data":"0599a0e3f5f84d892ed36f17f295a62b97c46261feb03714fdcd10c0c93caf30"} Mar 20 11:45:01 crc kubenswrapper[4748]: I0320 11:45:01.852474 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566785-c7czq" event={"ID":"be605b49-d384-4a69-b79d-f3ae02325da4","Type":"ContainerStarted","Data":"88cdd931f05f11a87e2fbd7f1374d53ed53aa5d7743b19ed62d032b3859a00a6"} Mar 20 11:45:03 crc kubenswrapper[4748]: I0320 11:45:03.238376 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566785-c7czq" Mar 20 11:45:03 crc kubenswrapper[4748]: I0320 11:45:03.398170 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/be605b49-d384-4a69-b79d-f3ae02325da4-secret-volume\") pod \"be605b49-d384-4a69-b79d-f3ae02325da4\" (UID: \"be605b49-d384-4a69-b79d-f3ae02325da4\") " Mar 20 11:45:03 crc kubenswrapper[4748]: I0320 11:45:03.398360 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jt7rm\" (UniqueName: \"kubernetes.io/projected/be605b49-d384-4a69-b79d-f3ae02325da4-kube-api-access-jt7rm\") pod \"be605b49-d384-4a69-b79d-f3ae02325da4\" (UID: \"be605b49-d384-4a69-b79d-f3ae02325da4\") " Mar 20 11:45:03 crc kubenswrapper[4748]: I0320 11:45:03.398478 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/be605b49-d384-4a69-b79d-f3ae02325da4-config-volume\") pod \"be605b49-d384-4a69-b79d-f3ae02325da4\" (UID: \"be605b49-d384-4a69-b79d-f3ae02325da4\") " Mar 20 11:45:03 crc kubenswrapper[4748]: I0320 11:45:03.399074 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be605b49-d384-4a69-b79d-f3ae02325da4-config-volume" (OuterVolumeSpecName: "config-volume") pod "be605b49-d384-4a69-b79d-f3ae02325da4" (UID: "be605b49-d384-4a69-b79d-f3ae02325da4"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 11:45:03 crc kubenswrapper[4748]: I0320 11:45:03.404885 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be605b49-d384-4a69-b79d-f3ae02325da4-kube-api-access-jt7rm" (OuterVolumeSpecName: "kube-api-access-jt7rm") pod "be605b49-d384-4a69-b79d-f3ae02325da4" (UID: "be605b49-d384-4a69-b79d-f3ae02325da4"). InnerVolumeSpecName "kube-api-access-jt7rm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:45:03 crc kubenswrapper[4748]: I0320 11:45:03.405303 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be605b49-d384-4a69-b79d-f3ae02325da4-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "be605b49-d384-4a69-b79d-f3ae02325da4" (UID: "be605b49-d384-4a69-b79d-f3ae02325da4"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 11:45:03 crc kubenswrapper[4748]: I0320 11:45:03.501245 4748 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/be605b49-d384-4a69-b79d-f3ae02325da4-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 20 11:45:03 crc kubenswrapper[4748]: I0320 11:45:03.501648 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jt7rm\" (UniqueName: \"kubernetes.io/projected/be605b49-d384-4a69-b79d-f3ae02325da4-kube-api-access-jt7rm\") on node \"crc\" DevicePath \"\"" Mar 20 11:45:03 crc kubenswrapper[4748]: I0320 11:45:03.501670 4748 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/be605b49-d384-4a69-b79d-f3ae02325da4-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 11:45:03 crc kubenswrapper[4748]: I0320 11:45:03.879493 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566785-c7czq" event={"ID":"be605b49-d384-4a69-b79d-f3ae02325da4","Type":"ContainerDied","Data":"88cdd931f05f11a87e2fbd7f1374d53ed53aa5d7743b19ed62d032b3859a00a6"} Mar 20 11:45:03 crc kubenswrapper[4748]: I0320 11:45:03.879532 4748 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="88cdd931f05f11a87e2fbd7f1374d53ed53aa5d7743b19ed62d032b3859a00a6" Mar 20 11:45:03 crc kubenswrapper[4748]: I0320 11:45:03.879676 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566785-c7czq" Mar 20 11:45:04 crc kubenswrapper[4748]: I0320 11:45:04.324257 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566740-nhhgf"] Mar 20 11:45:04 crc kubenswrapper[4748]: I0320 11:45:04.334696 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566740-nhhgf"] Mar 20 11:45:05 crc kubenswrapper[4748]: I0320 11:45:05.393957 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-8kn4d" Mar 20 11:45:05 crc kubenswrapper[4748]: I0320 11:45:05.394401 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-8kn4d" Mar 20 11:45:05 crc kubenswrapper[4748]: I0320 11:45:05.447393 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-8kn4d" Mar 20 11:45:05 crc kubenswrapper[4748]: I0320 11:45:05.527931 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d6cdaef-9045-42e8-8033-abba36827d27" path="/var/lib/kubelet/pods/7d6cdaef-9045-42e8-8033-abba36827d27/volumes" Mar 20 11:45:05 crc kubenswrapper[4748]: I0320 11:45:05.944632 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-8kn4d" Mar 20 11:45:05 crc kubenswrapper[4748]: I0320 11:45:05.991603 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8kn4d"] Mar 20 11:45:06 crc kubenswrapper[4748]: I0320 11:45:06.515445 4748 scope.go:117] "RemoveContainer" containerID="73b367691f863dac06f8e78eb6207c111ac5931452db06413eefe37857198933" Mar 20 11:45:06 crc kubenswrapper[4748]: E0320 11:45:06.515732 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lbvz_openshift-machine-config-operator(8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" Mar 20 11:45:07 crc kubenswrapper[4748]: I0320 11:45:07.913715 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-8kn4d" podUID="3021b62b-12d9-4bea-ad56-39a75ba44da7" containerName="registry-server" containerID="cri-o://70372c5a98aea8f1332ac1d1bf47127bb095b917c66ba66f02a3598265a9fa4a" gracePeriod=2 Mar 20 11:45:08 crc kubenswrapper[4748]: I0320 11:45:08.362157 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8kn4d" Mar 20 11:45:08 crc kubenswrapper[4748]: I0320 11:45:08.431801 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3021b62b-12d9-4bea-ad56-39a75ba44da7-utilities\") pod \"3021b62b-12d9-4bea-ad56-39a75ba44da7\" (UID: \"3021b62b-12d9-4bea-ad56-39a75ba44da7\") " Mar 20 11:45:08 crc kubenswrapper[4748]: I0320 11:45:08.431954 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3021b62b-12d9-4bea-ad56-39a75ba44da7-catalog-content\") pod \"3021b62b-12d9-4bea-ad56-39a75ba44da7\" (UID: \"3021b62b-12d9-4bea-ad56-39a75ba44da7\") " Mar 20 11:45:08 crc kubenswrapper[4748]: I0320 11:45:08.432023 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qwpw6\" (UniqueName: \"kubernetes.io/projected/3021b62b-12d9-4bea-ad56-39a75ba44da7-kube-api-access-qwpw6\") pod \"3021b62b-12d9-4bea-ad56-39a75ba44da7\" (UID: \"3021b62b-12d9-4bea-ad56-39a75ba44da7\") " Mar 20 11:45:08 crc kubenswrapper[4748]: I0320 11:45:08.432829 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3021b62b-12d9-4bea-ad56-39a75ba44da7-utilities" (OuterVolumeSpecName: "utilities") pod "3021b62b-12d9-4bea-ad56-39a75ba44da7" (UID: "3021b62b-12d9-4bea-ad56-39a75ba44da7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:45:08 crc kubenswrapper[4748]: I0320 11:45:08.455449 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3021b62b-12d9-4bea-ad56-39a75ba44da7-kube-api-access-qwpw6" (OuterVolumeSpecName: "kube-api-access-qwpw6") pod "3021b62b-12d9-4bea-ad56-39a75ba44da7" (UID: "3021b62b-12d9-4bea-ad56-39a75ba44da7"). InnerVolumeSpecName "kube-api-access-qwpw6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:45:08 crc kubenswrapper[4748]: I0320 11:45:08.534431 4748 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3021b62b-12d9-4bea-ad56-39a75ba44da7-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 11:45:08 crc kubenswrapper[4748]: I0320 11:45:08.534464 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qwpw6\" (UniqueName: \"kubernetes.io/projected/3021b62b-12d9-4bea-ad56-39a75ba44da7-kube-api-access-qwpw6\") on node \"crc\" DevicePath \"\"" Mar 20 11:45:08 crc kubenswrapper[4748]: I0320 11:45:08.925848 4748 generic.go:334] "Generic (PLEG): container finished" podID="3021b62b-12d9-4bea-ad56-39a75ba44da7" containerID="70372c5a98aea8f1332ac1d1bf47127bb095b917c66ba66f02a3598265a9fa4a" exitCode=0 Mar 20 11:45:08 crc kubenswrapper[4748]: I0320 11:45:08.925890 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8kn4d" event={"ID":"3021b62b-12d9-4bea-ad56-39a75ba44da7","Type":"ContainerDied","Data":"70372c5a98aea8f1332ac1d1bf47127bb095b917c66ba66f02a3598265a9fa4a"} Mar 20 11:45:08 crc kubenswrapper[4748]: I0320 11:45:08.925914 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8kn4d" event={"ID":"3021b62b-12d9-4bea-ad56-39a75ba44da7","Type":"ContainerDied","Data":"3279747f7d8f303e27816358a258743930935716a422dce18ad3351e7786d554"} Mar 20 11:45:08 crc kubenswrapper[4748]: I0320 11:45:08.925943 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8kn4d" Mar 20 11:45:08 crc kubenswrapper[4748]: I0320 11:45:08.925949 4748 scope.go:117] "RemoveContainer" containerID="70372c5a98aea8f1332ac1d1bf47127bb095b917c66ba66f02a3598265a9fa4a" Mar 20 11:45:08 crc kubenswrapper[4748]: I0320 11:45:08.946809 4748 scope.go:117] "RemoveContainer" containerID="47802c0de561de5586456002b03803e80d4f6a95d8a358cf4470023d9b1c2e92" Mar 20 11:45:08 crc kubenswrapper[4748]: I0320 11:45:08.965599 4748 scope.go:117] "RemoveContainer" containerID="8a6186263327eccb3b4faf802b54f8ee0d6f3c6ebe6cfa8d96b4cf7350408c5c" Mar 20 11:45:09 crc kubenswrapper[4748]: I0320 11:45:09.011415 4748 scope.go:117] "RemoveContainer" containerID="70372c5a98aea8f1332ac1d1bf47127bb095b917c66ba66f02a3598265a9fa4a" Mar 20 11:45:09 crc kubenswrapper[4748]: E0320 11:45:09.012009 4748 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70372c5a98aea8f1332ac1d1bf47127bb095b917c66ba66f02a3598265a9fa4a\": container with ID starting with 70372c5a98aea8f1332ac1d1bf47127bb095b917c66ba66f02a3598265a9fa4a not found: ID does not exist" containerID="70372c5a98aea8f1332ac1d1bf47127bb095b917c66ba66f02a3598265a9fa4a" Mar 20 11:45:09 crc kubenswrapper[4748]: I0320 11:45:09.012043 4748 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70372c5a98aea8f1332ac1d1bf47127bb095b917c66ba66f02a3598265a9fa4a"} err="failed to get container status \"70372c5a98aea8f1332ac1d1bf47127bb095b917c66ba66f02a3598265a9fa4a\": rpc error: code = NotFound desc = could not find container \"70372c5a98aea8f1332ac1d1bf47127bb095b917c66ba66f02a3598265a9fa4a\": container with ID starting with 70372c5a98aea8f1332ac1d1bf47127bb095b917c66ba66f02a3598265a9fa4a not found: ID does not exist" Mar 20 11:45:09 crc kubenswrapper[4748]: I0320 11:45:09.012065 4748 scope.go:117] "RemoveContainer" containerID="47802c0de561de5586456002b03803e80d4f6a95d8a358cf4470023d9b1c2e92" Mar 20 11:45:09 crc kubenswrapper[4748]: E0320 11:45:09.012382 4748 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47802c0de561de5586456002b03803e80d4f6a95d8a358cf4470023d9b1c2e92\": container with ID starting with 47802c0de561de5586456002b03803e80d4f6a95d8a358cf4470023d9b1c2e92 not found: ID does not exist" containerID="47802c0de561de5586456002b03803e80d4f6a95d8a358cf4470023d9b1c2e92" Mar 20 11:45:09 crc kubenswrapper[4748]: I0320 11:45:09.012432 4748 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47802c0de561de5586456002b03803e80d4f6a95d8a358cf4470023d9b1c2e92"} err="failed to get container status \"47802c0de561de5586456002b03803e80d4f6a95d8a358cf4470023d9b1c2e92\": rpc error: code = NotFound desc = could not find container \"47802c0de561de5586456002b03803e80d4f6a95d8a358cf4470023d9b1c2e92\": container with ID starting with 47802c0de561de5586456002b03803e80d4f6a95d8a358cf4470023d9b1c2e92 not found: ID does not exist" Mar 20 11:45:09 crc kubenswrapper[4748]: I0320 11:45:09.012505 4748 scope.go:117] "RemoveContainer" containerID="8a6186263327eccb3b4faf802b54f8ee0d6f3c6ebe6cfa8d96b4cf7350408c5c" Mar 20 11:45:09 crc kubenswrapper[4748]: E0320 11:45:09.012826 4748 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a6186263327eccb3b4faf802b54f8ee0d6f3c6ebe6cfa8d96b4cf7350408c5c\": container with ID starting with 8a6186263327eccb3b4faf802b54f8ee0d6f3c6ebe6cfa8d96b4cf7350408c5c not found: ID does not exist" containerID="8a6186263327eccb3b4faf802b54f8ee0d6f3c6ebe6cfa8d96b4cf7350408c5c" Mar 20 11:45:09 crc kubenswrapper[4748]: I0320 11:45:09.012893 4748 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a6186263327eccb3b4faf802b54f8ee0d6f3c6ebe6cfa8d96b4cf7350408c5c"} err="failed to get container status \"8a6186263327eccb3b4faf802b54f8ee0d6f3c6ebe6cfa8d96b4cf7350408c5c\": rpc error: code = NotFound desc = could not find container \"8a6186263327eccb3b4faf802b54f8ee0d6f3c6ebe6cfa8d96b4cf7350408c5c\": container with ID starting with 8a6186263327eccb3b4faf802b54f8ee0d6f3c6ebe6cfa8d96b4cf7350408c5c not found: ID does not exist" Mar 20 11:45:09 crc kubenswrapper[4748]: I0320 11:45:09.856330 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3021b62b-12d9-4bea-ad56-39a75ba44da7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3021b62b-12d9-4bea-ad56-39a75ba44da7" (UID: "3021b62b-12d9-4bea-ad56-39a75ba44da7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:45:09 crc kubenswrapper[4748]: I0320 11:45:09.860824 4748 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3021b62b-12d9-4bea-ad56-39a75ba44da7-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 11:45:10 crc kubenswrapper[4748]: I0320 11:45:10.179273 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8kn4d"] Mar 20 11:45:10 crc kubenswrapper[4748]: I0320 11:45:10.223604 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-8kn4d"] Mar 20 11:45:11 crc kubenswrapper[4748]: I0320 11:45:11.528687 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3021b62b-12d9-4bea-ad56-39a75ba44da7" path="/var/lib/kubelet/pods/3021b62b-12d9-4bea-ad56-39a75ba44da7/volumes" Mar 20 11:45:17 crc kubenswrapper[4748]: I0320 11:45:17.515944 4748 scope.go:117] "RemoveContainer" containerID="73b367691f863dac06f8e78eb6207c111ac5931452db06413eefe37857198933" Mar 20 11:45:17 crc kubenswrapper[4748]: E0320 11:45:17.516848 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lbvz_openshift-machine-config-operator(8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" Mar 20 11:45:22 crc kubenswrapper[4748]: I0320 11:45:22.569159 4748 scope.go:117] "RemoveContainer" containerID="1e7b1cf60089813463bee8757db9fe6e1668ec9a86893c9a2657ea67cb2d3cb0" Mar 20 11:45:28 crc kubenswrapper[4748]: I0320 11:45:28.515031 4748 scope.go:117] "RemoveContainer" containerID="73b367691f863dac06f8e78eb6207c111ac5931452db06413eefe37857198933" Mar 20 11:45:28 crc kubenswrapper[4748]: E0320 11:45:28.515894 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lbvz_openshift-machine-config-operator(8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" Mar 20 11:45:43 crc kubenswrapper[4748]: I0320 11:45:43.515643 4748 scope.go:117] "RemoveContainer" containerID="73b367691f863dac06f8e78eb6207c111ac5931452db06413eefe37857198933" Mar 20 11:45:43 crc kubenswrapper[4748]: E0320 11:45:43.516415 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lbvz_openshift-machine-config-operator(8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" Mar 20 11:45:57 crc kubenswrapper[4748]: I0320 11:45:57.515853 4748 scope.go:117] "RemoveContainer" containerID="73b367691f863dac06f8e78eb6207c111ac5931452db06413eefe37857198933" Mar 20 11:45:57 crc kubenswrapper[4748]: E0320 11:45:57.516669 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lbvz_openshift-machine-config-operator(8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" Mar 20 11:46:00 crc kubenswrapper[4748]: I0320 11:46:00.150698 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566786-hdlq9"] Mar 20 11:46:00 crc kubenswrapper[4748]: E0320 11:46:00.151890 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3021b62b-12d9-4bea-ad56-39a75ba44da7" containerName="extract-utilities" Mar 20 11:46:00 crc kubenswrapper[4748]: I0320 11:46:00.151905 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="3021b62b-12d9-4bea-ad56-39a75ba44da7" containerName="extract-utilities" Mar 20 11:46:00 crc kubenswrapper[4748]: E0320 11:46:00.151918 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be605b49-d384-4a69-b79d-f3ae02325da4" containerName="collect-profiles" Mar 20 11:46:00 crc kubenswrapper[4748]: I0320 11:46:00.151925 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="be605b49-d384-4a69-b79d-f3ae02325da4" containerName="collect-profiles" Mar 20 11:46:00 crc kubenswrapper[4748]: E0320 11:46:00.151938 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3021b62b-12d9-4bea-ad56-39a75ba44da7" containerName="extract-content" Mar 20 11:46:00 crc kubenswrapper[4748]: I0320 11:46:00.151944 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="3021b62b-12d9-4bea-ad56-39a75ba44da7" containerName="extract-content" Mar 20 11:46:00 crc kubenswrapper[4748]: E0320 11:46:00.151969 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3021b62b-12d9-4bea-ad56-39a75ba44da7" containerName="registry-server" Mar 20 11:46:00 crc kubenswrapper[4748]: I0320 11:46:00.151974 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="3021b62b-12d9-4bea-ad56-39a75ba44da7" containerName="registry-server" Mar 20 11:46:00 crc kubenswrapper[4748]: I0320 11:46:00.152182 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="3021b62b-12d9-4bea-ad56-39a75ba44da7" containerName="registry-server" Mar 20 11:46:00 crc kubenswrapper[4748]: I0320 11:46:00.152205 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="be605b49-d384-4a69-b79d-f3ae02325da4" containerName="collect-profiles" Mar 20 11:46:00 crc kubenswrapper[4748]: I0320 11:46:00.153160 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566786-hdlq9" Mar 20 11:46:00 crc kubenswrapper[4748]: I0320 11:46:00.157404 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 11:46:00 crc kubenswrapper[4748]: I0320 11:46:00.157648 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-p6q8z" Mar 20 11:46:00 crc kubenswrapper[4748]: I0320 11:46:00.157827 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 11:46:00 crc kubenswrapper[4748]: I0320 11:46:00.160998 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566786-hdlq9"] Mar 20 11:46:00 crc kubenswrapper[4748]: I0320 11:46:00.243256 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ms7p\" (UniqueName: \"kubernetes.io/projected/cf33acae-b0f1-469d-a655-28fa87914356-kube-api-access-8ms7p\") pod \"auto-csr-approver-29566786-hdlq9\" (UID: \"cf33acae-b0f1-469d-a655-28fa87914356\") " pod="openshift-infra/auto-csr-approver-29566786-hdlq9" Mar 20 11:46:00 crc kubenswrapper[4748]: I0320 11:46:00.344933 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8ms7p\" (UniqueName: \"kubernetes.io/projected/cf33acae-b0f1-469d-a655-28fa87914356-kube-api-access-8ms7p\") pod \"auto-csr-approver-29566786-hdlq9\" (UID: \"cf33acae-b0f1-469d-a655-28fa87914356\") " pod="openshift-infra/auto-csr-approver-29566786-hdlq9" Mar 20 11:46:00 crc kubenswrapper[4748]: I0320 11:46:00.370562 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ms7p\" (UniqueName: \"kubernetes.io/projected/cf33acae-b0f1-469d-a655-28fa87914356-kube-api-access-8ms7p\") pod \"auto-csr-approver-29566786-hdlq9\" (UID: \"cf33acae-b0f1-469d-a655-28fa87914356\") " pod="openshift-infra/auto-csr-approver-29566786-hdlq9" Mar 20 11:46:00 crc kubenswrapper[4748]: I0320 11:46:00.488004 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566786-hdlq9" Mar 20 11:46:00 crc kubenswrapper[4748]: I0320 11:46:00.944162 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566786-hdlq9"] Mar 20 11:46:00 crc kubenswrapper[4748]: I0320 11:46:00.960914 4748 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 11:46:01 crc kubenswrapper[4748]: I0320 11:46:01.396282 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566786-hdlq9" event={"ID":"cf33acae-b0f1-469d-a655-28fa87914356","Type":"ContainerStarted","Data":"d1ce07cb07ddebe459e8c67f94f4767899172b32dbfc1be4abbafa08739fb484"} Mar 20 11:46:02 crc kubenswrapper[4748]: I0320 11:46:02.405221 4748 generic.go:334] "Generic (PLEG): container finished" podID="cf33acae-b0f1-469d-a655-28fa87914356" containerID="3927b8c0333840b426f73fb675e152ef07b0f09bf3e10d51f94e44853e11f317" exitCode=0 Mar 20 11:46:02 crc kubenswrapper[4748]: I0320 11:46:02.405285 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566786-hdlq9" event={"ID":"cf33acae-b0f1-469d-a655-28fa87914356","Type":"ContainerDied","Data":"3927b8c0333840b426f73fb675e152ef07b0f09bf3e10d51f94e44853e11f317"} Mar 20 11:46:03 crc kubenswrapper[4748]: I0320 11:46:03.758634 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566786-hdlq9" Mar 20 11:46:03 crc kubenswrapper[4748]: I0320 11:46:03.919448 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8ms7p\" (UniqueName: \"kubernetes.io/projected/cf33acae-b0f1-469d-a655-28fa87914356-kube-api-access-8ms7p\") pod \"cf33acae-b0f1-469d-a655-28fa87914356\" (UID: \"cf33acae-b0f1-469d-a655-28fa87914356\") " Mar 20 11:46:03 crc kubenswrapper[4748]: I0320 11:46:03.927388 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf33acae-b0f1-469d-a655-28fa87914356-kube-api-access-8ms7p" (OuterVolumeSpecName: "kube-api-access-8ms7p") pod "cf33acae-b0f1-469d-a655-28fa87914356" (UID: "cf33acae-b0f1-469d-a655-28fa87914356"). InnerVolumeSpecName "kube-api-access-8ms7p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:46:04 crc kubenswrapper[4748]: I0320 11:46:04.022719 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8ms7p\" (UniqueName: \"kubernetes.io/projected/cf33acae-b0f1-469d-a655-28fa87914356-kube-api-access-8ms7p\") on node \"crc\" DevicePath \"\"" Mar 20 11:46:04 crc kubenswrapper[4748]: I0320 11:46:04.422324 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566786-hdlq9" event={"ID":"cf33acae-b0f1-469d-a655-28fa87914356","Type":"ContainerDied","Data":"d1ce07cb07ddebe459e8c67f94f4767899172b32dbfc1be4abbafa08739fb484"} Mar 20 11:46:04 crc kubenswrapper[4748]: I0320 11:46:04.422365 4748 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d1ce07cb07ddebe459e8c67f94f4767899172b32dbfc1be4abbafa08739fb484" Mar 20 11:46:04 crc kubenswrapper[4748]: I0320 11:46:04.422414 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566786-hdlq9" Mar 20 11:46:04 crc kubenswrapper[4748]: I0320 11:46:04.825815 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566780-wm8dm"] Mar 20 11:46:04 crc kubenswrapper[4748]: I0320 11:46:04.833995 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566780-wm8dm"] Mar 20 11:46:05 crc kubenswrapper[4748]: I0320 11:46:05.529600 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f104ce4-9d59-4c4b-97d8-8b186f48bc00" path="/var/lib/kubelet/pods/4f104ce4-9d59-4c4b-97d8-8b186f48bc00/volumes" Mar 20 11:46:08 crc kubenswrapper[4748]: I0320 11:46:08.515252 4748 scope.go:117] "RemoveContainer" containerID="73b367691f863dac06f8e78eb6207c111ac5931452db06413eefe37857198933" Mar 20 11:46:08 crc kubenswrapper[4748]: E0320 11:46:08.515802 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lbvz_openshift-machine-config-operator(8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" Mar 20 11:46:22 crc kubenswrapper[4748]: I0320 11:46:22.673571 4748 scope.go:117] "RemoveContainer" containerID="3be38a3d2958121772ea0fc9425d0d4401d392ed7ceaecb4d1ff03be9104ac74" Mar 20 11:46:23 crc kubenswrapper[4748]: I0320 11:46:23.515910 4748 scope.go:117] "RemoveContainer" containerID="73b367691f863dac06f8e78eb6207c111ac5931452db06413eefe37857198933" Mar 20 11:46:23 crc kubenswrapper[4748]: E0320 11:46:23.516193 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lbvz_openshift-machine-config-operator(8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" Mar 20 11:46:37 crc kubenswrapper[4748]: I0320 11:46:37.515906 4748 scope.go:117] "RemoveContainer" containerID="73b367691f863dac06f8e78eb6207c111ac5931452db06413eefe37857198933" Mar 20 11:46:37 crc kubenswrapper[4748]: E0320 11:46:37.516614 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lbvz_openshift-machine-config-operator(8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" Mar 20 11:46:48 crc kubenswrapper[4748]: I0320 11:46:48.515537 4748 scope.go:117] "RemoveContainer" containerID="73b367691f863dac06f8e78eb6207c111ac5931452db06413eefe37857198933" Mar 20 11:46:48 crc kubenswrapper[4748]: I0320 11:46:48.809260 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" event={"ID":"8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c","Type":"ContainerStarted","Data":"2ca3fadedbbf1ffc460403a3135324978873632bc777457c2622cd86b79e06ad"} Mar 20 11:46:58 crc kubenswrapper[4748]: I0320 11:46:58.940602 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-xfw9k"] Mar 20 11:46:58 crc kubenswrapper[4748]: E0320 11:46:58.941703 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf33acae-b0f1-469d-a655-28fa87914356" containerName="oc" Mar 20 11:46:58 crc kubenswrapper[4748]: I0320 11:46:58.941718 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf33acae-b0f1-469d-a655-28fa87914356" containerName="oc" Mar 20 11:46:58 crc kubenswrapper[4748]: I0320 11:46:58.941953 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf33acae-b0f1-469d-a655-28fa87914356" containerName="oc" Mar 20 11:46:58 crc kubenswrapper[4748]: I0320 11:46:58.944849 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xfw9k" Mar 20 11:46:58 crc kubenswrapper[4748]: I0320 11:46:58.950327 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xfw9k"] Mar 20 11:46:59 crc kubenswrapper[4748]: I0320 11:46:59.104182 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bdfc85e3-1e73-47e5-b4ae-31928ce48415-catalog-content\") pod \"community-operators-xfw9k\" (UID: \"bdfc85e3-1e73-47e5-b4ae-31928ce48415\") " pod="openshift-marketplace/community-operators-xfw9k" Mar 20 11:46:59 crc kubenswrapper[4748]: I0320 11:46:59.104702 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bdfc85e3-1e73-47e5-b4ae-31928ce48415-utilities\") pod \"community-operators-xfw9k\" (UID: \"bdfc85e3-1e73-47e5-b4ae-31928ce48415\") " pod="openshift-marketplace/community-operators-xfw9k" Mar 20 11:46:59 crc kubenswrapper[4748]: I0320 11:46:59.104948 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8bf2\" (UniqueName: \"kubernetes.io/projected/bdfc85e3-1e73-47e5-b4ae-31928ce48415-kube-api-access-q8bf2\") pod \"community-operators-xfw9k\" (UID: \"bdfc85e3-1e73-47e5-b4ae-31928ce48415\") " pod="openshift-marketplace/community-operators-xfw9k" Mar 20 11:46:59 crc kubenswrapper[4748]: I0320 11:46:59.206635 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q8bf2\" (UniqueName: \"kubernetes.io/projected/bdfc85e3-1e73-47e5-b4ae-31928ce48415-kube-api-access-q8bf2\") pod \"community-operators-xfw9k\" (UID: \"bdfc85e3-1e73-47e5-b4ae-31928ce48415\") " pod="openshift-marketplace/community-operators-xfw9k" Mar 20 11:46:59 crc kubenswrapper[4748]: I0320 11:46:59.206732 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bdfc85e3-1e73-47e5-b4ae-31928ce48415-catalog-content\") pod \"community-operators-xfw9k\" (UID: \"bdfc85e3-1e73-47e5-b4ae-31928ce48415\") " pod="openshift-marketplace/community-operators-xfw9k" Mar 20 11:46:59 crc kubenswrapper[4748]: I0320 11:46:59.206783 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bdfc85e3-1e73-47e5-b4ae-31928ce48415-utilities\") pod \"community-operators-xfw9k\" (UID: \"bdfc85e3-1e73-47e5-b4ae-31928ce48415\") " pod="openshift-marketplace/community-operators-xfw9k" Mar 20 11:46:59 crc kubenswrapper[4748]: I0320 11:46:59.207278 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bdfc85e3-1e73-47e5-b4ae-31928ce48415-utilities\") pod \"community-operators-xfw9k\" (UID: \"bdfc85e3-1e73-47e5-b4ae-31928ce48415\") " pod="openshift-marketplace/community-operators-xfw9k" Mar 20 11:46:59 crc kubenswrapper[4748]: I0320 11:46:59.207454 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bdfc85e3-1e73-47e5-b4ae-31928ce48415-catalog-content\") pod \"community-operators-xfw9k\" (UID: \"bdfc85e3-1e73-47e5-b4ae-31928ce48415\") " pod="openshift-marketplace/community-operators-xfw9k" Mar 20 11:46:59 crc kubenswrapper[4748]: I0320 11:46:59.227820 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8bf2\" (UniqueName: \"kubernetes.io/projected/bdfc85e3-1e73-47e5-b4ae-31928ce48415-kube-api-access-q8bf2\") pod \"community-operators-xfw9k\" (UID: \"bdfc85e3-1e73-47e5-b4ae-31928ce48415\") " pod="openshift-marketplace/community-operators-xfw9k" Mar 20 11:46:59 crc kubenswrapper[4748]: I0320 11:46:59.274271 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xfw9k" Mar 20 11:46:59 crc kubenswrapper[4748]: I0320 11:46:59.820475 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xfw9k"] Mar 20 11:47:00 crc kubenswrapper[4748]: W0320 11:47:00.176724 4748 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbdfc85e3_1e73_47e5_b4ae_31928ce48415.slice/crio-e027e5c557f7044c3187bc9caa8b53f5415fb5639d451d7138572fd231d088f9 WatchSource:0}: Error finding container e027e5c557f7044c3187bc9caa8b53f5415fb5639d451d7138572fd231d088f9: Status 404 returned error can't find the container with id e027e5c557f7044c3187bc9caa8b53f5415fb5639d451d7138572fd231d088f9 Mar 20 11:47:00 crc kubenswrapper[4748]: I0320 11:47:00.916077 4748 generic.go:334] "Generic (PLEG): container finished" podID="bdfc85e3-1e73-47e5-b4ae-31928ce48415" containerID="e7bdca31ff3e6d007a6b715285a7c34ef28285394f1da901ca991613105404ab" exitCode=0 Mar 20 11:47:00 crc kubenswrapper[4748]: I0320 11:47:00.916163 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xfw9k" event={"ID":"bdfc85e3-1e73-47e5-b4ae-31928ce48415","Type":"ContainerDied","Data":"e7bdca31ff3e6d007a6b715285a7c34ef28285394f1da901ca991613105404ab"} Mar 20 11:47:00 crc kubenswrapper[4748]: I0320 11:47:00.916651 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xfw9k" event={"ID":"bdfc85e3-1e73-47e5-b4ae-31928ce48415","Type":"ContainerStarted","Data":"e027e5c557f7044c3187bc9caa8b53f5415fb5639d451d7138572fd231d088f9"} Mar 20 11:47:03 crc kubenswrapper[4748]: I0320 11:47:03.018957 4748 generic.go:334] "Generic (PLEG): container finished" podID="bdfc85e3-1e73-47e5-b4ae-31928ce48415" containerID="654118f182c154ffcb454017eb0a39b0fbb79dc7410f312d711fc150f5282fdf" exitCode=0 Mar 20 11:47:03 crc kubenswrapper[4748]: I0320 11:47:03.019082 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xfw9k" event={"ID":"bdfc85e3-1e73-47e5-b4ae-31928ce48415","Type":"ContainerDied","Data":"654118f182c154ffcb454017eb0a39b0fbb79dc7410f312d711fc150f5282fdf"} Mar 20 11:47:04 crc kubenswrapper[4748]: I0320 11:47:04.030186 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xfw9k" event={"ID":"bdfc85e3-1e73-47e5-b4ae-31928ce48415","Type":"ContainerStarted","Data":"dd930c43180022a774e4122b9e2e7c617a4ae26590b0d01accf860f19335c64b"} Mar 20 11:47:04 crc kubenswrapper[4748]: I0320 11:47:04.053377 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-xfw9k" podStartSLOduration=3.49726609 podStartE2EDuration="6.05335777s" podCreationTimestamp="2026-03-20 11:46:58 +0000 UTC" firstStartedPulling="2026-03-20 11:47:00.918526374 +0000 UTC m=+4256.060072188" lastFinishedPulling="2026-03-20 11:47:03.474618054 +0000 UTC m=+4258.616163868" observedRunningTime="2026-03-20 11:47:04.046102936 +0000 UTC m=+4259.187648750" watchObservedRunningTime="2026-03-20 11:47:04.05335777 +0000 UTC m=+4259.194903584" Mar 20 11:47:09 crc kubenswrapper[4748]: I0320 11:47:09.274963 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-xfw9k" Mar 20 11:47:09 crc kubenswrapper[4748]: I0320 11:47:09.275574 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-xfw9k" Mar 20 11:47:09 crc kubenswrapper[4748]: I0320 11:47:09.324091 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-xfw9k" Mar 20 11:47:10 crc kubenswrapper[4748]: I0320 11:47:10.125588 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-xfw9k" Mar 20 11:47:10 crc kubenswrapper[4748]: I0320 11:47:10.183398 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xfw9k"] Mar 20 11:47:12 crc kubenswrapper[4748]: I0320 11:47:12.097935 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-xfw9k" podUID="bdfc85e3-1e73-47e5-b4ae-31928ce48415" containerName="registry-server" containerID="cri-o://dd930c43180022a774e4122b9e2e7c617a4ae26590b0d01accf860f19335c64b" gracePeriod=2 Mar 20 11:47:12 crc kubenswrapper[4748]: I0320 11:47:12.673603 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xfw9k" Mar 20 11:47:12 crc kubenswrapper[4748]: I0320 11:47:12.798498 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q8bf2\" (UniqueName: \"kubernetes.io/projected/bdfc85e3-1e73-47e5-b4ae-31928ce48415-kube-api-access-q8bf2\") pod \"bdfc85e3-1e73-47e5-b4ae-31928ce48415\" (UID: \"bdfc85e3-1e73-47e5-b4ae-31928ce48415\") " Mar 20 11:47:12 crc kubenswrapper[4748]: I0320 11:47:12.798702 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bdfc85e3-1e73-47e5-b4ae-31928ce48415-catalog-content\") pod \"bdfc85e3-1e73-47e5-b4ae-31928ce48415\" (UID: \"bdfc85e3-1e73-47e5-b4ae-31928ce48415\") " Mar 20 11:47:12 crc kubenswrapper[4748]: I0320 11:47:12.799112 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bdfc85e3-1e73-47e5-b4ae-31928ce48415-utilities\") pod \"bdfc85e3-1e73-47e5-b4ae-31928ce48415\" (UID: \"bdfc85e3-1e73-47e5-b4ae-31928ce48415\") " Mar 20 11:47:12 crc kubenswrapper[4748]: I0320 11:47:12.800850 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bdfc85e3-1e73-47e5-b4ae-31928ce48415-utilities" (OuterVolumeSpecName: "utilities") pod "bdfc85e3-1e73-47e5-b4ae-31928ce48415" (UID: "bdfc85e3-1e73-47e5-b4ae-31928ce48415"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:47:12 crc kubenswrapper[4748]: I0320 11:47:12.810311 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bdfc85e3-1e73-47e5-b4ae-31928ce48415-kube-api-access-q8bf2" (OuterVolumeSpecName: "kube-api-access-q8bf2") pod "bdfc85e3-1e73-47e5-b4ae-31928ce48415" (UID: "bdfc85e3-1e73-47e5-b4ae-31928ce48415"). InnerVolumeSpecName "kube-api-access-q8bf2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:47:12 crc kubenswrapper[4748]: I0320 11:47:12.864082 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bdfc85e3-1e73-47e5-b4ae-31928ce48415-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bdfc85e3-1e73-47e5-b4ae-31928ce48415" (UID: "bdfc85e3-1e73-47e5-b4ae-31928ce48415"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:47:12 crc kubenswrapper[4748]: I0320 11:47:12.900787 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q8bf2\" (UniqueName: \"kubernetes.io/projected/bdfc85e3-1e73-47e5-b4ae-31928ce48415-kube-api-access-q8bf2\") on node \"crc\" DevicePath \"\"" Mar 20 11:47:12 crc kubenswrapper[4748]: I0320 11:47:12.900867 4748 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bdfc85e3-1e73-47e5-b4ae-31928ce48415-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 11:47:12 crc kubenswrapper[4748]: I0320 11:47:12.900880 4748 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bdfc85e3-1e73-47e5-b4ae-31928ce48415-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 11:47:13 crc kubenswrapper[4748]: I0320 11:47:13.110079 4748 generic.go:334] "Generic (PLEG): container finished" podID="bdfc85e3-1e73-47e5-b4ae-31928ce48415" containerID="dd930c43180022a774e4122b9e2e7c617a4ae26590b0d01accf860f19335c64b" exitCode=0 Mar 20 11:47:13 crc kubenswrapper[4748]: I0320 11:47:13.110148 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xfw9k" event={"ID":"bdfc85e3-1e73-47e5-b4ae-31928ce48415","Type":"ContainerDied","Data":"dd930c43180022a774e4122b9e2e7c617a4ae26590b0d01accf860f19335c64b"} Mar 20 11:47:13 crc kubenswrapper[4748]: I0320 11:47:13.110205 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xfw9k" Mar 20 11:47:13 crc kubenswrapper[4748]: I0320 11:47:13.110449 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xfw9k" event={"ID":"bdfc85e3-1e73-47e5-b4ae-31928ce48415","Type":"ContainerDied","Data":"e027e5c557f7044c3187bc9caa8b53f5415fb5639d451d7138572fd231d088f9"} Mar 20 11:47:13 crc kubenswrapper[4748]: I0320 11:47:13.110480 4748 scope.go:117] "RemoveContainer" containerID="dd930c43180022a774e4122b9e2e7c617a4ae26590b0d01accf860f19335c64b" Mar 20 11:47:13 crc kubenswrapper[4748]: I0320 11:47:13.130072 4748 scope.go:117] "RemoveContainer" containerID="654118f182c154ffcb454017eb0a39b0fbb79dc7410f312d711fc150f5282fdf" Mar 20 11:47:13 crc kubenswrapper[4748]: I0320 11:47:13.155400 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xfw9k"] Mar 20 11:47:13 crc kubenswrapper[4748]: I0320 11:47:13.167313 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-xfw9k"] Mar 20 11:47:13 crc kubenswrapper[4748]: I0320 11:47:13.172503 4748 scope.go:117] "RemoveContainer" containerID="e7bdca31ff3e6d007a6b715285a7c34ef28285394f1da901ca991613105404ab" Mar 20 11:47:13 crc kubenswrapper[4748]: I0320 11:47:13.196917 4748 scope.go:117] "RemoveContainer" containerID="dd930c43180022a774e4122b9e2e7c617a4ae26590b0d01accf860f19335c64b" Mar 20 11:47:13 crc kubenswrapper[4748]: E0320 11:47:13.197425 4748 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd930c43180022a774e4122b9e2e7c617a4ae26590b0d01accf860f19335c64b\": container with ID starting with dd930c43180022a774e4122b9e2e7c617a4ae26590b0d01accf860f19335c64b not found: ID does not exist" containerID="dd930c43180022a774e4122b9e2e7c617a4ae26590b0d01accf860f19335c64b" Mar 20 11:47:13 crc kubenswrapper[4748]: I0320 11:47:13.197473 4748 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd930c43180022a774e4122b9e2e7c617a4ae26590b0d01accf860f19335c64b"} err="failed to get container status \"dd930c43180022a774e4122b9e2e7c617a4ae26590b0d01accf860f19335c64b\": rpc error: code = NotFound desc = could not find container \"dd930c43180022a774e4122b9e2e7c617a4ae26590b0d01accf860f19335c64b\": container with ID starting with dd930c43180022a774e4122b9e2e7c617a4ae26590b0d01accf860f19335c64b not found: ID does not exist" Mar 20 11:47:13 crc kubenswrapper[4748]: I0320 11:47:13.197501 4748 scope.go:117] "RemoveContainer" containerID="654118f182c154ffcb454017eb0a39b0fbb79dc7410f312d711fc150f5282fdf" Mar 20 11:47:13 crc kubenswrapper[4748]: E0320 11:47:13.197873 4748 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"654118f182c154ffcb454017eb0a39b0fbb79dc7410f312d711fc150f5282fdf\": container with ID starting with 654118f182c154ffcb454017eb0a39b0fbb79dc7410f312d711fc150f5282fdf not found: ID does not exist" containerID="654118f182c154ffcb454017eb0a39b0fbb79dc7410f312d711fc150f5282fdf" Mar 20 11:47:13 crc kubenswrapper[4748]: I0320 11:47:13.197902 4748 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"654118f182c154ffcb454017eb0a39b0fbb79dc7410f312d711fc150f5282fdf"} err="failed to get container status \"654118f182c154ffcb454017eb0a39b0fbb79dc7410f312d711fc150f5282fdf\": rpc error: code = NotFound desc = could not find container \"654118f182c154ffcb454017eb0a39b0fbb79dc7410f312d711fc150f5282fdf\": container with ID starting with 654118f182c154ffcb454017eb0a39b0fbb79dc7410f312d711fc150f5282fdf not found: ID does not exist" Mar 20 11:47:13 crc kubenswrapper[4748]: I0320 11:47:13.197936 4748 scope.go:117] "RemoveContainer" containerID="e7bdca31ff3e6d007a6b715285a7c34ef28285394f1da901ca991613105404ab" Mar 20 11:47:13 crc kubenswrapper[4748]: E0320 11:47:13.198197 4748 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7bdca31ff3e6d007a6b715285a7c34ef28285394f1da901ca991613105404ab\": container with ID starting with e7bdca31ff3e6d007a6b715285a7c34ef28285394f1da901ca991613105404ab not found: ID does not exist" containerID="e7bdca31ff3e6d007a6b715285a7c34ef28285394f1da901ca991613105404ab" Mar 20 11:47:13 crc kubenswrapper[4748]: I0320 11:47:13.198229 4748 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7bdca31ff3e6d007a6b715285a7c34ef28285394f1da901ca991613105404ab"} err="failed to get container status \"e7bdca31ff3e6d007a6b715285a7c34ef28285394f1da901ca991613105404ab\": rpc error: code = NotFound desc = could not find container \"e7bdca31ff3e6d007a6b715285a7c34ef28285394f1da901ca991613105404ab\": container with ID starting with e7bdca31ff3e6d007a6b715285a7c34ef28285394f1da901ca991613105404ab not found: ID does not exist" Mar 20 11:47:13 crc kubenswrapper[4748]: I0320 11:47:13.527716 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bdfc85e3-1e73-47e5-b4ae-31928ce48415" path="/var/lib/kubelet/pods/bdfc85e3-1e73-47e5-b4ae-31928ce48415/volumes" Mar 20 11:47:26 crc kubenswrapper[4748]: I0320 11:47:26.216392 4748 generic.go:334] "Generic (PLEG): container finished" podID="da9b4776-4f59-46e4-9cdf-953b0a7f83bf" containerID="6264d77b4f783d8c446f419c3e26316d6922a8e2bba2ed44c27ecc804310b54a" exitCode=0 Mar 20 11:47:26 crc kubenswrapper[4748]: I0320 11:47:26.216494 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"da9b4776-4f59-46e4-9cdf-953b0a7f83bf","Type":"ContainerDied","Data":"6264d77b4f783d8c446f419c3e26316d6922a8e2bba2ed44c27ecc804310b54a"} Mar 20 11:47:27 crc kubenswrapper[4748]: I0320 11:47:27.592761 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 20 11:47:27 crc kubenswrapper[4748]: I0320 11:47:27.686256 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"da9b4776-4f59-46e4-9cdf-953b0a7f83bf\" (UID: \"da9b4776-4f59-46e4-9cdf-953b0a7f83bf\") " Mar 20 11:47:27 crc kubenswrapper[4748]: I0320 11:47:27.686334 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/da9b4776-4f59-46e4-9cdf-953b0a7f83bf-test-operator-ephemeral-temporary\") pod \"da9b4776-4f59-46e4-9cdf-953b0a7f83bf\" (UID: \"da9b4776-4f59-46e4-9cdf-953b0a7f83bf\") " Mar 20 11:47:27 crc kubenswrapper[4748]: I0320 11:47:27.686660 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/da9b4776-4f59-46e4-9cdf-953b0a7f83bf-config-data\") pod \"da9b4776-4f59-46e4-9cdf-953b0a7f83bf\" (UID: \"da9b4776-4f59-46e4-9cdf-953b0a7f83bf\") " Mar 20 11:47:27 crc kubenswrapper[4748]: I0320 11:47:27.686683 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-stbvw\" (UniqueName: \"kubernetes.io/projected/da9b4776-4f59-46e4-9cdf-953b0a7f83bf-kube-api-access-stbvw\") pod \"da9b4776-4f59-46e4-9cdf-953b0a7f83bf\" (UID: \"da9b4776-4f59-46e4-9cdf-953b0a7f83bf\") " Mar 20 11:47:27 crc kubenswrapper[4748]: I0320 11:47:27.686708 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/da9b4776-4f59-46e4-9cdf-953b0a7f83bf-ca-certs\") pod \"da9b4776-4f59-46e4-9cdf-953b0a7f83bf\" (UID: \"da9b4776-4f59-46e4-9cdf-953b0a7f83bf\") " Mar 20 11:47:27 crc kubenswrapper[4748]: I0320 11:47:27.686735 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/da9b4776-4f59-46e4-9cdf-953b0a7f83bf-openstack-config-secret\") pod \"da9b4776-4f59-46e4-9cdf-953b0a7f83bf\" (UID: \"da9b4776-4f59-46e4-9cdf-953b0a7f83bf\") " Mar 20 11:47:27 crc kubenswrapper[4748]: I0320 11:47:27.686803 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/da9b4776-4f59-46e4-9cdf-953b0a7f83bf-test-operator-ephemeral-workdir\") pod \"da9b4776-4f59-46e4-9cdf-953b0a7f83bf\" (UID: \"da9b4776-4f59-46e4-9cdf-953b0a7f83bf\") " Mar 20 11:47:27 crc kubenswrapper[4748]: I0320 11:47:27.686880 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/da9b4776-4f59-46e4-9cdf-953b0a7f83bf-openstack-config\") pod \"da9b4776-4f59-46e4-9cdf-953b0a7f83bf\" (UID: \"da9b4776-4f59-46e4-9cdf-953b0a7f83bf\") " Mar 20 11:47:27 crc kubenswrapper[4748]: I0320 11:47:27.686912 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/da9b4776-4f59-46e4-9cdf-953b0a7f83bf-ssh-key\") pod \"da9b4776-4f59-46e4-9cdf-953b0a7f83bf\" (UID: \"da9b4776-4f59-46e4-9cdf-953b0a7f83bf\") " Mar 20 11:47:27 crc kubenswrapper[4748]: I0320 11:47:27.687628 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da9b4776-4f59-46e4-9cdf-953b0a7f83bf-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "da9b4776-4f59-46e4-9cdf-953b0a7f83bf" (UID: "da9b4776-4f59-46e4-9cdf-953b0a7f83bf"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:47:27 crc kubenswrapper[4748]: I0320 11:47:27.687915 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da9b4776-4f59-46e4-9cdf-953b0a7f83bf-config-data" (OuterVolumeSpecName: "config-data") pod "da9b4776-4f59-46e4-9cdf-953b0a7f83bf" (UID: "da9b4776-4f59-46e4-9cdf-953b0a7f83bf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 11:47:27 crc kubenswrapper[4748]: I0320 11:47:27.692692 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da9b4776-4f59-46e4-9cdf-953b0a7f83bf-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "da9b4776-4f59-46e4-9cdf-953b0a7f83bf" (UID: "da9b4776-4f59-46e4-9cdf-953b0a7f83bf"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:47:27 crc kubenswrapper[4748]: I0320 11:47:27.693284 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "test-operator-logs") pod "da9b4776-4f59-46e4-9cdf-953b0a7f83bf" (UID: "da9b4776-4f59-46e4-9cdf-953b0a7f83bf"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 20 11:47:27 crc kubenswrapper[4748]: I0320 11:47:27.697933 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da9b4776-4f59-46e4-9cdf-953b0a7f83bf-kube-api-access-stbvw" (OuterVolumeSpecName: "kube-api-access-stbvw") pod "da9b4776-4f59-46e4-9cdf-953b0a7f83bf" (UID: "da9b4776-4f59-46e4-9cdf-953b0a7f83bf"). InnerVolumeSpecName "kube-api-access-stbvw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:47:27 crc kubenswrapper[4748]: I0320 11:47:27.718419 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da9b4776-4f59-46e4-9cdf-953b0a7f83bf-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "da9b4776-4f59-46e4-9cdf-953b0a7f83bf" (UID: "da9b4776-4f59-46e4-9cdf-953b0a7f83bf"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 11:47:27 crc kubenswrapper[4748]: I0320 11:47:27.719128 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da9b4776-4f59-46e4-9cdf-953b0a7f83bf-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "da9b4776-4f59-46e4-9cdf-953b0a7f83bf" (UID: "da9b4776-4f59-46e4-9cdf-953b0a7f83bf"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 11:47:27 crc kubenswrapper[4748]: I0320 11:47:27.724193 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da9b4776-4f59-46e4-9cdf-953b0a7f83bf-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "da9b4776-4f59-46e4-9cdf-953b0a7f83bf" (UID: "da9b4776-4f59-46e4-9cdf-953b0a7f83bf"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 11:47:27 crc kubenswrapper[4748]: I0320 11:47:27.741561 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da9b4776-4f59-46e4-9cdf-953b0a7f83bf-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "da9b4776-4f59-46e4-9cdf-953b0a7f83bf" (UID: "da9b4776-4f59-46e4-9cdf-953b0a7f83bf"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 11:47:27 crc kubenswrapper[4748]: I0320 11:47:27.788784 4748 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/da9b4776-4f59-46e4-9cdf-953b0a7f83bf-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 11:47:27 crc kubenswrapper[4748]: I0320 11:47:27.788820 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-stbvw\" (UniqueName: \"kubernetes.io/projected/da9b4776-4f59-46e4-9cdf-953b0a7f83bf-kube-api-access-stbvw\") on node \"crc\" DevicePath \"\"" Mar 20 11:47:27 crc kubenswrapper[4748]: I0320 11:47:27.788855 4748 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/da9b4776-4f59-46e4-9cdf-953b0a7f83bf-ca-certs\") on node \"crc\" DevicePath \"\"" Mar 20 11:47:27 crc kubenswrapper[4748]: I0320 11:47:27.788870 4748 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/da9b4776-4f59-46e4-9cdf-953b0a7f83bf-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Mar 20 11:47:27 crc kubenswrapper[4748]: I0320 11:47:27.788881 4748 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/da9b4776-4f59-46e4-9cdf-953b0a7f83bf-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Mar 20 11:47:27 crc kubenswrapper[4748]: I0320 11:47:27.788895 4748 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/da9b4776-4f59-46e4-9cdf-953b0a7f83bf-openstack-config\") on node \"crc\" DevicePath \"\"" Mar 20 11:47:27 crc kubenswrapper[4748]: I0320 11:47:27.788906 4748 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/da9b4776-4f59-46e4-9cdf-953b0a7f83bf-ssh-key\") on node \"crc\" DevicePath \"\"" Mar 20 11:47:27 crc kubenswrapper[4748]: I0320 11:47:27.788948 4748 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Mar 20 11:47:27 crc kubenswrapper[4748]: I0320 11:47:27.788962 4748 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/da9b4776-4f59-46e4-9cdf-953b0a7f83bf-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Mar 20 11:47:27 crc kubenswrapper[4748]: I0320 11:47:27.812138 4748 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Mar 20 11:47:27 crc kubenswrapper[4748]: I0320 11:47:27.890897 4748 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Mar 20 11:47:28 crc kubenswrapper[4748]: I0320 11:47:28.236499 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"da9b4776-4f59-46e4-9cdf-953b0a7f83bf","Type":"ContainerDied","Data":"4ddb9546ce0d27085ea90fc667c1d432c125b2bbdb76fa2736308e846999e0ac"} Mar 20 11:47:28 crc kubenswrapper[4748]: I0320 11:47:28.236531 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 20 11:47:28 crc kubenswrapper[4748]: I0320 11:47:28.236550 4748 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4ddb9546ce0d27085ea90fc667c1d432c125b2bbdb76fa2736308e846999e0ac" Mar 20 11:47:33 crc kubenswrapper[4748]: I0320 11:47:33.105633 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 20 11:47:33 crc kubenswrapper[4748]: E0320 11:47:33.106665 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da9b4776-4f59-46e4-9cdf-953b0a7f83bf" containerName="tempest-tests-tempest-tests-runner" Mar 20 11:47:33 crc kubenswrapper[4748]: I0320 11:47:33.106689 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="da9b4776-4f59-46e4-9cdf-953b0a7f83bf" containerName="tempest-tests-tempest-tests-runner" Mar 20 11:47:33 crc kubenswrapper[4748]: E0320 11:47:33.106724 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bdfc85e3-1e73-47e5-b4ae-31928ce48415" containerName="extract-content" Mar 20 11:47:33 crc kubenswrapper[4748]: I0320 11:47:33.106736 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdfc85e3-1e73-47e5-b4ae-31928ce48415" containerName="extract-content" Mar 20 11:47:33 crc kubenswrapper[4748]: E0320 11:47:33.106864 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bdfc85e3-1e73-47e5-b4ae-31928ce48415" containerName="registry-server" Mar 20 11:47:33 crc kubenswrapper[4748]: I0320 11:47:33.106880 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdfc85e3-1e73-47e5-b4ae-31928ce48415" containerName="registry-server" Mar 20 11:47:33 crc kubenswrapper[4748]: E0320 11:47:33.106915 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bdfc85e3-1e73-47e5-b4ae-31928ce48415" containerName="extract-utilities" Mar 20 11:47:33 crc kubenswrapper[4748]: I0320 11:47:33.106928 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdfc85e3-1e73-47e5-b4ae-31928ce48415" containerName="extract-utilities" Mar 20 11:47:33 crc kubenswrapper[4748]: I0320 11:47:33.107277 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="da9b4776-4f59-46e4-9cdf-953b0a7f83bf" containerName="tempest-tests-tempest-tests-runner" Mar 20 11:47:33 crc kubenswrapper[4748]: I0320 11:47:33.107294 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="bdfc85e3-1e73-47e5-b4ae-31928ce48415" containerName="registry-server" Mar 20 11:47:33 crc kubenswrapper[4748]: I0320 11:47:33.108326 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 20 11:47:33 crc kubenswrapper[4748]: I0320 11:47:33.111093 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-fmdpt" Mar 20 11:47:33 crc kubenswrapper[4748]: I0320 11:47:33.121718 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 20 11:47:33 crc kubenswrapper[4748]: I0320 11:47:33.195205 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"46161687-c406-4f73-aced-74edbd5e2f81\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 20 11:47:33 crc kubenswrapper[4748]: I0320 11:47:33.195273 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qb6n2\" (UniqueName: \"kubernetes.io/projected/46161687-c406-4f73-aced-74edbd5e2f81-kube-api-access-qb6n2\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"46161687-c406-4f73-aced-74edbd5e2f81\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 20 11:47:33 crc kubenswrapper[4748]: I0320 11:47:33.296775 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"46161687-c406-4f73-aced-74edbd5e2f81\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 20 11:47:33 crc kubenswrapper[4748]: I0320 11:47:33.296826 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qb6n2\" (UniqueName: \"kubernetes.io/projected/46161687-c406-4f73-aced-74edbd5e2f81-kube-api-access-qb6n2\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"46161687-c406-4f73-aced-74edbd5e2f81\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 20 11:47:33 crc kubenswrapper[4748]: I0320 11:47:33.297376 4748 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"46161687-c406-4f73-aced-74edbd5e2f81\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 20 11:47:33 crc kubenswrapper[4748]: I0320 11:47:33.321163 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qb6n2\" (UniqueName: \"kubernetes.io/projected/46161687-c406-4f73-aced-74edbd5e2f81-kube-api-access-qb6n2\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"46161687-c406-4f73-aced-74edbd5e2f81\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 20 11:47:33 crc kubenswrapper[4748]: I0320 11:47:33.329100 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"46161687-c406-4f73-aced-74edbd5e2f81\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 20 11:47:33 crc kubenswrapper[4748]: I0320 11:47:33.440607 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 20 11:47:33 crc kubenswrapper[4748]: I0320 11:47:33.865353 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 20 11:47:34 crc kubenswrapper[4748]: I0320 11:47:34.296499 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"46161687-c406-4f73-aced-74edbd5e2f81","Type":"ContainerStarted","Data":"defe8174759520137e23b6a6cc3cd8b45a14161ab04d1dc08293d359dc056d1e"} Mar 20 11:47:36 crc kubenswrapper[4748]: I0320 11:47:36.316228 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"46161687-c406-4f73-aced-74edbd5e2f81","Type":"ContainerStarted","Data":"baad455fc42567062bf56b8a8f8d48e7b3f34baa7a70e82b81df8fe880226362"} Mar 20 11:47:36 crc kubenswrapper[4748]: I0320 11:47:36.331948 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=2.46761939 podStartE2EDuration="3.331925267s" podCreationTimestamp="2026-03-20 11:47:33 +0000 UTC" firstStartedPulling="2026-03-20 11:47:34.285785484 +0000 UTC m=+4289.427331298" lastFinishedPulling="2026-03-20 11:47:35.150091361 +0000 UTC m=+4290.291637175" observedRunningTime="2026-03-20 11:47:36.328783678 +0000 UTC m=+4291.470351942" watchObservedRunningTime="2026-03-20 11:47:36.331925267 +0000 UTC m=+4291.473471081" Mar 20 11:48:00 crc kubenswrapper[4748]: I0320 11:48:00.150696 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566788-v2zsq"] Mar 20 11:48:00 crc kubenswrapper[4748]: I0320 11:48:00.154536 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566788-v2zsq" Mar 20 11:48:00 crc kubenswrapper[4748]: I0320 11:48:00.157076 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-p6q8z" Mar 20 11:48:00 crc kubenswrapper[4748]: I0320 11:48:00.159591 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 11:48:00 crc kubenswrapper[4748]: I0320 11:48:00.159831 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 11:48:00 crc kubenswrapper[4748]: I0320 11:48:00.185691 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566788-v2zsq"] Mar 20 11:48:00 crc kubenswrapper[4748]: I0320 11:48:00.263143 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbpdl\" (UniqueName: \"kubernetes.io/projected/c53b89d4-6145-4717-815a-00d6a80049a4-kube-api-access-bbpdl\") pod \"auto-csr-approver-29566788-v2zsq\" (UID: \"c53b89d4-6145-4717-815a-00d6a80049a4\") " pod="openshift-infra/auto-csr-approver-29566788-v2zsq" Mar 20 11:48:00 crc kubenswrapper[4748]: I0320 11:48:00.365089 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bbpdl\" (UniqueName: \"kubernetes.io/projected/c53b89d4-6145-4717-815a-00d6a80049a4-kube-api-access-bbpdl\") pod \"auto-csr-approver-29566788-v2zsq\" (UID: \"c53b89d4-6145-4717-815a-00d6a80049a4\") " pod="openshift-infra/auto-csr-approver-29566788-v2zsq" Mar 20 11:48:00 crc kubenswrapper[4748]: I0320 11:48:00.388112 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbpdl\" (UniqueName: \"kubernetes.io/projected/c53b89d4-6145-4717-815a-00d6a80049a4-kube-api-access-bbpdl\") pod \"auto-csr-approver-29566788-v2zsq\" (UID: \"c53b89d4-6145-4717-815a-00d6a80049a4\") " pod="openshift-infra/auto-csr-approver-29566788-v2zsq" Mar 20 11:48:00 crc kubenswrapper[4748]: I0320 11:48:00.487904 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566788-v2zsq" Mar 20 11:48:00 crc kubenswrapper[4748]: I0320 11:48:00.953998 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566788-v2zsq"] Mar 20 11:48:01 crc kubenswrapper[4748]: I0320 11:48:01.405646 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-xzd8l/must-gather-ksq57"] Mar 20 11:48:01 crc kubenswrapper[4748]: I0320 11:48:01.407910 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xzd8l/must-gather-ksq57" Mar 20 11:48:01 crc kubenswrapper[4748]: I0320 11:48:01.409976 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-xzd8l"/"default-dockercfg-vx2c5" Mar 20 11:48:01 crc kubenswrapper[4748]: I0320 11:48:01.410313 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-xzd8l"/"openshift-service-ca.crt" Mar 20 11:48:01 crc kubenswrapper[4748]: I0320 11:48:01.410330 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-xzd8l"/"kube-root-ca.crt" Mar 20 11:48:01 crc kubenswrapper[4748]: I0320 11:48:01.415843 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-xzd8l/must-gather-ksq57"] Mar 20 11:48:01 crc kubenswrapper[4748]: I0320 11:48:01.548644 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566788-v2zsq" event={"ID":"c53b89d4-6145-4717-815a-00d6a80049a4","Type":"ContainerStarted","Data":"c10c347c47f7085ada3b1c576fe3556185e8f88daa03439df573ad62e27e680f"} Mar 20 11:48:01 crc kubenswrapper[4748]: I0320 11:48:01.592969 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/90b5aeb3-bb8e-40ae-872f-72c5f98e260f-must-gather-output\") pod \"must-gather-ksq57\" (UID: \"90b5aeb3-bb8e-40ae-872f-72c5f98e260f\") " pod="openshift-must-gather-xzd8l/must-gather-ksq57" Mar 20 11:48:01 crc kubenswrapper[4748]: I0320 11:48:01.593107 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqd6x\" (UniqueName: \"kubernetes.io/projected/90b5aeb3-bb8e-40ae-872f-72c5f98e260f-kube-api-access-gqd6x\") pod \"must-gather-ksq57\" (UID: \"90b5aeb3-bb8e-40ae-872f-72c5f98e260f\") " pod="openshift-must-gather-xzd8l/must-gather-ksq57" Mar 20 11:48:01 crc kubenswrapper[4748]: I0320 11:48:01.695097 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/90b5aeb3-bb8e-40ae-872f-72c5f98e260f-must-gather-output\") pod \"must-gather-ksq57\" (UID: \"90b5aeb3-bb8e-40ae-872f-72c5f98e260f\") " pod="openshift-must-gather-xzd8l/must-gather-ksq57" Mar 20 11:48:01 crc kubenswrapper[4748]: I0320 11:48:01.695942 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/90b5aeb3-bb8e-40ae-872f-72c5f98e260f-must-gather-output\") pod \"must-gather-ksq57\" (UID: \"90b5aeb3-bb8e-40ae-872f-72c5f98e260f\") " pod="openshift-must-gather-xzd8l/must-gather-ksq57" Mar 20 11:48:01 crc kubenswrapper[4748]: I0320 11:48:01.696313 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gqd6x\" (UniqueName: \"kubernetes.io/projected/90b5aeb3-bb8e-40ae-872f-72c5f98e260f-kube-api-access-gqd6x\") pod \"must-gather-ksq57\" (UID: \"90b5aeb3-bb8e-40ae-872f-72c5f98e260f\") " pod="openshift-must-gather-xzd8l/must-gather-ksq57" Mar 20 11:48:01 crc kubenswrapper[4748]: I0320 11:48:01.715058 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqd6x\" (UniqueName: \"kubernetes.io/projected/90b5aeb3-bb8e-40ae-872f-72c5f98e260f-kube-api-access-gqd6x\") pod \"must-gather-ksq57\" (UID: \"90b5aeb3-bb8e-40ae-872f-72c5f98e260f\") " pod="openshift-must-gather-xzd8l/must-gather-ksq57" Mar 20 11:48:01 crc kubenswrapper[4748]: I0320 11:48:01.728378 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xzd8l/must-gather-ksq57" Mar 20 11:48:02 crc kubenswrapper[4748]: I0320 11:48:02.206097 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-xzd8l/must-gather-ksq57"] Mar 20 11:48:02 crc kubenswrapper[4748]: W0320 11:48:02.208830 4748 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod90b5aeb3_bb8e_40ae_872f_72c5f98e260f.slice/crio-1a94c170dd7493baf79664394598f7979f315c0431b27144c891a39379200142 WatchSource:0}: Error finding container 1a94c170dd7493baf79664394598f7979f315c0431b27144c891a39379200142: Status 404 returned error can't find the container with id 1a94c170dd7493baf79664394598f7979f315c0431b27144c891a39379200142 Mar 20 11:48:02 crc kubenswrapper[4748]: I0320 11:48:02.559086 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566788-v2zsq" event={"ID":"c53b89d4-6145-4717-815a-00d6a80049a4","Type":"ContainerStarted","Data":"af83612a5d6f19ae622228644002233a9d3a159489750155ead34016e5ae33ff"} Mar 20 11:48:02 crc kubenswrapper[4748]: I0320 11:48:02.560472 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xzd8l/must-gather-ksq57" event={"ID":"90b5aeb3-bb8e-40ae-872f-72c5f98e260f","Type":"ContainerStarted","Data":"1a94c170dd7493baf79664394598f7979f315c0431b27144c891a39379200142"} Mar 20 11:48:02 crc kubenswrapper[4748]: I0320 11:48:02.582042 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29566788-v2zsq" podStartSLOduration=1.470847855 podStartE2EDuration="2.582013234s" podCreationTimestamp="2026-03-20 11:48:00 +0000 UTC" firstStartedPulling="2026-03-20 11:48:00.959309329 +0000 UTC m=+4316.100855143" lastFinishedPulling="2026-03-20 11:48:02.070474708 +0000 UTC m=+4317.212020522" observedRunningTime="2026-03-20 11:48:02.578401103 +0000 UTC m=+4317.719946927" watchObservedRunningTime="2026-03-20 11:48:02.582013234 +0000 UTC m=+4317.723559048" Mar 20 11:48:03 crc kubenswrapper[4748]: I0320 11:48:03.570878 4748 generic.go:334] "Generic (PLEG): container finished" podID="c53b89d4-6145-4717-815a-00d6a80049a4" containerID="af83612a5d6f19ae622228644002233a9d3a159489750155ead34016e5ae33ff" exitCode=0 Mar 20 11:48:03 crc kubenswrapper[4748]: I0320 11:48:03.571018 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566788-v2zsq" event={"ID":"c53b89d4-6145-4717-815a-00d6a80049a4","Type":"ContainerDied","Data":"af83612a5d6f19ae622228644002233a9d3a159489750155ead34016e5ae33ff"} Mar 20 11:48:06 crc kubenswrapper[4748]: I0320 11:48:06.605421 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566788-v2zsq" event={"ID":"c53b89d4-6145-4717-815a-00d6a80049a4","Type":"ContainerDied","Data":"c10c347c47f7085ada3b1c576fe3556185e8f88daa03439df573ad62e27e680f"} Mar 20 11:48:06 crc kubenswrapper[4748]: I0320 11:48:06.606088 4748 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c10c347c47f7085ada3b1c576fe3556185e8f88daa03439df573ad62e27e680f" Mar 20 11:48:06 crc kubenswrapper[4748]: I0320 11:48:06.613338 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566788-v2zsq" Mar 20 11:48:06 crc kubenswrapper[4748]: I0320 11:48:06.722651 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bbpdl\" (UniqueName: \"kubernetes.io/projected/c53b89d4-6145-4717-815a-00d6a80049a4-kube-api-access-bbpdl\") pod \"c53b89d4-6145-4717-815a-00d6a80049a4\" (UID: \"c53b89d4-6145-4717-815a-00d6a80049a4\") " Mar 20 11:48:06 crc kubenswrapper[4748]: I0320 11:48:06.728786 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c53b89d4-6145-4717-815a-00d6a80049a4-kube-api-access-bbpdl" (OuterVolumeSpecName: "kube-api-access-bbpdl") pod "c53b89d4-6145-4717-815a-00d6a80049a4" (UID: "c53b89d4-6145-4717-815a-00d6a80049a4"). InnerVolumeSpecName "kube-api-access-bbpdl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:48:06 crc kubenswrapper[4748]: I0320 11:48:06.825325 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bbpdl\" (UniqueName: \"kubernetes.io/projected/c53b89d4-6145-4717-815a-00d6a80049a4-kube-api-access-bbpdl\") on node \"crc\" DevicePath \"\"" Mar 20 11:48:07 crc kubenswrapper[4748]: I0320 11:48:07.615395 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566788-v2zsq" Mar 20 11:48:07 crc kubenswrapper[4748]: I0320 11:48:07.680696 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566782-vvbtb"] Mar 20 11:48:07 crc kubenswrapper[4748]: I0320 11:48:07.689976 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566782-vvbtb"] Mar 20 11:48:08 crc kubenswrapper[4748]: I0320 11:48:08.626724 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xzd8l/must-gather-ksq57" event={"ID":"90b5aeb3-bb8e-40ae-872f-72c5f98e260f","Type":"ContainerStarted","Data":"234c1c5440f3d405ca09c49c1a6144bf682d9dfe23bde4ed61e4c1c2a432d5e1"} Mar 20 11:48:09 crc kubenswrapper[4748]: I0320 11:48:09.531580 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c85e48f-fb2f-489f-8143-226f44751edd" path="/var/lib/kubelet/pods/0c85e48f-fb2f-489f-8143-226f44751edd/volumes" Mar 20 11:48:09 crc kubenswrapper[4748]: I0320 11:48:09.656868 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xzd8l/must-gather-ksq57" event={"ID":"90b5aeb3-bb8e-40ae-872f-72c5f98e260f","Type":"ContainerStarted","Data":"3ca133b9481d157ad3d194051c919f77e5ae07e18fbbbac77374e496a0c3014f"} Mar 20 11:48:09 crc kubenswrapper[4748]: I0320 11:48:09.679252 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-xzd8l/must-gather-ksq57" podStartSLOduration=2.6202859 podStartE2EDuration="8.679236522s" podCreationTimestamp="2026-03-20 11:48:01 +0000 UTC" firstStartedPulling="2026-03-20 11:48:02.213781652 +0000 UTC m=+4317.355327466" lastFinishedPulling="2026-03-20 11:48:08.272732264 +0000 UTC m=+4323.414278088" observedRunningTime="2026-03-20 11:48:09.677652192 +0000 UTC m=+4324.819198006" watchObservedRunningTime="2026-03-20 11:48:09.679236522 +0000 UTC m=+4324.820782326" Mar 20 11:48:12 crc kubenswrapper[4748]: I0320 11:48:12.155114 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-xzd8l/crc-debug-wtzc7"] Mar 20 11:48:12 crc kubenswrapper[4748]: E0320 11:48:12.156362 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c53b89d4-6145-4717-815a-00d6a80049a4" containerName="oc" Mar 20 11:48:12 crc kubenswrapper[4748]: I0320 11:48:12.156383 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="c53b89d4-6145-4717-815a-00d6a80049a4" containerName="oc" Mar 20 11:48:12 crc kubenswrapper[4748]: I0320 11:48:12.156674 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="c53b89d4-6145-4717-815a-00d6a80049a4" containerName="oc" Mar 20 11:48:12 crc kubenswrapper[4748]: I0320 11:48:12.157639 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xzd8l/crc-debug-wtzc7" Mar 20 11:48:12 crc kubenswrapper[4748]: I0320 11:48:12.227334 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lb9m2\" (UniqueName: \"kubernetes.io/projected/22fd9db9-a5de-4f99-b031-3420d8e03b0d-kube-api-access-lb9m2\") pod \"crc-debug-wtzc7\" (UID: \"22fd9db9-a5de-4f99-b031-3420d8e03b0d\") " pod="openshift-must-gather-xzd8l/crc-debug-wtzc7" Mar 20 11:48:12 crc kubenswrapper[4748]: I0320 11:48:12.227747 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/22fd9db9-a5de-4f99-b031-3420d8e03b0d-host\") pod \"crc-debug-wtzc7\" (UID: \"22fd9db9-a5de-4f99-b031-3420d8e03b0d\") " pod="openshift-must-gather-xzd8l/crc-debug-wtzc7" Mar 20 11:48:12 crc kubenswrapper[4748]: I0320 11:48:12.329508 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lb9m2\" (UniqueName: \"kubernetes.io/projected/22fd9db9-a5de-4f99-b031-3420d8e03b0d-kube-api-access-lb9m2\") pod \"crc-debug-wtzc7\" (UID: \"22fd9db9-a5de-4f99-b031-3420d8e03b0d\") " pod="openshift-must-gather-xzd8l/crc-debug-wtzc7" Mar 20 11:48:12 crc kubenswrapper[4748]: I0320 11:48:12.329592 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/22fd9db9-a5de-4f99-b031-3420d8e03b0d-host\") pod \"crc-debug-wtzc7\" (UID: \"22fd9db9-a5de-4f99-b031-3420d8e03b0d\") " pod="openshift-must-gather-xzd8l/crc-debug-wtzc7" Mar 20 11:48:12 crc kubenswrapper[4748]: I0320 11:48:12.329691 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/22fd9db9-a5de-4f99-b031-3420d8e03b0d-host\") pod \"crc-debug-wtzc7\" (UID: \"22fd9db9-a5de-4f99-b031-3420d8e03b0d\") " pod="openshift-must-gather-xzd8l/crc-debug-wtzc7" Mar 20 11:48:12 crc kubenswrapper[4748]: I0320 11:48:12.351760 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lb9m2\" (UniqueName: \"kubernetes.io/projected/22fd9db9-a5de-4f99-b031-3420d8e03b0d-kube-api-access-lb9m2\") pod \"crc-debug-wtzc7\" (UID: \"22fd9db9-a5de-4f99-b031-3420d8e03b0d\") " pod="openshift-must-gather-xzd8l/crc-debug-wtzc7" Mar 20 11:48:12 crc kubenswrapper[4748]: I0320 11:48:12.478962 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xzd8l/crc-debug-wtzc7" Mar 20 11:48:12 crc kubenswrapper[4748]: W0320 11:48:12.525907 4748 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod22fd9db9_a5de_4f99_b031_3420d8e03b0d.slice/crio-84140f5dec360aafc1064fa5e237848d7e1e970c03e6e1ec9e99a88051c61e17 WatchSource:0}: Error finding container 84140f5dec360aafc1064fa5e237848d7e1e970c03e6e1ec9e99a88051c61e17: Status 404 returned error can't find the container with id 84140f5dec360aafc1064fa5e237848d7e1e970c03e6e1ec9e99a88051c61e17 Mar 20 11:48:12 crc kubenswrapper[4748]: I0320 11:48:12.684710 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xzd8l/crc-debug-wtzc7" event={"ID":"22fd9db9-a5de-4f99-b031-3420d8e03b0d","Type":"ContainerStarted","Data":"84140f5dec360aafc1064fa5e237848d7e1e970c03e6e1ec9e99a88051c61e17"} Mar 20 11:48:22 crc kubenswrapper[4748]: I0320 11:48:22.798729 4748 scope.go:117] "RemoveContainer" containerID="fd6901472ea5b987df1064749005e0ef3b2d705e51d677ecf4c54828a8d3799f" Mar 20 11:48:25 crc kubenswrapper[4748]: I0320 11:48:25.812329 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xzd8l/crc-debug-wtzc7" event={"ID":"22fd9db9-a5de-4f99-b031-3420d8e03b0d","Type":"ContainerStarted","Data":"2be97010bce97f769f1b67f353ad303768415e75c01fac7123ff6411bed4a7d5"} Mar 20 11:48:25 crc kubenswrapper[4748]: I0320 11:48:25.836293 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-xzd8l/crc-debug-wtzc7" podStartSLOduration=1.8959586769999999 podStartE2EDuration="13.83627504s" podCreationTimestamp="2026-03-20 11:48:12 +0000 UTC" firstStartedPulling="2026-03-20 11:48:12.531262625 +0000 UTC m=+4327.672808439" lastFinishedPulling="2026-03-20 11:48:24.471578988 +0000 UTC m=+4339.613124802" observedRunningTime="2026-03-20 11:48:25.833242443 +0000 UTC m=+4340.974788277" watchObservedRunningTime="2026-03-20 11:48:25.83627504 +0000 UTC m=+4340.977820854" Mar 20 11:49:11 crc kubenswrapper[4748]: I0320 11:49:11.211694 4748 generic.go:334] "Generic (PLEG): container finished" podID="22fd9db9-a5de-4f99-b031-3420d8e03b0d" containerID="2be97010bce97f769f1b67f353ad303768415e75c01fac7123ff6411bed4a7d5" exitCode=0 Mar 20 11:49:11 crc kubenswrapper[4748]: I0320 11:49:11.211803 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xzd8l/crc-debug-wtzc7" event={"ID":"22fd9db9-a5de-4f99-b031-3420d8e03b0d","Type":"ContainerDied","Data":"2be97010bce97f769f1b67f353ad303768415e75c01fac7123ff6411bed4a7d5"} Mar 20 11:49:12 crc kubenswrapper[4748]: I0320 11:49:12.358853 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xzd8l/crc-debug-wtzc7" Mar 20 11:49:12 crc kubenswrapper[4748]: I0320 11:49:12.400101 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-xzd8l/crc-debug-wtzc7"] Mar 20 11:49:12 crc kubenswrapper[4748]: I0320 11:49:12.414631 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-xzd8l/crc-debug-wtzc7"] Mar 20 11:49:12 crc kubenswrapper[4748]: I0320 11:49:12.445623 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/22fd9db9-a5de-4f99-b031-3420d8e03b0d-host\") pod \"22fd9db9-a5de-4f99-b031-3420d8e03b0d\" (UID: \"22fd9db9-a5de-4f99-b031-3420d8e03b0d\") " Mar 20 11:49:12 crc kubenswrapper[4748]: I0320 11:49:12.445762 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/22fd9db9-a5de-4f99-b031-3420d8e03b0d-host" (OuterVolumeSpecName: "host") pod "22fd9db9-a5de-4f99-b031-3420d8e03b0d" (UID: "22fd9db9-a5de-4f99-b031-3420d8e03b0d"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 11:49:12 crc kubenswrapper[4748]: I0320 11:49:12.445786 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lb9m2\" (UniqueName: \"kubernetes.io/projected/22fd9db9-a5de-4f99-b031-3420d8e03b0d-kube-api-access-lb9m2\") pod \"22fd9db9-a5de-4f99-b031-3420d8e03b0d\" (UID: \"22fd9db9-a5de-4f99-b031-3420d8e03b0d\") " Mar 20 11:49:12 crc kubenswrapper[4748]: I0320 11:49:12.446548 4748 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/22fd9db9-a5de-4f99-b031-3420d8e03b0d-host\") on node \"crc\" DevicePath \"\"" Mar 20 11:49:12 crc kubenswrapper[4748]: I0320 11:49:12.461229 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22fd9db9-a5de-4f99-b031-3420d8e03b0d-kube-api-access-lb9m2" (OuterVolumeSpecName: "kube-api-access-lb9m2") pod "22fd9db9-a5de-4f99-b031-3420d8e03b0d" (UID: "22fd9db9-a5de-4f99-b031-3420d8e03b0d"). InnerVolumeSpecName "kube-api-access-lb9m2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:49:12 crc kubenswrapper[4748]: I0320 11:49:12.548591 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lb9m2\" (UniqueName: \"kubernetes.io/projected/22fd9db9-a5de-4f99-b031-3420d8e03b0d-kube-api-access-lb9m2\") on node \"crc\" DevicePath \"\"" Mar 20 11:49:12 crc kubenswrapper[4748]: I0320 11:49:12.928760 4748 patch_prober.go:28] interesting pod/machine-config-daemon-5lbvz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:49:12 crc kubenswrapper[4748]: I0320 11:49:12.929158 4748 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:49:13 crc kubenswrapper[4748]: I0320 11:49:13.229136 4748 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="84140f5dec360aafc1064fa5e237848d7e1e970c03e6e1ec9e99a88051c61e17" Mar 20 11:49:13 crc kubenswrapper[4748]: I0320 11:49:13.229204 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xzd8l/crc-debug-wtzc7" Mar 20 11:49:13 crc kubenswrapper[4748]: I0320 11:49:13.536174 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22fd9db9-a5de-4f99-b031-3420d8e03b0d" path="/var/lib/kubelet/pods/22fd9db9-a5de-4f99-b031-3420d8e03b0d/volumes" Mar 20 11:49:13 crc kubenswrapper[4748]: I0320 11:49:13.680727 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-xzd8l/crc-debug-xfhfn"] Mar 20 11:49:13 crc kubenswrapper[4748]: E0320 11:49:13.681246 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22fd9db9-a5de-4f99-b031-3420d8e03b0d" containerName="container-00" Mar 20 11:49:13 crc kubenswrapper[4748]: I0320 11:49:13.681271 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="22fd9db9-a5de-4f99-b031-3420d8e03b0d" containerName="container-00" Mar 20 11:49:13 crc kubenswrapper[4748]: I0320 11:49:13.681527 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="22fd9db9-a5de-4f99-b031-3420d8e03b0d" containerName="container-00" Mar 20 11:49:13 crc kubenswrapper[4748]: I0320 11:49:13.682305 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xzd8l/crc-debug-xfhfn" Mar 20 11:49:13 crc kubenswrapper[4748]: I0320 11:49:13.770143 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bea0745f-dcad-4a08-9e73-8492800d8259-host\") pod \"crc-debug-xfhfn\" (UID: \"bea0745f-dcad-4a08-9e73-8492800d8259\") " pod="openshift-must-gather-xzd8l/crc-debug-xfhfn" Mar 20 11:49:13 crc kubenswrapper[4748]: I0320 11:49:13.770346 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g927w\" (UniqueName: \"kubernetes.io/projected/bea0745f-dcad-4a08-9e73-8492800d8259-kube-api-access-g927w\") pod \"crc-debug-xfhfn\" (UID: \"bea0745f-dcad-4a08-9e73-8492800d8259\") " pod="openshift-must-gather-xzd8l/crc-debug-xfhfn" Mar 20 11:49:13 crc kubenswrapper[4748]: I0320 11:49:13.872548 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bea0745f-dcad-4a08-9e73-8492800d8259-host\") pod \"crc-debug-xfhfn\" (UID: \"bea0745f-dcad-4a08-9e73-8492800d8259\") " pod="openshift-must-gather-xzd8l/crc-debug-xfhfn" Mar 20 11:49:13 crc kubenswrapper[4748]: I0320 11:49:13.872669 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g927w\" (UniqueName: \"kubernetes.io/projected/bea0745f-dcad-4a08-9e73-8492800d8259-kube-api-access-g927w\") pod \"crc-debug-xfhfn\" (UID: \"bea0745f-dcad-4a08-9e73-8492800d8259\") " pod="openshift-must-gather-xzd8l/crc-debug-xfhfn" Mar 20 11:49:13 crc kubenswrapper[4748]: I0320 11:49:13.872710 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bea0745f-dcad-4a08-9e73-8492800d8259-host\") pod \"crc-debug-xfhfn\" (UID: \"bea0745f-dcad-4a08-9e73-8492800d8259\") " pod="openshift-must-gather-xzd8l/crc-debug-xfhfn" Mar 20 11:49:13 crc kubenswrapper[4748]: I0320 11:49:13.980675 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g927w\" (UniqueName: \"kubernetes.io/projected/bea0745f-dcad-4a08-9e73-8492800d8259-kube-api-access-g927w\") pod \"crc-debug-xfhfn\" (UID: \"bea0745f-dcad-4a08-9e73-8492800d8259\") " pod="openshift-must-gather-xzd8l/crc-debug-xfhfn" Mar 20 11:49:14 crc kubenswrapper[4748]: I0320 11:49:14.021111 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xzd8l/crc-debug-xfhfn" Mar 20 11:49:14 crc kubenswrapper[4748]: I0320 11:49:14.251640 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xzd8l/crc-debug-xfhfn" event={"ID":"bea0745f-dcad-4a08-9e73-8492800d8259","Type":"ContainerStarted","Data":"d4f0b26d04ae27264e3f809c59a95a080662d7eef9fd33beb5ffea635938a109"} Mar 20 11:49:15 crc kubenswrapper[4748]: I0320 11:49:15.261346 4748 generic.go:334] "Generic (PLEG): container finished" podID="bea0745f-dcad-4a08-9e73-8492800d8259" containerID="78e6d1588f7cbac868528b66b9a591f39b76c8b4868a17ca120ec9b6d9f17127" exitCode=0 Mar 20 11:49:15 crc kubenswrapper[4748]: I0320 11:49:15.261420 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xzd8l/crc-debug-xfhfn" event={"ID":"bea0745f-dcad-4a08-9e73-8492800d8259","Type":"ContainerDied","Data":"78e6d1588f7cbac868528b66b9a591f39b76c8b4868a17ca120ec9b6d9f17127"} Mar 20 11:49:16 crc kubenswrapper[4748]: I0320 11:49:16.375114 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xzd8l/crc-debug-xfhfn" Mar 20 11:49:16 crc kubenswrapper[4748]: I0320 11:49:16.470327 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g927w\" (UniqueName: \"kubernetes.io/projected/bea0745f-dcad-4a08-9e73-8492800d8259-kube-api-access-g927w\") pod \"bea0745f-dcad-4a08-9e73-8492800d8259\" (UID: \"bea0745f-dcad-4a08-9e73-8492800d8259\") " Mar 20 11:49:16 crc kubenswrapper[4748]: I0320 11:49:16.470500 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bea0745f-dcad-4a08-9e73-8492800d8259-host\") pod \"bea0745f-dcad-4a08-9e73-8492800d8259\" (UID: \"bea0745f-dcad-4a08-9e73-8492800d8259\") " Mar 20 11:49:16 crc kubenswrapper[4748]: I0320 11:49:16.470995 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bea0745f-dcad-4a08-9e73-8492800d8259-host" (OuterVolumeSpecName: "host") pod "bea0745f-dcad-4a08-9e73-8492800d8259" (UID: "bea0745f-dcad-4a08-9e73-8492800d8259"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 11:49:16 crc kubenswrapper[4748]: I0320 11:49:16.476362 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bea0745f-dcad-4a08-9e73-8492800d8259-kube-api-access-g927w" (OuterVolumeSpecName: "kube-api-access-g927w") pod "bea0745f-dcad-4a08-9e73-8492800d8259" (UID: "bea0745f-dcad-4a08-9e73-8492800d8259"). InnerVolumeSpecName "kube-api-access-g927w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:49:16 crc kubenswrapper[4748]: I0320 11:49:16.572620 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g927w\" (UniqueName: \"kubernetes.io/projected/bea0745f-dcad-4a08-9e73-8492800d8259-kube-api-access-g927w\") on node \"crc\" DevicePath \"\"" Mar 20 11:49:16 crc kubenswrapper[4748]: I0320 11:49:16.572901 4748 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bea0745f-dcad-4a08-9e73-8492800d8259-host\") on node \"crc\" DevicePath \"\"" Mar 20 11:49:17 crc kubenswrapper[4748]: I0320 11:49:17.279869 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xzd8l/crc-debug-xfhfn" event={"ID":"bea0745f-dcad-4a08-9e73-8492800d8259","Type":"ContainerDied","Data":"d4f0b26d04ae27264e3f809c59a95a080662d7eef9fd33beb5ffea635938a109"} Mar 20 11:49:17 crc kubenswrapper[4748]: I0320 11:49:17.279961 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xzd8l/crc-debug-xfhfn" Mar 20 11:49:17 crc kubenswrapper[4748]: I0320 11:49:17.296077 4748 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d4f0b26d04ae27264e3f809c59a95a080662d7eef9fd33beb5ffea635938a109" Mar 20 11:49:17 crc kubenswrapper[4748]: I0320 11:49:17.310372 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-xzd8l/crc-debug-xfhfn"] Mar 20 11:49:17 crc kubenswrapper[4748]: I0320 11:49:17.322256 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-xzd8l/crc-debug-xfhfn"] Mar 20 11:49:17 crc kubenswrapper[4748]: I0320 11:49:17.526864 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bea0745f-dcad-4a08-9e73-8492800d8259" path="/var/lib/kubelet/pods/bea0745f-dcad-4a08-9e73-8492800d8259/volumes" Mar 20 11:49:18 crc kubenswrapper[4748]: I0320 11:49:18.495369 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-xzd8l/crc-debug-9zxlx"] Mar 20 11:49:18 crc kubenswrapper[4748]: E0320 11:49:18.496201 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bea0745f-dcad-4a08-9e73-8492800d8259" containerName="container-00" Mar 20 11:49:18 crc kubenswrapper[4748]: I0320 11:49:18.496222 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="bea0745f-dcad-4a08-9e73-8492800d8259" containerName="container-00" Mar 20 11:49:18 crc kubenswrapper[4748]: I0320 11:49:18.496403 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="bea0745f-dcad-4a08-9e73-8492800d8259" containerName="container-00" Mar 20 11:49:18 crc kubenswrapper[4748]: I0320 11:49:18.497082 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xzd8l/crc-debug-9zxlx" Mar 20 11:49:18 crc kubenswrapper[4748]: I0320 11:49:18.622135 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/67898f44-39f5-4f13-8ab1-11c370ef85f3-host\") pod \"crc-debug-9zxlx\" (UID: \"67898f44-39f5-4f13-8ab1-11c370ef85f3\") " pod="openshift-must-gather-xzd8l/crc-debug-9zxlx" Mar 20 11:49:18 crc kubenswrapper[4748]: I0320 11:49:18.622480 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t56ld\" (UniqueName: \"kubernetes.io/projected/67898f44-39f5-4f13-8ab1-11c370ef85f3-kube-api-access-t56ld\") pod \"crc-debug-9zxlx\" (UID: \"67898f44-39f5-4f13-8ab1-11c370ef85f3\") " pod="openshift-must-gather-xzd8l/crc-debug-9zxlx" Mar 20 11:49:18 crc kubenswrapper[4748]: I0320 11:49:18.725613 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t56ld\" (UniqueName: \"kubernetes.io/projected/67898f44-39f5-4f13-8ab1-11c370ef85f3-kube-api-access-t56ld\") pod \"crc-debug-9zxlx\" (UID: \"67898f44-39f5-4f13-8ab1-11c370ef85f3\") " pod="openshift-must-gather-xzd8l/crc-debug-9zxlx" Mar 20 11:49:18 crc kubenswrapper[4748]: I0320 11:49:18.726177 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/67898f44-39f5-4f13-8ab1-11c370ef85f3-host\") pod \"crc-debug-9zxlx\" (UID: \"67898f44-39f5-4f13-8ab1-11c370ef85f3\") " pod="openshift-must-gather-xzd8l/crc-debug-9zxlx" Mar 20 11:49:18 crc kubenswrapper[4748]: I0320 11:49:18.726247 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/67898f44-39f5-4f13-8ab1-11c370ef85f3-host\") pod \"crc-debug-9zxlx\" (UID: \"67898f44-39f5-4f13-8ab1-11c370ef85f3\") " pod="openshift-must-gather-xzd8l/crc-debug-9zxlx" Mar 20 11:49:18 crc kubenswrapper[4748]: I0320 11:49:18.744994 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t56ld\" (UniqueName: \"kubernetes.io/projected/67898f44-39f5-4f13-8ab1-11c370ef85f3-kube-api-access-t56ld\") pod \"crc-debug-9zxlx\" (UID: \"67898f44-39f5-4f13-8ab1-11c370ef85f3\") " pod="openshift-must-gather-xzd8l/crc-debug-9zxlx" Mar 20 11:49:18 crc kubenswrapper[4748]: I0320 11:49:18.823333 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xzd8l/crc-debug-9zxlx" Mar 20 11:49:18 crc kubenswrapper[4748]: W0320 11:49:18.853102 4748 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod67898f44_39f5_4f13_8ab1_11c370ef85f3.slice/crio-94a35f198506007f7f17f8955955dc9fb08a6e0b8d597130eb3d1c250bc779be WatchSource:0}: Error finding container 94a35f198506007f7f17f8955955dc9fb08a6e0b8d597130eb3d1c250bc779be: Status 404 returned error can't find the container with id 94a35f198506007f7f17f8955955dc9fb08a6e0b8d597130eb3d1c250bc779be Mar 20 11:49:19 crc kubenswrapper[4748]: I0320 11:49:19.295658 4748 generic.go:334] "Generic (PLEG): container finished" podID="67898f44-39f5-4f13-8ab1-11c370ef85f3" containerID="683a223ffabc15a9baf4c6f95d7a8a26fc820b3e3d96debcc0f6a01f075006dd" exitCode=0 Mar 20 11:49:19 crc kubenswrapper[4748]: I0320 11:49:19.295893 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xzd8l/crc-debug-9zxlx" event={"ID":"67898f44-39f5-4f13-8ab1-11c370ef85f3","Type":"ContainerDied","Data":"683a223ffabc15a9baf4c6f95d7a8a26fc820b3e3d96debcc0f6a01f075006dd"} Mar 20 11:49:19 crc kubenswrapper[4748]: I0320 11:49:19.295950 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xzd8l/crc-debug-9zxlx" event={"ID":"67898f44-39f5-4f13-8ab1-11c370ef85f3","Type":"ContainerStarted","Data":"94a35f198506007f7f17f8955955dc9fb08a6e0b8d597130eb3d1c250bc779be"} Mar 20 11:49:19 crc kubenswrapper[4748]: I0320 11:49:19.332264 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-xzd8l/crc-debug-9zxlx"] Mar 20 11:49:19 crc kubenswrapper[4748]: I0320 11:49:19.343141 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-xzd8l/crc-debug-9zxlx"] Mar 20 11:49:20 crc kubenswrapper[4748]: I0320 11:49:20.417883 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xzd8l/crc-debug-9zxlx" Mar 20 11:49:20 crc kubenswrapper[4748]: I0320 11:49:20.558324 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t56ld\" (UniqueName: \"kubernetes.io/projected/67898f44-39f5-4f13-8ab1-11c370ef85f3-kube-api-access-t56ld\") pod \"67898f44-39f5-4f13-8ab1-11c370ef85f3\" (UID: \"67898f44-39f5-4f13-8ab1-11c370ef85f3\") " Mar 20 11:49:20 crc kubenswrapper[4748]: I0320 11:49:20.558505 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/67898f44-39f5-4f13-8ab1-11c370ef85f3-host\") pod \"67898f44-39f5-4f13-8ab1-11c370ef85f3\" (UID: \"67898f44-39f5-4f13-8ab1-11c370ef85f3\") " Mar 20 11:49:20 crc kubenswrapper[4748]: I0320 11:49:20.558623 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/67898f44-39f5-4f13-8ab1-11c370ef85f3-host" (OuterVolumeSpecName: "host") pod "67898f44-39f5-4f13-8ab1-11c370ef85f3" (UID: "67898f44-39f5-4f13-8ab1-11c370ef85f3"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 11:49:20 crc kubenswrapper[4748]: I0320 11:49:20.559102 4748 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/67898f44-39f5-4f13-8ab1-11c370ef85f3-host\") on node \"crc\" DevicePath \"\"" Mar 20 11:49:20 crc kubenswrapper[4748]: I0320 11:49:20.573207 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67898f44-39f5-4f13-8ab1-11c370ef85f3-kube-api-access-t56ld" (OuterVolumeSpecName: "kube-api-access-t56ld") pod "67898f44-39f5-4f13-8ab1-11c370ef85f3" (UID: "67898f44-39f5-4f13-8ab1-11c370ef85f3"). InnerVolumeSpecName "kube-api-access-t56ld". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:49:20 crc kubenswrapper[4748]: I0320 11:49:20.661863 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t56ld\" (UniqueName: \"kubernetes.io/projected/67898f44-39f5-4f13-8ab1-11c370ef85f3-kube-api-access-t56ld\") on node \"crc\" DevicePath \"\"" Mar 20 11:49:21 crc kubenswrapper[4748]: I0320 11:49:21.314323 4748 scope.go:117] "RemoveContainer" containerID="683a223ffabc15a9baf4c6f95d7a8a26fc820b3e3d96debcc0f6a01f075006dd" Mar 20 11:49:21 crc kubenswrapper[4748]: I0320 11:49:21.314396 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xzd8l/crc-debug-9zxlx" Mar 20 11:49:21 crc kubenswrapper[4748]: I0320 11:49:21.526993 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67898f44-39f5-4f13-8ab1-11c370ef85f3" path="/var/lib/kubelet/pods/67898f44-39f5-4f13-8ab1-11c370ef85f3/volumes" Mar 20 11:49:42 crc kubenswrapper[4748]: I0320 11:49:42.929060 4748 patch_prober.go:28] interesting pod/machine-config-daemon-5lbvz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:49:42 crc kubenswrapper[4748]: I0320 11:49:42.929551 4748 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:49:52 crc kubenswrapper[4748]: I0320 11:49:52.374467 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-77b5d7f4f8-8jmkc_201e8a26-7bfa-40c7-aa3d-bf32c1344d61/barbican-api/0.log" Mar 20 11:49:52 crc kubenswrapper[4748]: I0320 11:49:52.529295 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-77b5d7f4f8-8jmkc_201e8a26-7bfa-40c7-aa3d-bf32c1344d61/barbican-api-log/0.log" Mar 20 11:49:52 crc kubenswrapper[4748]: I0320 11:49:52.582548 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-5b69f7d5cb-p5jsk_ed96228e-6626-468c-bf60-a1073dfc123e/barbican-keystone-listener/0.log" Mar 20 11:49:52 crc kubenswrapper[4748]: I0320 11:49:52.686031 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-5b69f7d5cb-p5jsk_ed96228e-6626-468c-bf60-a1073dfc123e/barbican-keystone-listener-log/0.log" Mar 20 11:49:52 crc kubenswrapper[4748]: I0320 11:49:52.807439 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-556bf684bc-f9q9w_6c942b79-bc14-4a48-8fbd-32667bc1afc6/barbican-worker/0.log" Mar 20 11:49:52 crc kubenswrapper[4748]: I0320 11:49:52.812404 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-556bf684bc-f9q9w_6c942b79-bc14-4a48-8fbd-32667bc1afc6/barbican-worker-log/0.log" Mar 20 11:49:53 crc kubenswrapper[4748]: I0320 11:49:53.075103 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_06f5e1bb-615b-4f6e-9c62-93a82f0984c8/ceilometer-central-agent/0.log" Mar 20 11:49:53 crc kubenswrapper[4748]: I0320 11:49:53.320247 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-q4wv5_01e10255-e1d0-4e62-9b54-4c1043b5f502/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 11:49:53 crc kubenswrapper[4748]: I0320 11:49:53.615816 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_06f5e1bb-615b-4f6e-9c62-93a82f0984c8/proxy-httpd/0.log" Mar 20 11:49:53 crc kubenswrapper[4748]: I0320 11:49:53.656194 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_06f5e1bb-615b-4f6e-9c62-93a82f0984c8/ceilometer-notification-agent/0.log" Mar 20 11:49:53 crc kubenswrapper[4748]: I0320 11:49:53.673196 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_06f5e1bb-615b-4f6e-9c62-93a82f0984c8/sg-core/0.log" Mar 20 11:49:53 crc kubenswrapper[4748]: I0320 11:49:53.888334 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_4376a841-9631-4c91-bae6-9c12b2f46a17/cinder-api/0.log" Mar 20 11:49:53 crc kubenswrapper[4748]: I0320 11:49:53.897487 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_4376a841-9631-4c91-bae6-9c12b2f46a17/cinder-api-log/0.log" Mar 20 11:49:53 crc kubenswrapper[4748]: I0320 11:49:53.970011 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_9cbc2f9e-ca03-4d54-b229-ab80f6ffb53c/cinder-scheduler/0.log" Mar 20 11:49:54 crc kubenswrapper[4748]: I0320 11:49:54.132736 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_9cbc2f9e-ca03-4d54-b229-ab80f6ffb53c/probe/0.log" Mar 20 11:49:54 crc kubenswrapper[4748]: I0320 11:49:54.368955 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-r6hn2_2e4d68e5-3aee-40fe-98fb-a2c06bdd601e/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 11:49:54 crc kubenswrapper[4748]: I0320 11:49:54.647522 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-54ffdb7d8c-ckmg2_66fa6d88-f5fa-4288-8b2b-bc30561967c0/init/0.log" Mar 20 11:49:54 crc kubenswrapper[4748]: I0320 11:49:54.815934 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-wdbpr_9e122eba-13d6-4074-b8d8-a4fc7ae4e3f1/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 11:49:54 crc kubenswrapper[4748]: I0320 11:49:54.853044 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-54ffdb7d8c-ckmg2_66fa6d88-f5fa-4288-8b2b-bc30561967c0/init/0.log" Mar 20 11:49:55 crc kubenswrapper[4748]: I0320 11:49:55.054009 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-54ffdb7d8c-ckmg2_66fa6d88-f5fa-4288-8b2b-bc30561967c0/dnsmasq-dns/0.log" Mar 20 11:49:55 crc kubenswrapper[4748]: I0320 11:49:55.649996 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_ffe44fec-9121-46b2-9087-eba59b656915/glance-log/0.log" Mar 20 11:49:55 crc kubenswrapper[4748]: I0320 11:49:55.666872 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_ffe44fec-9121-46b2-9087-eba59b656915/glance-httpd/0.log" Mar 20 11:49:55 crc kubenswrapper[4748]: I0320 11:49:55.900443 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-rzshm_a0708128-eacf-422a-8dac-98032a9f12e7/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 11:49:55 crc kubenswrapper[4748]: I0320 11:49:55.915917 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_f70c4bdb-308f-485c-9f2b-388e135bdfc9/glance-httpd/0.log" Mar 20 11:49:55 crc kubenswrapper[4748]: I0320 11:49:55.952335 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_f70c4bdb-308f-485c-9f2b-388e135bdfc9/glance-log/0.log" Mar 20 11:49:56 crc kubenswrapper[4748]: I0320 11:49:56.209353 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-7d79b6bb86-nhfts_f3de236a-e527-4582-8eb5-03ca8aa883e0/horizon/0.log" Mar 20 11:49:56 crc kubenswrapper[4748]: I0320 11:49:56.502419 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-d522t_5d2decbf-7d56-4ff7-896e-eaca78da7448/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 11:49:56 crc kubenswrapper[4748]: I0320 11:49:56.663692 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-7d79b6bb86-nhfts_f3de236a-e527-4582-8eb5-03ca8aa883e0/horizon-log/0.log" Mar 20 11:49:56 crc kubenswrapper[4748]: I0320 11:49:56.741111 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29566741-sxfvz_7997caf5-1478-40d5-a0c6-6811d242ef17/keystone-cron/0.log" Mar 20 11:49:56 crc kubenswrapper[4748]: I0320 11:49:56.953353 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_e9d06c7d-5d90-45f8-b4df-b53bff4761a5/kube-state-metrics/0.log" Mar 20 11:49:57 crc kubenswrapper[4748]: I0320 11:49:57.471445 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-688f5b7cfd-ffmqn_cc43e627-4d33-422e-bfc0-63cb746991ca/keystone-api/0.log" Mar 20 11:49:57 crc kubenswrapper[4748]: I0320 11:49:57.628418 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-n6f7v_93a46290-fef3-4e7a-9cb3-682c3f453cc1/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 11:49:58 crc kubenswrapper[4748]: I0320 11:49:58.109225 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-695f6cc9c-5bkz4_0900bb20-c211-44be-a5f8-6775641e54ca/neutron-httpd/0.log" Mar 20 11:49:58 crc kubenswrapper[4748]: I0320 11:49:58.418443 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-zr6v8_cba4401d-824d-4c51-8a04-43691fa34a45/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 11:49:58 crc kubenswrapper[4748]: I0320 11:49:58.447988 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-695f6cc9c-5bkz4_0900bb20-c211-44be-a5f8-6775641e54ca/neutron-api/0.log" Mar 20 11:49:59 crc kubenswrapper[4748]: I0320 11:49:59.297058 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_8b67252d-9978-4214-bf76-b57b2272c603/nova-cell0-conductor-conductor/0.log" Mar 20 11:49:59 crc kubenswrapper[4748]: I0320 11:49:59.818356 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-rf2m4_a1734ea2-369d-4c96-aca3-1a450a82e9dc/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 11:49:59 crc kubenswrapper[4748]: I0320 11:49:59.876931 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_a9acc08e-1cf9-4a43-9b60-2bd4e1cad401/nova-cell1-conductor-conductor/0.log" Mar 20 11:49:59 crc kubenswrapper[4748]: I0320 11:49:59.924614 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_6acc81f7-4f7d-4828-a328-1e2a4426bd57/nova-api-log/0.log" Mar 20 11:50:00 crc kubenswrapper[4748]: I0320 11:50:00.148442 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566790-r97n4"] Mar 20 11:50:00 crc kubenswrapper[4748]: E0320 11:50:00.148990 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67898f44-39f5-4f13-8ab1-11c370ef85f3" containerName="container-00" Mar 20 11:50:00 crc kubenswrapper[4748]: I0320 11:50:00.149016 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="67898f44-39f5-4f13-8ab1-11c370ef85f3" containerName="container-00" Mar 20 11:50:00 crc kubenswrapper[4748]: I0320 11:50:00.149272 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="67898f44-39f5-4f13-8ab1-11c370ef85f3" containerName="container-00" Mar 20 11:50:00 crc kubenswrapper[4748]: I0320 11:50:00.150088 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566790-r97n4" Mar 20 11:50:00 crc kubenswrapper[4748]: I0320 11:50:00.154180 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-p6q8z" Mar 20 11:50:00 crc kubenswrapper[4748]: I0320 11:50:00.155156 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 11:50:00 crc kubenswrapper[4748]: I0320 11:50:00.155442 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 11:50:00 crc kubenswrapper[4748]: I0320 11:50:00.195449 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566790-r97n4"] Mar 20 11:50:00 crc kubenswrapper[4748]: I0320 11:50:00.236900 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxnwb\" (UniqueName: \"kubernetes.io/projected/5bae15e5-0199-42c4-babb-5da895996b8a-kube-api-access-hxnwb\") pod \"auto-csr-approver-29566790-r97n4\" (UID: \"5bae15e5-0199-42c4-babb-5da895996b8a\") " pod="openshift-infra/auto-csr-approver-29566790-r97n4" Mar 20 11:50:00 crc kubenswrapper[4748]: I0320 11:50:00.304101 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_a2376389-554f-4c38-bfc1-00962d858ff4/nova-cell1-novncproxy-novncproxy/0.log" Mar 20 11:50:00 crc kubenswrapper[4748]: I0320 11:50:00.338178 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hxnwb\" (UniqueName: \"kubernetes.io/projected/5bae15e5-0199-42c4-babb-5da895996b8a-kube-api-access-hxnwb\") pod \"auto-csr-approver-29566790-r97n4\" (UID: \"5bae15e5-0199-42c4-babb-5da895996b8a\") " pod="openshift-infra/auto-csr-approver-29566790-r97n4" Mar 20 11:50:00 crc kubenswrapper[4748]: I0320 11:50:00.380311 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxnwb\" (UniqueName: \"kubernetes.io/projected/5bae15e5-0199-42c4-babb-5da895996b8a-kube-api-access-hxnwb\") pod \"auto-csr-approver-29566790-r97n4\" (UID: \"5bae15e5-0199-42c4-babb-5da895996b8a\") " pod="openshift-infra/auto-csr-approver-29566790-r97n4" Mar 20 11:50:00 crc kubenswrapper[4748]: I0320 11:50:00.453692 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_6acc81f7-4f7d-4828-a328-1e2a4426bd57/nova-api-api/0.log" Mar 20 11:50:00 crc kubenswrapper[4748]: I0320 11:50:00.474640 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566790-r97n4" Mar 20 11:50:00 crc kubenswrapper[4748]: I0320 11:50:00.742213 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_2f35a381-dc79-4781-97a4-1d0c8f96a0d2/nova-metadata-log/0.log" Mar 20 11:50:00 crc kubenswrapper[4748]: I0320 11:50:00.969719 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566790-r97n4"] Mar 20 11:50:01 crc kubenswrapper[4748]: I0320 11:50:01.253324 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_ffd19d53-385f-45a9-a222-caa7fbf6545e/mysql-bootstrap/0.log" Mar 20 11:50:01 crc kubenswrapper[4748]: I0320 11:50:01.345747 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_2f35a381-dc79-4781-97a4-1d0c8f96a0d2/nova-metadata-metadata/0.log" Mar 20 11:50:01 crc kubenswrapper[4748]: I0320 11:50:01.475626 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_734b70b0-5549-4b8e-aa70-a9d589c5b457/nova-scheduler-scheduler/0.log" Mar 20 11:50:01 crc kubenswrapper[4748]: I0320 11:50:01.513523 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_ffd19d53-385f-45a9-a222-caa7fbf6545e/mysql-bootstrap/0.log" Mar 20 11:50:01 crc kubenswrapper[4748]: I0320 11:50:01.534436 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_ffd19d53-385f-45a9-a222-caa7fbf6545e/galera/0.log" Mar 20 11:50:01 crc kubenswrapper[4748]: I0320 11:50:01.763581 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_74743039-97e4-46cf-8fbf-183c8c11ca20/mysql-bootstrap/0.log" Mar 20 11:50:01 crc kubenswrapper[4748]: I0320 11:50:01.916742 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566790-r97n4" event={"ID":"5bae15e5-0199-42c4-babb-5da895996b8a","Type":"ContainerStarted","Data":"1f6196438726d1e2e4d3bac65ea88b8a6d512b1ae26b53548113d18609ee0408"} Mar 20 11:50:01 crc kubenswrapper[4748]: I0320 11:50:01.919888 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_74743039-97e4-46cf-8fbf-183c8c11ca20/mysql-bootstrap/0.log" Mar 20 11:50:01 crc kubenswrapper[4748]: I0320 11:50:01.991964 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_74743039-97e4-46cf-8fbf-183c8c11ca20/galera/0.log" Mar 20 11:50:02 crc kubenswrapper[4748]: I0320 11:50:02.028218 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-fjmvn_4457d05c-f317-4cf2-97cb-03888616f4af/nova-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 11:50:02 crc kubenswrapper[4748]: I0320 11:50:02.180122 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_2b2f2b26-6292-47bb-b8ee-971d9b47c85d/openstackclient/0.log" Mar 20 11:50:02 crc kubenswrapper[4748]: I0320 11:50:02.197174 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-bldp9_2482f122-92d5-410c-b4c0-41834cea1711/ovn-controller/0.log" Mar 20 11:50:02 crc kubenswrapper[4748]: I0320 11:50:02.463319 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-zhsd6_6b172e4c-c5b0-4573-b80c-9bc074489627/openstack-network-exporter/0.log" Mar 20 11:50:02 crc kubenswrapper[4748]: I0320 11:50:02.553561 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-756zb_334c1861-88b1-44e2-a02e-ad1dcecf2fc0/ovsdb-server-init/0.log" Mar 20 11:50:02 crc kubenswrapper[4748]: I0320 11:50:02.771388 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-756zb_334c1861-88b1-44e2-a02e-ad1dcecf2fc0/ovsdb-server-init/0.log" Mar 20 11:50:02 crc kubenswrapper[4748]: I0320 11:50:02.779540 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-756zb_334c1861-88b1-44e2-a02e-ad1dcecf2fc0/ovsdb-server/0.log" Mar 20 11:50:02 crc kubenswrapper[4748]: I0320 11:50:02.814782 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-756zb_334c1861-88b1-44e2-a02e-ad1dcecf2fc0/ovs-vswitchd/0.log" Mar 20 11:50:02 crc kubenswrapper[4748]: I0320 11:50:02.927100 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566790-r97n4" event={"ID":"5bae15e5-0199-42c4-babb-5da895996b8a","Type":"ContainerStarted","Data":"dc459f89e289407986eebe2f2d008cec672587e5f42521b934c399b13bdc5aa0"} Mar 20 11:50:02 crc kubenswrapper[4748]: I0320 11:50:02.951224 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29566790-r97n4" podStartSLOduration=1.572918917 podStartE2EDuration="2.951201052s" podCreationTimestamp="2026-03-20 11:50:00 +0000 UTC" firstStartedPulling="2026-03-20 11:50:00.956866248 +0000 UTC m=+4436.098412052" lastFinishedPulling="2026-03-20 11:50:02.335148373 +0000 UTC m=+4437.476694187" observedRunningTime="2026-03-20 11:50:02.942700247 +0000 UTC m=+4438.084246061" watchObservedRunningTime="2026-03-20 11:50:02.951201052 +0000 UTC m=+4438.092746866" Mar 20 11:50:03 crc kubenswrapper[4748]: I0320 11:50:03.055952 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_9b80cd60-a2f6-4638-a600-4d866573bbc3/openstack-network-exporter/0.log" Mar 20 11:50:03 crc kubenswrapper[4748]: I0320 11:50:03.287888 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-6qhn9_17ae0e15-a1a8-46a4-a2d0-4d1e92fe87b3/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 11:50:03 crc kubenswrapper[4748]: I0320 11:50:03.759440 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_9b80cd60-a2f6-4638-a600-4d866573bbc3/ovn-northd/0.log" Mar 20 11:50:03 crc kubenswrapper[4748]: I0320 11:50:03.761892 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_9ed1d8cb-39ee-4f07-98b6-61e1ea8777a2/openstack-network-exporter/0.log" Mar 20 11:50:03 crc kubenswrapper[4748]: I0320 11:50:03.803368 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_9ed1d8cb-39ee-4f07-98b6-61e1ea8777a2/ovsdbserver-nb/0.log" Mar 20 11:50:03 crc kubenswrapper[4748]: I0320 11:50:03.945286 4748 generic.go:334] "Generic (PLEG): container finished" podID="5bae15e5-0199-42c4-babb-5da895996b8a" containerID="dc459f89e289407986eebe2f2d008cec672587e5f42521b934c399b13bdc5aa0" exitCode=0 Mar 20 11:50:03 crc kubenswrapper[4748]: I0320 11:50:03.945546 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566790-r97n4" event={"ID":"5bae15e5-0199-42c4-babb-5da895996b8a","Type":"ContainerDied","Data":"dc459f89e289407986eebe2f2d008cec672587e5f42521b934c399b13bdc5aa0"} Mar 20 11:50:03 crc kubenswrapper[4748]: I0320 11:50:03.981168 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_6e2e0cd6-8912-4f2a-81ed-7cf9a9a0bd78/openstack-network-exporter/0.log" Mar 20 11:50:04 crc kubenswrapper[4748]: I0320 11:50:04.102603 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_6e2e0cd6-8912-4f2a-81ed-7cf9a9a0bd78/ovsdbserver-sb/0.log" Mar 20 11:50:04 crc kubenswrapper[4748]: I0320 11:50:04.339206 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_507647d5-8633-4346-a9e0-4af3eb0e3e5f/setup-container/0.log" Mar 20 11:50:04 crc kubenswrapper[4748]: I0320 11:50:04.384308 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-56cc45587d-dchtq_9bd6666d-34bf-42aa-bac6-e119898e279d/placement-api/0.log" Mar 20 11:50:04 crc kubenswrapper[4748]: I0320 11:50:04.524244 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_507647d5-8633-4346-a9e0-4af3eb0e3e5f/setup-container/0.log" Mar 20 11:50:04 crc kubenswrapper[4748]: I0320 11:50:04.578292 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-56cc45587d-dchtq_9bd6666d-34bf-42aa-bac6-e119898e279d/placement-log/0.log" Mar 20 11:50:04 crc kubenswrapper[4748]: I0320 11:50:04.660458 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_507647d5-8633-4346-a9e0-4af3eb0e3e5f/rabbitmq/0.log" Mar 20 11:50:04 crc kubenswrapper[4748]: I0320 11:50:04.736091 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_b6bc49d0-a6dc-4a70-9d54-2cc66e8cf07a/setup-container/0.log" Mar 20 11:50:05 crc kubenswrapper[4748]: I0320 11:50:05.363043 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_b6bc49d0-a6dc-4a70-9d54-2cc66e8cf07a/rabbitmq/0.log" Mar 20 11:50:05 crc kubenswrapper[4748]: I0320 11:50:05.364542 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_b6bc49d0-a6dc-4a70-9d54-2cc66e8cf07a/setup-container/0.log" Mar 20 11:50:05 crc kubenswrapper[4748]: I0320 11:50:05.392545 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566790-r97n4" Mar 20 11:50:05 crc kubenswrapper[4748]: I0320 11:50:05.468389 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-gx2k6_e51cf464-1d93-4c6c-99f9-418be04dce30/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 11:50:05 crc kubenswrapper[4748]: I0320 11:50:05.551646 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hxnwb\" (UniqueName: \"kubernetes.io/projected/5bae15e5-0199-42c4-babb-5da895996b8a-kube-api-access-hxnwb\") pod \"5bae15e5-0199-42c4-babb-5da895996b8a\" (UID: \"5bae15e5-0199-42c4-babb-5da895996b8a\") " Mar 20 11:50:05 crc kubenswrapper[4748]: I0320 11:50:05.557980 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5bae15e5-0199-42c4-babb-5da895996b8a-kube-api-access-hxnwb" (OuterVolumeSpecName: "kube-api-access-hxnwb") pod "5bae15e5-0199-42c4-babb-5da895996b8a" (UID: "5bae15e5-0199-42c4-babb-5da895996b8a"). InnerVolumeSpecName "kube-api-access-hxnwb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:50:05 crc kubenswrapper[4748]: I0320 11:50:05.653926 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hxnwb\" (UniqueName: \"kubernetes.io/projected/5bae15e5-0199-42c4-babb-5da895996b8a-kube-api-access-hxnwb\") on node \"crc\" DevicePath \"\"" Mar 20 11:50:05 crc kubenswrapper[4748]: I0320 11:50:05.657237 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-s6rzn_c8abd498-c75d-47c5-992b-77857b856c30/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 11:50:05 crc kubenswrapper[4748]: I0320 11:50:05.760813 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-6nsm6_6298ed1e-1de4-489a-ba4c-ca6f3f989909/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 11:50:05 crc kubenswrapper[4748]: I0320 11:50:05.951392 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-gl7rt_4ae809a6-7d0a-4b85-a623-eda42d60e2d7/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 11:50:05 crc kubenswrapper[4748]: I0320 11:50:05.963930 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566790-r97n4" event={"ID":"5bae15e5-0199-42c4-babb-5da895996b8a","Type":"ContainerDied","Data":"1f6196438726d1e2e4d3bac65ea88b8a6d512b1ae26b53548113d18609ee0408"} Mar 20 11:50:05 crc kubenswrapper[4748]: I0320 11:50:05.963978 4748 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1f6196438726d1e2e4d3bac65ea88b8a6d512b1ae26b53548113d18609ee0408" Mar 20 11:50:05 crc kubenswrapper[4748]: I0320 11:50:05.964201 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566790-r97n4" Mar 20 11:50:06 crc kubenswrapper[4748]: I0320 11:50:06.014119 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566784-9bjsw"] Mar 20 11:50:06 crc kubenswrapper[4748]: I0320 11:50:06.022410 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566784-9bjsw"] Mar 20 11:50:06 crc kubenswrapper[4748]: I0320 11:50:06.068751 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-66zrt_80a3504f-a9f2-4be3-9f87-e4110fb5fc7b/ssh-known-hosts-edpm-deployment/0.log" Mar 20 11:50:06 crc kubenswrapper[4748]: I0320 11:50:06.323913 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-c66c949c-9cv26_12263f55-f4a7-481f-afab-45f51bd4d60d/proxy-httpd/0.log" Mar 20 11:50:06 crc kubenswrapper[4748]: I0320 11:50:06.327197 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-c66c949c-9cv26_12263f55-f4a7-481f-afab-45f51bd4d60d/proxy-server/0.log" Mar 20 11:50:06 crc kubenswrapper[4748]: I0320 11:50:06.558084 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-mlztp_8bde82f9-d6e7-4ab4-8666-5fbbd5b90f17/swift-ring-rebalance/0.log" Mar 20 11:50:06 crc kubenswrapper[4748]: I0320 11:50:06.613586 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_7753c601-7739-4165-b5f2-a673b0797334/account-reaper/0.log" Mar 20 11:50:06 crc kubenswrapper[4748]: I0320 11:50:06.622849 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_7753c601-7739-4165-b5f2-a673b0797334/account-auditor/0.log" Mar 20 11:50:06 crc kubenswrapper[4748]: I0320 11:50:06.774102 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_7753c601-7739-4165-b5f2-a673b0797334/account-replicator/0.log" Mar 20 11:50:06 crc kubenswrapper[4748]: I0320 11:50:06.820343 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_7753c601-7739-4165-b5f2-a673b0797334/container-auditor/0.log" Mar 20 11:50:06 crc kubenswrapper[4748]: I0320 11:50:06.921528 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_7753c601-7739-4165-b5f2-a673b0797334/account-server/0.log" Mar 20 11:50:06 crc kubenswrapper[4748]: I0320 11:50:06.949975 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_7753c601-7739-4165-b5f2-a673b0797334/container-replicator/0.log" Mar 20 11:50:07 crc kubenswrapper[4748]: I0320 11:50:07.028679 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_7753c601-7739-4165-b5f2-a673b0797334/container-server/0.log" Mar 20 11:50:07 crc kubenswrapper[4748]: I0320 11:50:07.120244 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_7753c601-7739-4165-b5f2-a673b0797334/container-updater/0.log" Mar 20 11:50:07 crc kubenswrapper[4748]: I0320 11:50:07.196173 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_7753c601-7739-4165-b5f2-a673b0797334/object-expirer/0.log" Mar 20 11:50:07 crc kubenswrapper[4748]: I0320 11:50:07.210745 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_7753c601-7739-4165-b5f2-a673b0797334/object-auditor/0.log" Mar 20 11:50:07 crc kubenswrapper[4748]: I0320 11:50:07.234873 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_27339553-c013-4538-9a4d-5bbd249c197c/memcached/0.log" Mar 20 11:50:07 crc kubenswrapper[4748]: I0320 11:50:07.261118 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_7753c601-7739-4165-b5f2-a673b0797334/object-replicator/0.log" Mar 20 11:50:07 crc kubenswrapper[4748]: I0320 11:50:07.307697 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_7753c601-7739-4165-b5f2-a673b0797334/object-server/0.log" Mar 20 11:50:07 crc kubenswrapper[4748]: I0320 11:50:07.405379 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_7753c601-7739-4165-b5f2-a673b0797334/rsync/0.log" Mar 20 11:50:07 crc kubenswrapper[4748]: I0320 11:50:07.409813 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_7753c601-7739-4165-b5f2-a673b0797334/object-updater/0.log" Mar 20 11:50:07 crc kubenswrapper[4748]: I0320 11:50:07.469829 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_7753c601-7739-4165-b5f2-a673b0797334/swift-recon-cron/0.log" Mar 20 11:50:07 crc kubenswrapper[4748]: I0320 11:50:07.526411 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e98a37db-3a1d-4b1b-9442-7f4d005d3c1f" path="/var/lib/kubelet/pods/e98a37db-3a1d-4b1b-9442-7f4d005d3c1f/volumes" Mar 20 11:50:07 crc kubenswrapper[4748]: I0320 11:50:07.884113 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_da9b4776-4f59-46e4-9cdf-953b0a7f83bf/tempest-tests-tempest-tests-runner/0.log" Mar 20 11:50:07 crc kubenswrapper[4748]: I0320 11:50:07.891467 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_46161687-c406-4f73-aced-74edbd5e2f81/test-operator-logs-container/0.log" Mar 20 11:50:08 crc kubenswrapper[4748]: I0320 11:50:08.156732 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-xr6bf_2da6177a-9350-445c-820e-cf678dfd5500/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 11:50:08 crc kubenswrapper[4748]: I0320 11:50:08.312310 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-7wd5b_d03d3cf8-b0f5-46bf-9396-e6da7698e6fb/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 11:50:12 crc kubenswrapper[4748]: I0320 11:50:12.928106 4748 patch_prober.go:28] interesting pod/machine-config-daemon-5lbvz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:50:12 crc kubenswrapper[4748]: I0320 11:50:12.928700 4748 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:50:12 crc kubenswrapper[4748]: I0320 11:50:12.928755 4748 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" Mar 20 11:50:12 crc kubenswrapper[4748]: I0320 11:50:12.929689 4748 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2ca3fadedbbf1ffc460403a3135324978873632bc777457c2622cd86b79e06ad"} pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 11:50:12 crc kubenswrapper[4748]: I0320 11:50:12.929760 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" containerName="machine-config-daemon" containerID="cri-o://2ca3fadedbbf1ffc460403a3135324978873632bc777457c2622cd86b79e06ad" gracePeriod=600 Mar 20 11:50:14 crc kubenswrapper[4748]: I0320 11:50:14.037895 4748 generic.go:334] "Generic (PLEG): container finished" podID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" containerID="2ca3fadedbbf1ffc460403a3135324978873632bc777457c2622cd86b79e06ad" exitCode=0 Mar 20 11:50:14 crc kubenswrapper[4748]: I0320 11:50:14.037975 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" event={"ID":"8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c","Type":"ContainerDied","Data":"2ca3fadedbbf1ffc460403a3135324978873632bc777457c2622cd86b79e06ad"} Mar 20 11:50:14 crc kubenswrapper[4748]: I0320 11:50:14.038484 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" event={"ID":"8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c","Type":"ContainerStarted","Data":"219c994104121ab22714c2868fe3144cfbb2c9dfcdad0eea40b956e614be504d"} Mar 20 11:50:14 crc kubenswrapper[4748]: I0320 11:50:14.038507 4748 scope.go:117] "RemoveContainer" containerID="73b367691f863dac06f8e78eb6207c111ac5931452db06413eefe37857198933" Mar 20 11:50:24 crc kubenswrapper[4748]: I0320 11:50:24.484185 4748 scope.go:117] "RemoveContainer" containerID="6ff3a1bc20f0af391a6f5a181e62de21f5b826195df1c2c99f784c889a972fac" Mar 20 11:50:32 crc kubenswrapper[4748]: I0320 11:50:32.836111 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_69f52c17ab4278ef34a91246e6ae1710f6a470e03e4488535e84bfb4ebh7v5w_c9a1c042-3dd9-486b-aafd-05767bd2e20b/util/0.log" Mar 20 11:50:32 crc kubenswrapper[4748]: I0320 11:50:32.981335 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_69f52c17ab4278ef34a91246e6ae1710f6a470e03e4488535e84bfb4ebh7v5w_c9a1c042-3dd9-486b-aafd-05767bd2e20b/util/0.log" Mar 20 11:50:32 crc kubenswrapper[4748]: I0320 11:50:32.996282 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_69f52c17ab4278ef34a91246e6ae1710f6a470e03e4488535e84bfb4ebh7v5w_c9a1c042-3dd9-486b-aafd-05767bd2e20b/pull/0.log" Mar 20 11:50:33 crc kubenswrapper[4748]: I0320 11:50:33.025310 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_69f52c17ab4278ef34a91246e6ae1710f6a470e03e4488535e84bfb4ebh7v5w_c9a1c042-3dd9-486b-aafd-05767bd2e20b/pull/0.log" Mar 20 11:50:33 crc kubenswrapper[4748]: I0320 11:50:33.215490 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_69f52c17ab4278ef34a91246e6ae1710f6a470e03e4488535e84bfb4ebh7v5w_c9a1c042-3dd9-486b-aafd-05767bd2e20b/extract/0.log" Mar 20 11:50:33 crc kubenswrapper[4748]: I0320 11:50:33.233220 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_69f52c17ab4278ef34a91246e6ae1710f6a470e03e4488535e84bfb4ebh7v5w_c9a1c042-3dd9-486b-aafd-05767bd2e20b/pull/0.log" Mar 20 11:50:33 crc kubenswrapper[4748]: I0320 11:50:33.235485 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_69f52c17ab4278ef34a91246e6ae1710f6a470e03e4488535e84bfb4ebh7v5w_c9a1c042-3dd9-486b-aafd-05767bd2e20b/util/0.log" Mar 20 11:50:33 crc kubenswrapper[4748]: I0320 11:50:33.448498 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-59bc569d95-4gsq2_20023868-c089-41ec-ac26-9b4882fbab50/manager/0.log" Mar 20 11:50:33 crc kubenswrapper[4748]: I0320 11:50:33.690438 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-588d4d986b-9gzr8_d855d6bf-853d-454b-b0b7-feb11f23cc17/manager/0.log" Mar 20 11:50:33 crc kubenswrapper[4748]: I0320 11:50:33.823896 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-79df6bcc97-dn24h_8582a4fb-51b2-411c-a67f-31a023f40493/manager/0.log" Mar 20 11:50:33 crc kubenswrapper[4748]: I0320 11:50:33.906222 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-67dd5f86f5-r4sct_3fb5bb3a-ab86-4c3f-9d2d-9ef6d7f7ca1f/manager/0.log" Mar 20 11:50:34 crc kubenswrapper[4748]: I0320 11:50:34.142450 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-8464cc45fb-9gkdc_5d9f2386-33fc-43e9-9a61-e0d57fd94fbe/manager/0.log" Mar 20 11:50:34 crc kubenswrapper[4748]: I0320 11:50:34.384526 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6f787dddc9-9zhhz_640d4c26-acbd-4cb4-8b59-fde206294a91/manager/0.log" Mar 20 11:50:34 crc kubenswrapper[4748]: I0320 11:50:34.603373 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-7b9c774f96-x7sjb_c29e0600-cf39-40bf-9225-48e55c4b8f97/manager/0.log" Mar 20 11:50:34 crc kubenswrapper[4748]: I0320 11:50:34.719146 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-768b96df4c-prgc2_17f5527d-b31e-4788-ab09-ac5d26ea1bce/manager/0.log" Mar 20 11:50:34 crc kubenswrapper[4748]: I0320 11:50:34.946162 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-55f864c847-blfgz_6dbf38ab-35e4-4cf3-9655-b8dc49eaea7d/manager/0.log" Mar 20 11:50:35 crc kubenswrapper[4748]: I0320 11:50:35.168966 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-67ccfc9778-r5z4d_b0d0b327-5826-4c41-84bc-8b2c2bb05756/manager/0.log" Mar 20 11:50:35 crc kubenswrapper[4748]: I0320 11:50:35.169032 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-8d58dc466-stsk5_1ec4d02c-2709-4102-8a27-c4e7c71ed61f/manager/0.log" Mar 20 11:50:35 crc kubenswrapper[4748]: I0320 11:50:35.273938 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-767865f676-zcqnn_ecd87b49-65fa-465e-a668-03cb90381b6e/manager/0.log" Mar 20 11:50:35 crc kubenswrapper[4748]: I0320 11:50:35.452656 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-5b9f45d989-cgzmd_49092c30-9830-451a-8003-2cc7fa078b62/manager/0.log" Mar 20 11:50:35 crc kubenswrapper[4748]: I0320 11:50:35.483850 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-5d488d59fb-m84q7_bd4cdccf-68e3-4c27-ae51-f54b8089e08b/manager/0.log" Mar 20 11:50:35 crc kubenswrapper[4748]: I0320 11:50:35.601991 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-89d64c458-rxplv_f0a3f8d9-dcfa-498a-a46e-61628aa68067/manager/0.log" Mar 20 11:50:35 crc kubenswrapper[4748]: I0320 11:50:35.809286 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-d8d579484-vx6pz_35554cb6-28ee-4104-8591-ee987f93805b/operator/0.log" Mar 20 11:50:36 crc kubenswrapper[4748]: I0320 11:50:36.105307 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-p9wd8_246b06bc-5f0b-4ef1-87eb-a0f56ad26e30/registry-server/0.log" Mar 20 11:50:36 crc kubenswrapper[4748]: I0320 11:50:36.307826 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-884679f54-rt4hn_bf9a7295-e355-4c61-a841-fd2bce675235/manager/0.log" Mar 20 11:50:36 crc kubenswrapper[4748]: I0320 11:50:36.358608 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-5784578c99-897gg_736beaed-774c-43c0-bff9-d66a5ae4a1f5/manager/0.log" Mar 20 11:50:36 crc kubenswrapper[4748]: I0320 11:50:36.888389 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-5647f98656-9pqdv_795ce1d0-2232-4ed7-8618-c47a7584973e/manager/0.log" Mar 20 11:50:37 crc kubenswrapper[4748]: I0320 11:50:37.008720 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-7tgsh_b0646c53-71d5-40d9-8a3b-77c244fff7c4/operator/0.log" Mar 20 11:50:37 crc kubenswrapper[4748]: I0320 11:50:37.110643 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-c674c5965-2lkhq_1988c5e6-a91c-4085-a878-2ffdf478fa1b/manager/0.log" Mar 20 11:50:37 crc kubenswrapper[4748]: I0320 11:50:37.209648 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-d6b694c5-9pmr7_44b5dd5f-9a81-4c01-8efd-6d4997bb9c94/manager/0.log" Mar 20 11:50:37 crc kubenswrapper[4748]: I0320 11:50:37.314622 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5c5cb9c4d7-vsd7j_8e47e918-33de-4a66-9223-7ee3264600c1/manager/0.log" Mar 20 11:50:37 crc kubenswrapper[4748]: I0320 11:50:37.366359 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6c4d75f7f9-h7vwp_b6d05c98-f000-4560-b790-da31157488dc/manager/0.log" Mar 20 11:50:56 crc kubenswrapper[4748]: I0320 11:50:56.980911 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-v79zw_a6110c56-5634-4ef9-92b1-4c7c75dd4986/control-plane-machine-set-operator/0.log" Mar 20 11:50:57 crc kubenswrapper[4748]: I0320 11:50:57.527622 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-9lbsk_2cc53e20-383b-4e3a-a00a-d54ac8272e00/kube-rbac-proxy/0.log" Mar 20 11:50:57 crc kubenswrapper[4748]: I0320 11:50:57.575660 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-9lbsk_2cc53e20-383b-4e3a-a00a-d54ac8272e00/machine-api-operator/0.log" Mar 20 11:51:10 crc kubenswrapper[4748]: I0320 11:51:10.437761 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-krcxd_668082d7-988d-415d-bfde-1c28171130b5/cert-manager-controller/0.log" Mar 20 11:51:10 crc kubenswrapper[4748]: I0320 11:51:10.563511 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-wn2nc_bd82c1fc-6aff-4336-80fd-247fbc7aed58/cert-manager-cainjector/0.log" Mar 20 11:51:10 crc kubenswrapper[4748]: I0320 11:51:10.618489 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-jt8rf_2f029dc3-bb1a-4b18-91ee-fd467cbe157f/cert-manager-webhook/0.log" Mar 20 11:51:22 crc kubenswrapper[4748]: I0320 11:51:22.389252 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-86f58fcf4-l8vtz_2781b78b-43e7-4826-8e44-74f302a93478/nmstate-console-plugin/0.log" Mar 20 11:51:22 crc kubenswrapper[4748]: I0320 11:51:22.527939 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-sms75_12dadf04-5eff-4e48-96cd-d8033b0baf63/nmstate-handler/0.log" Mar 20 11:51:22 crc kubenswrapper[4748]: I0320 11:51:22.599538 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-n2kjn_48cc57fd-25e5-490f-af0b-13a1e5f9be6d/kube-rbac-proxy/0.log" Mar 20 11:51:22 crc kubenswrapper[4748]: I0320 11:51:22.705093 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-n2kjn_48cc57fd-25e5-490f-af0b-13a1e5f9be6d/nmstate-metrics/0.log" Mar 20 11:51:22 crc kubenswrapper[4748]: I0320 11:51:22.764289 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-796d4cfff4-hfdql_6ec3f263-5ad8-494d-88d0-ec9f60f7c6bc/nmstate-operator/0.log" Mar 20 11:51:22 crc kubenswrapper[4748]: I0320 11:51:22.909494 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f558f5558-w284q_fd6c2ee9-cef6-406b-a6e0-e1f741be9f61/nmstate-webhook/0.log" Mar 20 11:51:51 crc kubenswrapper[4748]: I0320 11:51:51.319100 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-ppl7d_1dcaf132-7ad3-4b86-ba2f-e695238b2001/kube-rbac-proxy/0.log" Mar 20 11:51:51 crc kubenswrapper[4748]: I0320 11:51:51.453741 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-ppl7d_1dcaf132-7ad3-4b86-ba2f-e695238b2001/controller/0.log" Mar 20 11:51:51 crc kubenswrapper[4748]: I0320 11:51:51.528141 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-bcc4b6f68-8rblj_de736bb5-e7a6-4a9a-8841-5ff65871db92/frr-k8s-webhook-server/0.log" Mar 20 11:51:51 crc kubenswrapper[4748]: I0320 11:51:51.651554 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wf8gw_9c9f62ea-e7bf-4b0b-adc7-d0f282d6d13c/cp-frr-files/0.log" Mar 20 11:51:51 crc kubenswrapper[4748]: I0320 11:51:51.822459 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wf8gw_9c9f62ea-e7bf-4b0b-adc7-d0f282d6d13c/cp-frr-files/0.log" Mar 20 11:51:51 crc kubenswrapper[4748]: I0320 11:51:51.831249 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wf8gw_9c9f62ea-e7bf-4b0b-adc7-d0f282d6d13c/cp-reloader/0.log" Mar 20 11:51:51 crc kubenswrapper[4748]: I0320 11:51:51.850937 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wf8gw_9c9f62ea-e7bf-4b0b-adc7-d0f282d6d13c/cp-metrics/0.log" Mar 20 11:51:51 crc kubenswrapper[4748]: I0320 11:51:51.860254 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wf8gw_9c9f62ea-e7bf-4b0b-adc7-d0f282d6d13c/cp-reloader/0.log" Mar 20 11:51:52 crc kubenswrapper[4748]: I0320 11:51:52.059463 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wf8gw_9c9f62ea-e7bf-4b0b-adc7-d0f282d6d13c/cp-metrics/0.log" Mar 20 11:51:52 crc kubenswrapper[4748]: I0320 11:51:52.070359 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wf8gw_9c9f62ea-e7bf-4b0b-adc7-d0f282d6d13c/cp-reloader/0.log" Mar 20 11:51:52 crc kubenswrapper[4748]: I0320 11:51:52.083976 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wf8gw_9c9f62ea-e7bf-4b0b-adc7-d0f282d6d13c/cp-frr-files/0.log" Mar 20 11:51:52 crc kubenswrapper[4748]: I0320 11:51:52.140392 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wf8gw_9c9f62ea-e7bf-4b0b-adc7-d0f282d6d13c/cp-metrics/0.log" Mar 20 11:51:52 crc kubenswrapper[4748]: I0320 11:51:52.249538 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wf8gw_9c9f62ea-e7bf-4b0b-adc7-d0f282d6d13c/cp-metrics/0.log" Mar 20 11:51:52 crc kubenswrapper[4748]: I0320 11:51:52.258177 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wf8gw_9c9f62ea-e7bf-4b0b-adc7-d0f282d6d13c/cp-frr-files/0.log" Mar 20 11:51:52 crc kubenswrapper[4748]: I0320 11:51:52.286646 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wf8gw_9c9f62ea-e7bf-4b0b-adc7-d0f282d6d13c/cp-reloader/0.log" Mar 20 11:51:52 crc kubenswrapper[4748]: I0320 11:51:52.333858 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wf8gw_9c9f62ea-e7bf-4b0b-adc7-d0f282d6d13c/controller/0.log" Mar 20 11:51:52 crc kubenswrapper[4748]: I0320 11:51:52.438621 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wf8gw_9c9f62ea-e7bf-4b0b-adc7-d0f282d6d13c/frr-metrics/0.log" Mar 20 11:51:52 crc kubenswrapper[4748]: I0320 11:51:52.467018 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wf8gw_9c9f62ea-e7bf-4b0b-adc7-d0f282d6d13c/kube-rbac-proxy/0.log" Mar 20 11:51:52 crc kubenswrapper[4748]: I0320 11:51:52.547684 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wf8gw_9c9f62ea-e7bf-4b0b-adc7-d0f282d6d13c/kube-rbac-proxy-frr/0.log" Mar 20 11:51:52 crc kubenswrapper[4748]: I0320 11:51:52.660483 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wf8gw_9c9f62ea-e7bf-4b0b-adc7-d0f282d6d13c/reloader/0.log" Mar 20 11:51:52 crc kubenswrapper[4748]: I0320 11:51:52.828519 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-5b86d95c7b-kqtkn_7b6c6eee-f00c-458f-b050-6aaab992addf/manager/0.log" Mar 20 11:51:52 crc kubenswrapper[4748]: I0320 11:51:52.973417 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-c7659c7ff-w4c8m_be8702a2-29a5-4037-94f4-0f3a4b48754d/webhook-server/0.log" Mar 20 11:51:53 crc kubenswrapper[4748]: I0320 11:51:53.133550 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-vkqf7_7a239ede-0107-4751-b103-27b225f2cf5e/kube-rbac-proxy/0.log" Mar 20 11:51:53 crc kubenswrapper[4748]: I0320 11:51:53.665102 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-vkqf7_7a239ede-0107-4751-b103-27b225f2cf5e/speaker/0.log" Mar 20 11:51:54 crc kubenswrapper[4748]: I0320 11:51:54.167310 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wf8gw_9c9f62ea-e7bf-4b0b-adc7-d0f282d6d13c/frr/0.log" Mar 20 11:52:00 crc kubenswrapper[4748]: I0320 11:52:00.141957 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566792-7t722"] Mar 20 11:52:00 crc kubenswrapper[4748]: E0320 11:52:00.142955 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5bae15e5-0199-42c4-babb-5da895996b8a" containerName="oc" Mar 20 11:52:00 crc kubenswrapper[4748]: I0320 11:52:00.142972 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bae15e5-0199-42c4-babb-5da895996b8a" containerName="oc" Mar 20 11:52:00 crc kubenswrapper[4748]: I0320 11:52:00.143206 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="5bae15e5-0199-42c4-babb-5da895996b8a" containerName="oc" Mar 20 11:52:00 crc kubenswrapper[4748]: I0320 11:52:00.144047 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566792-7t722" Mar 20 11:52:00 crc kubenswrapper[4748]: I0320 11:52:00.146022 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 11:52:00 crc kubenswrapper[4748]: I0320 11:52:00.146145 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-p6q8z" Mar 20 11:52:00 crc kubenswrapper[4748]: I0320 11:52:00.146849 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 11:52:00 crc kubenswrapper[4748]: I0320 11:52:00.153817 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566792-7t722"] Mar 20 11:52:00 crc kubenswrapper[4748]: I0320 11:52:00.261141 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2tkd\" (UniqueName: \"kubernetes.io/projected/2071e2d0-4ad8-4cf2-b478-502bf98abb04-kube-api-access-n2tkd\") pod \"auto-csr-approver-29566792-7t722\" (UID: \"2071e2d0-4ad8-4cf2-b478-502bf98abb04\") " pod="openshift-infra/auto-csr-approver-29566792-7t722" Mar 20 11:52:00 crc kubenswrapper[4748]: I0320 11:52:00.363591 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n2tkd\" (UniqueName: \"kubernetes.io/projected/2071e2d0-4ad8-4cf2-b478-502bf98abb04-kube-api-access-n2tkd\") pod \"auto-csr-approver-29566792-7t722\" (UID: \"2071e2d0-4ad8-4cf2-b478-502bf98abb04\") " pod="openshift-infra/auto-csr-approver-29566792-7t722" Mar 20 11:52:00 crc kubenswrapper[4748]: I0320 11:52:00.771658 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2tkd\" (UniqueName: \"kubernetes.io/projected/2071e2d0-4ad8-4cf2-b478-502bf98abb04-kube-api-access-n2tkd\") pod \"auto-csr-approver-29566792-7t722\" (UID: \"2071e2d0-4ad8-4cf2-b478-502bf98abb04\") " pod="openshift-infra/auto-csr-approver-29566792-7t722" Mar 20 11:52:00 crc kubenswrapper[4748]: I0320 11:52:00.774241 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566792-7t722" Mar 20 11:52:01 crc kubenswrapper[4748]: I0320 11:52:01.305229 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566792-7t722"] Mar 20 11:52:01 crc kubenswrapper[4748]: I0320 11:52:01.315463 4748 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 11:52:01 crc kubenswrapper[4748]: I0320 11:52:01.970313 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566792-7t722" event={"ID":"2071e2d0-4ad8-4cf2-b478-502bf98abb04","Type":"ContainerStarted","Data":"92f769d3d2c92aeddb6e3a3999c00ed03d2acd7fc6ef5b160c9f29839dd38f3c"} Mar 20 11:52:02 crc kubenswrapper[4748]: I0320 11:52:02.980660 4748 generic.go:334] "Generic (PLEG): container finished" podID="2071e2d0-4ad8-4cf2-b478-502bf98abb04" containerID="f4982bba606bf16f715397ff55f7527addd6a3b11a360d72d75a6724a692388b" exitCode=0 Mar 20 11:52:02 crc kubenswrapper[4748]: I0320 11:52:02.980716 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566792-7t722" event={"ID":"2071e2d0-4ad8-4cf2-b478-502bf98abb04","Type":"ContainerDied","Data":"f4982bba606bf16f715397ff55f7527addd6a3b11a360d72d75a6724a692388b"} Mar 20 11:52:04 crc kubenswrapper[4748]: I0320 11:52:04.337144 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566792-7t722" Mar 20 11:52:04 crc kubenswrapper[4748]: I0320 11:52:04.436657 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n2tkd\" (UniqueName: \"kubernetes.io/projected/2071e2d0-4ad8-4cf2-b478-502bf98abb04-kube-api-access-n2tkd\") pod \"2071e2d0-4ad8-4cf2-b478-502bf98abb04\" (UID: \"2071e2d0-4ad8-4cf2-b478-502bf98abb04\") " Mar 20 11:52:04 crc kubenswrapper[4748]: I0320 11:52:04.442819 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2071e2d0-4ad8-4cf2-b478-502bf98abb04-kube-api-access-n2tkd" (OuterVolumeSpecName: "kube-api-access-n2tkd") pod "2071e2d0-4ad8-4cf2-b478-502bf98abb04" (UID: "2071e2d0-4ad8-4cf2-b478-502bf98abb04"). InnerVolumeSpecName "kube-api-access-n2tkd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:52:04 crc kubenswrapper[4748]: I0320 11:52:04.539326 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n2tkd\" (UniqueName: \"kubernetes.io/projected/2071e2d0-4ad8-4cf2-b478-502bf98abb04-kube-api-access-n2tkd\") on node \"crc\" DevicePath \"\"" Mar 20 11:52:04 crc kubenswrapper[4748]: I0320 11:52:04.998109 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566792-7t722" event={"ID":"2071e2d0-4ad8-4cf2-b478-502bf98abb04","Type":"ContainerDied","Data":"92f769d3d2c92aeddb6e3a3999c00ed03d2acd7fc6ef5b160c9f29839dd38f3c"} Mar 20 11:52:04 crc kubenswrapper[4748]: I0320 11:52:04.998155 4748 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="92f769d3d2c92aeddb6e3a3999c00ed03d2acd7fc6ef5b160c9f29839dd38f3c" Mar 20 11:52:04 crc kubenswrapper[4748]: I0320 11:52:04.998165 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566792-7t722" Mar 20 11:52:05 crc kubenswrapper[4748]: I0320 11:52:05.411641 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566786-hdlq9"] Mar 20 11:52:05 crc kubenswrapper[4748]: I0320 11:52:05.424219 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566786-hdlq9"] Mar 20 11:52:05 crc kubenswrapper[4748]: I0320 11:52:05.526623 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf33acae-b0f1-469d-a655-28fa87914356" path="/var/lib/kubelet/pods/cf33acae-b0f1-469d-a655-28fa87914356/volumes" Mar 20 11:52:08 crc kubenswrapper[4748]: I0320 11:52:08.483316 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874tsfzz_5e2ce8ab-5247-412b-a4fb-d35645c906c6/util/0.log" Mar 20 11:52:09 crc kubenswrapper[4748]: I0320 11:52:09.261849 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874tsfzz_5e2ce8ab-5247-412b-a4fb-d35645c906c6/pull/0.log" Mar 20 11:52:09 crc kubenswrapper[4748]: I0320 11:52:09.300567 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874tsfzz_5e2ce8ab-5247-412b-a4fb-d35645c906c6/util/0.log" Mar 20 11:52:09 crc kubenswrapper[4748]: I0320 11:52:09.329803 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874tsfzz_5e2ce8ab-5247-412b-a4fb-d35645c906c6/pull/0.log" Mar 20 11:52:09 crc kubenswrapper[4748]: I0320 11:52:09.528818 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874tsfzz_5e2ce8ab-5247-412b-a4fb-d35645c906c6/util/0.log" Mar 20 11:52:09 crc kubenswrapper[4748]: I0320 11:52:09.562006 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874tsfzz_5e2ce8ab-5247-412b-a4fb-d35645c906c6/pull/0.log" Mar 20 11:52:09 crc kubenswrapper[4748]: I0320 11:52:09.580522 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874tsfzz_5e2ce8ab-5247-412b-a4fb-d35645c906c6/extract/0.log" Mar 20 11:52:09 crc kubenswrapper[4748]: I0320 11:52:09.699125 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1426jk_22772fc8-2959-4dfa-b1aa-070f9db955a1/util/0.log" Mar 20 11:52:09 crc kubenswrapper[4748]: I0320 11:52:09.876720 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1426jk_22772fc8-2959-4dfa-b1aa-070f9db955a1/util/0.log" Mar 20 11:52:09 crc kubenswrapper[4748]: I0320 11:52:09.942074 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1426jk_22772fc8-2959-4dfa-b1aa-070f9db955a1/pull/0.log" Mar 20 11:52:09 crc kubenswrapper[4748]: I0320 11:52:09.978109 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1426jk_22772fc8-2959-4dfa-b1aa-070f9db955a1/pull/0.log" Mar 20 11:52:10 crc kubenswrapper[4748]: I0320 11:52:10.127228 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1426jk_22772fc8-2959-4dfa-b1aa-070f9db955a1/util/0.log" Mar 20 11:52:10 crc kubenswrapper[4748]: I0320 11:52:10.190713 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1426jk_22772fc8-2959-4dfa-b1aa-070f9db955a1/pull/0.log" Mar 20 11:52:10 crc kubenswrapper[4748]: I0320 11:52:10.208995 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1426jk_22772fc8-2959-4dfa-b1aa-070f9db955a1/extract/0.log" Mar 20 11:52:10 crc kubenswrapper[4748]: I0320 11:52:10.332665 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-52b5d_ce0ed389-9ca5-4022-bc2d-3dfed380bd01/extract-utilities/0.log" Mar 20 11:52:10 crc kubenswrapper[4748]: I0320 11:52:10.984301 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-52b5d_ce0ed389-9ca5-4022-bc2d-3dfed380bd01/extract-content/0.log" Mar 20 11:52:11 crc kubenswrapper[4748]: I0320 11:52:11.001297 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-52b5d_ce0ed389-9ca5-4022-bc2d-3dfed380bd01/extract-utilities/0.log" Mar 20 11:52:11 crc kubenswrapper[4748]: I0320 11:52:11.023917 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-52b5d_ce0ed389-9ca5-4022-bc2d-3dfed380bd01/extract-content/0.log" Mar 20 11:52:11 crc kubenswrapper[4748]: I0320 11:52:11.154469 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-52b5d_ce0ed389-9ca5-4022-bc2d-3dfed380bd01/extract-utilities/0.log" Mar 20 11:52:11 crc kubenswrapper[4748]: I0320 11:52:11.191761 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-52b5d_ce0ed389-9ca5-4022-bc2d-3dfed380bd01/extract-content/0.log" Mar 20 11:52:11 crc kubenswrapper[4748]: I0320 11:52:11.411459 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-lnbmv_e152aa56-2971-4596-84ee-5ba1c22ef8e3/extract-utilities/0.log" Mar 20 11:52:11 crc kubenswrapper[4748]: I0320 11:52:11.691102 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-lnbmv_e152aa56-2971-4596-84ee-5ba1c22ef8e3/extract-utilities/0.log" Mar 20 11:52:11 crc kubenswrapper[4748]: I0320 11:52:11.711809 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-52b5d_ce0ed389-9ca5-4022-bc2d-3dfed380bd01/registry-server/0.log" Mar 20 11:52:11 crc kubenswrapper[4748]: I0320 11:52:11.737033 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-lnbmv_e152aa56-2971-4596-84ee-5ba1c22ef8e3/extract-content/0.log" Mar 20 11:52:11 crc kubenswrapper[4748]: I0320 11:52:11.774210 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-lnbmv_e152aa56-2971-4596-84ee-5ba1c22ef8e3/extract-content/0.log" Mar 20 11:52:11 crc kubenswrapper[4748]: I0320 11:52:11.901720 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-lnbmv_e152aa56-2971-4596-84ee-5ba1c22ef8e3/extract-utilities/0.log" Mar 20 11:52:11 crc kubenswrapper[4748]: I0320 11:52:11.910209 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-lnbmv_e152aa56-2971-4596-84ee-5ba1c22ef8e3/extract-content/0.log" Mar 20 11:52:12 crc kubenswrapper[4748]: I0320 11:52:12.193092 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-lnbmv_e152aa56-2971-4596-84ee-5ba1c22ef8e3/registry-server/0.log" Mar 20 11:52:12 crc kubenswrapper[4748]: I0320 11:52:12.383754 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-2qp4n_4e32ab73-56fa-4a44-bb26-42d87e8ee2d5/marketplace-operator/0.log" Mar 20 11:52:12 crc kubenswrapper[4748]: I0320 11:52:12.479504 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xvhwr_ae14d541-76df-4379-98ef-87f4e35e7db3/extract-utilities/0.log" Mar 20 11:52:12 crc kubenswrapper[4748]: I0320 11:52:12.630951 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xvhwr_ae14d541-76df-4379-98ef-87f4e35e7db3/extract-utilities/0.log" Mar 20 11:52:12 crc kubenswrapper[4748]: I0320 11:52:12.688321 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xvhwr_ae14d541-76df-4379-98ef-87f4e35e7db3/extract-content/0.log" Mar 20 11:52:12 crc kubenswrapper[4748]: I0320 11:52:12.714493 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xvhwr_ae14d541-76df-4379-98ef-87f4e35e7db3/extract-content/0.log" Mar 20 11:52:12 crc kubenswrapper[4748]: I0320 11:52:12.860447 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xvhwr_ae14d541-76df-4379-98ef-87f4e35e7db3/extract-content/0.log" Mar 20 11:52:12 crc kubenswrapper[4748]: I0320 11:52:12.926104 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xvhwr_ae14d541-76df-4379-98ef-87f4e35e7db3/extract-utilities/0.log" Mar 20 11:52:13 crc kubenswrapper[4748]: I0320 11:52:13.102277 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8mlkc_36f779a6-7268-4911-8532-f6fda0d56533/extract-utilities/0.log" Mar 20 11:52:13 crc kubenswrapper[4748]: I0320 11:52:13.126376 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xvhwr_ae14d541-76df-4379-98ef-87f4e35e7db3/registry-server/0.log" Mar 20 11:52:13 crc kubenswrapper[4748]: I0320 11:52:13.269746 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8mlkc_36f779a6-7268-4911-8532-f6fda0d56533/extract-utilities/0.log" Mar 20 11:52:13 crc kubenswrapper[4748]: I0320 11:52:13.290824 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8mlkc_36f779a6-7268-4911-8532-f6fda0d56533/extract-content/0.log" Mar 20 11:52:13 crc kubenswrapper[4748]: I0320 11:52:13.292930 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8mlkc_36f779a6-7268-4911-8532-f6fda0d56533/extract-content/0.log" Mar 20 11:52:13 crc kubenswrapper[4748]: I0320 11:52:13.484955 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8mlkc_36f779a6-7268-4911-8532-f6fda0d56533/extract-content/0.log" Mar 20 11:52:13 crc kubenswrapper[4748]: I0320 11:52:13.486534 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8mlkc_36f779a6-7268-4911-8532-f6fda0d56533/extract-utilities/0.log" Mar 20 11:52:14 crc kubenswrapper[4748]: I0320 11:52:14.102776 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8mlkc_36f779a6-7268-4911-8532-f6fda0d56533/registry-server/0.log" Mar 20 11:52:24 crc kubenswrapper[4748]: I0320 11:52:24.592487 4748 scope.go:117] "RemoveContainer" containerID="3927b8c0333840b426f73fb675e152ef07b0f09bf3e10d51f94e44853e11f317" Mar 20 11:52:42 crc kubenswrapper[4748]: I0320 11:52:42.928998 4748 patch_prober.go:28] interesting pod/machine-config-daemon-5lbvz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:52:42 crc kubenswrapper[4748]: I0320 11:52:42.929487 4748 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:52:52 crc kubenswrapper[4748]: I0320 11:52:52.644675 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-jxpg8"] Mar 20 11:52:52 crc kubenswrapper[4748]: E0320 11:52:52.645639 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2071e2d0-4ad8-4cf2-b478-502bf98abb04" containerName="oc" Mar 20 11:52:52 crc kubenswrapper[4748]: I0320 11:52:52.645654 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="2071e2d0-4ad8-4cf2-b478-502bf98abb04" containerName="oc" Mar 20 11:52:52 crc kubenswrapper[4748]: I0320 11:52:52.645870 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="2071e2d0-4ad8-4cf2-b478-502bf98abb04" containerName="oc" Mar 20 11:52:52 crc kubenswrapper[4748]: I0320 11:52:52.647531 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jxpg8" Mar 20 11:52:52 crc kubenswrapper[4748]: I0320 11:52:52.653698 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jxpg8"] Mar 20 11:52:52 crc kubenswrapper[4748]: I0320 11:52:52.660879 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61c33938-11f4-402d-96ef-8351465eaf90-utilities\") pod \"certified-operators-jxpg8\" (UID: \"61c33938-11f4-402d-96ef-8351465eaf90\") " pod="openshift-marketplace/certified-operators-jxpg8" Mar 20 11:52:52 crc kubenswrapper[4748]: I0320 11:52:52.661027 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rs655\" (UniqueName: \"kubernetes.io/projected/61c33938-11f4-402d-96ef-8351465eaf90-kube-api-access-rs655\") pod \"certified-operators-jxpg8\" (UID: \"61c33938-11f4-402d-96ef-8351465eaf90\") " pod="openshift-marketplace/certified-operators-jxpg8" Mar 20 11:52:52 crc kubenswrapper[4748]: I0320 11:52:52.661092 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61c33938-11f4-402d-96ef-8351465eaf90-catalog-content\") pod \"certified-operators-jxpg8\" (UID: \"61c33938-11f4-402d-96ef-8351465eaf90\") " pod="openshift-marketplace/certified-operators-jxpg8" Mar 20 11:52:52 crc kubenswrapper[4748]: I0320 11:52:52.763466 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61c33938-11f4-402d-96ef-8351465eaf90-catalog-content\") pod \"certified-operators-jxpg8\" (UID: \"61c33938-11f4-402d-96ef-8351465eaf90\") " pod="openshift-marketplace/certified-operators-jxpg8" Mar 20 11:52:52 crc kubenswrapper[4748]: I0320 11:52:52.763623 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61c33938-11f4-402d-96ef-8351465eaf90-utilities\") pod \"certified-operators-jxpg8\" (UID: \"61c33938-11f4-402d-96ef-8351465eaf90\") " pod="openshift-marketplace/certified-operators-jxpg8" Mar 20 11:52:52 crc kubenswrapper[4748]: I0320 11:52:52.764002 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rs655\" (UniqueName: \"kubernetes.io/projected/61c33938-11f4-402d-96ef-8351465eaf90-kube-api-access-rs655\") pod \"certified-operators-jxpg8\" (UID: \"61c33938-11f4-402d-96ef-8351465eaf90\") " pod="openshift-marketplace/certified-operators-jxpg8" Mar 20 11:52:52 crc kubenswrapper[4748]: I0320 11:52:52.764354 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61c33938-11f4-402d-96ef-8351465eaf90-catalog-content\") pod \"certified-operators-jxpg8\" (UID: \"61c33938-11f4-402d-96ef-8351465eaf90\") " pod="openshift-marketplace/certified-operators-jxpg8" Mar 20 11:52:52 crc kubenswrapper[4748]: I0320 11:52:52.764491 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61c33938-11f4-402d-96ef-8351465eaf90-utilities\") pod \"certified-operators-jxpg8\" (UID: \"61c33938-11f4-402d-96ef-8351465eaf90\") " pod="openshift-marketplace/certified-operators-jxpg8" Mar 20 11:52:52 crc kubenswrapper[4748]: I0320 11:52:52.790324 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rs655\" (UniqueName: \"kubernetes.io/projected/61c33938-11f4-402d-96ef-8351465eaf90-kube-api-access-rs655\") pod \"certified-operators-jxpg8\" (UID: \"61c33938-11f4-402d-96ef-8351465eaf90\") " pod="openshift-marketplace/certified-operators-jxpg8" Mar 20 11:52:52 crc kubenswrapper[4748]: I0320 11:52:52.968113 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jxpg8" Mar 20 11:52:53 crc kubenswrapper[4748]: I0320 11:52:53.419562 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jxpg8"] Mar 20 11:52:53 crc kubenswrapper[4748]: I0320 11:52:53.438526 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jxpg8" event={"ID":"61c33938-11f4-402d-96ef-8351465eaf90","Type":"ContainerStarted","Data":"a694e5537daa00b35f1a72173ef65a9b22889301bd27fcaac70c8e4c72a0828d"} Mar 20 11:52:54 crc kubenswrapper[4748]: I0320 11:52:54.449763 4748 generic.go:334] "Generic (PLEG): container finished" podID="61c33938-11f4-402d-96ef-8351465eaf90" containerID="91f6e09aba8f6a841e638862221d5af64ff422b7efac38fa3ba5c050cc06ef97" exitCode=0 Mar 20 11:52:54 crc kubenswrapper[4748]: I0320 11:52:54.450183 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jxpg8" event={"ID":"61c33938-11f4-402d-96ef-8351465eaf90","Type":"ContainerDied","Data":"91f6e09aba8f6a841e638862221d5af64ff422b7efac38fa3ba5c050cc06ef97"} Mar 20 11:52:56 crc kubenswrapper[4748]: I0320 11:52:56.472628 4748 generic.go:334] "Generic (PLEG): container finished" podID="61c33938-11f4-402d-96ef-8351465eaf90" containerID="7de0298c7108492046dbecbaffb192952d77b0ae6b75f9e106114479e17015aa" exitCode=0 Mar 20 11:52:56 crc kubenswrapper[4748]: I0320 11:52:56.472896 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jxpg8" event={"ID":"61c33938-11f4-402d-96ef-8351465eaf90","Type":"ContainerDied","Data":"7de0298c7108492046dbecbaffb192952d77b0ae6b75f9e106114479e17015aa"} Mar 20 11:52:57 crc kubenswrapper[4748]: I0320 11:52:57.488934 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jxpg8" event={"ID":"61c33938-11f4-402d-96ef-8351465eaf90","Type":"ContainerStarted","Data":"4ba37004f145c33d36b64d6c0dea287de753aa5e169b9de82fac55b4243f0052"} Mar 20 11:52:57 crc kubenswrapper[4748]: I0320 11:52:57.561208 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-jxpg8" podStartSLOduration=3.123748841 podStartE2EDuration="5.5611773s" podCreationTimestamp="2026-03-20 11:52:52 +0000 UTC" firstStartedPulling="2026-03-20 11:52:54.451585693 +0000 UTC m=+4609.593131507" lastFinishedPulling="2026-03-20 11:52:56.889014152 +0000 UTC m=+4612.030559966" observedRunningTime="2026-03-20 11:52:57.513476244 +0000 UTC m=+4612.655022068" watchObservedRunningTime="2026-03-20 11:52:57.5611773 +0000 UTC m=+4612.702723114" Mar 20 11:53:02 crc kubenswrapper[4748]: I0320 11:53:02.968968 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-jxpg8" Mar 20 11:53:02 crc kubenswrapper[4748]: I0320 11:53:02.969568 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-jxpg8" Mar 20 11:53:03 crc kubenswrapper[4748]: I0320 11:53:03.030443 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-jxpg8" Mar 20 11:53:03 crc kubenswrapper[4748]: I0320 11:53:03.585017 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-jxpg8" Mar 20 11:53:03 crc kubenswrapper[4748]: I0320 11:53:03.643162 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jxpg8"] Mar 20 11:53:05 crc kubenswrapper[4748]: I0320 11:53:05.553670 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-jxpg8" podUID="61c33938-11f4-402d-96ef-8351465eaf90" containerName="registry-server" containerID="cri-o://4ba37004f145c33d36b64d6c0dea287de753aa5e169b9de82fac55b4243f0052" gracePeriod=2 Mar 20 11:53:06 crc kubenswrapper[4748]: I0320 11:53:06.059653 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jxpg8" Mar 20 11:53:06 crc kubenswrapper[4748]: I0320 11:53:06.244293 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61c33938-11f4-402d-96ef-8351465eaf90-catalog-content\") pod \"61c33938-11f4-402d-96ef-8351465eaf90\" (UID: \"61c33938-11f4-402d-96ef-8351465eaf90\") " Mar 20 11:53:06 crc kubenswrapper[4748]: I0320 11:53:06.244632 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61c33938-11f4-402d-96ef-8351465eaf90-utilities\") pod \"61c33938-11f4-402d-96ef-8351465eaf90\" (UID: \"61c33938-11f4-402d-96ef-8351465eaf90\") " Mar 20 11:53:06 crc kubenswrapper[4748]: I0320 11:53:06.244763 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rs655\" (UniqueName: \"kubernetes.io/projected/61c33938-11f4-402d-96ef-8351465eaf90-kube-api-access-rs655\") pod \"61c33938-11f4-402d-96ef-8351465eaf90\" (UID: \"61c33938-11f4-402d-96ef-8351465eaf90\") " Mar 20 11:53:06 crc kubenswrapper[4748]: I0320 11:53:06.246163 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/61c33938-11f4-402d-96ef-8351465eaf90-utilities" (OuterVolumeSpecName: "utilities") pod "61c33938-11f4-402d-96ef-8351465eaf90" (UID: "61c33938-11f4-402d-96ef-8351465eaf90"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:53:06 crc kubenswrapper[4748]: I0320 11:53:06.252956 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61c33938-11f4-402d-96ef-8351465eaf90-kube-api-access-rs655" (OuterVolumeSpecName: "kube-api-access-rs655") pod "61c33938-11f4-402d-96ef-8351465eaf90" (UID: "61c33938-11f4-402d-96ef-8351465eaf90"). InnerVolumeSpecName "kube-api-access-rs655". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:53:06 crc kubenswrapper[4748]: I0320 11:53:06.323941 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/61c33938-11f4-402d-96ef-8351465eaf90-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "61c33938-11f4-402d-96ef-8351465eaf90" (UID: "61c33938-11f4-402d-96ef-8351465eaf90"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:53:06 crc kubenswrapper[4748]: I0320 11:53:06.346894 4748 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61c33938-11f4-402d-96ef-8351465eaf90-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 11:53:06 crc kubenswrapper[4748]: I0320 11:53:06.346929 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rs655\" (UniqueName: \"kubernetes.io/projected/61c33938-11f4-402d-96ef-8351465eaf90-kube-api-access-rs655\") on node \"crc\" DevicePath \"\"" Mar 20 11:53:06 crc kubenswrapper[4748]: I0320 11:53:06.346939 4748 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61c33938-11f4-402d-96ef-8351465eaf90-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 11:53:06 crc kubenswrapper[4748]: I0320 11:53:06.563786 4748 generic.go:334] "Generic (PLEG): container finished" podID="61c33938-11f4-402d-96ef-8351465eaf90" containerID="4ba37004f145c33d36b64d6c0dea287de753aa5e169b9de82fac55b4243f0052" exitCode=0 Mar 20 11:53:06 crc kubenswrapper[4748]: I0320 11:53:06.563849 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jxpg8" event={"ID":"61c33938-11f4-402d-96ef-8351465eaf90","Type":"ContainerDied","Data":"4ba37004f145c33d36b64d6c0dea287de753aa5e169b9de82fac55b4243f0052"} Mar 20 11:53:06 crc kubenswrapper[4748]: I0320 11:53:06.563885 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jxpg8" event={"ID":"61c33938-11f4-402d-96ef-8351465eaf90","Type":"ContainerDied","Data":"a694e5537daa00b35f1a72173ef65a9b22889301bd27fcaac70c8e4c72a0828d"} Mar 20 11:53:06 crc kubenswrapper[4748]: I0320 11:53:06.563905 4748 scope.go:117] "RemoveContainer" containerID="4ba37004f145c33d36b64d6c0dea287de753aa5e169b9de82fac55b4243f0052" Mar 20 11:53:06 crc kubenswrapper[4748]: I0320 11:53:06.565824 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jxpg8" Mar 20 11:53:06 crc kubenswrapper[4748]: I0320 11:53:06.582330 4748 scope.go:117] "RemoveContainer" containerID="7de0298c7108492046dbecbaffb192952d77b0ae6b75f9e106114479e17015aa" Mar 20 11:53:06 crc kubenswrapper[4748]: I0320 11:53:06.606329 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jxpg8"] Mar 20 11:53:06 crc kubenswrapper[4748]: I0320 11:53:06.606718 4748 scope.go:117] "RemoveContainer" containerID="91f6e09aba8f6a841e638862221d5af64ff422b7efac38fa3ba5c050cc06ef97" Mar 20 11:53:06 crc kubenswrapper[4748]: I0320 11:53:06.620108 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-jxpg8"] Mar 20 11:53:06 crc kubenswrapper[4748]: I0320 11:53:06.651730 4748 scope.go:117] "RemoveContainer" containerID="4ba37004f145c33d36b64d6c0dea287de753aa5e169b9de82fac55b4243f0052" Mar 20 11:53:06 crc kubenswrapper[4748]: E0320 11:53:06.652149 4748 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ba37004f145c33d36b64d6c0dea287de753aa5e169b9de82fac55b4243f0052\": container with ID starting with 4ba37004f145c33d36b64d6c0dea287de753aa5e169b9de82fac55b4243f0052 not found: ID does not exist" containerID="4ba37004f145c33d36b64d6c0dea287de753aa5e169b9de82fac55b4243f0052" Mar 20 11:53:06 crc kubenswrapper[4748]: I0320 11:53:06.652210 4748 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ba37004f145c33d36b64d6c0dea287de753aa5e169b9de82fac55b4243f0052"} err="failed to get container status \"4ba37004f145c33d36b64d6c0dea287de753aa5e169b9de82fac55b4243f0052\": rpc error: code = NotFound desc = could not find container \"4ba37004f145c33d36b64d6c0dea287de753aa5e169b9de82fac55b4243f0052\": container with ID starting with 4ba37004f145c33d36b64d6c0dea287de753aa5e169b9de82fac55b4243f0052 not found: ID does not exist" Mar 20 11:53:06 crc kubenswrapper[4748]: I0320 11:53:06.652238 4748 scope.go:117] "RemoveContainer" containerID="7de0298c7108492046dbecbaffb192952d77b0ae6b75f9e106114479e17015aa" Mar 20 11:53:06 crc kubenswrapper[4748]: E0320 11:53:06.652449 4748 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7de0298c7108492046dbecbaffb192952d77b0ae6b75f9e106114479e17015aa\": container with ID starting with 7de0298c7108492046dbecbaffb192952d77b0ae6b75f9e106114479e17015aa not found: ID does not exist" containerID="7de0298c7108492046dbecbaffb192952d77b0ae6b75f9e106114479e17015aa" Mar 20 11:53:06 crc kubenswrapper[4748]: I0320 11:53:06.652478 4748 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7de0298c7108492046dbecbaffb192952d77b0ae6b75f9e106114479e17015aa"} err="failed to get container status \"7de0298c7108492046dbecbaffb192952d77b0ae6b75f9e106114479e17015aa\": rpc error: code = NotFound desc = could not find container \"7de0298c7108492046dbecbaffb192952d77b0ae6b75f9e106114479e17015aa\": container with ID starting with 7de0298c7108492046dbecbaffb192952d77b0ae6b75f9e106114479e17015aa not found: ID does not exist" Mar 20 11:53:06 crc kubenswrapper[4748]: I0320 11:53:06.652499 4748 scope.go:117] "RemoveContainer" containerID="91f6e09aba8f6a841e638862221d5af64ff422b7efac38fa3ba5c050cc06ef97" Mar 20 11:53:06 crc kubenswrapper[4748]: E0320 11:53:06.653085 4748 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91f6e09aba8f6a841e638862221d5af64ff422b7efac38fa3ba5c050cc06ef97\": container with ID starting with 91f6e09aba8f6a841e638862221d5af64ff422b7efac38fa3ba5c050cc06ef97 not found: ID does not exist" containerID="91f6e09aba8f6a841e638862221d5af64ff422b7efac38fa3ba5c050cc06ef97" Mar 20 11:53:06 crc kubenswrapper[4748]: I0320 11:53:06.653113 4748 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91f6e09aba8f6a841e638862221d5af64ff422b7efac38fa3ba5c050cc06ef97"} err="failed to get container status \"91f6e09aba8f6a841e638862221d5af64ff422b7efac38fa3ba5c050cc06ef97\": rpc error: code = NotFound desc = could not find container \"91f6e09aba8f6a841e638862221d5af64ff422b7efac38fa3ba5c050cc06ef97\": container with ID starting with 91f6e09aba8f6a841e638862221d5af64ff422b7efac38fa3ba5c050cc06ef97 not found: ID does not exist" Mar 20 11:53:07 crc kubenswrapper[4748]: I0320 11:53:07.524730 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="61c33938-11f4-402d-96ef-8351465eaf90" path="/var/lib/kubelet/pods/61c33938-11f4-402d-96ef-8351465eaf90/volumes" Mar 20 11:53:12 crc kubenswrapper[4748]: I0320 11:53:12.927890 4748 patch_prober.go:28] interesting pod/machine-config-daemon-5lbvz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:53:12 crc kubenswrapper[4748]: I0320 11:53:12.928196 4748 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:53:42 crc kubenswrapper[4748]: I0320 11:53:42.928327 4748 patch_prober.go:28] interesting pod/machine-config-daemon-5lbvz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:53:42 crc kubenswrapper[4748]: I0320 11:53:42.930043 4748 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:53:42 crc kubenswrapper[4748]: I0320 11:53:42.930193 4748 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" Mar 20 11:53:42 crc kubenswrapper[4748]: I0320 11:53:42.931195 4748 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"219c994104121ab22714c2868fe3144cfbb2c9dfcdad0eea40b956e614be504d"} pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 11:53:42 crc kubenswrapper[4748]: I0320 11:53:42.931381 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" containerName="machine-config-daemon" containerID="cri-o://219c994104121ab22714c2868fe3144cfbb2c9dfcdad0eea40b956e614be504d" gracePeriod=600 Mar 20 11:53:43 crc kubenswrapper[4748]: E0320 11:53:43.691332 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lbvz_openshift-machine-config-operator(8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" Mar 20 11:53:43 crc kubenswrapper[4748]: I0320 11:53:43.899549 4748 generic.go:334] "Generic (PLEG): container finished" podID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" containerID="219c994104121ab22714c2868fe3144cfbb2c9dfcdad0eea40b956e614be504d" exitCode=0 Mar 20 11:53:43 crc kubenswrapper[4748]: I0320 11:53:43.899595 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" event={"ID":"8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c","Type":"ContainerDied","Data":"219c994104121ab22714c2868fe3144cfbb2c9dfcdad0eea40b956e614be504d"} Mar 20 11:53:43 crc kubenswrapper[4748]: I0320 11:53:43.899628 4748 scope.go:117] "RemoveContainer" containerID="2ca3fadedbbf1ffc460403a3135324978873632bc777457c2622cd86b79e06ad" Mar 20 11:53:43 crc kubenswrapper[4748]: I0320 11:53:43.900504 4748 scope.go:117] "RemoveContainer" containerID="219c994104121ab22714c2868fe3144cfbb2c9dfcdad0eea40b956e614be504d" Mar 20 11:53:43 crc kubenswrapper[4748]: E0320 11:53:43.900858 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lbvz_openshift-machine-config-operator(8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" Mar 20 11:53:55 crc kubenswrapper[4748]: I0320 11:53:55.521056 4748 scope.go:117] "RemoveContainer" containerID="219c994104121ab22714c2868fe3144cfbb2c9dfcdad0eea40b956e614be504d" Mar 20 11:53:55 crc kubenswrapper[4748]: E0320 11:53:55.521680 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lbvz_openshift-machine-config-operator(8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" Mar 20 11:54:00 crc kubenswrapper[4748]: I0320 11:54:00.151673 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566794-86k58"] Mar 20 11:54:00 crc kubenswrapper[4748]: E0320 11:54:00.152759 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61c33938-11f4-402d-96ef-8351465eaf90" containerName="extract-content" Mar 20 11:54:00 crc kubenswrapper[4748]: I0320 11:54:00.152776 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="61c33938-11f4-402d-96ef-8351465eaf90" containerName="extract-content" Mar 20 11:54:00 crc kubenswrapper[4748]: E0320 11:54:00.152791 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61c33938-11f4-402d-96ef-8351465eaf90" containerName="registry-server" Mar 20 11:54:00 crc kubenswrapper[4748]: I0320 11:54:00.152798 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="61c33938-11f4-402d-96ef-8351465eaf90" containerName="registry-server" Mar 20 11:54:00 crc kubenswrapper[4748]: E0320 11:54:00.152875 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61c33938-11f4-402d-96ef-8351465eaf90" containerName="extract-utilities" Mar 20 11:54:00 crc kubenswrapper[4748]: I0320 11:54:00.152883 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="61c33938-11f4-402d-96ef-8351465eaf90" containerName="extract-utilities" Mar 20 11:54:00 crc kubenswrapper[4748]: I0320 11:54:00.153111 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="61c33938-11f4-402d-96ef-8351465eaf90" containerName="registry-server" Mar 20 11:54:00 crc kubenswrapper[4748]: I0320 11:54:00.153900 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566794-86k58" Mar 20 11:54:00 crc kubenswrapper[4748]: I0320 11:54:00.157219 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 11:54:00 crc kubenswrapper[4748]: I0320 11:54:00.157576 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-p6q8z" Mar 20 11:54:00 crc kubenswrapper[4748]: I0320 11:54:00.157729 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 11:54:00 crc kubenswrapper[4748]: I0320 11:54:00.163907 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566794-86k58"] Mar 20 11:54:00 crc kubenswrapper[4748]: I0320 11:54:00.307954 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6s67f\" (UniqueName: \"kubernetes.io/projected/8d6ad2b5-b240-441e-bfd8-8337f2b7c53a-kube-api-access-6s67f\") pod \"auto-csr-approver-29566794-86k58\" (UID: \"8d6ad2b5-b240-441e-bfd8-8337f2b7c53a\") " pod="openshift-infra/auto-csr-approver-29566794-86k58" Mar 20 11:54:00 crc kubenswrapper[4748]: I0320 11:54:00.409670 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6s67f\" (UniqueName: \"kubernetes.io/projected/8d6ad2b5-b240-441e-bfd8-8337f2b7c53a-kube-api-access-6s67f\") pod \"auto-csr-approver-29566794-86k58\" (UID: \"8d6ad2b5-b240-441e-bfd8-8337f2b7c53a\") " pod="openshift-infra/auto-csr-approver-29566794-86k58" Mar 20 11:54:00 crc kubenswrapper[4748]: I0320 11:54:00.442761 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6s67f\" (UniqueName: \"kubernetes.io/projected/8d6ad2b5-b240-441e-bfd8-8337f2b7c53a-kube-api-access-6s67f\") pod \"auto-csr-approver-29566794-86k58\" (UID: \"8d6ad2b5-b240-441e-bfd8-8337f2b7c53a\") " pod="openshift-infra/auto-csr-approver-29566794-86k58" Mar 20 11:54:00 crc kubenswrapper[4748]: I0320 11:54:00.488069 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566794-86k58" Mar 20 11:54:00 crc kubenswrapper[4748]: I0320 11:54:00.961662 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566794-86k58"] Mar 20 11:54:01 crc kubenswrapper[4748]: I0320 11:54:01.078806 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566794-86k58" event={"ID":"8d6ad2b5-b240-441e-bfd8-8337f2b7c53a","Type":"ContainerStarted","Data":"60a895f26d5b0d75889b2ef878fdae8947c18cc0344c11232cedd0bb7cb511cf"} Mar 20 11:54:03 crc kubenswrapper[4748]: I0320 11:54:03.094866 4748 generic.go:334] "Generic (PLEG): container finished" podID="8d6ad2b5-b240-441e-bfd8-8337f2b7c53a" containerID="a0fa16c072acc5eeb5fd94f12182bb8326fc395167269f32546de0c675b4b536" exitCode=0 Mar 20 11:54:03 crc kubenswrapper[4748]: I0320 11:54:03.094922 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566794-86k58" event={"ID":"8d6ad2b5-b240-441e-bfd8-8337f2b7c53a","Type":"ContainerDied","Data":"a0fa16c072acc5eeb5fd94f12182bb8326fc395167269f32546de0c675b4b536"} Mar 20 11:54:04 crc kubenswrapper[4748]: I0320 11:54:04.455812 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566794-86k58" Mar 20 11:54:04 crc kubenswrapper[4748]: I0320 11:54:04.588056 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6s67f\" (UniqueName: \"kubernetes.io/projected/8d6ad2b5-b240-441e-bfd8-8337f2b7c53a-kube-api-access-6s67f\") pod \"8d6ad2b5-b240-441e-bfd8-8337f2b7c53a\" (UID: \"8d6ad2b5-b240-441e-bfd8-8337f2b7c53a\") " Mar 20 11:54:04 crc kubenswrapper[4748]: I0320 11:54:04.595522 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d6ad2b5-b240-441e-bfd8-8337f2b7c53a-kube-api-access-6s67f" (OuterVolumeSpecName: "kube-api-access-6s67f") pod "8d6ad2b5-b240-441e-bfd8-8337f2b7c53a" (UID: "8d6ad2b5-b240-441e-bfd8-8337f2b7c53a"). InnerVolumeSpecName "kube-api-access-6s67f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:54:04 crc kubenswrapper[4748]: I0320 11:54:04.690420 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6s67f\" (UniqueName: \"kubernetes.io/projected/8d6ad2b5-b240-441e-bfd8-8337f2b7c53a-kube-api-access-6s67f\") on node \"crc\" DevicePath \"\"" Mar 20 11:54:05 crc kubenswrapper[4748]: I0320 11:54:05.116151 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566794-86k58" event={"ID":"8d6ad2b5-b240-441e-bfd8-8337f2b7c53a","Type":"ContainerDied","Data":"60a895f26d5b0d75889b2ef878fdae8947c18cc0344c11232cedd0bb7cb511cf"} Mar 20 11:54:05 crc kubenswrapper[4748]: I0320 11:54:05.116208 4748 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="60a895f26d5b0d75889b2ef878fdae8947c18cc0344c11232cedd0bb7cb511cf" Mar 20 11:54:05 crc kubenswrapper[4748]: I0320 11:54:05.116223 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566794-86k58" Mar 20 11:54:05 crc kubenswrapper[4748]: I0320 11:54:05.555632 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566788-v2zsq"] Mar 20 11:54:05 crc kubenswrapper[4748]: I0320 11:54:05.586922 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566788-v2zsq"] Mar 20 11:54:07 crc kubenswrapper[4748]: I0320 11:54:07.516125 4748 scope.go:117] "RemoveContainer" containerID="219c994104121ab22714c2868fe3144cfbb2c9dfcdad0eea40b956e614be504d" Mar 20 11:54:07 crc kubenswrapper[4748]: E0320 11:54:07.516650 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lbvz_openshift-machine-config-operator(8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" Mar 20 11:54:07 crc kubenswrapper[4748]: I0320 11:54:07.532493 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c53b89d4-6145-4717-815a-00d6a80049a4" path="/var/lib/kubelet/pods/c53b89d4-6145-4717-815a-00d6a80049a4/volumes" Mar 20 11:54:21 crc kubenswrapper[4748]: I0320 11:54:21.515673 4748 scope.go:117] "RemoveContainer" containerID="219c994104121ab22714c2868fe3144cfbb2c9dfcdad0eea40b956e614be504d" Mar 20 11:54:21 crc kubenswrapper[4748]: E0320 11:54:21.516515 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lbvz_openshift-machine-config-operator(8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" Mar 20 11:54:23 crc kubenswrapper[4748]: I0320 11:54:23.073735 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-nnfnz"] Mar 20 11:54:23 crc kubenswrapper[4748]: E0320 11:54:23.074589 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d6ad2b5-b240-441e-bfd8-8337f2b7c53a" containerName="oc" Mar 20 11:54:23 crc kubenswrapper[4748]: I0320 11:54:23.074609 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d6ad2b5-b240-441e-bfd8-8337f2b7c53a" containerName="oc" Mar 20 11:54:23 crc kubenswrapper[4748]: I0320 11:54:23.074914 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d6ad2b5-b240-441e-bfd8-8337f2b7c53a" containerName="oc" Mar 20 11:54:23 crc kubenswrapper[4748]: I0320 11:54:23.076728 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nnfnz" Mar 20 11:54:23 crc kubenswrapper[4748]: I0320 11:54:23.085287 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nnfnz"] Mar 20 11:54:23 crc kubenswrapper[4748]: I0320 11:54:23.178715 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2e8ec044-4258-42b8-9a84-653d94d9b76b-catalog-content\") pod \"redhat-marketplace-nnfnz\" (UID: \"2e8ec044-4258-42b8-9a84-653d94d9b76b\") " pod="openshift-marketplace/redhat-marketplace-nnfnz" Mar 20 11:54:23 crc kubenswrapper[4748]: I0320 11:54:23.178888 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqcvc\" (UniqueName: \"kubernetes.io/projected/2e8ec044-4258-42b8-9a84-653d94d9b76b-kube-api-access-jqcvc\") pod \"redhat-marketplace-nnfnz\" (UID: \"2e8ec044-4258-42b8-9a84-653d94d9b76b\") " pod="openshift-marketplace/redhat-marketplace-nnfnz" Mar 20 11:54:23 crc kubenswrapper[4748]: I0320 11:54:23.179118 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2e8ec044-4258-42b8-9a84-653d94d9b76b-utilities\") pod \"redhat-marketplace-nnfnz\" (UID: \"2e8ec044-4258-42b8-9a84-653d94d9b76b\") " pod="openshift-marketplace/redhat-marketplace-nnfnz" Mar 20 11:54:23 crc kubenswrapper[4748]: I0320 11:54:23.280467 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2e8ec044-4258-42b8-9a84-653d94d9b76b-utilities\") pod \"redhat-marketplace-nnfnz\" (UID: \"2e8ec044-4258-42b8-9a84-653d94d9b76b\") " pod="openshift-marketplace/redhat-marketplace-nnfnz" Mar 20 11:54:23 crc kubenswrapper[4748]: I0320 11:54:23.281032 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2e8ec044-4258-42b8-9a84-653d94d9b76b-utilities\") pod \"redhat-marketplace-nnfnz\" (UID: \"2e8ec044-4258-42b8-9a84-653d94d9b76b\") " pod="openshift-marketplace/redhat-marketplace-nnfnz" Mar 20 11:54:23 crc kubenswrapper[4748]: I0320 11:54:23.281489 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2e8ec044-4258-42b8-9a84-653d94d9b76b-catalog-content\") pod \"redhat-marketplace-nnfnz\" (UID: \"2e8ec044-4258-42b8-9a84-653d94d9b76b\") " pod="openshift-marketplace/redhat-marketplace-nnfnz" Mar 20 11:54:23 crc kubenswrapper[4748]: I0320 11:54:23.281982 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2e8ec044-4258-42b8-9a84-653d94d9b76b-catalog-content\") pod \"redhat-marketplace-nnfnz\" (UID: \"2e8ec044-4258-42b8-9a84-653d94d9b76b\") " pod="openshift-marketplace/redhat-marketplace-nnfnz" Mar 20 11:54:23 crc kubenswrapper[4748]: I0320 11:54:23.282276 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jqcvc\" (UniqueName: \"kubernetes.io/projected/2e8ec044-4258-42b8-9a84-653d94d9b76b-kube-api-access-jqcvc\") pod \"redhat-marketplace-nnfnz\" (UID: \"2e8ec044-4258-42b8-9a84-653d94d9b76b\") " pod="openshift-marketplace/redhat-marketplace-nnfnz" Mar 20 11:54:23 crc kubenswrapper[4748]: I0320 11:54:23.304682 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqcvc\" (UniqueName: \"kubernetes.io/projected/2e8ec044-4258-42b8-9a84-653d94d9b76b-kube-api-access-jqcvc\") pod \"redhat-marketplace-nnfnz\" (UID: \"2e8ec044-4258-42b8-9a84-653d94d9b76b\") " pod="openshift-marketplace/redhat-marketplace-nnfnz" Mar 20 11:54:23 crc kubenswrapper[4748]: I0320 11:54:23.402030 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nnfnz" Mar 20 11:54:23 crc kubenswrapper[4748]: I0320 11:54:23.944906 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nnfnz"] Mar 20 11:54:24 crc kubenswrapper[4748]: I0320 11:54:24.308702 4748 generic.go:334] "Generic (PLEG): container finished" podID="2e8ec044-4258-42b8-9a84-653d94d9b76b" containerID="ae928fc83664ca822d261a5cce4d5df988c282f04089cc7c22482697b628e764" exitCode=0 Mar 20 11:54:24 crc kubenswrapper[4748]: I0320 11:54:24.308746 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nnfnz" event={"ID":"2e8ec044-4258-42b8-9a84-653d94d9b76b","Type":"ContainerDied","Data":"ae928fc83664ca822d261a5cce4d5df988c282f04089cc7c22482697b628e764"} Mar 20 11:54:24 crc kubenswrapper[4748]: I0320 11:54:24.308772 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nnfnz" event={"ID":"2e8ec044-4258-42b8-9a84-653d94d9b76b","Type":"ContainerStarted","Data":"b48b977ad72041e3374bc48658eed1b35904713e9cf88abb8594092f39938d97"} Mar 20 11:54:24 crc kubenswrapper[4748]: I0320 11:54:24.711605 4748 scope.go:117] "RemoveContainer" containerID="af83612a5d6f19ae622228644002233a9d3a159489750155ead34016e5ae33ff" Mar 20 11:54:25 crc kubenswrapper[4748]: I0320 11:54:25.437995 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nnfnz" event={"ID":"2e8ec044-4258-42b8-9a84-653d94d9b76b","Type":"ContainerStarted","Data":"95558c0ea4167d02cb1f585d6d6b06688995b034bb2aab2c586f53cf4500eed7"} Mar 20 11:54:26 crc kubenswrapper[4748]: I0320 11:54:26.450544 4748 generic.go:334] "Generic (PLEG): container finished" podID="2e8ec044-4258-42b8-9a84-653d94d9b76b" containerID="95558c0ea4167d02cb1f585d6d6b06688995b034bb2aab2c586f53cf4500eed7" exitCode=0 Mar 20 11:54:26 crc kubenswrapper[4748]: I0320 11:54:26.450606 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nnfnz" event={"ID":"2e8ec044-4258-42b8-9a84-653d94d9b76b","Type":"ContainerDied","Data":"95558c0ea4167d02cb1f585d6d6b06688995b034bb2aab2c586f53cf4500eed7"} Mar 20 11:54:26 crc kubenswrapper[4748]: I0320 11:54:26.451095 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nnfnz" event={"ID":"2e8ec044-4258-42b8-9a84-653d94d9b76b","Type":"ContainerStarted","Data":"7c3556812645bc864b2a4883b40e3e00424a2273406a5db52166247884834d58"} Mar 20 11:54:26 crc kubenswrapper[4748]: I0320 11:54:26.481270 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-nnfnz" podStartSLOduration=1.947297786 podStartE2EDuration="3.481241817s" podCreationTimestamp="2026-03-20 11:54:23 +0000 UTC" firstStartedPulling="2026-03-20 11:54:24.311117998 +0000 UTC m=+4699.452663812" lastFinishedPulling="2026-03-20 11:54:25.845062029 +0000 UTC m=+4700.986607843" observedRunningTime="2026-03-20 11:54:26.471594633 +0000 UTC m=+4701.613140457" watchObservedRunningTime="2026-03-20 11:54:26.481241817 +0000 UTC m=+4701.622787631" Mar 20 11:54:33 crc kubenswrapper[4748]: I0320 11:54:33.403977 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-nnfnz" Mar 20 11:54:33 crc kubenswrapper[4748]: I0320 11:54:33.404589 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-nnfnz" Mar 20 11:54:33 crc kubenswrapper[4748]: I0320 11:54:33.463873 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-nnfnz" Mar 20 11:54:33 crc kubenswrapper[4748]: I0320 11:54:33.561154 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-nnfnz" Mar 20 11:54:33 crc kubenswrapper[4748]: I0320 11:54:33.703022 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-nnfnz"] Mar 20 11:54:34 crc kubenswrapper[4748]: I0320 11:54:34.515156 4748 scope.go:117] "RemoveContainer" containerID="219c994104121ab22714c2868fe3144cfbb2c9dfcdad0eea40b956e614be504d" Mar 20 11:54:34 crc kubenswrapper[4748]: E0320 11:54:34.515458 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lbvz_openshift-machine-config-operator(8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" Mar 20 11:54:35 crc kubenswrapper[4748]: I0320 11:54:35.523111 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-nnfnz" podUID="2e8ec044-4258-42b8-9a84-653d94d9b76b" containerName="registry-server" containerID="cri-o://7c3556812645bc864b2a4883b40e3e00424a2273406a5db52166247884834d58" gracePeriod=2 Mar 20 11:54:36 crc kubenswrapper[4748]: I0320 11:54:36.041903 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nnfnz" Mar 20 11:54:36 crc kubenswrapper[4748]: I0320 11:54:36.173638 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jqcvc\" (UniqueName: \"kubernetes.io/projected/2e8ec044-4258-42b8-9a84-653d94d9b76b-kube-api-access-jqcvc\") pod \"2e8ec044-4258-42b8-9a84-653d94d9b76b\" (UID: \"2e8ec044-4258-42b8-9a84-653d94d9b76b\") " Mar 20 11:54:36 crc kubenswrapper[4748]: I0320 11:54:36.173698 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2e8ec044-4258-42b8-9a84-653d94d9b76b-catalog-content\") pod \"2e8ec044-4258-42b8-9a84-653d94d9b76b\" (UID: \"2e8ec044-4258-42b8-9a84-653d94d9b76b\") " Mar 20 11:54:36 crc kubenswrapper[4748]: I0320 11:54:36.173803 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2e8ec044-4258-42b8-9a84-653d94d9b76b-utilities\") pod \"2e8ec044-4258-42b8-9a84-653d94d9b76b\" (UID: \"2e8ec044-4258-42b8-9a84-653d94d9b76b\") " Mar 20 11:54:36 crc kubenswrapper[4748]: I0320 11:54:36.174797 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2e8ec044-4258-42b8-9a84-653d94d9b76b-utilities" (OuterVolumeSpecName: "utilities") pod "2e8ec044-4258-42b8-9a84-653d94d9b76b" (UID: "2e8ec044-4258-42b8-9a84-653d94d9b76b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:54:36 crc kubenswrapper[4748]: I0320 11:54:36.179855 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e8ec044-4258-42b8-9a84-653d94d9b76b-kube-api-access-jqcvc" (OuterVolumeSpecName: "kube-api-access-jqcvc") pod "2e8ec044-4258-42b8-9a84-653d94d9b76b" (UID: "2e8ec044-4258-42b8-9a84-653d94d9b76b"). InnerVolumeSpecName "kube-api-access-jqcvc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:54:36 crc kubenswrapper[4748]: I0320 11:54:36.215380 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2e8ec044-4258-42b8-9a84-653d94d9b76b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2e8ec044-4258-42b8-9a84-653d94d9b76b" (UID: "2e8ec044-4258-42b8-9a84-653d94d9b76b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:54:36 crc kubenswrapper[4748]: I0320 11:54:36.275705 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jqcvc\" (UniqueName: \"kubernetes.io/projected/2e8ec044-4258-42b8-9a84-653d94d9b76b-kube-api-access-jqcvc\") on node \"crc\" DevicePath \"\"" Mar 20 11:54:36 crc kubenswrapper[4748]: I0320 11:54:36.275761 4748 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2e8ec044-4258-42b8-9a84-653d94d9b76b-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 11:54:36 crc kubenswrapper[4748]: I0320 11:54:36.275773 4748 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2e8ec044-4258-42b8-9a84-653d94d9b76b-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 11:54:36 crc kubenswrapper[4748]: I0320 11:54:36.533616 4748 generic.go:334] "Generic (PLEG): container finished" podID="2e8ec044-4258-42b8-9a84-653d94d9b76b" containerID="7c3556812645bc864b2a4883b40e3e00424a2273406a5db52166247884834d58" exitCode=0 Mar 20 11:54:36 crc kubenswrapper[4748]: I0320 11:54:36.533664 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nnfnz" event={"ID":"2e8ec044-4258-42b8-9a84-653d94d9b76b","Type":"ContainerDied","Data":"7c3556812645bc864b2a4883b40e3e00424a2273406a5db52166247884834d58"} Mar 20 11:54:36 crc kubenswrapper[4748]: I0320 11:54:36.533693 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nnfnz" event={"ID":"2e8ec044-4258-42b8-9a84-653d94d9b76b","Type":"ContainerDied","Data":"b48b977ad72041e3374bc48658eed1b35904713e9cf88abb8594092f39938d97"} Mar 20 11:54:36 crc kubenswrapper[4748]: I0320 11:54:36.533715 4748 scope.go:117] "RemoveContainer" containerID="7c3556812645bc864b2a4883b40e3e00424a2273406a5db52166247884834d58" Mar 20 11:54:36 crc kubenswrapper[4748]: I0320 11:54:36.533875 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nnfnz" Mar 20 11:54:36 crc kubenswrapper[4748]: I0320 11:54:36.576037 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-nnfnz"] Mar 20 11:54:36 crc kubenswrapper[4748]: I0320 11:54:36.576244 4748 scope.go:117] "RemoveContainer" containerID="95558c0ea4167d02cb1f585d6d6b06688995b034bb2aab2c586f53cf4500eed7" Mar 20 11:54:36 crc kubenswrapper[4748]: I0320 11:54:36.586397 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-nnfnz"] Mar 20 11:54:36 crc kubenswrapper[4748]: I0320 11:54:36.602974 4748 scope.go:117] "RemoveContainer" containerID="ae928fc83664ca822d261a5cce4d5df988c282f04089cc7c22482697b628e764" Mar 20 11:54:36 crc kubenswrapper[4748]: I0320 11:54:36.640547 4748 scope.go:117] "RemoveContainer" containerID="7c3556812645bc864b2a4883b40e3e00424a2273406a5db52166247884834d58" Mar 20 11:54:36 crc kubenswrapper[4748]: E0320 11:54:36.644512 4748 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c3556812645bc864b2a4883b40e3e00424a2273406a5db52166247884834d58\": container with ID starting with 7c3556812645bc864b2a4883b40e3e00424a2273406a5db52166247884834d58 not found: ID does not exist" containerID="7c3556812645bc864b2a4883b40e3e00424a2273406a5db52166247884834d58" Mar 20 11:54:36 crc kubenswrapper[4748]: I0320 11:54:36.644652 4748 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c3556812645bc864b2a4883b40e3e00424a2273406a5db52166247884834d58"} err="failed to get container status \"7c3556812645bc864b2a4883b40e3e00424a2273406a5db52166247884834d58\": rpc error: code = NotFound desc = could not find container \"7c3556812645bc864b2a4883b40e3e00424a2273406a5db52166247884834d58\": container with ID starting with 7c3556812645bc864b2a4883b40e3e00424a2273406a5db52166247884834d58 not found: ID does not exist" Mar 20 11:54:36 crc kubenswrapper[4748]: I0320 11:54:36.644732 4748 scope.go:117] "RemoveContainer" containerID="95558c0ea4167d02cb1f585d6d6b06688995b034bb2aab2c586f53cf4500eed7" Mar 20 11:54:36 crc kubenswrapper[4748]: E0320 11:54:36.645677 4748 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95558c0ea4167d02cb1f585d6d6b06688995b034bb2aab2c586f53cf4500eed7\": container with ID starting with 95558c0ea4167d02cb1f585d6d6b06688995b034bb2aab2c586f53cf4500eed7 not found: ID does not exist" containerID="95558c0ea4167d02cb1f585d6d6b06688995b034bb2aab2c586f53cf4500eed7" Mar 20 11:54:36 crc kubenswrapper[4748]: I0320 11:54:36.645737 4748 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95558c0ea4167d02cb1f585d6d6b06688995b034bb2aab2c586f53cf4500eed7"} err="failed to get container status \"95558c0ea4167d02cb1f585d6d6b06688995b034bb2aab2c586f53cf4500eed7\": rpc error: code = NotFound desc = could not find container \"95558c0ea4167d02cb1f585d6d6b06688995b034bb2aab2c586f53cf4500eed7\": container with ID starting with 95558c0ea4167d02cb1f585d6d6b06688995b034bb2aab2c586f53cf4500eed7 not found: ID does not exist" Mar 20 11:54:36 crc kubenswrapper[4748]: I0320 11:54:36.645775 4748 scope.go:117] "RemoveContainer" containerID="ae928fc83664ca822d261a5cce4d5df988c282f04089cc7c22482697b628e764" Mar 20 11:54:36 crc kubenswrapper[4748]: E0320 11:54:36.646169 4748 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae928fc83664ca822d261a5cce4d5df988c282f04089cc7c22482697b628e764\": container with ID starting with ae928fc83664ca822d261a5cce4d5df988c282f04089cc7c22482697b628e764 not found: ID does not exist" containerID="ae928fc83664ca822d261a5cce4d5df988c282f04089cc7c22482697b628e764" Mar 20 11:54:36 crc kubenswrapper[4748]: I0320 11:54:36.646206 4748 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae928fc83664ca822d261a5cce4d5df988c282f04089cc7c22482697b628e764"} err="failed to get container status \"ae928fc83664ca822d261a5cce4d5df988c282f04089cc7c22482697b628e764\": rpc error: code = NotFound desc = could not find container \"ae928fc83664ca822d261a5cce4d5df988c282f04089cc7c22482697b628e764\": container with ID starting with ae928fc83664ca822d261a5cce4d5df988c282f04089cc7c22482697b628e764 not found: ID does not exist" Mar 20 11:54:37 crc kubenswrapper[4748]: I0320 11:54:37.528519 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e8ec044-4258-42b8-9a84-653d94d9b76b" path="/var/lib/kubelet/pods/2e8ec044-4258-42b8-9a84-653d94d9b76b/volumes" Mar 20 11:54:43 crc kubenswrapper[4748]: I0320 11:54:43.631671 4748 generic.go:334] "Generic (PLEG): container finished" podID="90b5aeb3-bb8e-40ae-872f-72c5f98e260f" containerID="234c1c5440f3d405ca09c49c1a6144bf682d9dfe23bde4ed61e4c1c2a432d5e1" exitCode=0 Mar 20 11:54:43 crc kubenswrapper[4748]: I0320 11:54:43.631771 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xzd8l/must-gather-ksq57" event={"ID":"90b5aeb3-bb8e-40ae-872f-72c5f98e260f","Type":"ContainerDied","Data":"234c1c5440f3d405ca09c49c1a6144bf682d9dfe23bde4ed61e4c1c2a432d5e1"} Mar 20 11:54:43 crc kubenswrapper[4748]: I0320 11:54:43.633069 4748 scope.go:117] "RemoveContainer" containerID="234c1c5440f3d405ca09c49c1a6144bf682d9dfe23bde4ed61e4c1c2a432d5e1" Mar 20 11:54:43 crc kubenswrapper[4748]: I0320 11:54:43.872471 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-xzd8l_must-gather-ksq57_90b5aeb3-bb8e-40ae-872f-72c5f98e260f/gather/0.log" Mar 20 11:54:47 crc kubenswrapper[4748]: I0320 11:54:47.516756 4748 scope.go:117] "RemoveContainer" containerID="219c994104121ab22714c2868fe3144cfbb2c9dfcdad0eea40b956e614be504d" Mar 20 11:54:47 crc kubenswrapper[4748]: E0320 11:54:47.517541 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lbvz_openshift-machine-config-operator(8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" Mar 20 11:54:51 crc kubenswrapper[4748]: I0320 11:54:51.938757 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-xzd8l/must-gather-ksq57"] Mar 20 11:54:51 crc kubenswrapper[4748]: I0320 11:54:51.939529 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-xzd8l/must-gather-ksq57" podUID="90b5aeb3-bb8e-40ae-872f-72c5f98e260f" containerName="copy" containerID="cri-o://3ca133b9481d157ad3d194051c919f77e5ae07e18fbbbac77374e496a0c3014f" gracePeriod=2 Mar 20 11:54:51 crc kubenswrapper[4748]: I0320 11:54:51.951358 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-xzd8l/must-gather-ksq57"] Mar 20 11:54:52 crc kubenswrapper[4748]: I0320 11:54:52.385072 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-xzd8l_must-gather-ksq57_90b5aeb3-bb8e-40ae-872f-72c5f98e260f/copy/0.log" Mar 20 11:54:52 crc kubenswrapper[4748]: I0320 11:54:52.385920 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xzd8l/must-gather-ksq57" Mar 20 11:54:52 crc kubenswrapper[4748]: I0320 11:54:52.504894 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gqd6x\" (UniqueName: \"kubernetes.io/projected/90b5aeb3-bb8e-40ae-872f-72c5f98e260f-kube-api-access-gqd6x\") pod \"90b5aeb3-bb8e-40ae-872f-72c5f98e260f\" (UID: \"90b5aeb3-bb8e-40ae-872f-72c5f98e260f\") " Mar 20 11:54:52 crc kubenswrapper[4748]: I0320 11:54:52.505068 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/90b5aeb3-bb8e-40ae-872f-72c5f98e260f-must-gather-output\") pod \"90b5aeb3-bb8e-40ae-872f-72c5f98e260f\" (UID: \"90b5aeb3-bb8e-40ae-872f-72c5f98e260f\") " Mar 20 11:54:52 crc kubenswrapper[4748]: I0320 11:54:52.511473 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90b5aeb3-bb8e-40ae-872f-72c5f98e260f-kube-api-access-gqd6x" (OuterVolumeSpecName: "kube-api-access-gqd6x") pod "90b5aeb3-bb8e-40ae-872f-72c5f98e260f" (UID: "90b5aeb3-bb8e-40ae-872f-72c5f98e260f"). InnerVolumeSpecName "kube-api-access-gqd6x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:54:52 crc kubenswrapper[4748]: I0320 11:54:52.611191 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gqd6x\" (UniqueName: \"kubernetes.io/projected/90b5aeb3-bb8e-40ae-872f-72c5f98e260f-kube-api-access-gqd6x\") on node \"crc\" DevicePath \"\"" Mar 20 11:54:52 crc kubenswrapper[4748]: I0320 11:54:52.687006 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/90b5aeb3-bb8e-40ae-872f-72c5f98e260f-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "90b5aeb3-bb8e-40ae-872f-72c5f98e260f" (UID: "90b5aeb3-bb8e-40ae-872f-72c5f98e260f"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:54:52 crc kubenswrapper[4748]: I0320 11:54:52.712852 4748 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/90b5aeb3-bb8e-40ae-872f-72c5f98e260f-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 20 11:54:52 crc kubenswrapper[4748]: I0320 11:54:52.714047 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-xzd8l_must-gather-ksq57_90b5aeb3-bb8e-40ae-872f-72c5f98e260f/copy/0.log" Mar 20 11:54:52 crc kubenswrapper[4748]: I0320 11:54:52.714733 4748 generic.go:334] "Generic (PLEG): container finished" podID="90b5aeb3-bb8e-40ae-872f-72c5f98e260f" containerID="3ca133b9481d157ad3d194051c919f77e5ae07e18fbbbac77374e496a0c3014f" exitCode=143 Mar 20 11:54:52 crc kubenswrapper[4748]: I0320 11:54:52.714788 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xzd8l/must-gather-ksq57" Mar 20 11:54:52 crc kubenswrapper[4748]: I0320 11:54:52.714800 4748 scope.go:117] "RemoveContainer" containerID="3ca133b9481d157ad3d194051c919f77e5ae07e18fbbbac77374e496a0c3014f" Mar 20 11:54:52 crc kubenswrapper[4748]: I0320 11:54:52.742629 4748 scope.go:117] "RemoveContainer" containerID="234c1c5440f3d405ca09c49c1a6144bf682d9dfe23bde4ed61e4c1c2a432d5e1" Mar 20 11:54:52 crc kubenswrapper[4748]: I0320 11:54:52.843924 4748 scope.go:117] "RemoveContainer" containerID="3ca133b9481d157ad3d194051c919f77e5ae07e18fbbbac77374e496a0c3014f" Mar 20 11:54:52 crc kubenswrapper[4748]: E0320 11:54:52.844433 4748 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ca133b9481d157ad3d194051c919f77e5ae07e18fbbbac77374e496a0c3014f\": container with ID starting with 3ca133b9481d157ad3d194051c919f77e5ae07e18fbbbac77374e496a0c3014f not found: ID does not exist" containerID="3ca133b9481d157ad3d194051c919f77e5ae07e18fbbbac77374e496a0c3014f" Mar 20 11:54:52 crc kubenswrapper[4748]: I0320 11:54:52.844479 4748 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ca133b9481d157ad3d194051c919f77e5ae07e18fbbbac77374e496a0c3014f"} err="failed to get container status \"3ca133b9481d157ad3d194051c919f77e5ae07e18fbbbac77374e496a0c3014f\": rpc error: code = NotFound desc = could not find container \"3ca133b9481d157ad3d194051c919f77e5ae07e18fbbbac77374e496a0c3014f\": container with ID starting with 3ca133b9481d157ad3d194051c919f77e5ae07e18fbbbac77374e496a0c3014f not found: ID does not exist" Mar 20 11:54:52 crc kubenswrapper[4748]: I0320 11:54:52.844505 4748 scope.go:117] "RemoveContainer" containerID="234c1c5440f3d405ca09c49c1a6144bf682d9dfe23bde4ed61e4c1c2a432d5e1" Mar 20 11:54:52 crc kubenswrapper[4748]: E0320 11:54:52.844776 4748 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"234c1c5440f3d405ca09c49c1a6144bf682d9dfe23bde4ed61e4c1c2a432d5e1\": container with ID starting with 234c1c5440f3d405ca09c49c1a6144bf682d9dfe23bde4ed61e4c1c2a432d5e1 not found: ID does not exist" containerID="234c1c5440f3d405ca09c49c1a6144bf682d9dfe23bde4ed61e4c1c2a432d5e1" Mar 20 11:54:52 crc kubenswrapper[4748]: I0320 11:54:52.844807 4748 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"234c1c5440f3d405ca09c49c1a6144bf682d9dfe23bde4ed61e4c1c2a432d5e1"} err="failed to get container status \"234c1c5440f3d405ca09c49c1a6144bf682d9dfe23bde4ed61e4c1c2a432d5e1\": rpc error: code = NotFound desc = could not find container \"234c1c5440f3d405ca09c49c1a6144bf682d9dfe23bde4ed61e4c1c2a432d5e1\": container with ID starting with 234c1c5440f3d405ca09c49c1a6144bf682d9dfe23bde4ed61e4c1c2a432d5e1 not found: ID does not exist" Mar 20 11:54:53 crc kubenswrapper[4748]: I0320 11:54:53.526335 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90b5aeb3-bb8e-40ae-872f-72c5f98e260f" path="/var/lib/kubelet/pods/90b5aeb3-bb8e-40ae-872f-72c5f98e260f/volumes" Mar 20 11:54:59 crc kubenswrapper[4748]: I0320 11:54:59.516015 4748 scope.go:117] "RemoveContainer" containerID="219c994104121ab22714c2868fe3144cfbb2c9dfcdad0eea40b956e614be504d" Mar 20 11:54:59 crc kubenswrapper[4748]: E0320 11:54:59.517103 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lbvz_openshift-machine-config-operator(8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" Mar 20 11:55:14 crc kubenswrapper[4748]: I0320 11:55:14.515540 4748 scope.go:117] "RemoveContainer" containerID="219c994104121ab22714c2868fe3144cfbb2c9dfcdad0eea40b956e614be504d" Mar 20 11:55:14 crc kubenswrapper[4748]: E0320 11:55:14.516360 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lbvz_openshift-machine-config-operator(8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" Mar 20 11:55:24 crc kubenswrapper[4748]: I0320 11:55:24.793993 4748 scope.go:117] "RemoveContainer" containerID="78e6d1588f7cbac868528b66b9a591f39b76c8b4868a17ca120ec9b6d9f17127" Mar 20 11:55:24 crc kubenswrapper[4748]: I0320 11:55:24.820073 4748 scope.go:117] "RemoveContainer" containerID="2be97010bce97f769f1b67f353ad303768415e75c01fac7123ff6411bed4a7d5" Mar 20 11:55:28 crc kubenswrapper[4748]: I0320 11:55:28.515575 4748 scope.go:117] "RemoveContainer" containerID="219c994104121ab22714c2868fe3144cfbb2c9dfcdad0eea40b956e614be504d" Mar 20 11:55:28 crc kubenswrapper[4748]: E0320 11:55:28.516209 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lbvz_openshift-machine-config-operator(8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" Mar 20 11:55:39 crc kubenswrapper[4748]: I0320 11:55:39.516486 4748 scope.go:117] "RemoveContainer" containerID="219c994104121ab22714c2868fe3144cfbb2c9dfcdad0eea40b956e614be504d" Mar 20 11:55:39 crc kubenswrapper[4748]: E0320 11:55:39.517311 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lbvz_openshift-machine-config-operator(8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" Mar 20 11:55:52 crc kubenswrapper[4748]: I0320 11:55:52.516041 4748 scope.go:117] "RemoveContainer" containerID="219c994104121ab22714c2868fe3144cfbb2c9dfcdad0eea40b956e614be504d" Mar 20 11:55:52 crc kubenswrapper[4748]: E0320 11:55:52.516940 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lbvz_openshift-machine-config-operator(8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" Mar 20 11:56:00 crc kubenswrapper[4748]: I0320 11:56:00.158863 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566796-7chdb"] Mar 20 11:56:00 crc kubenswrapper[4748]: E0320 11:56:00.160071 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90b5aeb3-bb8e-40ae-872f-72c5f98e260f" containerName="copy" Mar 20 11:56:00 crc kubenswrapper[4748]: I0320 11:56:00.160093 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="90b5aeb3-bb8e-40ae-872f-72c5f98e260f" containerName="copy" Mar 20 11:56:00 crc kubenswrapper[4748]: E0320 11:56:00.160112 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e8ec044-4258-42b8-9a84-653d94d9b76b" containerName="extract-utilities" Mar 20 11:56:00 crc kubenswrapper[4748]: I0320 11:56:00.160181 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e8ec044-4258-42b8-9a84-653d94d9b76b" containerName="extract-utilities" Mar 20 11:56:00 crc kubenswrapper[4748]: E0320 11:56:00.160217 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90b5aeb3-bb8e-40ae-872f-72c5f98e260f" containerName="gather" Mar 20 11:56:00 crc kubenswrapper[4748]: I0320 11:56:00.160229 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="90b5aeb3-bb8e-40ae-872f-72c5f98e260f" containerName="gather" Mar 20 11:56:00 crc kubenswrapper[4748]: E0320 11:56:00.160241 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e8ec044-4258-42b8-9a84-653d94d9b76b" containerName="extract-content" Mar 20 11:56:00 crc kubenswrapper[4748]: I0320 11:56:00.160249 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e8ec044-4258-42b8-9a84-653d94d9b76b" containerName="extract-content" Mar 20 11:56:00 crc kubenswrapper[4748]: E0320 11:56:00.160274 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e8ec044-4258-42b8-9a84-653d94d9b76b" containerName="registry-server" Mar 20 11:56:00 crc kubenswrapper[4748]: I0320 11:56:00.160282 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e8ec044-4258-42b8-9a84-653d94d9b76b" containerName="registry-server" Mar 20 11:56:00 crc kubenswrapper[4748]: I0320 11:56:00.160521 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="90b5aeb3-bb8e-40ae-872f-72c5f98e260f" containerName="copy" Mar 20 11:56:00 crc kubenswrapper[4748]: I0320 11:56:00.160546 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e8ec044-4258-42b8-9a84-653d94d9b76b" containerName="registry-server" Mar 20 11:56:00 crc kubenswrapper[4748]: I0320 11:56:00.160561 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="90b5aeb3-bb8e-40ae-872f-72c5f98e260f" containerName="gather" Mar 20 11:56:00 crc kubenswrapper[4748]: I0320 11:56:00.161710 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566796-7chdb" Mar 20 11:56:00 crc kubenswrapper[4748]: I0320 11:56:00.164195 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 11:56:00 crc kubenswrapper[4748]: I0320 11:56:00.164650 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-p6q8z" Mar 20 11:56:00 crc kubenswrapper[4748]: I0320 11:56:00.164669 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 11:56:00 crc kubenswrapper[4748]: I0320 11:56:00.172359 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566796-7chdb"] Mar 20 11:56:00 crc kubenswrapper[4748]: I0320 11:56:00.248855 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wfjgc\" (UniqueName: \"kubernetes.io/projected/cd818bd0-acc6-4b57-bd4e-5e1ad6648bb3-kube-api-access-wfjgc\") pod \"auto-csr-approver-29566796-7chdb\" (UID: \"cd818bd0-acc6-4b57-bd4e-5e1ad6648bb3\") " pod="openshift-infra/auto-csr-approver-29566796-7chdb" Mar 20 11:56:00 crc kubenswrapper[4748]: I0320 11:56:00.351090 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wfjgc\" (UniqueName: \"kubernetes.io/projected/cd818bd0-acc6-4b57-bd4e-5e1ad6648bb3-kube-api-access-wfjgc\") pod \"auto-csr-approver-29566796-7chdb\" (UID: \"cd818bd0-acc6-4b57-bd4e-5e1ad6648bb3\") " pod="openshift-infra/auto-csr-approver-29566796-7chdb" Mar 20 11:56:00 crc kubenswrapper[4748]: I0320 11:56:00.377932 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wfjgc\" (UniqueName: \"kubernetes.io/projected/cd818bd0-acc6-4b57-bd4e-5e1ad6648bb3-kube-api-access-wfjgc\") pod \"auto-csr-approver-29566796-7chdb\" (UID: \"cd818bd0-acc6-4b57-bd4e-5e1ad6648bb3\") " pod="openshift-infra/auto-csr-approver-29566796-7chdb" Mar 20 11:56:00 crc kubenswrapper[4748]: I0320 11:56:00.489112 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566796-7chdb" Mar 20 11:56:00 crc kubenswrapper[4748]: I0320 11:56:00.932911 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566796-7chdb"] Mar 20 11:56:01 crc kubenswrapper[4748]: I0320 11:56:01.321762 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566796-7chdb" event={"ID":"cd818bd0-acc6-4b57-bd4e-5e1ad6648bb3","Type":"ContainerStarted","Data":"d31b24b085eadbea6283422297fd9c08fa4391ab585e9f251d84556eb1673260"} Mar 20 11:56:01 crc kubenswrapper[4748]: I0320 11:56:01.367433 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-ttvk7"] Mar 20 11:56:01 crc kubenswrapper[4748]: I0320 11:56:01.371138 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ttvk7" Mar 20 11:56:01 crc kubenswrapper[4748]: I0320 11:56:01.378784 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ttvk7"] Mar 20 11:56:01 crc kubenswrapper[4748]: I0320 11:56:01.472741 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13a90aeb-6cf2-4e75-8842-07cb97c6a8ce-utilities\") pod \"redhat-operators-ttvk7\" (UID: \"13a90aeb-6cf2-4e75-8842-07cb97c6a8ce\") " pod="openshift-marketplace/redhat-operators-ttvk7" Mar 20 11:56:01 crc kubenswrapper[4748]: I0320 11:56:01.472782 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13a90aeb-6cf2-4e75-8842-07cb97c6a8ce-catalog-content\") pod \"redhat-operators-ttvk7\" (UID: \"13a90aeb-6cf2-4e75-8842-07cb97c6a8ce\") " pod="openshift-marketplace/redhat-operators-ttvk7" Mar 20 11:56:01 crc kubenswrapper[4748]: I0320 11:56:01.472913 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gr7lx\" (UniqueName: \"kubernetes.io/projected/13a90aeb-6cf2-4e75-8842-07cb97c6a8ce-kube-api-access-gr7lx\") pod \"redhat-operators-ttvk7\" (UID: \"13a90aeb-6cf2-4e75-8842-07cb97c6a8ce\") " pod="openshift-marketplace/redhat-operators-ttvk7" Mar 20 11:56:01 crc kubenswrapper[4748]: I0320 11:56:01.575351 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gr7lx\" (UniqueName: \"kubernetes.io/projected/13a90aeb-6cf2-4e75-8842-07cb97c6a8ce-kube-api-access-gr7lx\") pod \"redhat-operators-ttvk7\" (UID: \"13a90aeb-6cf2-4e75-8842-07cb97c6a8ce\") " pod="openshift-marketplace/redhat-operators-ttvk7" Mar 20 11:56:01 crc kubenswrapper[4748]: I0320 11:56:01.575466 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13a90aeb-6cf2-4e75-8842-07cb97c6a8ce-utilities\") pod \"redhat-operators-ttvk7\" (UID: \"13a90aeb-6cf2-4e75-8842-07cb97c6a8ce\") " pod="openshift-marketplace/redhat-operators-ttvk7" Mar 20 11:56:01 crc kubenswrapper[4748]: I0320 11:56:01.575483 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13a90aeb-6cf2-4e75-8842-07cb97c6a8ce-catalog-content\") pod \"redhat-operators-ttvk7\" (UID: \"13a90aeb-6cf2-4e75-8842-07cb97c6a8ce\") " pod="openshift-marketplace/redhat-operators-ttvk7" Mar 20 11:56:01 crc kubenswrapper[4748]: I0320 11:56:01.576049 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13a90aeb-6cf2-4e75-8842-07cb97c6a8ce-catalog-content\") pod \"redhat-operators-ttvk7\" (UID: \"13a90aeb-6cf2-4e75-8842-07cb97c6a8ce\") " pod="openshift-marketplace/redhat-operators-ttvk7" Mar 20 11:56:01 crc kubenswrapper[4748]: I0320 11:56:01.576153 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13a90aeb-6cf2-4e75-8842-07cb97c6a8ce-utilities\") pod \"redhat-operators-ttvk7\" (UID: \"13a90aeb-6cf2-4e75-8842-07cb97c6a8ce\") " pod="openshift-marketplace/redhat-operators-ttvk7" Mar 20 11:56:01 crc kubenswrapper[4748]: I0320 11:56:01.604274 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gr7lx\" (UniqueName: \"kubernetes.io/projected/13a90aeb-6cf2-4e75-8842-07cb97c6a8ce-kube-api-access-gr7lx\") pod \"redhat-operators-ttvk7\" (UID: \"13a90aeb-6cf2-4e75-8842-07cb97c6a8ce\") " pod="openshift-marketplace/redhat-operators-ttvk7" Mar 20 11:56:01 crc kubenswrapper[4748]: I0320 11:56:01.706669 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ttvk7" Mar 20 11:56:02 crc kubenswrapper[4748]: I0320 11:56:02.140402 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ttvk7"] Mar 20 11:56:02 crc kubenswrapper[4748]: I0320 11:56:02.332985 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566796-7chdb" event={"ID":"cd818bd0-acc6-4b57-bd4e-5e1ad6648bb3","Type":"ContainerStarted","Data":"62ddfd967246f7d6057932ccb065fed02ec0476bf4b6493fac2d4d97f9c433c7"} Mar 20 11:56:02 crc kubenswrapper[4748]: I0320 11:56:02.335210 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ttvk7" event={"ID":"13a90aeb-6cf2-4e75-8842-07cb97c6a8ce","Type":"ContainerStarted","Data":"aed539bb134bef0131fc874ee3624fa8b7da16acd270c112268065334d9ab97a"} Mar 20 11:56:02 crc kubenswrapper[4748]: I0320 11:56:02.347567 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29566796-7chdb" podStartSLOduration=1.478404977 podStartE2EDuration="2.347547486s" podCreationTimestamp="2026-03-20 11:56:00 +0000 UTC" firstStartedPulling="2026-03-20 11:56:00.932886352 +0000 UTC m=+4796.074432166" lastFinishedPulling="2026-03-20 11:56:01.802028861 +0000 UTC m=+4796.943574675" observedRunningTime="2026-03-20 11:56:02.342845267 +0000 UTC m=+4797.484391081" watchObservedRunningTime="2026-03-20 11:56:02.347547486 +0000 UTC m=+4797.489093310" Mar 20 11:56:03 crc kubenswrapper[4748]: I0320 11:56:03.344948 4748 generic.go:334] "Generic (PLEG): container finished" podID="cd818bd0-acc6-4b57-bd4e-5e1ad6648bb3" containerID="62ddfd967246f7d6057932ccb065fed02ec0476bf4b6493fac2d4d97f9c433c7" exitCode=0 Mar 20 11:56:03 crc kubenswrapper[4748]: I0320 11:56:03.345032 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566796-7chdb" event={"ID":"cd818bd0-acc6-4b57-bd4e-5e1ad6648bb3","Type":"ContainerDied","Data":"62ddfd967246f7d6057932ccb065fed02ec0476bf4b6493fac2d4d97f9c433c7"} Mar 20 11:56:03 crc kubenswrapper[4748]: I0320 11:56:03.346890 4748 generic.go:334] "Generic (PLEG): container finished" podID="13a90aeb-6cf2-4e75-8842-07cb97c6a8ce" containerID="57f00e01cfd966fbac89ba921ab8d0fed951645f36b169f19ae009afb2ac33d9" exitCode=0 Mar 20 11:56:03 crc kubenswrapper[4748]: I0320 11:56:03.346916 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ttvk7" event={"ID":"13a90aeb-6cf2-4e75-8842-07cb97c6a8ce","Type":"ContainerDied","Data":"57f00e01cfd966fbac89ba921ab8d0fed951645f36b169f19ae009afb2ac33d9"} Mar 20 11:56:04 crc kubenswrapper[4748]: I0320 11:56:04.358496 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ttvk7" event={"ID":"13a90aeb-6cf2-4e75-8842-07cb97c6a8ce","Type":"ContainerStarted","Data":"56fd6be114cc55c4529a2ec77a604d1a0a7504a2f15396916725ce88fc6c8620"} Mar 20 11:56:04 crc kubenswrapper[4748]: I0320 11:56:04.700530 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566796-7chdb" Mar 20 11:56:04 crc kubenswrapper[4748]: I0320 11:56:04.862110 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wfjgc\" (UniqueName: \"kubernetes.io/projected/cd818bd0-acc6-4b57-bd4e-5e1ad6648bb3-kube-api-access-wfjgc\") pod \"cd818bd0-acc6-4b57-bd4e-5e1ad6648bb3\" (UID: \"cd818bd0-acc6-4b57-bd4e-5e1ad6648bb3\") " Mar 20 11:56:04 crc kubenswrapper[4748]: I0320 11:56:04.871461 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd818bd0-acc6-4b57-bd4e-5e1ad6648bb3-kube-api-access-wfjgc" (OuterVolumeSpecName: "kube-api-access-wfjgc") pod "cd818bd0-acc6-4b57-bd4e-5e1ad6648bb3" (UID: "cd818bd0-acc6-4b57-bd4e-5e1ad6648bb3"). InnerVolumeSpecName "kube-api-access-wfjgc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:56:04 crc kubenswrapper[4748]: I0320 11:56:04.965261 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wfjgc\" (UniqueName: \"kubernetes.io/projected/cd818bd0-acc6-4b57-bd4e-5e1ad6648bb3-kube-api-access-wfjgc\") on node \"crc\" DevicePath \"\"" Mar 20 11:56:05 crc kubenswrapper[4748]: I0320 11:56:05.371173 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566796-7chdb" Mar 20 11:56:05 crc kubenswrapper[4748]: I0320 11:56:05.371158 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566796-7chdb" event={"ID":"cd818bd0-acc6-4b57-bd4e-5e1ad6648bb3","Type":"ContainerDied","Data":"d31b24b085eadbea6283422297fd9c08fa4391ab585e9f251d84556eb1673260"} Mar 20 11:56:05 crc kubenswrapper[4748]: I0320 11:56:05.372584 4748 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d31b24b085eadbea6283422297fd9c08fa4391ab585e9f251d84556eb1673260" Mar 20 11:56:05 crc kubenswrapper[4748]: I0320 11:56:05.426438 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566790-r97n4"] Mar 20 11:56:05 crc kubenswrapper[4748]: I0320 11:56:05.434705 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566790-r97n4"] Mar 20 11:56:05 crc kubenswrapper[4748]: I0320 11:56:05.529511 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5bae15e5-0199-42c4-babb-5da895996b8a" path="/var/lib/kubelet/pods/5bae15e5-0199-42c4-babb-5da895996b8a/volumes" Mar 20 11:56:06 crc kubenswrapper[4748]: I0320 11:56:06.380771 4748 generic.go:334] "Generic (PLEG): container finished" podID="13a90aeb-6cf2-4e75-8842-07cb97c6a8ce" containerID="56fd6be114cc55c4529a2ec77a604d1a0a7504a2f15396916725ce88fc6c8620" exitCode=0 Mar 20 11:56:06 crc kubenswrapper[4748]: I0320 11:56:06.380882 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ttvk7" event={"ID":"13a90aeb-6cf2-4e75-8842-07cb97c6a8ce","Type":"ContainerDied","Data":"56fd6be114cc55c4529a2ec77a604d1a0a7504a2f15396916725ce88fc6c8620"} Mar 20 11:56:06 crc kubenswrapper[4748]: I0320 11:56:06.515551 4748 scope.go:117] "RemoveContainer" containerID="219c994104121ab22714c2868fe3144cfbb2c9dfcdad0eea40b956e614be504d" Mar 20 11:56:06 crc kubenswrapper[4748]: E0320 11:56:06.515795 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lbvz_openshift-machine-config-operator(8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" Mar 20 11:56:07 crc kubenswrapper[4748]: I0320 11:56:07.393515 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ttvk7" event={"ID":"13a90aeb-6cf2-4e75-8842-07cb97c6a8ce","Type":"ContainerStarted","Data":"b182a3a3d062af72bf94d3233f66bd2d763348d765c2c6280076d0d9b743d721"} Mar 20 11:56:07 crc kubenswrapper[4748]: I0320 11:56:07.412869 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-ttvk7" podStartSLOduration=2.938019057 podStartE2EDuration="6.41283115s" podCreationTimestamp="2026-03-20 11:56:01 +0000 UTC" firstStartedPulling="2026-03-20 11:56:03.349102724 +0000 UTC m=+4798.490648538" lastFinishedPulling="2026-03-20 11:56:06.823914817 +0000 UTC m=+4801.965460631" observedRunningTime="2026-03-20 11:56:07.410706006 +0000 UTC m=+4802.552251830" watchObservedRunningTime="2026-03-20 11:56:07.41283115 +0000 UTC m=+4802.554376964" Mar 20 11:56:11 crc kubenswrapper[4748]: I0320 11:56:11.707669 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-ttvk7" Mar 20 11:56:11 crc kubenswrapper[4748]: I0320 11:56:11.708396 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-ttvk7" Mar 20 11:56:12 crc kubenswrapper[4748]: I0320 11:56:12.751932 4748 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-ttvk7" podUID="13a90aeb-6cf2-4e75-8842-07cb97c6a8ce" containerName="registry-server" probeResult="failure" output=< Mar 20 11:56:12 crc kubenswrapper[4748]: timeout: failed to connect service ":50051" within 1s Mar 20 11:56:12 crc kubenswrapper[4748]: > Mar 20 11:56:21 crc kubenswrapper[4748]: I0320 11:56:21.516028 4748 scope.go:117] "RemoveContainer" containerID="219c994104121ab22714c2868fe3144cfbb2c9dfcdad0eea40b956e614be504d" Mar 20 11:56:21 crc kubenswrapper[4748]: E0320 11:56:21.516862 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lbvz_openshift-machine-config-operator(8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" Mar 20 11:56:21 crc kubenswrapper[4748]: I0320 11:56:21.758816 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-ttvk7" Mar 20 11:56:21 crc kubenswrapper[4748]: I0320 11:56:21.810527 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-ttvk7" Mar 20 11:56:22 crc kubenswrapper[4748]: I0320 11:56:22.006187 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ttvk7"] Mar 20 11:56:23 crc kubenswrapper[4748]: I0320 11:56:23.537407 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-ttvk7" podUID="13a90aeb-6cf2-4e75-8842-07cb97c6a8ce" containerName="registry-server" containerID="cri-o://b182a3a3d062af72bf94d3233f66bd2d763348d765c2c6280076d0d9b743d721" gracePeriod=2 Mar 20 11:56:24 crc kubenswrapper[4748]: I0320 11:56:24.035614 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ttvk7" Mar 20 11:56:24 crc kubenswrapper[4748]: I0320 11:56:24.168392 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13a90aeb-6cf2-4e75-8842-07cb97c6a8ce-catalog-content\") pod \"13a90aeb-6cf2-4e75-8842-07cb97c6a8ce\" (UID: \"13a90aeb-6cf2-4e75-8842-07cb97c6a8ce\") " Mar 20 11:56:24 crc kubenswrapper[4748]: I0320 11:56:24.168479 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gr7lx\" (UniqueName: \"kubernetes.io/projected/13a90aeb-6cf2-4e75-8842-07cb97c6a8ce-kube-api-access-gr7lx\") pod \"13a90aeb-6cf2-4e75-8842-07cb97c6a8ce\" (UID: \"13a90aeb-6cf2-4e75-8842-07cb97c6a8ce\") " Mar 20 11:56:24 crc kubenswrapper[4748]: I0320 11:56:24.168754 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13a90aeb-6cf2-4e75-8842-07cb97c6a8ce-utilities\") pod \"13a90aeb-6cf2-4e75-8842-07cb97c6a8ce\" (UID: \"13a90aeb-6cf2-4e75-8842-07cb97c6a8ce\") " Mar 20 11:56:24 crc kubenswrapper[4748]: I0320 11:56:24.169642 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13a90aeb-6cf2-4e75-8842-07cb97c6a8ce-utilities" (OuterVolumeSpecName: "utilities") pod "13a90aeb-6cf2-4e75-8842-07cb97c6a8ce" (UID: "13a90aeb-6cf2-4e75-8842-07cb97c6a8ce"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:56:24 crc kubenswrapper[4748]: I0320 11:56:24.177191 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13a90aeb-6cf2-4e75-8842-07cb97c6a8ce-kube-api-access-gr7lx" (OuterVolumeSpecName: "kube-api-access-gr7lx") pod "13a90aeb-6cf2-4e75-8842-07cb97c6a8ce" (UID: "13a90aeb-6cf2-4e75-8842-07cb97c6a8ce"). InnerVolumeSpecName "kube-api-access-gr7lx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:56:24 crc kubenswrapper[4748]: I0320 11:56:24.271765 4748 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13a90aeb-6cf2-4e75-8842-07cb97c6a8ce-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 11:56:24 crc kubenswrapper[4748]: I0320 11:56:24.271802 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gr7lx\" (UniqueName: \"kubernetes.io/projected/13a90aeb-6cf2-4e75-8842-07cb97c6a8ce-kube-api-access-gr7lx\") on node \"crc\" DevicePath \"\"" Mar 20 11:56:24 crc kubenswrapper[4748]: I0320 11:56:24.316940 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13a90aeb-6cf2-4e75-8842-07cb97c6a8ce-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "13a90aeb-6cf2-4e75-8842-07cb97c6a8ce" (UID: "13a90aeb-6cf2-4e75-8842-07cb97c6a8ce"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:56:24 crc kubenswrapper[4748]: I0320 11:56:24.372733 4748 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13a90aeb-6cf2-4e75-8842-07cb97c6a8ce-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 11:56:24 crc kubenswrapper[4748]: I0320 11:56:24.548071 4748 generic.go:334] "Generic (PLEG): container finished" podID="13a90aeb-6cf2-4e75-8842-07cb97c6a8ce" containerID="b182a3a3d062af72bf94d3233f66bd2d763348d765c2c6280076d0d9b743d721" exitCode=0 Mar 20 11:56:24 crc kubenswrapper[4748]: I0320 11:56:24.548123 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ttvk7" event={"ID":"13a90aeb-6cf2-4e75-8842-07cb97c6a8ce","Type":"ContainerDied","Data":"b182a3a3d062af72bf94d3233f66bd2d763348d765c2c6280076d0d9b743d721"} Mar 20 11:56:24 crc kubenswrapper[4748]: I0320 11:56:24.548155 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ttvk7" event={"ID":"13a90aeb-6cf2-4e75-8842-07cb97c6a8ce","Type":"ContainerDied","Data":"aed539bb134bef0131fc874ee3624fa8b7da16acd270c112268065334d9ab97a"} Mar 20 11:56:24 crc kubenswrapper[4748]: I0320 11:56:24.548175 4748 scope.go:117] "RemoveContainer" containerID="b182a3a3d062af72bf94d3233f66bd2d763348d765c2c6280076d0d9b743d721" Mar 20 11:56:24 crc kubenswrapper[4748]: I0320 11:56:24.548369 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ttvk7" Mar 20 11:56:24 crc kubenswrapper[4748]: I0320 11:56:24.573786 4748 scope.go:117] "RemoveContainer" containerID="56fd6be114cc55c4529a2ec77a604d1a0a7504a2f15396916725ce88fc6c8620" Mar 20 11:56:24 crc kubenswrapper[4748]: I0320 11:56:24.588610 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ttvk7"] Mar 20 11:56:24 crc kubenswrapper[4748]: I0320 11:56:24.599060 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-ttvk7"] Mar 20 11:56:24 crc kubenswrapper[4748]: I0320 11:56:24.614117 4748 scope.go:117] "RemoveContainer" containerID="57f00e01cfd966fbac89ba921ab8d0fed951645f36b169f19ae009afb2ac33d9" Mar 20 11:56:24 crc kubenswrapper[4748]: I0320 11:56:24.644956 4748 scope.go:117] "RemoveContainer" containerID="b182a3a3d062af72bf94d3233f66bd2d763348d765c2c6280076d0d9b743d721" Mar 20 11:56:24 crc kubenswrapper[4748]: E0320 11:56:24.645396 4748 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b182a3a3d062af72bf94d3233f66bd2d763348d765c2c6280076d0d9b743d721\": container with ID starting with b182a3a3d062af72bf94d3233f66bd2d763348d765c2c6280076d0d9b743d721 not found: ID does not exist" containerID="b182a3a3d062af72bf94d3233f66bd2d763348d765c2c6280076d0d9b743d721" Mar 20 11:56:24 crc kubenswrapper[4748]: I0320 11:56:24.645447 4748 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b182a3a3d062af72bf94d3233f66bd2d763348d765c2c6280076d0d9b743d721"} err="failed to get container status \"b182a3a3d062af72bf94d3233f66bd2d763348d765c2c6280076d0d9b743d721\": rpc error: code = NotFound desc = could not find container \"b182a3a3d062af72bf94d3233f66bd2d763348d765c2c6280076d0d9b743d721\": container with ID starting with b182a3a3d062af72bf94d3233f66bd2d763348d765c2c6280076d0d9b743d721 not found: ID does not exist" Mar 20 11:56:24 crc kubenswrapper[4748]: I0320 11:56:24.645476 4748 scope.go:117] "RemoveContainer" containerID="56fd6be114cc55c4529a2ec77a604d1a0a7504a2f15396916725ce88fc6c8620" Mar 20 11:56:24 crc kubenswrapper[4748]: E0320 11:56:24.645904 4748 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56fd6be114cc55c4529a2ec77a604d1a0a7504a2f15396916725ce88fc6c8620\": container with ID starting with 56fd6be114cc55c4529a2ec77a604d1a0a7504a2f15396916725ce88fc6c8620 not found: ID does not exist" containerID="56fd6be114cc55c4529a2ec77a604d1a0a7504a2f15396916725ce88fc6c8620" Mar 20 11:56:24 crc kubenswrapper[4748]: I0320 11:56:24.645943 4748 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56fd6be114cc55c4529a2ec77a604d1a0a7504a2f15396916725ce88fc6c8620"} err="failed to get container status \"56fd6be114cc55c4529a2ec77a604d1a0a7504a2f15396916725ce88fc6c8620\": rpc error: code = NotFound desc = could not find container \"56fd6be114cc55c4529a2ec77a604d1a0a7504a2f15396916725ce88fc6c8620\": container with ID starting with 56fd6be114cc55c4529a2ec77a604d1a0a7504a2f15396916725ce88fc6c8620 not found: ID does not exist" Mar 20 11:56:24 crc kubenswrapper[4748]: I0320 11:56:24.645972 4748 scope.go:117] "RemoveContainer" containerID="57f00e01cfd966fbac89ba921ab8d0fed951645f36b169f19ae009afb2ac33d9" Mar 20 11:56:24 crc kubenswrapper[4748]: E0320 11:56:24.646296 4748 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57f00e01cfd966fbac89ba921ab8d0fed951645f36b169f19ae009afb2ac33d9\": container with ID starting with 57f00e01cfd966fbac89ba921ab8d0fed951645f36b169f19ae009afb2ac33d9 not found: ID does not exist" containerID="57f00e01cfd966fbac89ba921ab8d0fed951645f36b169f19ae009afb2ac33d9" Mar 20 11:56:24 crc kubenswrapper[4748]: I0320 11:56:24.646349 4748 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57f00e01cfd966fbac89ba921ab8d0fed951645f36b169f19ae009afb2ac33d9"} err="failed to get container status \"57f00e01cfd966fbac89ba921ab8d0fed951645f36b169f19ae009afb2ac33d9\": rpc error: code = NotFound desc = could not find container \"57f00e01cfd966fbac89ba921ab8d0fed951645f36b169f19ae009afb2ac33d9\": container with ID starting with 57f00e01cfd966fbac89ba921ab8d0fed951645f36b169f19ae009afb2ac33d9 not found: ID does not exist" Mar 20 11:56:24 crc kubenswrapper[4748]: I0320 11:56:24.935865 4748 scope.go:117] "RemoveContainer" containerID="dc459f89e289407986eebe2f2d008cec672587e5f42521b934c399b13bdc5aa0" Mar 20 11:56:25 crc kubenswrapper[4748]: I0320 11:56:25.525897 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13a90aeb-6cf2-4e75-8842-07cb97c6a8ce" path="/var/lib/kubelet/pods/13a90aeb-6cf2-4e75-8842-07cb97c6a8ce/volumes" Mar 20 11:56:34 crc kubenswrapper[4748]: I0320 11:56:34.515658 4748 scope.go:117] "RemoveContainer" containerID="219c994104121ab22714c2868fe3144cfbb2c9dfcdad0eea40b956e614be504d" Mar 20 11:56:34 crc kubenswrapper[4748]: E0320 11:56:34.517708 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lbvz_openshift-machine-config-operator(8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" Mar 20 11:56:47 crc kubenswrapper[4748]: I0320 11:56:47.516320 4748 scope.go:117] "RemoveContainer" containerID="219c994104121ab22714c2868fe3144cfbb2c9dfcdad0eea40b956e614be504d" Mar 20 11:56:47 crc kubenswrapper[4748]: E0320 11:56:47.517180 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lbvz_openshift-machine-config-operator(8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" Mar 20 11:57:00 crc kubenswrapper[4748]: I0320 11:57:00.514992 4748 scope.go:117] "RemoveContainer" containerID="219c994104121ab22714c2868fe3144cfbb2c9dfcdad0eea40b956e614be504d" Mar 20 11:57:00 crc kubenswrapper[4748]: E0320 11:57:00.515760 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lbvz_openshift-machine-config-operator(8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" Mar 20 11:57:12 crc kubenswrapper[4748]: I0320 11:57:12.515486 4748 scope.go:117] "RemoveContainer" containerID="219c994104121ab22714c2868fe3144cfbb2c9dfcdad0eea40b956e614be504d" Mar 20 11:57:12 crc kubenswrapper[4748]: E0320 11:57:12.516107 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lbvz_openshift-machine-config-operator(8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" Mar 20 11:57:24 crc kubenswrapper[4748]: I0320 11:57:24.515994 4748 scope.go:117] "RemoveContainer" containerID="219c994104121ab22714c2868fe3144cfbb2c9dfcdad0eea40b956e614be504d" Mar 20 11:57:24 crc kubenswrapper[4748]: E0320 11:57:24.517312 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lbvz_openshift-machine-config-operator(8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" Mar 20 11:57:36 crc kubenswrapper[4748]: I0320 11:57:36.515637 4748 scope.go:117] "RemoveContainer" containerID="219c994104121ab22714c2868fe3144cfbb2c9dfcdad0eea40b956e614be504d" Mar 20 11:57:36 crc kubenswrapper[4748]: E0320 11:57:36.516465 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lbvz_openshift-machine-config-operator(8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" Mar 20 11:57:47 crc kubenswrapper[4748]: I0320 11:57:47.515045 4748 scope.go:117] "RemoveContainer" containerID="219c994104121ab22714c2868fe3144cfbb2c9dfcdad0eea40b956e614be504d" Mar 20 11:57:47 crc kubenswrapper[4748]: E0320 11:57:47.515772 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lbvz_openshift-machine-config-operator(8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" Mar 20 11:57:56 crc kubenswrapper[4748]: I0320 11:57:56.133723 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-vpnp5/must-gather-plw76"] Mar 20 11:57:56 crc kubenswrapper[4748]: E0320 11:57:56.134723 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13a90aeb-6cf2-4e75-8842-07cb97c6a8ce" containerName="extract-content" Mar 20 11:57:56 crc kubenswrapper[4748]: I0320 11:57:56.134741 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="13a90aeb-6cf2-4e75-8842-07cb97c6a8ce" containerName="extract-content" Mar 20 11:57:56 crc kubenswrapper[4748]: E0320 11:57:56.134757 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13a90aeb-6cf2-4e75-8842-07cb97c6a8ce" containerName="registry-server" Mar 20 11:57:56 crc kubenswrapper[4748]: I0320 11:57:56.134764 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="13a90aeb-6cf2-4e75-8842-07cb97c6a8ce" containerName="registry-server" Mar 20 11:57:56 crc kubenswrapper[4748]: E0320 11:57:56.134794 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13a90aeb-6cf2-4e75-8842-07cb97c6a8ce" containerName="extract-utilities" Mar 20 11:57:56 crc kubenswrapper[4748]: I0320 11:57:56.134803 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="13a90aeb-6cf2-4e75-8842-07cb97c6a8ce" containerName="extract-utilities" Mar 20 11:57:56 crc kubenswrapper[4748]: E0320 11:57:56.134821 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd818bd0-acc6-4b57-bd4e-5e1ad6648bb3" containerName="oc" Mar 20 11:57:56 crc kubenswrapper[4748]: I0320 11:57:56.134828 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd818bd0-acc6-4b57-bd4e-5e1ad6648bb3" containerName="oc" Mar 20 11:57:56 crc kubenswrapper[4748]: I0320 11:57:56.135066 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="13a90aeb-6cf2-4e75-8842-07cb97c6a8ce" containerName="registry-server" Mar 20 11:57:56 crc kubenswrapper[4748]: I0320 11:57:56.135093 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd818bd0-acc6-4b57-bd4e-5e1ad6648bb3" containerName="oc" Mar 20 11:57:56 crc kubenswrapper[4748]: I0320 11:57:56.138116 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vpnp5/must-gather-plw76" Mar 20 11:57:56 crc kubenswrapper[4748]: I0320 11:57:56.140784 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-vpnp5"/"default-dockercfg-t6hdt" Mar 20 11:57:56 crc kubenswrapper[4748]: I0320 11:57:56.141045 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-vpnp5"/"openshift-service-ca.crt" Mar 20 11:57:56 crc kubenswrapper[4748]: I0320 11:57:56.141132 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-vpnp5"/"kube-root-ca.crt" Mar 20 11:57:56 crc kubenswrapper[4748]: I0320 11:57:56.181624 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-vpnp5/must-gather-plw76"] Mar 20 11:57:56 crc kubenswrapper[4748]: I0320 11:57:56.232630 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q57hp\" (UniqueName: \"kubernetes.io/projected/90804811-2638-4a6e-8f6c-259fe4e36763-kube-api-access-q57hp\") pod \"must-gather-plw76\" (UID: \"90804811-2638-4a6e-8f6c-259fe4e36763\") " pod="openshift-must-gather-vpnp5/must-gather-plw76" Mar 20 11:57:56 crc kubenswrapper[4748]: I0320 11:57:56.232719 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/90804811-2638-4a6e-8f6c-259fe4e36763-must-gather-output\") pod \"must-gather-plw76\" (UID: \"90804811-2638-4a6e-8f6c-259fe4e36763\") " pod="openshift-must-gather-vpnp5/must-gather-plw76" Mar 20 11:57:56 crc kubenswrapper[4748]: I0320 11:57:56.333992 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q57hp\" (UniqueName: \"kubernetes.io/projected/90804811-2638-4a6e-8f6c-259fe4e36763-kube-api-access-q57hp\") pod \"must-gather-plw76\" (UID: \"90804811-2638-4a6e-8f6c-259fe4e36763\") " pod="openshift-must-gather-vpnp5/must-gather-plw76" Mar 20 11:57:56 crc kubenswrapper[4748]: I0320 11:57:56.334081 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/90804811-2638-4a6e-8f6c-259fe4e36763-must-gather-output\") pod \"must-gather-plw76\" (UID: \"90804811-2638-4a6e-8f6c-259fe4e36763\") " pod="openshift-must-gather-vpnp5/must-gather-plw76" Mar 20 11:57:56 crc kubenswrapper[4748]: I0320 11:57:56.334447 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/90804811-2638-4a6e-8f6c-259fe4e36763-must-gather-output\") pod \"must-gather-plw76\" (UID: \"90804811-2638-4a6e-8f6c-259fe4e36763\") " pod="openshift-must-gather-vpnp5/must-gather-plw76" Mar 20 11:57:56 crc kubenswrapper[4748]: I0320 11:57:56.358202 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q57hp\" (UniqueName: \"kubernetes.io/projected/90804811-2638-4a6e-8f6c-259fe4e36763-kube-api-access-q57hp\") pod \"must-gather-plw76\" (UID: \"90804811-2638-4a6e-8f6c-259fe4e36763\") " pod="openshift-must-gather-vpnp5/must-gather-plw76" Mar 20 11:57:56 crc kubenswrapper[4748]: I0320 11:57:56.462807 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vpnp5/must-gather-plw76" Mar 20 11:57:56 crc kubenswrapper[4748]: I0320 11:57:56.948237 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-vpnp5/must-gather-plw76"] Mar 20 11:57:57 crc kubenswrapper[4748]: I0320 11:57:57.406244 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vpnp5/must-gather-plw76" event={"ID":"90804811-2638-4a6e-8f6c-259fe4e36763","Type":"ContainerStarted","Data":"a4a01dbb18ef461ba0c64479a20d20ef746373bff8c15104e3883a9f6d2970bb"} Mar 20 11:57:57 crc kubenswrapper[4748]: I0320 11:57:57.406754 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vpnp5/must-gather-plw76" event={"ID":"90804811-2638-4a6e-8f6c-259fe4e36763","Type":"ContainerStarted","Data":"ff913d3ba97b49e87ed17c2430a769c1c6407f922d7f27c811106e2d70e79104"} Mar 20 11:57:57 crc kubenswrapper[4748]: I0320 11:57:57.406778 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vpnp5/must-gather-plw76" event={"ID":"90804811-2638-4a6e-8f6c-259fe4e36763","Type":"ContainerStarted","Data":"7d250ede2680f0bbc8aacdec76da2199d5c19514e2591c980e6fc2d7fd9afa00"} Mar 20 11:57:57 crc kubenswrapper[4748]: I0320 11:57:57.432262 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-vpnp5/must-gather-plw76" podStartSLOduration=1.432239532 podStartE2EDuration="1.432239532s" podCreationTimestamp="2026-03-20 11:57:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 11:57:57.422223069 +0000 UTC m=+4912.563768883" watchObservedRunningTime="2026-03-20 11:57:57.432239532 +0000 UTC m=+4912.573785346" Mar 20 11:58:00 crc kubenswrapper[4748]: I0320 11:58:00.148506 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566798-klbz6"] Mar 20 11:58:00 crc kubenswrapper[4748]: I0320 11:58:00.151441 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566798-klbz6" Mar 20 11:58:00 crc kubenswrapper[4748]: I0320 11:58:00.154884 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 11:58:00 crc kubenswrapper[4748]: I0320 11:58:00.155386 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-p6q8z" Mar 20 11:58:00 crc kubenswrapper[4748]: I0320 11:58:00.155821 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 11:58:00 crc kubenswrapper[4748]: I0320 11:58:00.162713 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566798-klbz6"] Mar 20 11:58:00 crc kubenswrapper[4748]: I0320 11:58:00.212450 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68n2b\" (UniqueName: \"kubernetes.io/projected/d3d414f8-6f11-4e8a-8add-cb74488d3568-kube-api-access-68n2b\") pod \"auto-csr-approver-29566798-klbz6\" (UID: \"d3d414f8-6f11-4e8a-8add-cb74488d3568\") " pod="openshift-infra/auto-csr-approver-29566798-klbz6" Mar 20 11:58:00 crc kubenswrapper[4748]: I0320 11:58:00.314592 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-68n2b\" (UniqueName: \"kubernetes.io/projected/d3d414f8-6f11-4e8a-8add-cb74488d3568-kube-api-access-68n2b\") pod \"auto-csr-approver-29566798-klbz6\" (UID: \"d3d414f8-6f11-4e8a-8add-cb74488d3568\") " pod="openshift-infra/auto-csr-approver-29566798-klbz6" Mar 20 11:58:00 crc kubenswrapper[4748]: I0320 11:58:00.340667 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-68n2b\" (UniqueName: \"kubernetes.io/projected/d3d414f8-6f11-4e8a-8add-cb74488d3568-kube-api-access-68n2b\") pod \"auto-csr-approver-29566798-klbz6\" (UID: \"d3d414f8-6f11-4e8a-8add-cb74488d3568\") " pod="openshift-infra/auto-csr-approver-29566798-klbz6" Mar 20 11:58:00 crc kubenswrapper[4748]: I0320 11:58:00.479013 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566798-klbz6" Mar 20 11:58:00 crc kubenswrapper[4748]: I0320 11:58:00.922569 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566798-klbz6"] Mar 20 11:58:00 crc kubenswrapper[4748]: I0320 11:58:00.960183 4748 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 11:58:00 crc kubenswrapper[4748]: I0320 11:58:00.982800 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-vpnp5/crc-debug-4qjf2"] Mar 20 11:58:00 crc kubenswrapper[4748]: I0320 11:58:00.984207 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vpnp5/crc-debug-4qjf2" Mar 20 11:58:01 crc kubenswrapper[4748]: I0320 11:58:01.128410 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjphj\" (UniqueName: \"kubernetes.io/projected/f9d98dbd-da7e-45cb-8491-93859a93b0c3-kube-api-access-fjphj\") pod \"crc-debug-4qjf2\" (UID: \"f9d98dbd-da7e-45cb-8491-93859a93b0c3\") " pod="openshift-must-gather-vpnp5/crc-debug-4qjf2" Mar 20 11:58:01 crc kubenswrapper[4748]: I0320 11:58:01.128501 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f9d98dbd-da7e-45cb-8491-93859a93b0c3-host\") pod \"crc-debug-4qjf2\" (UID: \"f9d98dbd-da7e-45cb-8491-93859a93b0c3\") " pod="openshift-must-gather-vpnp5/crc-debug-4qjf2" Mar 20 11:58:01 crc kubenswrapper[4748]: I0320 11:58:01.230911 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fjphj\" (UniqueName: \"kubernetes.io/projected/f9d98dbd-da7e-45cb-8491-93859a93b0c3-kube-api-access-fjphj\") pod \"crc-debug-4qjf2\" (UID: \"f9d98dbd-da7e-45cb-8491-93859a93b0c3\") " pod="openshift-must-gather-vpnp5/crc-debug-4qjf2" Mar 20 11:58:01 crc kubenswrapper[4748]: I0320 11:58:01.231004 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f9d98dbd-da7e-45cb-8491-93859a93b0c3-host\") pod \"crc-debug-4qjf2\" (UID: \"f9d98dbd-da7e-45cb-8491-93859a93b0c3\") " pod="openshift-must-gather-vpnp5/crc-debug-4qjf2" Mar 20 11:58:01 crc kubenswrapper[4748]: I0320 11:58:01.231189 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f9d98dbd-da7e-45cb-8491-93859a93b0c3-host\") pod \"crc-debug-4qjf2\" (UID: \"f9d98dbd-da7e-45cb-8491-93859a93b0c3\") " pod="openshift-must-gather-vpnp5/crc-debug-4qjf2" Mar 20 11:58:01 crc kubenswrapper[4748]: I0320 11:58:01.252775 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjphj\" (UniqueName: \"kubernetes.io/projected/f9d98dbd-da7e-45cb-8491-93859a93b0c3-kube-api-access-fjphj\") pod \"crc-debug-4qjf2\" (UID: \"f9d98dbd-da7e-45cb-8491-93859a93b0c3\") " pod="openshift-must-gather-vpnp5/crc-debug-4qjf2" Mar 20 11:58:01 crc kubenswrapper[4748]: I0320 11:58:01.320381 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vpnp5/crc-debug-4qjf2" Mar 20 11:58:01 crc kubenswrapper[4748]: I0320 11:58:01.439925 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566798-klbz6" event={"ID":"d3d414f8-6f11-4e8a-8add-cb74488d3568","Type":"ContainerStarted","Data":"f5ea3d4aef3704b03ba34eb7cea3d87d284d6465ed762c7113d8e4b539de25b7"} Mar 20 11:58:01 crc kubenswrapper[4748]: I0320 11:58:01.441076 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vpnp5/crc-debug-4qjf2" event={"ID":"f9d98dbd-da7e-45cb-8491-93859a93b0c3","Type":"ContainerStarted","Data":"2e1775d51b7bc88cc03cd63ffe17f6d32dcc93c888fc00819a48ab43b5703718"} Mar 20 11:58:01 crc kubenswrapper[4748]: I0320 11:58:01.515124 4748 scope.go:117] "RemoveContainer" containerID="219c994104121ab22714c2868fe3144cfbb2c9dfcdad0eea40b956e614be504d" Mar 20 11:58:01 crc kubenswrapper[4748]: E0320 11:58:01.515399 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lbvz_openshift-machine-config-operator(8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" Mar 20 11:58:02 crc kubenswrapper[4748]: I0320 11:58:02.452269 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vpnp5/crc-debug-4qjf2" event={"ID":"f9d98dbd-da7e-45cb-8491-93859a93b0c3","Type":"ContainerStarted","Data":"4cd940b45880cfb45cda841808feb1b79d2aaa1acc6ece545a5aadba48087178"} Mar 20 11:58:02 crc kubenswrapper[4748]: I0320 11:58:02.472895 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566798-klbz6" event={"ID":"d3d414f8-6f11-4e8a-8add-cb74488d3568","Type":"ContainerStarted","Data":"2e58deefd358ab82d99a1de69a45a8a81612661f4628d642d9adb6be22fc770e"} Mar 20 11:58:02 crc kubenswrapper[4748]: I0320 11:58:02.473566 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-vpnp5/crc-debug-4qjf2" podStartSLOduration=2.47355173 podStartE2EDuration="2.47355173s" podCreationTimestamp="2026-03-20 11:58:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 11:58:02.471080047 +0000 UTC m=+4917.612625871" watchObservedRunningTime="2026-03-20 11:58:02.47355173 +0000 UTC m=+4917.615097544" Mar 20 11:58:02 crc kubenswrapper[4748]: I0320 11:58:02.498512 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29566798-klbz6" podStartSLOduration=1.305274395 podStartE2EDuration="2.49849105s" podCreationTimestamp="2026-03-20 11:58:00 +0000 UTC" firstStartedPulling="2026-03-20 11:58:00.959892511 +0000 UTC m=+4916.101438325" lastFinishedPulling="2026-03-20 11:58:02.153109166 +0000 UTC m=+4917.294654980" observedRunningTime="2026-03-20 11:58:02.491092263 +0000 UTC m=+4917.632638097" watchObservedRunningTime="2026-03-20 11:58:02.49849105 +0000 UTC m=+4917.640036864" Mar 20 11:58:03 crc kubenswrapper[4748]: I0320 11:58:03.482867 4748 generic.go:334] "Generic (PLEG): container finished" podID="d3d414f8-6f11-4e8a-8add-cb74488d3568" containerID="2e58deefd358ab82d99a1de69a45a8a81612661f4628d642d9adb6be22fc770e" exitCode=0 Mar 20 11:58:03 crc kubenswrapper[4748]: I0320 11:58:03.482961 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566798-klbz6" event={"ID":"d3d414f8-6f11-4e8a-8add-cb74488d3568","Type":"ContainerDied","Data":"2e58deefd358ab82d99a1de69a45a8a81612661f4628d642d9adb6be22fc770e"} Mar 20 11:58:05 crc kubenswrapper[4748]: I0320 11:58:05.057353 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566798-klbz6" Mar 20 11:58:05 crc kubenswrapper[4748]: I0320 11:58:05.213167 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-68n2b\" (UniqueName: \"kubernetes.io/projected/d3d414f8-6f11-4e8a-8add-cb74488d3568-kube-api-access-68n2b\") pod \"d3d414f8-6f11-4e8a-8add-cb74488d3568\" (UID: \"d3d414f8-6f11-4e8a-8add-cb74488d3568\") " Mar 20 11:58:05 crc kubenswrapper[4748]: I0320 11:58:05.222314 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3d414f8-6f11-4e8a-8add-cb74488d3568-kube-api-access-68n2b" (OuterVolumeSpecName: "kube-api-access-68n2b") pod "d3d414f8-6f11-4e8a-8add-cb74488d3568" (UID: "d3d414f8-6f11-4e8a-8add-cb74488d3568"). InnerVolumeSpecName "kube-api-access-68n2b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:58:05 crc kubenswrapper[4748]: I0320 11:58:05.316117 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-68n2b\" (UniqueName: \"kubernetes.io/projected/d3d414f8-6f11-4e8a-8add-cb74488d3568-kube-api-access-68n2b\") on node \"crc\" DevicePath \"\"" Mar 20 11:58:05 crc kubenswrapper[4748]: I0320 11:58:05.499958 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566798-klbz6" event={"ID":"d3d414f8-6f11-4e8a-8add-cb74488d3568","Type":"ContainerDied","Data":"f5ea3d4aef3704b03ba34eb7cea3d87d284d6465ed762c7113d8e4b539de25b7"} Mar 20 11:58:05 crc kubenswrapper[4748]: I0320 11:58:05.500246 4748 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f5ea3d4aef3704b03ba34eb7cea3d87d284d6465ed762c7113d8e4b539de25b7" Mar 20 11:58:05 crc kubenswrapper[4748]: I0320 11:58:05.500015 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566798-klbz6" Mar 20 11:58:05 crc kubenswrapper[4748]: I0320 11:58:05.564613 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566792-7t722"] Mar 20 11:58:05 crc kubenswrapper[4748]: I0320 11:58:05.572595 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566792-7t722"] Mar 20 11:58:07 crc kubenswrapper[4748]: I0320 11:58:07.529344 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2071e2d0-4ad8-4cf2-b478-502bf98abb04" path="/var/lib/kubelet/pods/2071e2d0-4ad8-4cf2-b478-502bf98abb04/volumes" Mar 20 11:58:15 crc kubenswrapper[4748]: I0320 11:58:15.522625 4748 scope.go:117] "RemoveContainer" containerID="219c994104121ab22714c2868fe3144cfbb2c9dfcdad0eea40b956e614be504d" Mar 20 11:58:15 crc kubenswrapper[4748]: E0320 11:58:15.523561 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lbvz_openshift-machine-config-operator(8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" Mar 20 11:58:19 crc kubenswrapper[4748]: I0320 11:58:19.056242 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-mjrql"] Mar 20 11:58:19 crc kubenswrapper[4748]: E0320 11:58:19.057044 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3d414f8-6f11-4e8a-8add-cb74488d3568" containerName="oc" Mar 20 11:58:19 crc kubenswrapper[4748]: I0320 11:58:19.057061 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3d414f8-6f11-4e8a-8add-cb74488d3568" containerName="oc" Mar 20 11:58:19 crc kubenswrapper[4748]: I0320 11:58:19.057239 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3d414f8-6f11-4e8a-8add-cb74488d3568" containerName="oc" Mar 20 11:58:19 crc kubenswrapper[4748]: I0320 11:58:19.058467 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mjrql" Mar 20 11:58:19 crc kubenswrapper[4748]: I0320 11:58:19.076788 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mjrql"] Mar 20 11:58:19 crc kubenswrapper[4748]: I0320 11:58:19.164087 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngn87\" (UniqueName: \"kubernetes.io/projected/3f6bc139-86b7-42e4-93b2-8293185931dc-kube-api-access-ngn87\") pod \"community-operators-mjrql\" (UID: \"3f6bc139-86b7-42e4-93b2-8293185931dc\") " pod="openshift-marketplace/community-operators-mjrql" Mar 20 11:58:19 crc kubenswrapper[4748]: I0320 11:58:19.164400 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f6bc139-86b7-42e4-93b2-8293185931dc-utilities\") pod \"community-operators-mjrql\" (UID: \"3f6bc139-86b7-42e4-93b2-8293185931dc\") " pod="openshift-marketplace/community-operators-mjrql" Mar 20 11:58:19 crc kubenswrapper[4748]: I0320 11:58:19.164626 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f6bc139-86b7-42e4-93b2-8293185931dc-catalog-content\") pod \"community-operators-mjrql\" (UID: \"3f6bc139-86b7-42e4-93b2-8293185931dc\") " pod="openshift-marketplace/community-operators-mjrql" Mar 20 11:58:19 crc kubenswrapper[4748]: I0320 11:58:19.266296 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngn87\" (UniqueName: \"kubernetes.io/projected/3f6bc139-86b7-42e4-93b2-8293185931dc-kube-api-access-ngn87\") pod \"community-operators-mjrql\" (UID: \"3f6bc139-86b7-42e4-93b2-8293185931dc\") " pod="openshift-marketplace/community-operators-mjrql" Mar 20 11:58:19 crc kubenswrapper[4748]: I0320 11:58:19.266372 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f6bc139-86b7-42e4-93b2-8293185931dc-utilities\") pod \"community-operators-mjrql\" (UID: \"3f6bc139-86b7-42e4-93b2-8293185931dc\") " pod="openshift-marketplace/community-operators-mjrql" Mar 20 11:58:19 crc kubenswrapper[4748]: I0320 11:58:19.266472 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f6bc139-86b7-42e4-93b2-8293185931dc-catalog-content\") pod \"community-operators-mjrql\" (UID: \"3f6bc139-86b7-42e4-93b2-8293185931dc\") " pod="openshift-marketplace/community-operators-mjrql" Mar 20 11:58:19 crc kubenswrapper[4748]: I0320 11:58:19.266875 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f6bc139-86b7-42e4-93b2-8293185931dc-utilities\") pod \"community-operators-mjrql\" (UID: \"3f6bc139-86b7-42e4-93b2-8293185931dc\") " pod="openshift-marketplace/community-operators-mjrql" Mar 20 11:58:19 crc kubenswrapper[4748]: I0320 11:58:19.267006 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f6bc139-86b7-42e4-93b2-8293185931dc-catalog-content\") pod \"community-operators-mjrql\" (UID: \"3f6bc139-86b7-42e4-93b2-8293185931dc\") " pod="openshift-marketplace/community-operators-mjrql" Mar 20 11:58:19 crc kubenswrapper[4748]: I0320 11:58:19.315573 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngn87\" (UniqueName: \"kubernetes.io/projected/3f6bc139-86b7-42e4-93b2-8293185931dc-kube-api-access-ngn87\") pod \"community-operators-mjrql\" (UID: \"3f6bc139-86b7-42e4-93b2-8293185931dc\") " pod="openshift-marketplace/community-operators-mjrql" Mar 20 11:58:19 crc kubenswrapper[4748]: I0320 11:58:19.388386 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mjrql" Mar 20 11:58:20 crc kubenswrapper[4748]: I0320 11:58:20.026636 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mjrql"] Mar 20 11:58:20 crc kubenswrapper[4748]: I0320 11:58:20.666861 4748 generic.go:334] "Generic (PLEG): container finished" podID="3f6bc139-86b7-42e4-93b2-8293185931dc" containerID="53cec5256db1dc0d1ac4fe490f377475702598c6dbc850b3ea6bd9b5a127b070" exitCode=0 Mar 20 11:58:20 crc kubenswrapper[4748]: I0320 11:58:20.667074 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mjrql" event={"ID":"3f6bc139-86b7-42e4-93b2-8293185931dc","Type":"ContainerDied","Data":"53cec5256db1dc0d1ac4fe490f377475702598c6dbc850b3ea6bd9b5a127b070"} Mar 20 11:58:20 crc kubenswrapper[4748]: I0320 11:58:20.667115 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mjrql" event={"ID":"3f6bc139-86b7-42e4-93b2-8293185931dc","Type":"ContainerStarted","Data":"c669079778478f33fdc012ab592b91a0de9d3ad0288483245b7cab06d097919c"} Mar 20 11:58:21 crc kubenswrapper[4748]: I0320 11:58:21.677070 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mjrql" event={"ID":"3f6bc139-86b7-42e4-93b2-8293185931dc","Type":"ContainerStarted","Data":"ab094f8ad46c28315b8a9a7637ff1516aa02eb70baa282fb91600301cd78a82b"} Mar 20 11:58:22 crc kubenswrapper[4748]: I0320 11:58:22.688169 4748 generic.go:334] "Generic (PLEG): container finished" podID="3f6bc139-86b7-42e4-93b2-8293185931dc" containerID="ab094f8ad46c28315b8a9a7637ff1516aa02eb70baa282fb91600301cd78a82b" exitCode=0 Mar 20 11:58:22 crc kubenswrapper[4748]: I0320 11:58:22.688218 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mjrql" event={"ID":"3f6bc139-86b7-42e4-93b2-8293185931dc","Type":"ContainerDied","Data":"ab094f8ad46c28315b8a9a7637ff1516aa02eb70baa282fb91600301cd78a82b"} Mar 20 11:58:24 crc kubenswrapper[4748]: I0320 11:58:24.712598 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mjrql" event={"ID":"3f6bc139-86b7-42e4-93b2-8293185931dc","Type":"ContainerStarted","Data":"b1d380866fd7f9c7d2d231f8f090d3cfb43fb598f8444c9cbd088de1240d81c8"} Mar 20 11:58:24 crc kubenswrapper[4748]: I0320 11:58:24.736280 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-mjrql" podStartSLOduration=3.26045953 podStartE2EDuration="5.73625535s" podCreationTimestamp="2026-03-20 11:58:19 +0000 UTC" firstStartedPulling="2026-03-20 11:58:20.668597994 +0000 UTC m=+4935.810143808" lastFinishedPulling="2026-03-20 11:58:23.144393814 +0000 UTC m=+4938.285939628" observedRunningTime="2026-03-20 11:58:24.731001497 +0000 UTC m=+4939.872547331" watchObservedRunningTime="2026-03-20 11:58:24.73625535 +0000 UTC m=+4939.877801164" Mar 20 11:58:25 crc kubenswrapper[4748]: I0320 11:58:25.055416 4748 scope.go:117] "RemoveContainer" containerID="f4982bba606bf16f715397ff55f7527addd6a3b11a360d72d75a6724a692388b" Mar 20 11:58:29 crc kubenswrapper[4748]: I0320 11:58:29.389403 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-mjrql" Mar 20 11:58:29 crc kubenswrapper[4748]: I0320 11:58:29.389689 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-mjrql" Mar 20 11:58:29 crc kubenswrapper[4748]: I0320 11:58:29.443000 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-mjrql" Mar 20 11:58:30 crc kubenswrapper[4748]: I0320 11:58:30.354780 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-mjrql" Mar 20 11:58:30 crc kubenswrapper[4748]: I0320 11:58:30.403663 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mjrql"] Mar 20 11:58:30 crc kubenswrapper[4748]: I0320 11:58:30.515177 4748 scope.go:117] "RemoveContainer" containerID="219c994104121ab22714c2868fe3144cfbb2c9dfcdad0eea40b956e614be504d" Mar 20 11:58:30 crc kubenswrapper[4748]: E0320 11:58:30.515470 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lbvz_openshift-machine-config-operator(8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" Mar 20 11:58:32 crc kubenswrapper[4748]: I0320 11:58:32.323254 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-mjrql" podUID="3f6bc139-86b7-42e4-93b2-8293185931dc" containerName="registry-server" containerID="cri-o://b1d380866fd7f9c7d2d231f8f090d3cfb43fb598f8444c9cbd088de1240d81c8" gracePeriod=2 Mar 20 11:58:32 crc kubenswrapper[4748]: I0320 11:58:32.777286 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mjrql" Mar 20 11:58:32 crc kubenswrapper[4748]: I0320 11:58:32.933703 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f6bc139-86b7-42e4-93b2-8293185931dc-utilities\") pod \"3f6bc139-86b7-42e4-93b2-8293185931dc\" (UID: \"3f6bc139-86b7-42e4-93b2-8293185931dc\") " Mar 20 11:58:32 crc kubenswrapper[4748]: I0320 11:58:32.933848 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f6bc139-86b7-42e4-93b2-8293185931dc-catalog-content\") pod \"3f6bc139-86b7-42e4-93b2-8293185931dc\" (UID: \"3f6bc139-86b7-42e4-93b2-8293185931dc\") " Mar 20 11:58:32 crc kubenswrapper[4748]: I0320 11:58:32.933929 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngn87\" (UniqueName: \"kubernetes.io/projected/3f6bc139-86b7-42e4-93b2-8293185931dc-kube-api-access-ngn87\") pod \"3f6bc139-86b7-42e4-93b2-8293185931dc\" (UID: \"3f6bc139-86b7-42e4-93b2-8293185931dc\") " Mar 20 11:58:32 crc kubenswrapper[4748]: I0320 11:58:32.934642 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3f6bc139-86b7-42e4-93b2-8293185931dc-utilities" (OuterVolumeSpecName: "utilities") pod "3f6bc139-86b7-42e4-93b2-8293185931dc" (UID: "3f6bc139-86b7-42e4-93b2-8293185931dc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:58:32 crc kubenswrapper[4748]: I0320 11:58:32.940576 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f6bc139-86b7-42e4-93b2-8293185931dc-kube-api-access-ngn87" (OuterVolumeSpecName: "kube-api-access-ngn87") pod "3f6bc139-86b7-42e4-93b2-8293185931dc" (UID: "3f6bc139-86b7-42e4-93b2-8293185931dc"). InnerVolumeSpecName "kube-api-access-ngn87". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:58:32 crc kubenswrapper[4748]: I0320 11:58:32.994640 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3f6bc139-86b7-42e4-93b2-8293185931dc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3f6bc139-86b7-42e4-93b2-8293185931dc" (UID: "3f6bc139-86b7-42e4-93b2-8293185931dc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:58:33 crc kubenswrapper[4748]: I0320 11:58:33.036576 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngn87\" (UniqueName: \"kubernetes.io/projected/3f6bc139-86b7-42e4-93b2-8293185931dc-kube-api-access-ngn87\") on node \"crc\" DevicePath \"\"" Mar 20 11:58:33 crc kubenswrapper[4748]: I0320 11:58:33.036606 4748 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f6bc139-86b7-42e4-93b2-8293185931dc-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 11:58:33 crc kubenswrapper[4748]: I0320 11:58:33.036615 4748 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f6bc139-86b7-42e4-93b2-8293185931dc-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 11:58:33 crc kubenswrapper[4748]: I0320 11:58:33.335919 4748 generic.go:334] "Generic (PLEG): container finished" podID="3f6bc139-86b7-42e4-93b2-8293185931dc" containerID="b1d380866fd7f9c7d2d231f8f090d3cfb43fb598f8444c9cbd088de1240d81c8" exitCode=0 Mar 20 11:58:33 crc kubenswrapper[4748]: I0320 11:58:33.336002 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mjrql" Mar 20 11:58:33 crc kubenswrapper[4748]: I0320 11:58:33.336023 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mjrql" event={"ID":"3f6bc139-86b7-42e4-93b2-8293185931dc","Type":"ContainerDied","Data":"b1d380866fd7f9c7d2d231f8f090d3cfb43fb598f8444c9cbd088de1240d81c8"} Mar 20 11:58:33 crc kubenswrapper[4748]: I0320 11:58:33.336370 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mjrql" event={"ID":"3f6bc139-86b7-42e4-93b2-8293185931dc","Type":"ContainerDied","Data":"c669079778478f33fdc012ab592b91a0de9d3ad0288483245b7cab06d097919c"} Mar 20 11:58:33 crc kubenswrapper[4748]: I0320 11:58:33.336389 4748 scope.go:117] "RemoveContainer" containerID="b1d380866fd7f9c7d2d231f8f090d3cfb43fb598f8444c9cbd088de1240d81c8" Mar 20 11:58:33 crc kubenswrapper[4748]: I0320 11:58:33.359729 4748 scope.go:117] "RemoveContainer" containerID="ab094f8ad46c28315b8a9a7637ff1516aa02eb70baa282fb91600301cd78a82b" Mar 20 11:58:33 crc kubenswrapper[4748]: I0320 11:58:33.385027 4748 scope.go:117] "RemoveContainer" containerID="53cec5256db1dc0d1ac4fe490f377475702598c6dbc850b3ea6bd9b5a127b070" Mar 20 11:58:33 crc kubenswrapper[4748]: I0320 11:58:33.405097 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mjrql"] Mar 20 11:58:33 crc kubenswrapper[4748]: I0320 11:58:33.418202 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-mjrql"] Mar 20 11:58:33 crc kubenswrapper[4748]: I0320 11:58:33.436595 4748 scope.go:117] "RemoveContainer" containerID="b1d380866fd7f9c7d2d231f8f090d3cfb43fb598f8444c9cbd088de1240d81c8" Mar 20 11:58:33 crc kubenswrapper[4748]: E0320 11:58:33.437257 4748 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1d380866fd7f9c7d2d231f8f090d3cfb43fb598f8444c9cbd088de1240d81c8\": container with ID starting with b1d380866fd7f9c7d2d231f8f090d3cfb43fb598f8444c9cbd088de1240d81c8 not found: ID does not exist" containerID="b1d380866fd7f9c7d2d231f8f090d3cfb43fb598f8444c9cbd088de1240d81c8" Mar 20 11:58:33 crc kubenswrapper[4748]: I0320 11:58:33.437421 4748 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1d380866fd7f9c7d2d231f8f090d3cfb43fb598f8444c9cbd088de1240d81c8"} err="failed to get container status \"b1d380866fd7f9c7d2d231f8f090d3cfb43fb598f8444c9cbd088de1240d81c8\": rpc error: code = NotFound desc = could not find container \"b1d380866fd7f9c7d2d231f8f090d3cfb43fb598f8444c9cbd088de1240d81c8\": container with ID starting with b1d380866fd7f9c7d2d231f8f090d3cfb43fb598f8444c9cbd088de1240d81c8 not found: ID does not exist" Mar 20 11:58:33 crc kubenswrapper[4748]: I0320 11:58:33.437563 4748 scope.go:117] "RemoveContainer" containerID="ab094f8ad46c28315b8a9a7637ff1516aa02eb70baa282fb91600301cd78a82b" Mar 20 11:58:33 crc kubenswrapper[4748]: E0320 11:58:33.438421 4748 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab094f8ad46c28315b8a9a7637ff1516aa02eb70baa282fb91600301cd78a82b\": container with ID starting with ab094f8ad46c28315b8a9a7637ff1516aa02eb70baa282fb91600301cd78a82b not found: ID does not exist" containerID="ab094f8ad46c28315b8a9a7637ff1516aa02eb70baa282fb91600301cd78a82b" Mar 20 11:58:33 crc kubenswrapper[4748]: I0320 11:58:33.438463 4748 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab094f8ad46c28315b8a9a7637ff1516aa02eb70baa282fb91600301cd78a82b"} err="failed to get container status \"ab094f8ad46c28315b8a9a7637ff1516aa02eb70baa282fb91600301cd78a82b\": rpc error: code = NotFound desc = could not find container \"ab094f8ad46c28315b8a9a7637ff1516aa02eb70baa282fb91600301cd78a82b\": container with ID starting with ab094f8ad46c28315b8a9a7637ff1516aa02eb70baa282fb91600301cd78a82b not found: ID does not exist" Mar 20 11:58:33 crc kubenswrapper[4748]: I0320 11:58:33.438493 4748 scope.go:117] "RemoveContainer" containerID="53cec5256db1dc0d1ac4fe490f377475702598c6dbc850b3ea6bd9b5a127b070" Mar 20 11:58:33 crc kubenswrapper[4748]: E0320 11:58:33.439255 4748 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"53cec5256db1dc0d1ac4fe490f377475702598c6dbc850b3ea6bd9b5a127b070\": container with ID starting with 53cec5256db1dc0d1ac4fe490f377475702598c6dbc850b3ea6bd9b5a127b070 not found: ID does not exist" containerID="53cec5256db1dc0d1ac4fe490f377475702598c6dbc850b3ea6bd9b5a127b070" Mar 20 11:58:33 crc kubenswrapper[4748]: I0320 11:58:33.439291 4748 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53cec5256db1dc0d1ac4fe490f377475702598c6dbc850b3ea6bd9b5a127b070"} err="failed to get container status \"53cec5256db1dc0d1ac4fe490f377475702598c6dbc850b3ea6bd9b5a127b070\": rpc error: code = NotFound desc = could not find container \"53cec5256db1dc0d1ac4fe490f377475702598c6dbc850b3ea6bd9b5a127b070\": container with ID starting with 53cec5256db1dc0d1ac4fe490f377475702598c6dbc850b3ea6bd9b5a127b070 not found: ID does not exist" Mar 20 11:58:33 crc kubenswrapper[4748]: I0320 11:58:33.543243 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f6bc139-86b7-42e4-93b2-8293185931dc" path="/var/lib/kubelet/pods/3f6bc139-86b7-42e4-93b2-8293185931dc/volumes" Mar 20 11:58:43 crc kubenswrapper[4748]: I0320 11:58:43.424830 4748 generic.go:334] "Generic (PLEG): container finished" podID="f9d98dbd-da7e-45cb-8491-93859a93b0c3" containerID="4cd940b45880cfb45cda841808feb1b79d2aaa1acc6ece545a5aadba48087178" exitCode=0 Mar 20 11:58:43 crc kubenswrapper[4748]: I0320 11:58:43.425010 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vpnp5/crc-debug-4qjf2" event={"ID":"f9d98dbd-da7e-45cb-8491-93859a93b0c3","Type":"ContainerDied","Data":"4cd940b45880cfb45cda841808feb1b79d2aaa1acc6ece545a5aadba48087178"} Mar 20 11:58:44 crc kubenswrapper[4748]: I0320 11:58:44.545682 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vpnp5/crc-debug-4qjf2" Mar 20 11:58:44 crc kubenswrapper[4748]: I0320 11:58:44.577457 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-vpnp5/crc-debug-4qjf2"] Mar 20 11:58:44 crc kubenswrapper[4748]: I0320 11:58:44.586415 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-vpnp5/crc-debug-4qjf2"] Mar 20 11:58:44 crc kubenswrapper[4748]: I0320 11:58:44.666613 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fjphj\" (UniqueName: \"kubernetes.io/projected/f9d98dbd-da7e-45cb-8491-93859a93b0c3-kube-api-access-fjphj\") pod \"f9d98dbd-da7e-45cb-8491-93859a93b0c3\" (UID: \"f9d98dbd-da7e-45cb-8491-93859a93b0c3\") " Mar 20 11:58:44 crc kubenswrapper[4748]: I0320 11:58:44.666937 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f9d98dbd-da7e-45cb-8491-93859a93b0c3-host\") pod \"f9d98dbd-da7e-45cb-8491-93859a93b0c3\" (UID: \"f9d98dbd-da7e-45cb-8491-93859a93b0c3\") " Mar 20 11:58:44 crc kubenswrapper[4748]: I0320 11:58:44.667167 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f9d98dbd-da7e-45cb-8491-93859a93b0c3-host" (OuterVolumeSpecName: "host") pod "f9d98dbd-da7e-45cb-8491-93859a93b0c3" (UID: "f9d98dbd-da7e-45cb-8491-93859a93b0c3"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 11:58:44 crc kubenswrapper[4748]: I0320 11:58:44.667825 4748 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f9d98dbd-da7e-45cb-8491-93859a93b0c3-host\") on node \"crc\" DevicePath \"\"" Mar 20 11:58:44 crc kubenswrapper[4748]: I0320 11:58:44.673725 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9d98dbd-da7e-45cb-8491-93859a93b0c3-kube-api-access-fjphj" (OuterVolumeSpecName: "kube-api-access-fjphj") pod "f9d98dbd-da7e-45cb-8491-93859a93b0c3" (UID: "f9d98dbd-da7e-45cb-8491-93859a93b0c3"). InnerVolumeSpecName "kube-api-access-fjphj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:58:44 crc kubenswrapper[4748]: I0320 11:58:44.769195 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fjphj\" (UniqueName: \"kubernetes.io/projected/f9d98dbd-da7e-45cb-8491-93859a93b0c3-kube-api-access-fjphj\") on node \"crc\" DevicePath \"\"" Mar 20 11:58:45 crc kubenswrapper[4748]: I0320 11:58:45.442936 4748 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2e1775d51b7bc88cc03cd63ffe17f6d32dcc93c888fc00819a48ab43b5703718" Mar 20 11:58:45 crc kubenswrapper[4748]: I0320 11:58:45.442990 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vpnp5/crc-debug-4qjf2" Mar 20 11:58:45 crc kubenswrapper[4748]: I0320 11:58:45.522067 4748 scope.go:117] "RemoveContainer" containerID="219c994104121ab22714c2868fe3144cfbb2c9dfcdad0eea40b956e614be504d" Mar 20 11:58:45 crc kubenswrapper[4748]: I0320 11:58:45.527385 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9d98dbd-da7e-45cb-8491-93859a93b0c3" path="/var/lib/kubelet/pods/f9d98dbd-da7e-45cb-8491-93859a93b0c3/volumes" Mar 20 11:58:45 crc kubenswrapper[4748]: I0320 11:58:45.849171 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-vpnp5/crc-debug-827b7"] Mar 20 11:58:45 crc kubenswrapper[4748]: E0320 11:58:45.849806 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f6bc139-86b7-42e4-93b2-8293185931dc" containerName="registry-server" Mar 20 11:58:45 crc kubenswrapper[4748]: I0320 11:58:45.849819 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f6bc139-86b7-42e4-93b2-8293185931dc" containerName="registry-server" Mar 20 11:58:45 crc kubenswrapper[4748]: E0320 11:58:45.849841 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9d98dbd-da7e-45cb-8491-93859a93b0c3" containerName="container-00" Mar 20 11:58:45 crc kubenswrapper[4748]: I0320 11:58:45.849851 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9d98dbd-da7e-45cb-8491-93859a93b0c3" containerName="container-00" Mar 20 11:58:45 crc kubenswrapper[4748]: E0320 11:58:45.849877 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f6bc139-86b7-42e4-93b2-8293185931dc" containerName="extract-utilities" Mar 20 11:58:45 crc kubenswrapper[4748]: I0320 11:58:45.849886 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f6bc139-86b7-42e4-93b2-8293185931dc" containerName="extract-utilities" Mar 20 11:58:45 crc kubenswrapper[4748]: E0320 11:58:45.849913 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f6bc139-86b7-42e4-93b2-8293185931dc" containerName="extract-content" Mar 20 11:58:45 crc kubenswrapper[4748]: I0320 11:58:45.849919 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f6bc139-86b7-42e4-93b2-8293185931dc" containerName="extract-content" Mar 20 11:58:45 crc kubenswrapper[4748]: I0320 11:58:45.850116 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f6bc139-86b7-42e4-93b2-8293185931dc" containerName="registry-server" Mar 20 11:58:45 crc kubenswrapper[4748]: I0320 11:58:45.850135 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9d98dbd-da7e-45cb-8491-93859a93b0c3" containerName="container-00" Mar 20 11:58:45 crc kubenswrapper[4748]: I0320 11:58:45.850718 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vpnp5/crc-debug-827b7" Mar 20 11:58:45 crc kubenswrapper[4748]: I0320 11:58:45.911052 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7q888\" (UniqueName: \"kubernetes.io/projected/de1944cd-0667-46dc-9483-d8525943279e-kube-api-access-7q888\") pod \"crc-debug-827b7\" (UID: \"de1944cd-0667-46dc-9483-d8525943279e\") " pod="openshift-must-gather-vpnp5/crc-debug-827b7" Mar 20 11:58:45 crc kubenswrapper[4748]: I0320 11:58:45.911344 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/de1944cd-0667-46dc-9483-d8525943279e-host\") pod \"crc-debug-827b7\" (UID: \"de1944cd-0667-46dc-9483-d8525943279e\") " pod="openshift-must-gather-vpnp5/crc-debug-827b7" Mar 20 11:58:46 crc kubenswrapper[4748]: I0320 11:58:46.013737 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7q888\" (UniqueName: \"kubernetes.io/projected/de1944cd-0667-46dc-9483-d8525943279e-kube-api-access-7q888\") pod \"crc-debug-827b7\" (UID: \"de1944cd-0667-46dc-9483-d8525943279e\") " pod="openshift-must-gather-vpnp5/crc-debug-827b7" Mar 20 11:58:46 crc kubenswrapper[4748]: I0320 11:58:46.013931 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/de1944cd-0667-46dc-9483-d8525943279e-host\") pod \"crc-debug-827b7\" (UID: \"de1944cd-0667-46dc-9483-d8525943279e\") " pod="openshift-must-gather-vpnp5/crc-debug-827b7" Mar 20 11:58:46 crc kubenswrapper[4748]: I0320 11:58:46.014064 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/de1944cd-0667-46dc-9483-d8525943279e-host\") pod \"crc-debug-827b7\" (UID: \"de1944cd-0667-46dc-9483-d8525943279e\") " pod="openshift-must-gather-vpnp5/crc-debug-827b7" Mar 20 11:58:46 crc kubenswrapper[4748]: I0320 11:58:46.032907 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7q888\" (UniqueName: \"kubernetes.io/projected/de1944cd-0667-46dc-9483-d8525943279e-kube-api-access-7q888\") pod \"crc-debug-827b7\" (UID: \"de1944cd-0667-46dc-9483-d8525943279e\") " pod="openshift-must-gather-vpnp5/crc-debug-827b7" Mar 20 11:58:46 crc kubenswrapper[4748]: I0320 11:58:46.167405 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vpnp5/crc-debug-827b7" Mar 20 11:58:46 crc kubenswrapper[4748]: I0320 11:58:46.454475 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vpnp5/crc-debug-827b7" event={"ID":"de1944cd-0667-46dc-9483-d8525943279e","Type":"ContainerStarted","Data":"1729455ba7adc1f65024369f256253ffaeec71d3249a5476cf01c3e56c8959a1"} Mar 20 11:58:46 crc kubenswrapper[4748]: I0320 11:58:46.454782 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vpnp5/crc-debug-827b7" event={"ID":"de1944cd-0667-46dc-9483-d8525943279e","Type":"ContainerStarted","Data":"6cbf0e3b0aa07d59d63a0fb0d9fe327cf732173c1a60134c5b0bd1b1478145f4"} Mar 20 11:58:46 crc kubenswrapper[4748]: I0320 11:58:46.458207 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" event={"ID":"8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c","Type":"ContainerStarted","Data":"5ce726fbb2e9c7a69b53896504587de01360f862a5598de467da49c2048886e6"} Mar 20 11:58:46 crc kubenswrapper[4748]: I0320 11:58:46.471523 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-vpnp5/crc-debug-827b7" podStartSLOduration=1.4715069920000001 podStartE2EDuration="1.471506992s" podCreationTimestamp="2026-03-20 11:58:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 11:58:46.465816478 +0000 UTC m=+4961.607362292" watchObservedRunningTime="2026-03-20 11:58:46.471506992 +0000 UTC m=+4961.613052806" Mar 20 11:58:47 crc kubenswrapper[4748]: I0320 11:58:47.466912 4748 generic.go:334] "Generic (PLEG): container finished" podID="de1944cd-0667-46dc-9483-d8525943279e" containerID="1729455ba7adc1f65024369f256253ffaeec71d3249a5476cf01c3e56c8959a1" exitCode=0 Mar 20 11:58:47 crc kubenswrapper[4748]: I0320 11:58:47.466962 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vpnp5/crc-debug-827b7" event={"ID":"de1944cd-0667-46dc-9483-d8525943279e","Type":"ContainerDied","Data":"1729455ba7adc1f65024369f256253ffaeec71d3249a5476cf01c3e56c8959a1"} Mar 20 11:58:48 crc kubenswrapper[4748]: I0320 11:58:48.640935 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vpnp5/crc-debug-827b7" Mar 20 11:58:48 crc kubenswrapper[4748]: I0320 11:58:48.689610 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-vpnp5/crc-debug-827b7"] Mar 20 11:58:48 crc kubenswrapper[4748]: I0320 11:58:48.698418 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-vpnp5/crc-debug-827b7"] Mar 20 11:58:48 crc kubenswrapper[4748]: I0320 11:58:48.764596 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7q888\" (UniqueName: \"kubernetes.io/projected/de1944cd-0667-46dc-9483-d8525943279e-kube-api-access-7q888\") pod \"de1944cd-0667-46dc-9483-d8525943279e\" (UID: \"de1944cd-0667-46dc-9483-d8525943279e\") " Mar 20 11:58:48 crc kubenswrapper[4748]: I0320 11:58:48.765078 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/de1944cd-0667-46dc-9483-d8525943279e-host\") pod \"de1944cd-0667-46dc-9483-d8525943279e\" (UID: \"de1944cd-0667-46dc-9483-d8525943279e\") " Mar 20 11:58:48 crc kubenswrapper[4748]: I0320 11:58:48.765522 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/de1944cd-0667-46dc-9483-d8525943279e-host" (OuterVolumeSpecName: "host") pod "de1944cd-0667-46dc-9483-d8525943279e" (UID: "de1944cd-0667-46dc-9483-d8525943279e"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 11:58:48 crc kubenswrapper[4748]: I0320 11:58:48.770261 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de1944cd-0667-46dc-9483-d8525943279e-kube-api-access-7q888" (OuterVolumeSpecName: "kube-api-access-7q888") pod "de1944cd-0667-46dc-9483-d8525943279e" (UID: "de1944cd-0667-46dc-9483-d8525943279e"). InnerVolumeSpecName "kube-api-access-7q888". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:58:48 crc kubenswrapper[4748]: I0320 11:58:48.869289 4748 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/de1944cd-0667-46dc-9483-d8525943279e-host\") on node \"crc\" DevicePath \"\"" Mar 20 11:58:48 crc kubenswrapper[4748]: I0320 11:58:48.869332 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7q888\" (UniqueName: \"kubernetes.io/projected/de1944cd-0667-46dc-9483-d8525943279e-kube-api-access-7q888\") on node \"crc\" DevicePath \"\"" Mar 20 11:58:49 crc kubenswrapper[4748]: I0320 11:58:49.499977 4748 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6cbf0e3b0aa07d59d63a0fb0d9fe327cf732173c1a60134c5b0bd1b1478145f4" Mar 20 11:58:49 crc kubenswrapper[4748]: I0320 11:58:49.500050 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vpnp5/crc-debug-827b7" Mar 20 11:58:49 crc kubenswrapper[4748]: I0320 11:58:49.524211 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de1944cd-0667-46dc-9483-d8525943279e" path="/var/lib/kubelet/pods/de1944cd-0667-46dc-9483-d8525943279e/volumes" Mar 20 11:58:49 crc kubenswrapper[4748]: I0320 11:58:49.873135 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-vpnp5/crc-debug-ggj48"] Mar 20 11:58:49 crc kubenswrapper[4748]: E0320 11:58:49.873760 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de1944cd-0667-46dc-9483-d8525943279e" containerName="container-00" Mar 20 11:58:49 crc kubenswrapper[4748]: I0320 11:58:49.873772 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="de1944cd-0667-46dc-9483-d8525943279e" containerName="container-00" Mar 20 11:58:49 crc kubenswrapper[4748]: I0320 11:58:49.873994 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="de1944cd-0667-46dc-9483-d8525943279e" containerName="container-00" Mar 20 11:58:49 crc kubenswrapper[4748]: I0320 11:58:49.874748 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vpnp5/crc-debug-ggj48" Mar 20 11:58:49 crc kubenswrapper[4748]: I0320 11:58:49.989924 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wg266\" (UniqueName: \"kubernetes.io/projected/66f8b58b-d6fc-48cc-a6a2-69caba458a86-kube-api-access-wg266\") pod \"crc-debug-ggj48\" (UID: \"66f8b58b-d6fc-48cc-a6a2-69caba458a86\") " pod="openshift-must-gather-vpnp5/crc-debug-ggj48" Mar 20 11:58:49 crc kubenswrapper[4748]: I0320 11:58:49.990272 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/66f8b58b-d6fc-48cc-a6a2-69caba458a86-host\") pod \"crc-debug-ggj48\" (UID: \"66f8b58b-d6fc-48cc-a6a2-69caba458a86\") " pod="openshift-must-gather-vpnp5/crc-debug-ggj48" Mar 20 11:58:50 crc kubenswrapper[4748]: I0320 11:58:50.092574 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/66f8b58b-d6fc-48cc-a6a2-69caba458a86-host\") pod \"crc-debug-ggj48\" (UID: \"66f8b58b-d6fc-48cc-a6a2-69caba458a86\") " pod="openshift-must-gather-vpnp5/crc-debug-ggj48" Mar 20 11:58:50 crc kubenswrapper[4748]: I0320 11:58:50.092787 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/66f8b58b-d6fc-48cc-a6a2-69caba458a86-host\") pod \"crc-debug-ggj48\" (UID: \"66f8b58b-d6fc-48cc-a6a2-69caba458a86\") " pod="openshift-must-gather-vpnp5/crc-debug-ggj48" Mar 20 11:58:50 crc kubenswrapper[4748]: I0320 11:58:50.093116 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wg266\" (UniqueName: \"kubernetes.io/projected/66f8b58b-d6fc-48cc-a6a2-69caba458a86-kube-api-access-wg266\") pod \"crc-debug-ggj48\" (UID: \"66f8b58b-d6fc-48cc-a6a2-69caba458a86\") " pod="openshift-must-gather-vpnp5/crc-debug-ggj48" Mar 20 11:58:50 crc kubenswrapper[4748]: I0320 11:58:50.119723 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wg266\" (UniqueName: \"kubernetes.io/projected/66f8b58b-d6fc-48cc-a6a2-69caba458a86-kube-api-access-wg266\") pod \"crc-debug-ggj48\" (UID: \"66f8b58b-d6fc-48cc-a6a2-69caba458a86\") " pod="openshift-must-gather-vpnp5/crc-debug-ggj48" Mar 20 11:58:50 crc kubenswrapper[4748]: I0320 11:58:50.192097 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vpnp5/crc-debug-ggj48" Mar 20 11:58:50 crc kubenswrapper[4748]: I0320 11:58:50.511056 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vpnp5/crc-debug-ggj48" event={"ID":"66f8b58b-d6fc-48cc-a6a2-69caba458a86","Type":"ContainerStarted","Data":"fdfb7aa86b110a97e6cfc34ebf2a50ddc10578404f70f5d186f4bb54376b0a75"} Mar 20 11:58:51 crc kubenswrapper[4748]: I0320 11:58:51.521258 4748 generic.go:334] "Generic (PLEG): container finished" podID="66f8b58b-d6fc-48cc-a6a2-69caba458a86" containerID="18496f0996a4affd4ce740b37c8eedc89441e6adeb7c10107690505881752df6" exitCode=0 Mar 20 11:58:51 crc kubenswrapper[4748]: I0320 11:58:51.524872 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vpnp5/crc-debug-ggj48" event={"ID":"66f8b58b-d6fc-48cc-a6a2-69caba458a86","Type":"ContainerDied","Data":"18496f0996a4affd4ce740b37c8eedc89441e6adeb7c10107690505881752df6"} Mar 20 11:58:51 crc kubenswrapper[4748]: I0320 11:58:51.560124 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-vpnp5/crc-debug-ggj48"] Mar 20 11:58:51 crc kubenswrapper[4748]: I0320 11:58:51.567912 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-vpnp5/crc-debug-ggj48"] Mar 20 11:58:52 crc kubenswrapper[4748]: I0320 11:58:52.633002 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vpnp5/crc-debug-ggj48" Mar 20 11:58:52 crc kubenswrapper[4748]: I0320 11:58:52.746471 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wg266\" (UniqueName: \"kubernetes.io/projected/66f8b58b-d6fc-48cc-a6a2-69caba458a86-kube-api-access-wg266\") pod \"66f8b58b-d6fc-48cc-a6a2-69caba458a86\" (UID: \"66f8b58b-d6fc-48cc-a6a2-69caba458a86\") " Mar 20 11:58:52 crc kubenswrapper[4748]: I0320 11:58:52.746809 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/66f8b58b-d6fc-48cc-a6a2-69caba458a86-host\") pod \"66f8b58b-d6fc-48cc-a6a2-69caba458a86\" (UID: \"66f8b58b-d6fc-48cc-a6a2-69caba458a86\") " Mar 20 11:58:52 crc kubenswrapper[4748]: I0320 11:58:52.746920 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/66f8b58b-d6fc-48cc-a6a2-69caba458a86-host" (OuterVolumeSpecName: "host") pod "66f8b58b-d6fc-48cc-a6a2-69caba458a86" (UID: "66f8b58b-d6fc-48cc-a6a2-69caba458a86"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 11:58:52 crc kubenswrapper[4748]: I0320 11:58:52.747344 4748 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/66f8b58b-d6fc-48cc-a6a2-69caba458a86-host\") on node \"crc\" DevicePath \"\"" Mar 20 11:58:52 crc kubenswrapper[4748]: I0320 11:58:52.760107 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66f8b58b-d6fc-48cc-a6a2-69caba458a86-kube-api-access-wg266" (OuterVolumeSpecName: "kube-api-access-wg266") pod "66f8b58b-d6fc-48cc-a6a2-69caba458a86" (UID: "66f8b58b-d6fc-48cc-a6a2-69caba458a86"). InnerVolumeSpecName "kube-api-access-wg266". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:58:52 crc kubenswrapper[4748]: I0320 11:58:52.848955 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wg266\" (UniqueName: \"kubernetes.io/projected/66f8b58b-d6fc-48cc-a6a2-69caba458a86-kube-api-access-wg266\") on node \"crc\" DevicePath \"\"" Mar 20 11:58:53 crc kubenswrapper[4748]: I0320 11:58:53.526376 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66f8b58b-d6fc-48cc-a6a2-69caba458a86" path="/var/lib/kubelet/pods/66f8b58b-d6fc-48cc-a6a2-69caba458a86/volumes" Mar 20 11:58:53 crc kubenswrapper[4748]: I0320 11:58:53.539134 4748 scope.go:117] "RemoveContainer" containerID="18496f0996a4affd4ce740b37c8eedc89441e6adeb7c10107690505881752df6" Mar 20 11:58:53 crc kubenswrapper[4748]: I0320 11:58:53.539354 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vpnp5/crc-debug-ggj48" Mar 20 11:59:36 crc kubenswrapper[4748]: I0320 11:59:36.846360 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-77b5d7f4f8-8jmkc_201e8a26-7bfa-40c7-aa3d-bf32c1344d61/barbican-api/0.log" Mar 20 11:59:37 crc kubenswrapper[4748]: I0320 11:59:37.046628 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-5b69f7d5cb-p5jsk_ed96228e-6626-468c-bf60-a1073dfc123e/barbican-keystone-listener/0.log" Mar 20 11:59:37 crc kubenswrapper[4748]: I0320 11:59:37.054323 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-77b5d7f4f8-8jmkc_201e8a26-7bfa-40c7-aa3d-bf32c1344d61/barbican-api-log/0.log" Mar 20 11:59:37 crc kubenswrapper[4748]: I0320 11:59:37.117123 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-5b69f7d5cb-p5jsk_ed96228e-6626-468c-bf60-a1073dfc123e/barbican-keystone-listener-log/0.log" Mar 20 11:59:37 crc kubenswrapper[4748]: I0320 11:59:37.269783 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-556bf684bc-f9q9w_6c942b79-bc14-4a48-8fbd-32667bc1afc6/barbican-worker/0.log" Mar 20 11:59:37 crc kubenswrapper[4748]: I0320 11:59:37.295329 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-556bf684bc-f9q9w_6c942b79-bc14-4a48-8fbd-32667bc1afc6/barbican-worker-log/0.log" Mar 20 11:59:37 crc kubenswrapper[4748]: I0320 11:59:37.592085 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_06f5e1bb-615b-4f6e-9c62-93a82f0984c8/ceilometer-central-agent/0.log" Mar 20 11:59:37 crc kubenswrapper[4748]: I0320 11:59:37.646583 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_06f5e1bb-615b-4f6e-9c62-93a82f0984c8/ceilometer-notification-agent/0.log" Mar 20 11:59:37 crc kubenswrapper[4748]: I0320 11:59:37.700020 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-q4wv5_01e10255-e1d0-4e62-9b54-4c1043b5f502/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 11:59:37 crc kubenswrapper[4748]: I0320 11:59:37.768302 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_06f5e1bb-615b-4f6e-9c62-93a82f0984c8/proxy-httpd/0.log" Mar 20 11:59:37 crc kubenswrapper[4748]: I0320 11:59:37.791087 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_06f5e1bb-615b-4f6e-9c62-93a82f0984c8/sg-core/0.log" Mar 20 11:59:37 crc kubenswrapper[4748]: I0320 11:59:37.987024 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_4376a841-9631-4c91-bae6-9c12b2f46a17/cinder-api/0.log" Mar 20 11:59:37 crc kubenswrapper[4748]: I0320 11:59:37.993171 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_4376a841-9631-4c91-bae6-9c12b2f46a17/cinder-api-log/0.log" Mar 20 11:59:38 crc kubenswrapper[4748]: I0320 11:59:38.157347 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_9cbc2f9e-ca03-4d54-b229-ab80f6ffb53c/cinder-scheduler/0.log" Mar 20 11:59:38 crc kubenswrapper[4748]: I0320 11:59:38.279599 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_9cbc2f9e-ca03-4d54-b229-ab80f6ffb53c/probe/0.log" Mar 20 11:59:38 crc kubenswrapper[4748]: I0320 11:59:38.549326 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-r6hn2_2e4d68e5-3aee-40fe-98fb-a2c06bdd601e/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 11:59:38 crc kubenswrapper[4748]: I0320 11:59:38.641341 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-54ffdb7d8c-ckmg2_66fa6d88-f5fa-4288-8b2b-bc30561967c0/init/0.log" Mar 20 11:59:38 crc kubenswrapper[4748]: I0320 11:59:38.921348 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-wdbpr_9e122eba-13d6-4074-b8d8-a4fc7ae4e3f1/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 11:59:39 crc kubenswrapper[4748]: I0320 11:59:39.170625 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-54ffdb7d8c-ckmg2_66fa6d88-f5fa-4288-8b2b-bc30561967c0/init/0.log" Mar 20 11:59:39 crc kubenswrapper[4748]: I0320 11:59:39.486605 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_ffe44fec-9121-46b2-9087-eba59b656915/glance-httpd/0.log" Mar 20 11:59:39 crc kubenswrapper[4748]: I0320 11:59:39.488527 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-54ffdb7d8c-ckmg2_66fa6d88-f5fa-4288-8b2b-bc30561967c0/dnsmasq-dns/0.log" Mar 20 11:59:39 crc kubenswrapper[4748]: I0320 11:59:39.516180 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-rzshm_a0708128-eacf-422a-8dac-98032a9f12e7/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 11:59:39 crc kubenswrapper[4748]: I0320 11:59:39.613612 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_ffe44fec-9121-46b2-9087-eba59b656915/glance-log/0.log" Mar 20 11:59:39 crc kubenswrapper[4748]: I0320 11:59:39.696978 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_f70c4bdb-308f-485c-9f2b-388e135bdfc9/glance-log/0.log" Mar 20 11:59:39 crc kubenswrapper[4748]: I0320 11:59:39.724120 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_f70c4bdb-308f-485c-9f2b-388e135bdfc9/glance-httpd/0.log" Mar 20 11:59:39 crc kubenswrapper[4748]: I0320 11:59:39.953956 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-7d79b6bb86-nhfts_f3de236a-e527-4582-8eb5-03ca8aa883e0/horizon/0.log" Mar 20 11:59:40 crc kubenswrapper[4748]: I0320 11:59:40.764440 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-7d79b6bb86-nhfts_f3de236a-e527-4582-8eb5-03ca8aa883e0/horizon-log/0.log" Mar 20 11:59:40 crc kubenswrapper[4748]: I0320 11:59:40.853103 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-d522t_5d2decbf-7d56-4ff7-896e-eaca78da7448/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 11:59:41 crc kubenswrapper[4748]: I0320 11:59:41.320077 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29566741-sxfvz_7997caf5-1478-40d5-a0c6-6811d242ef17/keystone-cron/0.log" Mar 20 11:59:41 crc kubenswrapper[4748]: I0320 11:59:41.401653 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_e9d06c7d-5d90-45f8-b4df-b53bff4761a5/kube-state-metrics/0.log" Mar 20 11:59:41 crc kubenswrapper[4748]: I0320 11:59:41.746766 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-n6f7v_93a46290-fef3-4e7a-9cb3-682c3f453cc1/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 11:59:41 crc kubenswrapper[4748]: I0320 11:59:41.987037 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-688f5b7cfd-ffmqn_cc43e627-4d33-422e-bfc0-63cb746991ca/keystone-api/0.log" Mar 20 11:59:42 crc kubenswrapper[4748]: I0320 11:59:42.514869 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-695f6cc9c-5bkz4_0900bb20-c211-44be-a5f8-6775641e54ca/neutron-httpd/0.log" Mar 20 11:59:42 crc kubenswrapper[4748]: I0320 11:59:42.657989 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-695f6cc9c-5bkz4_0900bb20-c211-44be-a5f8-6775641e54ca/neutron-api/0.log" Mar 20 11:59:42 crc kubenswrapper[4748]: I0320 11:59:42.743976 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-zr6v8_cba4401d-824d-4c51-8a04-43691fa34a45/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 11:59:43 crc kubenswrapper[4748]: I0320 11:59:43.566559 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_8b67252d-9978-4214-bf76-b57b2272c603/nova-cell0-conductor-conductor/0.log" Mar 20 11:59:44 crc kubenswrapper[4748]: I0320 11:59:44.312861 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-rf2m4_a1734ea2-369d-4c96-aca3-1a450a82e9dc/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 11:59:44 crc kubenswrapper[4748]: I0320 11:59:44.344960 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_6acc81f7-4f7d-4828-a328-1e2a4426bd57/nova-api-log/0.log" Mar 20 11:59:44 crc kubenswrapper[4748]: I0320 11:59:44.424298 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_a9acc08e-1cf9-4a43-9b60-2bd4e1cad401/nova-cell1-conductor-conductor/0.log" Mar 20 11:59:44 crc kubenswrapper[4748]: I0320 11:59:44.749546 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_a2376389-554f-4c38-bfc1-00962d858ff4/nova-cell1-novncproxy-novncproxy/0.log" Mar 20 11:59:44 crc kubenswrapper[4748]: I0320 11:59:44.946130 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_6acc81f7-4f7d-4828-a328-1e2a4426bd57/nova-api-api/0.log" Mar 20 11:59:45 crc kubenswrapper[4748]: I0320 11:59:45.015243 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_2f35a381-dc79-4781-97a4-1d0c8f96a0d2/nova-metadata-log/0.log" Mar 20 11:59:45 crc kubenswrapper[4748]: I0320 11:59:45.690679 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_ffd19d53-385f-45a9-a222-caa7fbf6545e/mysql-bootstrap/0.log" Mar 20 11:59:45 crc kubenswrapper[4748]: I0320 11:59:45.836661 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_2f35a381-dc79-4781-97a4-1d0c8f96a0d2/nova-metadata-metadata/0.log" Mar 20 11:59:45 crc kubenswrapper[4748]: I0320 11:59:45.853819 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_734b70b0-5549-4b8e-aa70-a9d589c5b457/nova-scheduler-scheduler/0.log" Mar 20 11:59:45 crc kubenswrapper[4748]: I0320 11:59:45.865913 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_ffd19d53-385f-45a9-a222-caa7fbf6545e/mysql-bootstrap/0.log" Mar 20 11:59:46 crc kubenswrapper[4748]: I0320 11:59:46.139080 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_74743039-97e4-46cf-8fbf-183c8c11ca20/mysql-bootstrap/0.log" Mar 20 11:59:46 crc kubenswrapper[4748]: I0320 11:59:46.155244 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_ffd19d53-385f-45a9-a222-caa7fbf6545e/galera/0.log" Mar 20 11:59:46 crc kubenswrapper[4748]: I0320 11:59:46.398293 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_74743039-97e4-46cf-8fbf-183c8c11ca20/mysql-bootstrap/0.log" Mar 20 11:59:46 crc kubenswrapper[4748]: I0320 11:59:46.413952 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_74743039-97e4-46cf-8fbf-183c8c11ca20/galera/0.log" Mar 20 11:59:46 crc kubenswrapper[4748]: I0320 11:59:46.465673 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-fjmvn_4457d05c-f317-4cf2-97cb-03888616f4af/nova-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 11:59:46 crc kubenswrapper[4748]: I0320 11:59:46.603725 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_2b2f2b26-6292-47bb-b8ee-971d9b47c85d/openstackclient/0.log" Mar 20 11:59:46 crc kubenswrapper[4748]: I0320 11:59:46.713002 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-bldp9_2482f122-92d5-410c-b4c0-41834cea1711/ovn-controller/0.log" Mar 20 11:59:46 crc kubenswrapper[4748]: I0320 11:59:46.856585 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-zhsd6_6b172e4c-c5b0-4573-b80c-9bc074489627/openstack-network-exporter/0.log" Mar 20 11:59:46 crc kubenswrapper[4748]: I0320 11:59:46.944230 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-756zb_334c1861-88b1-44e2-a02e-ad1dcecf2fc0/ovsdb-server-init/0.log" Mar 20 11:59:47 crc kubenswrapper[4748]: I0320 11:59:47.152746 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-756zb_334c1861-88b1-44e2-a02e-ad1dcecf2fc0/ovsdb-server-init/0.log" Mar 20 11:59:47 crc kubenswrapper[4748]: I0320 11:59:47.162818 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-756zb_334c1861-88b1-44e2-a02e-ad1dcecf2fc0/ovs-vswitchd/0.log" Mar 20 11:59:47 crc kubenswrapper[4748]: I0320 11:59:47.245595 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-756zb_334c1861-88b1-44e2-a02e-ad1dcecf2fc0/ovsdb-server/0.log" Mar 20 11:59:47 crc kubenswrapper[4748]: I0320 11:59:47.454464 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_9b80cd60-a2f6-4638-a600-4d866573bbc3/openstack-network-exporter/0.log" Mar 20 11:59:47 crc kubenswrapper[4748]: I0320 11:59:47.687119 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-6qhn9_17ae0e15-a1a8-46a4-a2d0-4d1e92fe87b3/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 11:59:47 crc kubenswrapper[4748]: I0320 11:59:47.798669 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_9b80cd60-a2f6-4638-a600-4d866573bbc3/ovn-northd/0.log" Mar 20 11:59:47 crc kubenswrapper[4748]: I0320 11:59:47.853819 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_9ed1d8cb-39ee-4f07-98b6-61e1ea8777a2/openstack-network-exporter/0.log" Mar 20 11:59:47 crc kubenswrapper[4748]: I0320 11:59:47.909437 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_9ed1d8cb-39ee-4f07-98b6-61e1ea8777a2/ovsdbserver-nb/0.log" Mar 20 11:59:48 crc kubenswrapper[4748]: I0320 11:59:48.058247 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_6e2e0cd6-8912-4f2a-81ed-7cf9a9a0bd78/openstack-network-exporter/0.log" Mar 20 11:59:48 crc kubenswrapper[4748]: I0320 11:59:48.157221 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_6e2e0cd6-8912-4f2a-81ed-7cf9a9a0bd78/ovsdbserver-sb/0.log" Mar 20 11:59:48 crc kubenswrapper[4748]: I0320 11:59:48.982005 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_507647d5-8633-4346-a9e0-4af3eb0e3e5f/setup-container/0.log" Mar 20 11:59:49 crc kubenswrapper[4748]: I0320 11:59:49.157547 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-56cc45587d-dchtq_9bd6666d-34bf-42aa-bac6-e119898e279d/placement-api/0.log" Mar 20 11:59:49 crc kubenswrapper[4748]: I0320 11:59:49.196246 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_507647d5-8633-4346-a9e0-4af3eb0e3e5f/setup-container/0.log" Mar 20 11:59:49 crc kubenswrapper[4748]: I0320 11:59:49.365216 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-56cc45587d-dchtq_9bd6666d-34bf-42aa-bac6-e119898e279d/placement-log/0.log" Mar 20 11:59:49 crc kubenswrapper[4748]: I0320 11:59:49.386183 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_507647d5-8633-4346-a9e0-4af3eb0e3e5f/rabbitmq/0.log" Mar 20 11:59:49 crc kubenswrapper[4748]: I0320 11:59:49.453770 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_b6bc49d0-a6dc-4a70-9d54-2cc66e8cf07a/setup-container/0.log" Mar 20 11:59:49 crc kubenswrapper[4748]: I0320 11:59:49.644334 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_b6bc49d0-a6dc-4a70-9d54-2cc66e8cf07a/setup-container/0.log" Mar 20 11:59:49 crc kubenswrapper[4748]: I0320 11:59:49.691445 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-gx2k6_e51cf464-1d93-4c6c-99f9-418be04dce30/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 11:59:49 crc kubenswrapper[4748]: I0320 11:59:49.726415 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_b6bc49d0-a6dc-4a70-9d54-2cc66e8cf07a/rabbitmq/0.log" Mar 20 11:59:49 crc kubenswrapper[4748]: I0320 11:59:49.888962 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-s6rzn_c8abd498-c75d-47c5-992b-77857b856c30/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 11:59:50 crc kubenswrapper[4748]: I0320 11:59:50.027492 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-6nsm6_6298ed1e-1de4-489a-ba4c-ca6f3f989909/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 11:59:50 crc kubenswrapper[4748]: I0320 11:59:50.239431 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-gl7rt_4ae809a6-7d0a-4b85-a623-eda42d60e2d7/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 11:59:50 crc kubenswrapper[4748]: I0320 11:59:50.624261 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-66zrt_80a3504f-a9f2-4be3-9f87-e4110fb5fc7b/ssh-known-hosts-edpm-deployment/0.log" Mar 20 11:59:50 crc kubenswrapper[4748]: I0320 11:59:50.792050 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-c66c949c-9cv26_12263f55-f4a7-481f-afab-45f51bd4d60d/proxy-httpd/0.log" Mar 20 11:59:50 crc kubenswrapper[4748]: I0320 11:59:50.840963 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-c66c949c-9cv26_12263f55-f4a7-481f-afab-45f51bd4d60d/proxy-server/0.log" Mar 20 11:59:50 crc kubenswrapper[4748]: I0320 11:59:50.874624 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-mlztp_8bde82f9-d6e7-4ab4-8666-5fbbd5b90f17/swift-ring-rebalance/0.log" Mar 20 11:59:51 crc kubenswrapper[4748]: I0320 11:59:51.121508 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_7753c601-7739-4165-b5f2-a673b0797334/account-reaper/0.log" Mar 20 11:59:51 crc kubenswrapper[4748]: I0320 11:59:51.168036 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_7753c601-7739-4165-b5f2-a673b0797334/account-auditor/0.log" Mar 20 11:59:51 crc kubenswrapper[4748]: I0320 11:59:51.279702 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_7753c601-7739-4165-b5f2-a673b0797334/account-replicator/0.log" Mar 20 11:59:51 crc kubenswrapper[4748]: I0320 11:59:51.530273 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_7753c601-7739-4165-b5f2-a673b0797334/container-auditor/0.log" Mar 20 11:59:51 crc kubenswrapper[4748]: I0320 11:59:51.539866 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_7753c601-7739-4165-b5f2-a673b0797334/account-server/0.log" Mar 20 11:59:51 crc kubenswrapper[4748]: I0320 11:59:51.560127 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_7753c601-7739-4165-b5f2-a673b0797334/container-replicator/0.log" Mar 20 11:59:51 crc kubenswrapper[4748]: I0320 11:59:51.608784 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_7753c601-7739-4165-b5f2-a673b0797334/container-server/0.log" Mar 20 11:59:51 crc kubenswrapper[4748]: I0320 11:59:51.794387 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_7753c601-7739-4165-b5f2-a673b0797334/object-auditor/0.log" Mar 20 11:59:51 crc kubenswrapper[4748]: I0320 11:59:51.796321 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_7753c601-7739-4165-b5f2-a673b0797334/container-updater/0.log" Mar 20 11:59:51 crc kubenswrapper[4748]: I0320 11:59:51.797370 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_7753c601-7739-4165-b5f2-a673b0797334/object-expirer/0.log" Mar 20 11:59:51 crc kubenswrapper[4748]: I0320 11:59:51.870189 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_7753c601-7739-4165-b5f2-a673b0797334/object-replicator/0.log" Mar 20 11:59:51 crc kubenswrapper[4748]: I0320 11:59:51.988668 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_7753c601-7739-4165-b5f2-a673b0797334/object-updater/0.log" Mar 20 11:59:52 crc kubenswrapper[4748]: I0320 11:59:52.035084 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_7753c601-7739-4165-b5f2-a673b0797334/object-server/0.log" Mar 20 11:59:52 crc kubenswrapper[4748]: I0320 11:59:52.093347 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_7753c601-7739-4165-b5f2-a673b0797334/rsync/0.log" Mar 20 11:59:52 crc kubenswrapper[4748]: I0320 11:59:52.096632 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_7753c601-7739-4165-b5f2-a673b0797334/swift-recon-cron/0.log" Mar 20 11:59:52 crc kubenswrapper[4748]: I0320 11:59:52.399022 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_da9b4776-4f59-46e4-9cdf-953b0a7f83bf/tempest-tests-tempest-tests-runner/0.log" Mar 20 11:59:52 crc kubenswrapper[4748]: I0320 11:59:52.633489 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_46161687-c406-4f73-aced-74edbd5e2f81/test-operator-logs-container/0.log" Mar 20 11:59:52 crc kubenswrapper[4748]: I0320 11:59:52.756639 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-xr6bf_2da6177a-9350-445c-820e-cf678dfd5500/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 11:59:53 crc kubenswrapper[4748]: I0320 11:59:53.412136 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-7wd5b_d03d3cf8-b0f5-46bf-9396-e6da7698e6fb/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 11:59:53 crc kubenswrapper[4748]: I0320 11:59:53.725012 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_27339553-c013-4538-9a4d-5bbd249c197c/memcached/0.log" Mar 20 12:00:00 crc kubenswrapper[4748]: I0320 12:00:00.160214 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566800-mpb99"] Mar 20 12:00:00 crc kubenswrapper[4748]: E0320 12:00:00.161134 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66f8b58b-d6fc-48cc-a6a2-69caba458a86" containerName="container-00" Mar 20 12:00:00 crc kubenswrapper[4748]: I0320 12:00:00.161149 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="66f8b58b-d6fc-48cc-a6a2-69caba458a86" containerName="container-00" Mar 20 12:00:00 crc kubenswrapper[4748]: I0320 12:00:00.161332 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="66f8b58b-d6fc-48cc-a6a2-69caba458a86" containerName="container-00" Mar 20 12:00:00 crc kubenswrapper[4748]: I0320 12:00:00.161963 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566800-mpb99" Mar 20 12:00:00 crc kubenswrapper[4748]: I0320 12:00:00.165451 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 20 12:00:00 crc kubenswrapper[4748]: I0320 12:00:00.165557 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 20 12:00:00 crc kubenswrapper[4748]: I0320 12:00:00.175904 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566800-pzr77"] Mar 20 12:00:00 crc kubenswrapper[4748]: I0320 12:00:00.177481 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566800-pzr77" Mar 20 12:00:00 crc kubenswrapper[4748]: I0320 12:00:00.181767 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 12:00:00 crc kubenswrapper[4748]: I0320 12:00:00.182142 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-p6q8z" Mar 20 12:00:00 crc kubenswrapper[4748]: I0320 12:00:00.182233 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 12:00:00 crc kubenswrapper[4748]: I0320 12:00:00.192658 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566800-pzr77"] Mar 20 12:00:00 crc kubenswrapper[4748]: I0320 12:00:00.208947 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566800-mpb99"] Mar 20 12:00:00 crc kubenswrapper[4748]: I0320 12:00:00.284303 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btzkd\" (UniqueName: \"kubernetes.io/projected/80f28a5a-6864-4320-b31e-0ab1fd022d45-kube-api-access-btzkd\") pod \"collect-profiles-29566800-mpb99\" (UID: \"80f28a5a-6864-4320-b31e-0ab1fd022d45\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566800-mpb99" Mar 20 12:00:00 crc kubenswrapper[4748]: I0320 12:00:00.284372 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/80f28a5a-6864-4320-b31e-0ab1fd022d45-secret-volume\") pod \"collect-profiles-29566800-mpb99\" (UID: \"80f28a5a-6864-4320-b31e-0ab1fd022d45\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566800-mpb99" Mar 20 12:00:00 crc kubenswrapper[4748]: I0320 12:00:00.284496 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/80f28a5a-6864-4320-b31e-0ab1fd022d45-config-volume\") pod \"collect-profiles-29566800-mpb99\" (UID: \"80f28a5a-6864-4320-b31e-0ab1fd022d45\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566800-mpb99" Mar 20 12:00:00 crc kubenswrapper[4748]: I0320 12:00:00.284632 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4lkg\" (UniqueName: \"kubernetes.io/projected/9d5b5af3-b53b-44d3-a690-cbc64f7c61fa-kube-api-access-q4lkg\") pod \"auto-csr-approver-29566800-pzr77\" (UID: \"9d5b5af3-b53b-44d3-a690-cbc64f7c61fa\") " pod="openshift-infra/auto-csr-approver-29566800-pzr77" Mar 20 12:00:00 crc kubenswrapper[4748]: I0320 12:00:00.388307 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/80f28a5a-6864-4320-b31e-0ab1fd022d45-secret-volume\") pod \"collect-profiles-29566800-mpb99\" (UID: \"80f28a5a-6864-4320-b31e-0ab1fd022d45\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566800-mpb99" Mar 20 12:00:00 crc kubenswrapper[4748]: I0320 12:00:00.388489 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/80f28a5a-6864-4320-b31e-0ab1fd022d45-config-volume\") pod \"collect-profiles-29566800-mpb99\" (UID: \"80f28a5a-6864-4320-b31e-0ab1fd022d45\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566800-mpb99" Mar 20 12:00:00 crc kubenswrapper[4748]: I0320 12:00:00.388708 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q4lkg\" (UniqueName: \"kubernetes.io/projected/9d5b5af3-b53b-44d3-a690-cbc64f7c61fa-kube-api-access-q4lkg\") pod \"auto-csr-approver-29566800-pzr77\" (UID: \"9d5b5af3-b53b-44d3-a690-cbc64f7c61fa\") " pod="openshift-infra/auto-csr-approver-29566800-pzr77" Mar 20 12:00:00 crc kubenswrapper[4748]: I0320 12:00:00.388887 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-btzkd\" (UniqueName: \"kubernetes.io/projected/80f28a5a-6864-4320-b31e-0ab1fd022d45-kube-api-access-btzkd\") pod \"collect-profiles-29566800-mpb99\" (UID: \"80f28a5a-6864-4320-b31e-0ab1fd022d45\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566800-mpb99" Mar 20 12:00:00 crc kubenswrapper[4748]: I0320 12:00:00.389701 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/80f28a5a-6864-4320-b31e-0ab1fd022d45-config-volume\") pod \"collect-profiles-29566800-mpb99\" (UID: \"80f28a5a-6864-4320-b31e-0ab1fd022d45\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566800-mpb99" Mar 20 12:00:00 crc kubenswrapper[4748]: I0320 12:00:00.471824 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/80f28a5a-6864-4320-b31e-0ab1fd022d45-secret-volume\") pod \"collect-profiles-29566800-mpb99\" (UID: \"80f28a5a-6864-4320-b31e-0ab1fd022d45\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566800-mpb99" Mar 20 12:00:00 crc kubenswrapper[4748]: I0320 12:00:00.471928 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-btzkd\" (UniqueName: \"kubernetes.io/projected/80f28a5a-6864-4320-b31e-0ab1fd022d45-kube-api-access-btzkd\") pod \"collect-profiles-29566800-mpb99\" (UID: \"80f28a5a-6864-4320-b31e-0ab1fd022d45\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566800-mpb99" Mar 20 12:00:00 crc kubenswrapper[4748]: I0320 12:00:00.476463 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4lkg\" (UniqueName: \"kubernetes.io/projected/9d5b5af3-b53b-44d3-a690-cbc64f7c61fa-kube-api-access-q4lkg\") pod \"auto-csr-approver-29566800-pzr77\" (UID: \"9d5b5af3-b53b-44d3-a690-cbc64f7c61fa\") " pod="openshift-infra/auto-csr-approver-29566800-pzr77" Mar 20 12:00:00 crc kubenswrapper[4748]: I0320 12:00:00.489893 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566800-mpb99" Mar 20 12:00:00 crc kubenswrapper[4748]: I0320 12:00:00.503509 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566800-pzr77" Mar 20 12:00:01 crc kubenswrapper[4748]: I0320 12:00:01.021820 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566800-pzr77"] Mar 20 12:00:01 crc kubenswrapper[4748]: I0320 12:00:01.126057 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566800-mpb99"] Mar 20 12:00:01 crc kubenswrapper[4748]: W0320 12:00:01.130585 4748 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod80f28a5a_6864_4320_b31e_0ab1fd022d45.slice/crio-abb21b4878a66c48981b3c6139e85e83e33760df483e1ad66f9519bf84d2338f WatchSource:0}: Error finding container abb21b4878a66c48981b3c6139e85e83e33760df483e1ad66f9519bf84d2338f: Status 404 returned error can't find the container with id abb21b4878a66c48981b3c6139e85e83e33760df483e1ad66f9519bf84d2338f Mar 20 12:00:01 crc kubenswrapper[4748]: I0320 12:00:01.215629 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566800-mpb99" event={"ID":"80f28a5a-6864-4320-b31e-0ab1fd022d45","Type":"ContainerStarted","Data":"abb21b4878a66c48981b3c6139e85e83e33760df483e1ad66f9519bf84d2338f"} Mar 20 12:00:01 crc kubenswrapper[4748]: I0320 12:00:01.224947 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566800-pzr77" event={"ID":"9d5b5af3-b53b-44d3-a690-cbc64f7c61fa","Type":"ContainerStarted","Data":"9beeace1e59ef222c4b516e2c1ab2116fd04ed845ff371b38da755573f9788c5"} Mar 20 12:00:02 crc kubenswrapper[4748]: I0320 12:00:02.234552 4748 generic.go:334] "Generic (PLEG): container finished" podID="80f28a5a-6864-4320-b31e-0ab1fd022d45" containerID="08eb144495dbd3cea6e15fde359b604e436960e4a08f9a609c5fc57dc4d4c201" exitCode=0 Mar 20 12:00:02 crc kubenswrapper[4748]: I0320 12:00:02.234752 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566800-mpb99" event={"ID":"80f28a5a-6864-4320-b31e-0ab1fd022d45","Type":"ContainerDied","Data":"08eb144495dbd3cea6e15fde359b604e436960e4a08f9a609c5fc57dc4d4c201"} Mar 20 12:00:03 crc kubenswrapper[4748]: I0320 12:00:03.589609 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566800-mpb99" Mar 20 12:00:03 crc kubenswrapper[4748]: I0320 12:00:03.759531 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/80f28a5a-6864-4320-b31e-0ab1fd022d45-config-volume\") pod \"80f28a5a-6864-4320-b31e-0ab1fd022d45\" (UID: \"80f28a5a-6864-4320-b31e-0ab1fd022d45\") " Mar 20 12:00:03 crc kubenswrapper[4748]: I0320 12:00:03.759767 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/80f28a5a-6864-4320-b31e-0ab1fd022d45-secret-volume\") pod \"80f28a5a-6864-4320-b31e-0ab1fd022d45\" (UID: \"80f28a5a-6864-4320-b31e-0ab1fd022d45\") " Mar 20 12:00:03 crc kubenswrapper[4748]: I0320 12:00:03.759809 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-btzkd\" (UniqueName: \"kubernetes.io/projected/80f28a5a-6864-4320-b31e-0ab1fd022d45-kube-api-access-btzkd\") pod \"80f28a5a-6864-4320-b31e-0ab1fd022d45\" (UID: \"80f28a5a-6864-4320-b31e-0ab1fd022d45\") " Mar 20 12:00:03 crc kubenswrapper[4748]: I0320 12:00:03.760136 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/80f28a5a-6864-4320-b31e-0ab1fd022d45-config-volume" (OuterVolumeSpecName: "config-volume") pod "80f28a5a-6864-4320-b31e-0ab1fd022d45" (UID: "80f28a5a-6864-4320-b31e-0ab1fd022d45"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 12:00:03 crc kubenswrapper[4748]: I0320 12:00:03.760821 4748 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/80f28a5a-6864-4320-b31e-0ab1fd022d45-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 12:00:03 crc kubenswrapper[4748]: I0320 12:00:03.771014 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80f28a5a-6864-4320-b31e-0ab1fd022d45-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "80f28a5a-6864-4320-b31e-0ab1fd022d45" (UID: "80f28a5a-6864-4320-b31e-0ab1fd022d45"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 12:00:03 crc kubenswrapper[4748]: I0320 12:00:03.771064 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80f28a5a-6864-4320-b31e-0ab1fd022d45-kube-api-access-btzkd" (OuterVolumeSpecName: "kube-api-access-btzkd") pod "80f28a5a-6864-4320-b31e-0ab1fd022d45" (UID: "80f28a5a-6864-4320-b31e-0ab1fd022d45"). InnerVolumeSpecName "kube-api-access-btzkd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 12:00:03 crc kubenswrapper[4748]: I0320 12:00:03.862809 4748 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/80f28a5a-6864-4320-b31e-0ab1fd022d45-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 20 12:00:03 crc kubenswrapper[4748]: I0320 12:00:03.862862 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-btzkd\" (UniqueName: \"kubernetes.io/projected/80f28a5a-6864-4320-b31e-0ab1fd022d45-kube-api-access-btzkd\") on node \"crc\" DevicePath \"\"" Mar 20 12:00:04 crc kubenswrapper[4748]: I0320 12:00:04.254223 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566800-mpb99" event={"ID":"80f28a5a-6864-4320-b31e-0ab1fd022d45","Type":"ContainerDied","Data":"abb21b4878a66c48981b3c6139e85e83e33760df483e1ad66f9519bf84d2338f"} Mar 20 12:00:04 crc kubenswrapper[4748]: I0320 12:00:04.254267 4748 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="abb21b4878a66c48981b3c6139e85e83e33760df483e1ad66f9519bf84d2338f" Mar 20 12:00:04 crc kubenswrapper[4748]: I0320 12:00:04.254334 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566800-mpb99" Mar 20 12:00:04 crc kubenswrapper[4748]: I0320 12:00:04.666583 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566755-tbdmw"] Mar 20 12:00:04 crc kubenswrapper[4748]: I0320 12:00:04.680175 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566755-tbdmw"] Mar 20 12:00:05 crc kubenswrapper[4748]: I0320 12:00:05.557228 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db15c3e8-540b-4a26-b05d-b56ed957a8bc" path="/var/lib/kubelet/pods/db15c3e8-540b-4a26-b05d-b56ed957a8bc/volumes" Mar 20 12:00:19 crc kubenswrapper[4748]: I0320 12:00:19.264476 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_69f52c17ab4278ef34a91246e6ae1710f6a470e03e4488535e84bfb4ebh7v5w_c9a1c042-3dd9-486b-aafd-05767bd2e20b/util/0.log" Mar 20 12:00:19 crc kubenswrapper[4748]: I0320 12:00:19.957176 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_69f52c17ab4278ef34a91246e6ae1710f6a470e03e4488535e84bfb4ebh7v5w_c9a1c042-3dd9-486b-aafd-05767bd2e20b/util/0.log" Mar 20 12:00:19 crc kubenswrapper[4748]: I0320 12:00:19.994691 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_69f52c17ab4278ef34a91246e6ae1710f6a470e03e4488535e84bfb4ebh7v5w_c9a1c042-3dd9-486b-aafd-05767bd2e20b/pull/0.log" Mar 20 12:00:19 crc kubenswrapper[4748]: I0320 12:00:19.998412 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_69f52c17ab4278ef34a91246e6ae1710f6a470e03e4488535e84bfb4ebh7v5w_c9a1c042-3dd9-486b-aafd-05767bd2e20b/pull/0.log" Mar 20 12:00:20 crc kubenswrapper[4748]: I0320 12:00:20.176118 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_69f52c17ab4278ef34a91246e6ae1710f6a470e03e4488535e84bfb4ebh7v5w_c9a1c042-3dd9-486b-aafd-05767bd2e20b/pull/0.log" Mar 20 12:00:20 crc kubenswrapper[4748]: I0320 12:00:20.182274 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_69f52c17ab4278ef34a91246e6ae1710f6a470e03e4488535e84bfb4ebh7v5w_c9a1c042-3dd9-486b-aafd-05767bd2e20b/util/0.log" Mar 20 12:00:20 crc kubenswrapper[4748]: I0320 12:00:20.299151 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_69f52c17ab4278ef34a91246e6ae1710f6a470e03e4488535e84bfb4ebh7v5w_c9a1c042-3dd9-486b-aafd-05767bd2e20b/extract/0.log" Mar 20 12:00:20 crc kubenswrapper[4748]: I0320 12:00:20.384578 4748 generic.go:334] "Generic (PLEG): container finished" podID="9d5b5af3-b53b-44d3-a690-cbc64f7c61fa" containerID="f0bac34c16833392314be99b4f108588d770550934c80b4b410212d3b0e6458e" exitCode=0 Mar 20 12:00:20 crc kubenswrapper[4748]: I0320 12:00:20.384616 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566800-pzr77" event={"ID":"9d5b5af3-b53b-44d3-a690-cbc64f7c61fa","Type":"ContainerDied","Data":"f0bac34c16833392314be99b4f108588d770550934c80b4b410212d3b0e6458e"} Mar 20 12:00:20 crc kubenswrapper[4748]: I0320 12:00:20.458021 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-59bc569d95-4gsq2_20023868-c089-41ec-ac26-9b4882fbab50/manager/0.log" Mar 20 12:00:20 crc kubenswrapper[4748]: I0320 12:00:20.717033 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-588d4d986b-9gzr8_d855d6bf-853d-454b-b0b7-feb11f23cc17/manager/0.log" Mar 20 12:00:20 crc kubenswrapper[4748]: I0320 12:00:20.895043 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-79df6bcc97-dn24h_8582a4fb-51b2-411c-a67f-31a023f40493/manager/0.log" Mar 20 12:00:21 crc kubenswrapper[4748]: I0320 12:00:21.014137 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-67dd5f86f5-r4sct_3fb5bb3a-ab86-4c3f-9d2d-9ef6d7f7ca1f/manager/0.log" Mar 20 12:00:21 crc kubenswrapper[4748]: I0320 12:00:21.252796 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-8464cc45fb-9gkdc_5d9f2386-33fc-43e9-9a61-e0d57fd94fbe/manager/0.log" Mar 20 12:00:21 crc kubenswrapper[4748]: I0320 12:00:21.539133 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6f787dddc9-9zhhz_640d4c26-acbd-4cb4-8b59-fde206294a91/manager/0.log" Mar 20 12:00:21 crc kubenswrapper[4748]: I0320 12:00:21.779430 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566800-pzr77" Mar 20 12:00:21 crc kubenswrapper[4748]: I0320 12:00:21.863442 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-7b9c774f96-x7sjb_c29e0600-cf39-40bf-9225-48e55c4b8f97/manager/0.log" Mar 20 12:00:21 crc kubenswrapper[4748]: I0320 12:00:21.867077 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-768b96df4c-prgc2_17f5527d-b31e-4788-ab09-ac5d26ea1bce/manager/0.log" Mar 20 12:00:21 crc kubenswrapper[4748]: I0320 12:00:21.937620 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q4lkg\" (UniqueName: \"kubernetes.io/projected/9d5b5af3-b53b-44d3-a690-cbc64f7c61fa-kube-api-access-q4lkg\") pod \"9d5b5af3-b53b-44d3-a690-cbc64f7c61fa\" (UID: \"9d5b5af3-b53b-44d3-a690-cbc64f7c61fa\") " Mar 20 12:00:21 crc kubenswrapper[4748]: I0320 12:00:21.945332 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d5b5af3-b53b-44d3-a690-cbc64f7c61fa-kube-api-access-q4lkg" (OuterVolumeSpecName: "kube-api-access-q4lkg") pod "9d5b5af3-b53b-44d3-a690-cbc64f7c61fa" (UID: "9d5b5af3-b53b-44d3-a690-cbc64f7c61fa"). InnerVolumeSpecName "kube-api-access-q4lkg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 12:00:22 crc kubenswrapper[4748]: I0320 12:00:22.039808 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q4lkg\" (UniqueName: \"kubernetes.io/projected/9d5b5af3-b53b-44d3-a690-cbc64f7c61fa-kube-api-access-q4lkg\") on node \"crc\" DevicePath \"\"" Mar 20 12:00:22 crc kubenswrapper[4748]: I0320 12:00:22.139307 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-8d58dc466-stsk5_1ec4d02c-2709-4102-8a27-c4e7c71ed61f/manager/0.log" Mar 20 12:00:22 crc kubenswrapper[4748]: I0320 12:00:22.180606 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-55f864c847-blfgz_6dbf38ab-35e4-4cf3-9655-b8dc49eaea7d/manager/0.log" Mar 20 12:00:22 crc kubenswrapper[4748]: I0320 12:00:22.300912 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-67ccfc9778-r5z4d_b0d0b327-5826-4c41-84bc-8b2c2bb05756/manager/0.log" Mar 20 12:00:22 crc kubenswrapper[4748]: I0320 12:00:22.403948 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566800-pzr77" event={"ID":"9d5b5af3-b53b-44d3-a690-cbc64f7c61fa","Type":"ContainerDied","Data":"9beeace1e59ef222c4b516e2c1ab2116fd04ed845ff371b38da755573f9788c5"} Mar 20 12:00:22 crc kubenswrapper[4748]: I0320 12:00:22.404325 4748 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9beeace1e59ef222c4b516e2c1ab2116fd04ed845ff371b38da755573f9788c5" Mar 20 12:00:22 crc kubenswrapper[4748]: I0320 12:00:22.404398 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566800-pzr77" Mar 20 12:00:22 crc kubenswrapper[4748]: I0320 12:00:22.421039 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-767865f676-zcqnn_ecd87b49-65fa-465e-a668-03cb90381b6e/manager/0.log" Mar 20 12:00:22 crc kubenswrapper[4748]: I0320 12:00:22.561914 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-5d488d59fb-m84q7_bd4cdccf-68e3-4c27-ae51-f54b8089e08b/manager/0.log" Mar 20 12:00:22 crc kubenswrapper[4748]: I0320 12:00:22.844878 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566794-86k58"] Mar 20 12:00:22 crc kubenswrapper[4748]: I0320 12:00:22.857935 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566794-86k58"] Mar 20 12:00:22 crc kubenswrapper[4748]: I0320 12:00:22.917161 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-5b9f45d989-cgzmd_49092c30-9830-451a-8003-2cc7fa078b62/manager/0.log" Mar 20 12:00:22 crc kubenswrapper[4748]: I0320 12:00:22.976328 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-89d64c458-rxplv_f0a3f8d9-dcfa-498a-a46e-61628aa68067/manager/0.log" Mar 20 12:00:23 crc kubenswrapper[4748]: I0320 12:00:23.128556 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-d8d579484-vx6pz_35554cb6-28ee-4104-8591-ee987f93805b/operator/0.log" Mar 20 12:00:23 crc kubenswrapper[4748]: I0320 12:00:23.387908 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-p9wd8_246b06bc-5f0b-4ef1-87eb-a0f56ad26e30/registry-server/0.log" Mar 20 12:00:23 crc kubenswrapper[4748]: I0320 12:00:23.444215 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-884679f54-rt4hn_bf9a7295-e355-4c61-a841-fd2bce675235/manager/0.log" Mar 20 12:00:23 crc kubenswrapper[4748]: I0320 12:00:23.529558 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d6ad2b5-b240-441e-bfd8-8337f2b7c53a" path="/var/lib/kubelet/pods/8d6ad2b5-b240-441e-bfd8-8337f2b7c53a/volumes" Mar 20 12:00:23 crc kubenswrapper[4748]: I0320 12:00:23.628485 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-5784578c99-897gg_736beaed-774c-43c0-bff9-d66a5ae4a1f5/manager/0.log" Mar 20 12:00:23 crc kubenswrapper[4748]: I0320 12:00:23.662978 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-7tgsh_b0646c53-71d5-40d9-8a3b-77c244fff7c4/operator/0.log" Mar 20 12:00:23 crc kubenswrapper[4748]: I0320 12:00:23.909941 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-c674c5965-2lkhq_1988c5e6-a91c-4085-a878-2ffdf478fa1b/manager/0.log" Mar 20 12:00:24 crc kubenswrapper[4748]: I0320 12:00:24.016461 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-d6b694c5-9pmr7_44b5dd5f-9a81-4c01-8efd-6d4997bb9c94/manager/0.log" Mar 20 12:00:24 crc kubenswrapper[4748]: I0320 12:00:24.120455 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5c5cb9c4d7-vsd7j_8e47e918-33de-4a66-9223-7ee3264600c1/manager/0.log" Mar 20 12:00:24 crc kubenswrapper[4748]: I0320 12:00:24.202157 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6c4d75f7f9-h7vwp_b6d05c98-f000-4560-b790-da31157488dc/manager/0.log" Mar 20 12:00:24 crc kubenswrapper[4748]: I0320 12:00:24.312771 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-5647f98656-9pqdv_795ce1d0-2232-4ed7-8618-c47a7584973e/manager/0.log" Mar 20 12:00:25 crc kubenswrapper[4748]: I0320 12:00:25.185708 4748 scope.go:117] "RemoveContainer" containerID="a0fa16c072acc5eeb5fd94f12182bb8326fc395167269f32546de0c675b4b536" Mar 20 12:00:25 crc kubenswrapper[4748]: I0320 12:00:25.232083 4748 scope.go:117] "RemoveContainer" containerID="554d8f30dc4e32b964cfaa1ec07885aa0297794b94f6d70063855076cb48ba28" Mar 20 12:00:46 crc kubenswrapper[4748]: I0320 12:00:46.121271 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-v79zw_a6110c56-5634-4ef9-92b1-4c7c75dd4986/control-plane-machine-set-operator/0.log" Mar 20 12:00:46 crc kubenswrapper[4748]: I0320 12:00:46.335861 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-9lbsk_2cc53e20-383b-4e3a-a00a-d54ac8272e00/kube-rbac-proxy/0.log" Mar 20 12:00:46 crc kubenswrapper[4748]: I0320 12:00:46.358572 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-9lbsk_2cc53e20-383b-4e3a-a00a-d54ac8272e00/machine-api-operator/0.log" Mar 20 12:01:00 crc kubenswrapper[4748]: I0320 12:01:00.158916 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29566801-wsbb2"] Mar 20 12:01:00 crc kubenswrapper[4748]: E0320 12:01:00.160007 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d5b5af3-b53b-44d3-a690-cbc64f7c61fa" containerName="oc" Mar 20 12:01:00 crc kubenswrapper[4748]: I0320 12:01:00.160027 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d5b5af3-b53b-44d3-a690-cbc64f7c61fa" containerName="oc" Mar 20 12:01:00 crc kubenswrapper[4748]: E0320 12:01:00.160074 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80f28a5a-6864-4320-b31e-0ab1fd022d45" containerName="collect-profiles" Mar 20 12:01:00 crc kubenswrapper[4748]: I0320 12:01:00.160082 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="80f28a5a-6864-4320-b31e-0ab1fd022d45" containerName="collect-profiles" Mar 20 12:01:00 crc kubenswrapper[4748]: I0320 12:01:00.160366 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d5b5af3-b53b-44d3-a690-cbc64f7c61fa" containerName="oc" Mar 20 12:01:00 crc kubenswrapper[4748]: I0320 12:01:00.160396 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="80f28a5a-6864-4320-b31e-0ab1fd022d45" containerName="collect-profiles" Mar 20 12:01:00 crc kubenswrapper[4748]: I0320 12:01:00.161228 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29566801-wsbb2" Mar 20 12:01:00 crc kubenswrapper[4748]: I0320 12:01:00.175884 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29566801-wsbb2"] Mar 20 12:01:00 crc kubenswrapper[4748]: I0320 12:01:00.315620 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a8af8b30-2085-488a-a4e1-97dd184fc28a-fernet-keys\") pod \"keystone-cron-29566801-wsbb2\" (UID: \"a8af8b30-2085-488a-a4e1-97dd184fc28a\") " pod="openstack/keystone-cron-29566801-wsbb2" Mar 20 12:01:00 crc kubenswrapper[4748]: I0320 12:01:00.315827 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8af8b30-2085-488a-a4e1-97dd184fc28a-combined-ca-bundle\") pod \"keystone-cron-29566801-wsbb2\" (UID: \"a8af8b30-2085-488a-a4e1-97dd184fc28a\") " pod="openstack/keystone-cron-29566801-wsbb2" Mar 20 12:01:00 crc kubenswrapper[4748]: I0320 12:01:00.315969 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8af8b30-2085-488a-a4e1-97dd184fc28a-config-data\") pod \"keystone-cron-29566801-wsbb2\" (UID: \"a8af8b30-2085-488a-a4e1-97dd184fc28a\") " pod="openstack/keystone-cron-29566801-wsbb2" Mar 20 12:01:00 crc kubenswrapper[4748]: I0320 12:01:00.316113 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hc6vc\" (UniqueName: \"kubernetes.io/projected/a8af8b30-2085-488a-a4e1-97dd184fc28a-kube-api-access-hc6vc\") pod \"keystone-cron-29566801-wsbb2\" (UID: \"a8af8b30-2085-488a-a4e1-97dd184fc28a\") " pod="openstack/keystone-cron-29566801-wsbb2" Mar 20 12:01:00 crc kubenswrapper[4748]: I0320 12:01:00.418288 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8af8b30-2085-488a-a4e1-97dd184fc28a-combined-ca-bundle\") pod \"keystone-cron-29566801-wsbb2\" (UID: \"a8af8b30-2085-488a-a4e1-97dd184fc28a\") " pod="openstack/keystone-cron-29566801-wsbb2" Mar 20 12:01:00 crc kubenswrapper[4748]: I0320 12:01:00.418360 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8af8b30-2085-488a-a4e1-97dd184fc28a-config-data\") pod \"keystone-cron-29566801-wsbb2\" (UID: \"a8af8b30-2085-488a-a4e1-97dd184fc28a\") " pod="openstack/keystone-cron-29566801-wsbb2" Mar 20 12:01:00 crc kubenswrapper[4748]: I0320 12:01:00.418471 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hc6vc\" (UniqueName: \"kubernetes.io/projected/a8af8b30-2085-488a-a4e1-97dd184fc28a-kube-api-access-hc6vc\") pod \"keystone-cron-29566801-wsbb2\" (UID: \"a8af8b30-2085-488a-a4e1-97dd184fc28a\") " pod="openstack/keystone-cron-29566801-wsbb2" Mar 20 12:01:00 crc kubenswrapper[4748]: I0320 12:01:00.418559 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a8af8b30-2085-488a-a4e1-97dd184fc28a-fernet-keys\") pod \"keystone-cron-29566801-wsbb2\" (UID: \"a8af8b30-2085-488a-a4e1-97dd184fc28a\") " pod="openstack/keystone-cron-29566801-wsbb2" Mar 20 12:01:00 crc kubenswrapper[4748]: I0320 12:01:00.426691 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8af8b30-2085-488a-a4e1-97dd184fc28a-combined-ca-bundle\") pod \"keystone-cron-29566801-wsbb2\" (UID: \"a8af8b30-2085-488a-a4e1-97dd184fc28a\") " pod="openstack/keystone-cron-29566801-wsbb2" Mar 20 12:01:00 crc kubenswrapper[4748]: I0320 12:01:00.426926 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8af8b30-2085-488a-a4e1-97dd184fc28a-config-data\") pod \"keystone-cron-29566801-wsbb2\" (UID: \"a8af8b30-2085-488a-a4e1-97dd184fc28a\") " pod="openstack/keystone-cron-29566801-wsbb2" Mar 20 12:01:00 crc kubenswrapper[4748]: I0320 12:01:00.429613 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a8af8b30-2085-488a-a4e1-97dd184fc28a-fernet-keys\") pod \"keystone-cron-29566801-wsbb2\" (UID: \"a8af8b30-2085-488a-a4e1-97dd184fc28a\") " pod="openstack/keystone-cron-29566801-wsbb2" Mar 20 12:01:00 crc kubenswrapper[4748]: I0320 12:01:00.437331 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hc6vc\" (UniqueName: \"kubernetes.io/projected/a8af8b30-2085-488a-a4e1-97dd184fc28a-kube-api-access-hc6vc\") pod \"keystone-cron-29566801-wsbb2\" (UID: \"a8af8b30-2085-488a-a4e1-97dd184fc28a\") " pod="openstack/keystone-cron-29566801-wsbb2" Mar 20 12:01:00 crc kubenswrapper[4748]: I0320 12:01:00.491403 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29566801-wsbb2" Mar 20 12:01:00 crc kubenswrapper[4748]: I0320 12:01:00.618278 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-krcxd_668082d7-988d-415d-bfde-1c28171130b5/cert-manager-controller/0.log" Mar 20 12:01:00 crc kubenswrapper[4748]: I0320 12:01:00.857625 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-jt8rf_2f029dc3-bb1a-4b18-91ee-fd467cbe157f/cert-manager-webhook/0.log" Mar 20 12:01:00 crc kubenswrapper[4748]: I0320 12:01:00.919399 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-wn2nc_bd82c1fc-6aff-4336-80fd-247fbc7aed58/cert-manager-cainjector/0.log" Mar 20 12:01:00 crc kubenswrapper[4748]: I0320 12:01:00.977209 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29566801-wsbb2"] Mar 20 12:01:01 crc kubenswrapper[4748]: I0320 12:01:01.743814 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29566801-wsbb2" event={"ID":"a8af8b30-2085-488a-a4e1-97dd184fc28a","Type":"ContainerStarted","Data":"12a98b25977145ba746067140b7e0d2aafb47d2779e8e088a0ad994cf3803412"} Mar 20 12:01:01 crc kubenswrapper[4748]: I0320 12:01:01.744169 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29566801-wsbb2" event={"ID":"a8af8b30-2085-488a-a4e1-97dd184fc28a","Type":"ContainerStarted","Data":"7e5972b26c7fee0c7cd0f2dd9bb7eac7cc6493632a63eab09a6eeca5f69e792e"} Mar 20 12:01:01 crc kubenswrapper[4748]: I0320 12:01:01.763537 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29566801-wsbb2" podStartSLOduration=1.763511829 podStartE2EDuration="1.763511829s" podCreationTimestamp="2026-03-20 12:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 12:01:01.758486912 +0000 UTC m=+5096.900032726" watchObservedRunningTime="2026-03-20 12:01:01.763511829 +0000 UTC m=+5096.905057643" Mar 20 12:01:04 crc kubenswrapper[4748]: I0320 12:01:04.778696 4748 generic.go:334] "Generic (PLEG): container finished" podID="a8af8b30-2085-488a-a4e1-97dd184fc28a" containerID="12a98b25977145ba746067140b7e0d2aafb47d2779e8e088a0ad994cf3803412" exitCode=0 Mar 20 12:01:04 crc kubenswrapper[4748]: I0320 12:01:04.778803 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29566801-wsbb2" event={"ID":"a8af8b30-2085-488a-a4e1-97dd184fc28a","Type":"ContainerDied","Data":"12a98b25977145ba746067140b7e0d2aafb47d2779e8e088a0ad994cf3803412"} Mar 20 12:01:06 crc kubenswrapper[4748]: I0320 12:01:06.151694 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29566801-wsbb2" Mar 20 12:01:06 crc kubenswrapper[4748]: I0320 12:01:06.165903 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hc6vc\" (UniqueName: \"kubernetes.io/projected/a8af8b30-2085-488a-a4e1-97dd184fc28a-kube-api-access-hc6vc\") pod \"a8af8b30-2085-488a-a4e1-97dd184fc28a\" (UID: \"a8af8b30-2085-488a-a4e1-97dd184fc28a\") " Mar 20 12:01:06 crc kubenswrapper[4748]: I0320 12:01:06.165946 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8af8b30-2085-488a-a4e1-97dd184fc28a-config-data\") pod \"a8af8b30-2085-488a-a4e1-97dd184fc28a\" (UID: \"a8af8b30-2085-488a-a4e1-97dd184fc28a\") " Mar 20 12:01:06 crc kubenswrapper[4748]: I0320 12:01:06.166103 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8af8b30-2085-488a-a4e1-97dd184fc28a-combined-ca-bundle\") pod \"a8af8b30-2085-488a-a4e1-97dd184fc28a\" (UID: \"a8af8b30-2085-488a-a4e1-97dd184fc28a\") " Mar 20 12:01:06 crc kubenswrapper[4748]: I0320 12:01:06.166215 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a8af8b30-2085-488a-a4e1-97dd184fc28a-fernet-keys\") pod \"a8af8b30-2085-488a-a4e1-97dd184fc28a\" (UID: \"a8af8b30-2085-488a-a4e1-97dd184fc28a\") " Mar 20 12:01:06 crc kubenswrapper[4748]: I0320 12:01:06.172601 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8af8b30-2085-488a-a4e1-97dd184fc28a-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "a8af8b30-2085-488a-a4e1-97dd184fc28a" (UID: "a8af8b30-2085-488a-a4e1-97dd184fc28a"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 12:01:06 crc kubenswrapper[4748]: I0320 12:01:06.216393 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8af8b30-2085-488a-a4e1-97dd184fc28a-kube-api-access-hc6vc" (OuterVolumeSpecName: "kube-api-access-hc6vc") pod "a8af8b30-2085-488a-a4e1-97dd184fc28a" (UID: "a8af8b30-2085-488a-a4e1-97dd184fc28a"). InnerVolumeSpecName "kube-api-access-hc6vc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 12:01:06 crc kubenswrapper[4748]: I0320 12:01:06.220968 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8af8b30-2085-488a-a4e1-97dd184fc28a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a8af8b30-2085-488a-a4e1-97dd184fc28a" (UID: "a8af8b30-2085-488a-a4e1-97dd184fc28a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 12:01:06 crc kubenswrapper[4748]: I0320 12:01:06.252896 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8af8b30-2085-488a-a4e1-97dd184fc28a-config-data" (OuterVolumeSpecName: "config-data") pod "a8af8b30-2085-488a-a4e1-97dd184fc28a" (UID: "a8af8b30-2085-488a-a4e1-97dd184fc28a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 12:01:06 crc kubenswrapper[4748]: I0320 12:01:06.267679 4748 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a8af8b30-2085-488a-a4e1-97dd184fc28a-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 20 12:01:06 crc kubenswrapper[4748]: I0320 12:01:06.267712 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hc6vc\" (UniqueName: \"kubernetes.io/projected/a8af8b30-2085-488a-a4e1-97dd184fc28a-kube-api-access-hc6vc\") on node \"crc\" DevicePath \"\"" Mar 20 12:01:06 crc kubenswrapper[4748]: I0320 12:01:06.267723 4748 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8af8b30-2085-488a-a4e1-97dd184fc28a-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 12:01:06 crc kubenswrapper[4748]: I0320 12:01:06.267731 4748 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8af8b30-2085-488a-a4e1-97dd184fc28a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 12:01:06 crc kubenswrapper[4748]: I0320 12:01:06.798909 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29566801-wsbb2" event={"ID":"a8af8b30-2085-488a-a4e1-97dd184fc28a","Type":"ContainerDied","Data":"7e5972b26c7fee0c7cd0f2dd9bb7eac7cc6493632a63eab09a6eeca5f69e792e"} Mar 20 12:01:06 crc kubenswrapper[4748]: I0320 12:01:06.798952 4748 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7e5972b26c7fee0c7cd0f2dd9bb7eac7cc6493632a63eab09a6eeca5f69e792e" Mar 20 12:01:06 crc kubenswrapper[4748]: I0320 12:01:06.798968 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29566801-wsbb2" Mar 20 12:01:12 crc kubenswrapper[4748]: I0320 12:01:12.928257 4748 patch_prober.go:28] interesting pod/machine-config-daemon-5lbvz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 12:01:12 crc kubenswrapper[4748]: I0320 12:01:12.928806 4748 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 12:01:13 crc kubenswrapper[4748]: I0320 12:01:13.568069 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-86f58fcf4-l8vtz_2781b78b-43e7-4826-8e44-74f302a93478/nmstate-console-plugin/0.log" Mar 20 12:01:13 crc kubenswrapper[4748]: I0320 12:01:13.778354 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-sms75_12dadf04-5eff-4e48-96cd-d8033b0baf63/nmstate-handler/0.log" Mar 20 12:01:13 crc kubenswrapper[4748]: I0320 12:01:13.865625 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-n2kjn_48cc57fd-25e5-490f-af0b-13a1e5f9be6d/kube-rbac-proxy/0.log" Mar 20 12:01:13 crc kubenswrapper[4748]: I0320 12:01:13.918371 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-n2kjn_48cc57fd-25e5-490f-af0b-13a1e5f9be6d/nmstate-metrics/0.log" Mar 20 12:01:13 crc kubenswrapper[4748]: I0320 12:01:13.966741 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-796d4cfff4-hfdql_6ec3f263-5ad8-494d-88d0-ec9f60f7c6bc/nmstate-operator/0.log" Mar 20 12:01:14 crc kubenswrapper[4748]: I0320 12:01:14.100644 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f558f5558-w284q_fd6c2ee9-cef6-406b-a6e0-e1f741be9f61/nmstate-webhook/0.log" Mar 20 12:01:42 crc kubenswrapper[4748]: I0320 12:01:42.242390 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-ppl7d_1dcaf132-7ad3-4b86-ba2f-e695238b2001/kube-rbac-proxy/0.log" Mar 20 12:01:42 crc kubenswrapper[4748]: I0320 12:01:42.373882 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-ppl7d_1dcaf132-7ad3-4b86-ba2f-e695238b2001/controller/0.log" Mar 20 12:01:42 crc kubenswrapper[4748]: I0320 12:01:42.479191 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-bcc4b6f68-8rblj_de736bb5-e7a6-4a9a-8841-5ff65871db92/frr-k8s-webhook-server/0.log" Mar 20 12:01:42 crc kubenswrapper[4748]: I0320 12:01:42.551116 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wf8gw_9c9f62ea-e7bf-4b0b-adc7-d0f282d6d13c/cp-frr-files/0.log" Mar 20 12:01:42 crc kubenswrapper[4748]: I0320 12:01:42.771918 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wf8gw_9c9f62ea-e7bf-4b0b-adc7-d0f282d6d13c/cp-frr-files/0.log" Mar 20 12:01:42 crc kubenswrapper[4748]: I0320 12:01:42.817254 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wf8gw_9c9f62ea-e7bf-4b0b-adc7-d0f282d6d13c/cp-reloader/0.log" Mar 20 12:01:42 crc kubenswrapper[4748]: I0320 12:01:42.889227 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wf8gw_9c9f62ea-e7bf-4b0b-adc7-d0f282d6d13c/cp-reloader/0.log" Mar 20 12:01:42 crc kubenswrapper[4748]: I0320 12:01:42.893957 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wf8gw_9c9f62ea-e7bf-4b0b-adc7-d0f282d6d13c/cp-metrics/0.log" Mar 20 12:01:42 crc kubenswrapper[4748]: I0320 12:01:42.927943 4748 patch_prober.go:28] interesting pod/machine-config-daemon-5lbvz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 12:01:42 crc kubenswrapper[4748]: I0320 12:01:42.928010 4748 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 12:01:43 crc kubenswrapper[4748]: I0320 12:01:43.045534 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wf8gw_9c9f62ea-e7bf-4b0b-adc7-d0f282d6d13c/cp-reloader/0.log" Mar 20 12:01:43 crc kubenswrapper[4748]: I0320 12:01:43.069733 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wf8gw_9c9f62ea-e7bf-4b0b-adc7-d0f282d6d13c/cp-frr-files/0.log" Mar 20 12:01:43 crc kubenswrapper[4748]: I0320 12:01:43.076775 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wf8gw_9c9f62ea-e7bf-4b0b-adc7-d0f282d6d13c/cp-metrics/0.log" Mar 20 12:01:43 crc kubenswrapper[4748]: I0320 12:01:43.092134 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wf8gw_9c9f62ea-e7bf-4b0b-adc7-d0f282d6d13c/cp-metrics/0.log" Mar 20 12:01:43 crc kubenswrapper[4748]: I0320 12:01:43.372987 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wf8gw_9c9f62ea-e7bf-4b0b-adc7-d0f282d6d13c/cp-frr-files/0.log" Mar 20 12:01:43 crc kubenswrapper[4748]: I0320 12:01:43.379031 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wf8gw_9c9f62ea-e7bf-4b0b-adc7-d0f282d6d13c/cp-reloader/0.log" Mar 20 12:01:43 crc kubenswrapper[4748]: I0320 12:01:43.383561 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wf8gw_9c9f62ea-e7bf-4b0b-adc7-d0f282d6d13c/controller/0.log" Mar 20 12:01:43 crc kubenswrapper[4748]: I0320 12:01:43.399369 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wf8gw_9c9f62ea-e7bf-4b0b-adc7-d0f282d6d13c/cp-metrics/0.log" Mar 20 12:01:43 crc kubenswrapper[4748]: I0320 12:01:43.564379 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wf8gw_9c9f62ea-e7bf-4b0b-adc7-d0f282d6d13c/kube-rbac-proxy/0.log" Mar 20 12:01:43 crc kubenswrapper[4748]: I0320 12:01:43.581062 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wf8gw_9c9f62ea-e7bf-4b0b-adc7-d0f282d6d13c/frr-metrics/0.log" Mar 20 12:01:43 crc kubenswrapper[4748]: I0320 12:01:43.660025 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wf8gw_9c9f62ea-e7bf-4b0b-adc7-d0f282d6d13c/kube-rbac-proxy-frr/0.log" Mar 20 12:01:43 crc kubenswrapper[4748]: I0320 12:01:43.802026 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wf8gw_9c9f62ea-e7bf-4b0b-adc7-d0f282d6d13c/reloader/0.log" Mar 20 12:01:43 crc kubenswrapper[4748]: I0320 12:01:43.948086 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-5b86d95c7b-kqtkn_7b6c6eee-f00c-458f-b050-6aaab992addf/manager/0.log" Mar 20 12:01:44 crc kubenswrapper[4748]: I0320 12:01:44.108868 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-c7659c7ff-w4c8m_be8702a2-29a5-4037-94f4-0f3a4b48754d/webhook-server/0.log" Mar 20 12:01:44 crc kubenswrapper[4748]: I0320 12:01:44.324288 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-vkqf7_7a239ede-0107-4751-b103-27b225f2cf5e/kube-rbac-proxy/0.log" Mar 20 12:01:44 crc kubenswrapper[4748]: I0320 12:01:44.807136 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-vkqf7_7a239ede-0107-4751-b103-27b225f2cf5e/speaker/0.log" Mar 20 12:01:45 crc kubenswrapper[4748]: I0320 12:01:45.517999 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wf8gw_9c9f62ea-e7bf-4b0b-adc7-d0f282d6d13c/frr/0.log" Mar 20 12:01:58 crc kubenswrapper[4748]: I0320 12:01:58.300175 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874tsfzz_5e2ce8ab-5247-412b-a4fb-d35645c906c6/util/0.log" Mar 20 12:01:58 crc kubenswrapper[4748]: I0320 12:01:58.468190 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874tsfzz_5e2ce8ab-5247-412b-a4fb-d35645c906c6/pull/0.log" Mar 20 12:01:58 crc kubenswrapper[4748]: I0320 12:01:58.470344 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874tsfzz_5e2ce8ab-5247-412b-a4fb-d35645c906c6/util/0.log" Mar 20 12:01:58 crc kubenswrapper[4748]: I0320 12:01:58.518945 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874tsfzz_5e2ce8ab-5247-412b-a4fb-d35645c906c6/pull/0.log" Mar 20 12:01:59 crc kubenswrapper[4748]: I0320 12:01:59.187690 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874tsfzz_5e2ce8ab-5247-412b-a4fb-d35645c906c6/util/0.log" Mar 20 12:01:59 crc kubenswrapper[4748]: I0320 12:01:59.188881 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874tsfzz_5e2ce8ab-5247-412b-a4fb-d35645c906c6/pull/0.log" Mar 20 12:01:59 crc kubenswrapper[4748]: I0320 12:01:59.200436 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874tsfzz_5e2ce8ab-5247-412b-a4fb-d35645c906c6/extract/0.log" Mar 20 12:01:59 crc kubenswrapper[4748]: I0320 12:01:59.355299 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1426jk_22772fc8-2959-4dfa-b1aa-070f9db955a1/util/0.log" Mar 20 12:01:59 crc kubenswrapper[4748]: I0320 12:01:59.542237 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1426jk_22772fc8-2959-4dfa-b1aa-070f9db955a1/pull/0.log" Mar 20 12:01:59 crc kubenswrapper[4748]: I0320 12:01:59.544258 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1426jk_22772fc8-2959-4dfa-b1aa-070f9db955a1/util/0.log" Mar 20 12:01:59 crc kubenswrapper[4748]: I0320 12:01:59.560754 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1426jk_22772fc8-2959-4dfa-b1aa-070f9db955a1/pull/0.log" Mar 20 12:01:59 crc kubenswrapper[4748]: I0320 12:01:59.747736 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1426jk_22772fc8-2959-4dfa-b1aa-070f9db955a1/util/0.log" Mar 20 12:01:59 crc kubenswrapper[4748]: I0320 12:01:59.755249 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1426jk_22772fc8-2959-4dfa-b1aa-070f9db955a1/pull/0.log" Mar 20 12:01:59 crc kubenswrapper[4748]: I0320 12:01:59.776862 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1426jk_22772fc8-2959-4dfa-b1aa-070f9db955a1/extract/0.log" Mar 20 12:01:59 crc kubenswrapper[4748]: I0320 12:01:59.903564 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-52b5d_ce0ed389-9ca5-4022-bc2d-3dfed380bd01/extract-utilities/0.log" Mar 20 12:02:00 crc kubenswrapper[4748]: I0320 12:02:00.087898 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-52b5d_ce0ed389-9ca5-4022-bc2d-3dfed380bd01/extract-content/0.log" Mar 20 12:02:00 crc kubenswrapper[4748]: I0320 12:02:00.087953 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-52b5d_ce0ed389-9ca5-4022-bc2d-3dfed380bd01/extract-utilities/0.log" Mar 20 12:02:00 crc kubenswrapper[4748]: I0320 12:02:00.124729 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-52b5d_ce0ed389-9ca5-4022-bc2d-3dfed380bd01/extract-content/0.log" Mar 20 12:02:00 crc kubenswrapper[4748]: I0320 12:02:00.148640 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566802-hvkxv"] Mar 20 12:02:00 crc kubenswrapper[4748]: E0320 12:02:00.149130 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8af8b30-2085-488a-a4e1-97dd184fc28a" containerName="keystone-cron" Mar 20 12:02:00 crc kubenswrapper[4748]: I0320 12:02:00.149155 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8af8b30-2085-488a-a4e1-97dd184fc28a" containerName="keystone-cron" Mar 20 12:02:00 crc kubenswrapper[4748]: I0320 12:02:00.149410 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8af8b30-2085-488a-a4e1-97dd184fc28a" containerName="keystone-cron" Mar 20 12:02:00 crc kubenswrapper[4748]: I0320 12:02:00.150150 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566802-hvkxv" Mar 20 12:02:00 crc kubenswrapper[4748]: I0320 12:02:00.153015 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-p6q8z" Mar 20 12:02:00 crc kubenswrapper[4748]: I0320 12:02:00.153524 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 12:02:00 crc kubenswrapper[4748]: I0320 12:02:00.154364 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 12:02:00 crc kubenswrapper[4748]: I0320 12:02:00.159721 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566802-hvkxv"] Mar 20 12:02:00 crc kubenswrapper[4748]: I0320 12:02:00.250240 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5fvc\" (UniqueName: \"kubernetes.io/projected/8bf6c021-b677-4fc2-8fa4-a4c86022bea9-kube-api-access-v5fvc\") pod \"auto-csr-approver-29566802-hvkxv\" (UID: \"8bf6c021-b677-4fc2-8fa4-a4c86022bea9\") " pod="openshift-infra/auto-csr-approver-29566802-hvkxv" Mar 20 12:02:00 crc kubenswrapper[4748]: I0320 12:02:00.351471 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v5fvc\" (UniqueName: \"kubernetes.io/projected/8bf6c021-b677-4fc2-8fa4-a4c86022bea9-kube-api-access-v5fvc\") pod \"auto-csr-approver-29566802-hvkxv\" (UID: \"8bf6c021-b677-4fc2-8fa4-a4c86022bea9\") " pod="openshift-infra/auto-csr-approver-29566802-hvkxv" Mar 20 12:02:00 crc kubenswrapper[4748]: I0320 12:02:00.583960 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5fvc\" (UniqueName: \"kubernetes.io/projected/8bf6c021-b677-4fc2-8fa4-a4c86022bea9-kube-api-access-v5fvc\") pod \"auto-csr-approver-29566802-hvkxv\" (UID: \"8bf6c021-b677-4fc2-8fa4-a4c86022bea9\") " pod="openshift-infra/auto-csr-approver-29566802-hvkxv" Mar 20 12:02:00 crc kubenswrapper[4748]: I0320 12:02:00.759660 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-52b5d_ce0ed389-9ca5-4022-bc2d-3dfed380bd01/extract-utilities/0.log" Mar 20 12:02:00 crc kubenswrapper[4748]: I0320 12:02:00.766379 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-52b5d_ce0ed389-9ca5-4022-bc2d-3dfed380bd01/extract-content/0.log" Mar 20 12:02:00 crc kubenswrapper[4748]: I0320 12:02:00.822339 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566802-hvkxv" Mar 20 12:02:01 crc kubenswrapper[4748]: I0320 12:02:01.038122 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-lnbmv_e152aa56-2971-4596-84ee-5ba1c22ef8e3/extract-utilities/0.log" Mar 20 12:02:01 crc kubenswrapper[4748]: I0320 12:02:01.243182 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-lnbmv_e152aa56-2971-4596-84ee-5ba1c22ef8e3/extract-utilities/0.log" Mar 20 12:02:01 crc kubenswrapper[4748]: I0320 12:02:01.282819 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-lnbmv_e152aa56-2971-4596-84ee-5ba1c22ef8e3/extract-content/0.log" Mar 20 12:02:01 crc kubenswrapper[4748]: I0320 12:02:01.285772 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-lnbmv_e152aa56-2971-4596-84ee-5ba1c22ef8e3/extract-content/0.log" Mar 20 12:02:01 crc kubenswrapper[4748]: I0320 12:02:01.393380 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566802-hvkxv"] Mar 20 12:02:01 crc kubenswrapper[4748]: I0320 12:02:01.497133 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-lnbmv_e152aa56-2971-4596-84ee-5ba1c22ef8e3/extract-content/0.log" Mar 20 12:02:01 crc kubenswrapper[4748]: I0320 12:02:01.534109 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-lnbmv_e152aa56-2971-4596-84ee-5ba1c22ef8e3/extract-utilities/0.log" Mar 20 12:02:01 crc kubenswrapper[4748]: I0320 12:02:01.709074 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-52b5d_ce0ed389-9ca5-4022-bc2d-3dfed380bd01/registry-server/0.log" Mar 20 12:02:01 crc kubenswrapper[4748]: I0320 12:02:01.747718 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-2qp4n_4e32ab73-56fa-4a44-bb26-42d87e8ee2d5/marketplace-operator/0.log" Mar 20 12:02:01 crc kubenswrapper[4748]: I0320 12:02:01.964440 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xvhwr_ae14d541-76df-4379-98ef-87f4e35e7db3/extract-utilities/0.log" Mar 20 12:02:02 crc kubenswrapper[4748]: I0320 12:02:02.160850 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-lnbmv_e152aa56-2971-4596-84ee-5ba1c22ef8e3/registry-server/0.log" Mar 20 12:02:02 crc kubenswrapper[4748]: I0320 12:02:02.225406 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xvhwr_ae14d541-76df-4379-98ef-87f4e35e7db3/extract-content/0.log" Mar 20 12:02:02 crc kubenswrapper[4748]: I0320 12:02:02.234060 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xvhwr_ae14d541-76df-4379-98ef-87f4e35e7db3/extract-utilities/0.log" Mar 20 12:02:02 crc kubenswrapper[4748]: I0320 12:02:02.243491 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xvhwr_ae14d541-76df-4379-98ef-87f4e35e7db3/extract-content/0.log" Mar 20 12:02:02 crc kubenswrapper[4748]: I0320 12:02:02.278360 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566802-hvkxv" event={"ID":"8bf6c021-b677-4fc2-8fa4-a4c86022bea9","Type":"ContainerStarted","Data":"da311151a42bc0e69780875eeb075ba50b1b89e50a7eb2f7063be0d0a3807a06"} Mar 20 12:02:02 crc kubenswrapper[4748]: I0320 12:02:02.411560 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xvhwr_ae14d541-76df-4379-98ef-87f4e35e7db3/extract-utilities/0.log" Mar 20 12:02:02 crc kubenswrapper[4748]: I0320 12:02:02.417416 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xvhwr_ae14d541-76df-4379-98ef-87f4e35e7db3/extract-content/0.log" Mar 20 12:02:02 crc kubenswrapper[4748]: I0320 12:02:02.622560 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8mlkc_36f779a6-7268-4911-8532-f6fda0d56533/extract-utilities/0.log" Mar 20 12:02:02 crc kubenswrapper[4748]: I0320 12:02:02.646137 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xvhwr_ae14d541-76df-4379-98ef-87f4e35e7db3/registry-server/0.log" Mar 20 12:02:02 crc kubenswrapper[4748]: I0320 12:02:02.785693 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8mlkc_36f779a6-7268-4911-8532-f6fda0d56533/extract-utilities/0.log" Mar 20 12:02:02 crc kubenswrapper[4748]: I0320 12:02:02.822499 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8mlkc_36f779a6-7268-4911-8532-f6fda0d56533/extract-content/0.log" Mar 20 12:02:02 crc kubenswrapper[4748]: I0320 12:02:02.831511 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8mlkc_36f779a6-7268-4911-8532-f6fda0d56533/extract-content/0.log" Mar 20 12:02:02 crc kubenswrapper[4748]: I0320 12:02:02.992719 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8mlkc_36f779a6-7268-4911-8532-f6fda0d56533/extract-content/0.log" Mar 20 12:02:03 crc kubenswrapper[4748]: I0320 12:02:03.004538 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8mlkc_36f779a6-7268-4911-8532-f6fda0d56533/extract-utilities/0.log" Mar 20 12:02:03 crc kubenswrapper[4748]: I0320 12:02:03.652229 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8mlkc_36f779a6-7268-4911-8532-f6fda0d56533/registry-server/0.log" Mar 20 12:02:04 crc kubenswrapper[4748]: I0320 12:02:04.305263 4748 generic.go:334] "Generic (PLEG): container finished" podID="8bf6c021-b677-4fc2-8fa4-a4c86022bea9" containerID="7893591d9717acc313f255cd2b4dad137ea10871e51c3328634fc593b9c03d7c" exitCode=0 Mar 20 12:02:04 crc kubenswrapper[4748]: I0320 12:02:04.305324 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566802-hvkxv" event={"ID":"8bf6c021-b677-4fc2-8fa4-a4c86022bea9","Type":"ContainerDied","Data":"7893591d9717acc313f255cd2b4dad137ea10871e51c3328634fc593b9c03d7c"} Mar 20 12:02:05 crc kubenswrapper[4748]: I0320 12:02:05.658570 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566802-hvkxv" Mar 20 12:02:05 crc kubenswrapper[4748]: I0320 12:02:05.664785 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v5fvc\" (UniqueName: \"kubernetes.io/projected/8bf6c021-b677-4fc2-8fa4-a4c86022bea9-kube-api-access-v5fvc\") pod \"8bf6c021-b677-4fc2-8fa4-a4c86022bea9\" (UID: \"8bf6c021-b677-4fc2-8fa4-a4c86022bea9\") " Mar 20 12:02:05 crc kubenswrapper[4748]: I0320 12:02:05.672375 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8bf6c021-b677-4fc2-8fa4-a4c86022bea9-kube-api-access-v5fvc" (OuterVolumeSpecName: "kube-api-access-v5fvc") pod "8bf6c021-b677-4fc2-8fa4-a4c86022bea9" (UID: "8bf6c021-b677-4fc2-8fa4-a4c86022bea9"). InnerVolumeSpecName "kube-api-access-v5fvc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 12:02:05 crc kubenswrapper[4748]: I0320 12:02:05.771305 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v5fvc\" (UniqueName: \"kubernetes.io/projected/8bf6c021-b677-4fc2-8fa4-a4c86022bea9-kube-api-access-v5fvc\") on node \"crc\" DevicePath \"\"" Mar 20 12:02:06 crc kubenswrapper[4748]: I0320 12:02:06.324555 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566802-hvkxv" event={"ID":"8bf6c021-b677-4fc2-8fa4-a4c86022bea9","Type":"ContainerDied","Data":"da311151a42bc0e69780875eeb075ba50b1b89e50a7eb2f7063be0d0a3807a06"} Mar 20 12:02:06 crc kubenswrapper[4748]: I0320 12:02:06.324594 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566802-hvkxv" Mar 20 12:02:06 crc kubenswrapper[4748]: I0320 12:02:06.324601 4748 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="da311151a42bc0e69780875eeb075ba50b1b89e50a7eb2f7063be0d0a3807a06" Mar 20 12:02:06 crc kubenswrapper[4748]: I0320 12:02:06.732154 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566796-7chdb"] Mar 20 12:02:06 crc kubenswrapper[4748]: I0320 12:02:06.741600 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566796-7chdb"] Mar 20 12:02:07 crc kubenswrapper[4748]: I0320 12:02:07.530034 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd818bd0-acc6-4b57-bd4e-5e1ad6648bb3" path="/var/lib/kubelet/pods/cd818bd0-acc6-4b57-bd4e-5e1ad6648bb3/volumes" Mar 20 12:02:12 crc kubenswrapper[4748]: I0320 12:02:12.928894 4748 patch_prober.go:28] interesting pod/machine-config-daemon-5lbvz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 12:02:12 crc kubenswrapper[4748]: I0320 12:02:12.929592 4748 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 12:02:12 crc kubenswrapper[4748]: I0320 12:02:12.929646 4748 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" Mar 20 12:02:12 crc kubenswrapper[4748]: I0320 12:02:12.930651 4748 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5ce726fbb2e9c7a69b53896504587de01360f862a5598de467da49c2048886e6"} pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 12:02:12 crc kubenswrapper[4748]: I0320 12:02:12.930721 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" containerName="machine-config-daemon" containerID="cri-o://5ce726fbb2e9c7a69b53896504587de01360f862a5598de467da49c2048886e6" gracePeriod=600 Mar 20 12:02:13 crc kubenswrapper[4748]: I0320 12:02:13.384463 4748 generic.go:334] "Generic (PLEG): container finished" podID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" containerID="5ce726fbb2e9c7a69b53896504587de01360f862a5598de467da49c2048886e6" exitCode=0 Mar 20 12:02:13 crc kubenswrapper[4748]: I0320 12:02:13.384767 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" event={"ID":"8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c","Type":"ContainerDied","Data":"5ce726fbb2e9c7a69b53896504587de01360f862a5598de467da49c2048886e6"} Mar 20 12:02:13 crc kubenswrapper[4748]: I0320 12:02:13.384905 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" event={"ID":"8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c","Type":"ContainerStarted","Data":"ab5fbc3d3e1bf68c7ddafde8663631f03f98d3c2bfc6679861a776522b57bd0f"} Mar 20 12:02:13 crc kubenswrapper[4748]: I0320 12:02:13.386044 4748 scope.go:117] "RemoveContainer" containerID="219c994104121ab22714c2868fe3144cfbb2c9dfcdad0eea40b956e614be504d" Mar 20 12:02:25 crc kubenswrapper[4748]: I0320 12:02:25.319670 4748 scope.go:117] "RemoveContainer" containerID="62ddfd967246f7d6057932ccb065fed02ec0476bf4b6493fac2d4d97f9c433c7" Mar 20 12:04:00 crc kubenswrapper[4748]: I0320 12:04:00.158875 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566804-jkm9d"] Mar 20 12:04:00 crc kubenswrapper[4748]: E0320 12:04:00.160088 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bf6c021-b677-4fc2-8fa4-a4c86022bea9" containerName="oc" Mar 20 12:04:00 crc kubenswrapper[4748]: I0320 12:04:00.160112 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bf6c021-b677-4fc2-8fa4-a4c86022bea9" containerName="oc" Mar 20 12:04:00 crc kubenswrapper[4748]: I0320 12:04:00.160434 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="8bf6c021-b677-4fc2-8fa4-a4c86022bea9" containerName="oc" Mar 20 12:04:00 crc kubenswrapper[4748]: I0320 12:04:00.161412 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566804-jkm9d" Mar 20 12:04:00 crc kubenswrapper[4748]: I0320 12:04:00.163811 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 12:04:00 crc kubenswrapper[4748]: I0320 12:04:00.164350 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 12:04:00 crc kubenswrapper[4748]: I0320 12:04:00.164625 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-p6q8z" Mar 20 12:04:00 crc kubenswrapper[4748]: I0320 12:04:00.172240 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566804-jkm9d"] Mar 20 12:04:00 crc kubenswrapper[4748]: I0320 12:04:00.287394 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjgx4\" (UniqueName: \"kubernetes.io/projected/66f57066-8e96-4ab7-8253-160b10c02883-kube-api-access-jjgx4\") pod \"auto-csr-approver-29566804-jkm9d\" (UID: \"66f57066-8e96-4ab7-8253-160b10c02883\") " pod="openshift-infra/auto-csr-approver-29566804-jkm9d" Mar 20 12:04:00 crc kubenswrapper[4748]: I0320 12:04:00.390110 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jjgx4\" (UniqueName: \"kubernetes.io/projected/66f57066-8e96-4ab7-8253-160b10c02883-kube-api-access-jjgx4\") pod \"auto-csr-approver-29566804-jkm9d\" (UID: \"66f57066-8e96-4ab7-8253-160b10c02883\") " pod="openshift-infra/auto-csr-approver-29566804-jkm9d" Mar 20 12:04:00 crc kubenswrapper[4748]: I0320 12:04:00.971399 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjgx4\" (UniqueName: \"kubernetes.io/projected/66f57066-8e96-4ab7-8253-160b10c02883-kube-api-access-jjgx4\") pod \"auto-csr-approver-29566804-jkm9d\" (UID: \"66f57066-8e96-4ab7-8253-160b10c02883\") " pod="openshift-infra/auto-csr-approver-29566804-jkm9d" Mar 20 12:04:01 crc kubenswrapper[4748]: I0320 12:04:01.082892 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566804-jkm9d" Mar 20 12:04:01 crc kubenswrapper[4748]: I0320 12:04:01.540648 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566804-jkm9d"] Mar 20 12:04:01 crc kubenswrapper[4748]: W0320 12:04:01.555441 4748 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod66f57066_8e96_4ab7_8253_160b10c02883.slice/crio-0c66b317833878467ad81192835b9ef36bd792531e48c1a15cb34df29a1f0b57 WatchSource:0}: Error finding container 0c66b317833878467ad81192835b9ef36bd792531e48c1a15cb34df29a1f0b57: Status 404 returned error can't find the container with id 0c66b317833878467ad81192835b9ef36bd792531e48c1a15cb34df29a1f0b57 Mar 20 12:04:01 crc kubenswrapper[4748]: I0320 12:04:01.560940 4748 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 12:04:02 crc kubenswrapper[4748]: I0320 12:04:02.432644 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566804-jkm9d" event={"ID":"66f57066-8e96-4ab7-8253-160b10c02883","Type":"ContainerStarted","Data":"0c66b317833878467ad81192835b9ef36bd792531e48c1a15cb34df29a1f0b57"} Mar 20 12:04:04 crc kubenswrapper[4748]: I0320 12:04:04.452713 4748 generic.go:334] "Generic (PLEG): container finished" podID="66f57066-8e96-4ab7-8253-160b10c02883" containerID="55443e94f0e0e15d831898f9631e022160de0e05a9cc7ef063b9519913262645" exitCode=0 Mar 20 12:04:04 crc kubenswrapper[4748]: I0320 12:04:04.453318 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566804-jkm9d" event={"ID":"66f57066-8e96-4ab7-8253-160b10c02883","Type":"ContainerDied","Data":"55443e94f0e0e15d831898f9631e022160de0e05a9cc7ef063b9519913262645"} Mar 20 12:04:05 crc kubenswrapper[4748]: I0320 12:04:05.818767 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566804-jkm9d" Mar 20 12:04:05 crc kubenswrapper[4748]: I0320 12:04:05.913867 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jjgx4\" (UniqueName: \"kubernetes.io/projected/66f57066-8e96-4ab7-8253-160b10c02883-kube-api-access-jjgx4\") pod \"66f57066-8e96-4ab7-8253-160b10c02883\" (UID: \"66f57066-8e96-4ab7-8253-160b10c02883\") " Mar 20 12:04:05 crc kubenswrapper[4748]: I0320 12:04:05.920307 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66f57066-8e96-4ab7-8253-160b10c02883-kube-api-access-jjgx4" (OuterVolumeSpecName: "kube-api-access-jjgx4") pod "66f57066-8e96-4ab7-8253-160b10c02883" (UID: "66f57066-8e96-4ab7-8253-160b10c02883"). InnerVolumeSpecName "kube-api-access-jjgx4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 12:04:06 crc kubenswrapper[4748]: I0320 12:04:06.016356 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jjgx4\" (UniqueName: \"kubernetes.io/projected/66f57066-8e96-4ab7-8253-160b10c02883-kube-api-access-jjgx4\") on node \"crc\" DevicePath \"\"" Mar 20 12:04:06 crc kubenswrapper[4748]: I0320 12:04:06.474468 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566804-jkm9d" event={"ID":"66f57066-8e96-4ab7-8253-160b10c02883","Type":"ContainerDied","Data":"0c66b317833878467ad81192835b9ef36bd792531e48c1a15cb34df29a1f0b57"} Mar 20 12:04:06 crc kubenswrapper[4748]: I0320 12:04:06.474515 4748 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0c66b317833878467ad81192835b9ef36bd792531e48c1a15cb34df29a1f0b57" Mar 20 12:04:06 crc kubenswrapper[4748]: I0320 12:04:06.474867 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566804-jkm9d" Mar 20 12:04:06 crc kubenswrapper[4748]: I0320 12:04:06.900060 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566798-klbz6"] Mar 20 12:04:06 crc kubenswrapper[4748]: I0320 12:04:06.907491 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566798-klbz6"] Mar 20 12:04:07 crc kubenswrapper[4748]: I0320 12:04:07.525305 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3d414f8-6f11-4e8a-8add-cb74488d3568" path="/var/lib/kubelet/pods/d3d414f8-6f11-4e8a-8add-cb74488d3568/volumes" Mar 20 12:04:10 crc kubenswrapper[4748]: I0320 12:04:10.957961 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-82sdm"] Mar 20 12:04:10 crc kubenswrapper[4748]: E0320 12:04:10.958692 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66f57066-8e96-4ab7-8253-160b10c02883" containerName="oc" Mar 20 12:04:10 crc kubenswrapper[4748]: I0320 12:04:10.958705 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="66f57066-8e96-4ab7-8253-160b10c02883" containerName="oc" Mar 20 12:04:10 crc kubenswrapper[4748]: I0320 12:04:10.958939 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="66f57066-8e96-4ab7-8253-160b10c02883" containerName="oc" Mar 20 12:04:10 crc kubenswrapper[4748]: I0320 12:04:10.968636 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-82sdm" Mar 20 12:04:10 crc kubenswrapper[4748]: I0320 12:04:10.970505 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-82sdm"] Mar 20 12:04:11 crc kubenswrapper[4748]: I0320 12:04:11.129418 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d508e50c-63c7-44dd-b39d-af458fdafd1d-utilities\") pod \"certified-operators-82sdm\" (UID: \"d508e50c-63c7-44dd-b39d-af458fdafd1d\") " pod="openshift-marketplace/certified-operators-82sdm" Mar 20 12:04:11 crc kubenswrapper[4748]: I0320 12:04:11.129869 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrvqv\" (UniqueName: \"kubernetes.io/projected/d508e50c-63c7-44dd-b39d-af458fdafd1d-kube-api-access-mrvqv\") pod \"certified-operators-82sdm\" (UID: \"d508e50c-63c7-44dd-b39d-af458fdafd1d\") " pod="openshift-marketplace/certified-operators-82sdm" Mar 20 12:04:11 crc kubenswrapper[4748]: I0320 12:04:11.130038 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d508e50c-63c7-44dd-b39d-af458fdafd1d-catalog-content\") pod \"certified-operators-82sdm\" (UID: \"d508e50c-63c7-44dd-b39d-af458fdafd1d\") " pod="openshift-marketplace/certified-operators-82sdm" Mar 20 12:04:11 crc kubenswrapper[4748]: I0320 12:04:11.232307 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d508e50c-63c7-44dd-b39d-af458fdafd1d-catalog-content\") pod \"certified-operators-82sdm\" (UID: \"d508e50c-63c7-44dd-b39d-af458fdafd1d\") " pod="openshift-marketplace/certified-operators-82sdm" Mar 20 12:04:11 crc kubenswrapper[4748]: I0320 12:04:11.232451 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d508e50c-63c7-44dd-b39d-af458fdafd1d-utilities\") pod \"certified-operators-82sdm\" (UID: \"d508e50c-63c7-44dd-b39d-af458fdafd1d\") " pod="openshift-marketplace/certified-operators-82sdm" Mar 20 12:04:11 crc kubenswrapper[4748]: I0320 12:04:11.232521 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mrvqv\" (UniqueName: \"kubernetes.io/projected/d508e50c-63c7-44dd-b39d-af458fdafd1d-kube-api-access-mrvqv\") pod \"certified-operators-82sdm\" (UID: \"d508e50c-63c7-44dd-b39d-af458fdafd1d\") " pod="openshift-marketplace/certified-operators-82sdm" Mar 20 12:04:11 crc kubenswrapper[4748]: I0320 12:04:11.232851 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d508e50c-63c7-44dd-b39d-af458fdafd1d-catalog-content\") pod \"certified-operators-82sdm\" (UID: \"d508e50c-63c7-44dd-b39d-af458fdafd1d\") " pod="openshift-marketplace/certified-operators-82sdm" Mar 20 12:04:11 crc kubenswrapper[4748]: I0320 12:04:11.232994 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d508e50c-63c7-44dd-b39d-af458fdafd1d-utilities\") pod \"certified-operators-82sdm\" (UID: \"d508e50c-63c7-44dd-b39d-af458fdafd1d\") " pod="openshift-marketplace/certified-operators-82sdm" Mar 20 12:04:11 crc kubenswrapper[4748]: I0320 12:04:11.671605 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrvqv\" (UniqueName: \"kubernetes.io/projected/d508e50c-63c7-44dd-b39d-af458fdafd1d-kube-api-access-mrvqv\") pod \"certified-operators-82sdm\" (UID: \"d508e50c-63c7-44dd-b39d-af458fdafd1d\") " pod="openshift-marketplace/certified-operators-82sdm" Mar 20 12:04:11 crc kubenswrapper[4748]: I0320 12:04:11.919911 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-82sdm" Mar 20 12:04:12 crc kubenswrapper[4748]: I0320 12:04:12.379542 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-82sdm"] Mar 20 12:04:12 crc kubenswrapper[4748]: I0320 12:04:12.525623 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-82sdm" event={"ID":"d508e50c-63c7-44dd-b39d-af458fdafd1d","Type":"ContainerStarted","Data":"48fc5b631520df59f477cb2fcf57d4e1dead802572868e65e2478a05f83157fd"} Mar 20 12:04:13 crc kubenswrapper[4748]: I0320 12:04:13.556528 4748 generic.go:334] "Generic (PLEG): container finished" podID="d508e50c-63c7-44dd-b39d-af458fdafd1d" containerID="77ae68ec6c448843e1b81d0a5f683fe3a669fc7245396db34fe86994f52612b6" exitCode=0 Mar 20 12:04:13 crc kubenswrapper[4748]: I0320 12:04:13.557003 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-82sdm" event={"ID":"d508e50c-63c7-44dd-b39d-af458fdafd1d","Type":"ContainerDied","Data":"77ae68ec6c448843e1b81d0a5f683fe3a669fc7245396db34fe86994f52612b6"} Mar 20 12:04:15 crc kubenswrapper[4748]: I0320 12:04:15.584903 4748 generic.go:334] "Generic (PLEG): container finished" podID="d508e50c-63c7-44dd-b39d-af458fdafd1d" containerID="55f06249ca061a2c8322436f8178f4e0f69273acbe30a5cfb604417f796d1d21" exitCode=0 Mar 20 12:04:15 crc kubenswrapper[4748]: I0320 12:04:15.585439 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-82sdm" event={"ID":"d508e50c-63c7-44dd-b39d-af458fdafd1d","Type":"ContainerDied","Data":"55f06249ca061a2c8322436f8178f4e0f69273acbe30a5cfb604417f796d1d21"} Mar 20 12:04:16 crc kubenswrapper[4748]: I0320 12:04:16.596559 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-82sdm" event={"ID":"d508e50c-63c7-44dd-b39d-af458fdafd1d","Type":"ContainerStarted","Data":"95e142b154181e4e400dfe67704d6bb8fa93ada11c9d187f94f092abb6b05be0"} Mar 20 12:04:16 crc kubenswrapper[4748]: I0320 12:04:16.629524 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-82sdm" podStartSLOduration=4.099363153 podStartE2EDuration="6.629499436s" podCreationTimestamp="2026-03-20 12:04:10 +0000 UTC" firstStartedPulling="2026-03-20 12:04:13.566194004 +0000 UTC m=+5288.707739828" lastFinishedPulling="2026-03-20 12:04:16.096330297 +0000 UTC m=+5291.237876111" observedRunningTime="2026-03-20 12:04:16.619519373 +0000 UTC m=+5291.761065197" watchObservedRunningTime="2026-03-20 12:04:16.629499436 +0000 UTC m=+5291.771045260" Mar 20 12:04:21 crc kubenswrapper[4748]: I0320 12:04:21.920552 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-82sdm" Mar 20 12:04:21 crc kubenswrapper[4748]: I0320 12:04:21.921140 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-82sdm" Mar 20 12:04:22 crc kubenswrapper[4748]: I0320 12:04:22.618378 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-82sdm" Mar 20 12:04:22 crc kubenswrapper[4748]: I0320 12:04:22.751039 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-82sdm" Mar 20 12:04:22 crc kubenswrapper[4748]: I0320 12:04:22.852589 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-82sdm"] Mar 20 12:04:24 crc kubenswrapper[4748]: I0320 12:04:24.687173 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-82sdm" podUID="d508e50c-63c7-44dd-b39d-af458fdafd1d" containerName="registry-server" containerID="cri-o://95e142b154181e4e400dfe67704d6bb8fa93ada11c9d187f94f092abb6b05be0" gracePeriod=2 Mar 20 12:04:25 crc kubenswrapper[4748]: I0320 12:04:25.424596 4748 scope.go:117] "RemoveContainer" containerID="4cd940b45880cfb45cda841808feb1b79d2aaa1acc6ece545a5aadba48087178" Mar 20 12:04:25 crc kubenswrapper[4748]: I0320 12:04:25.457361 4748 scope.go:117] "RemoveContainer" containerID="2e58deefd358ab82d99a1de69a45a8a81612661f4628d642d9adb6be22fc770e" Mar 20 12:04:25 crc kubenswrapper[4748]: I0320 12:04:25.698860 4748 generic.go:334] "Generic (PLEG): container finished" podID="d508e50c-63c7-44dd-b39d-af458fdafd1d" containerID="95e142b154181e4e400dfe67704d6bb8fa93ada11c9d187f94f092abb6b05be0" exitCode=0 Mar 20 12:04:25 crc kubenswrapper[4748]: I0320 12:04:25.698964 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-82sdm" event={"ID":"d508e50c-63c7-44dd-b39d-af458fdafd1d","Type":"ContainerDied","Data":"95e142b154181e4e400dfe67704d6bb8fa93ada11c9d187f94f092abb6b05be0"} Mar 20 12:04:25 crc kubenswrapper[4748]: I0320 12:04:25.701901 4748 generic.go:334] "Generic (PLEG): container finished" podID="90804811-2638-4a6e-8f6c-259fe4e36763" containerID="ff913d3ba97b49e87ed17c2430a769c1c6407f922d7f27c811106e2d70e79104" exitCode=0 Mar 20 12:04:25 crc kubenswrapper[4748]: I0320 12:04:25.701959 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vpnp5/must-gather-plw76" event={"ID":"90804811-2638-4a6e-8f6c-259fe4e36763","Type":"ContainerDied","Data":"ff913d3ba97b49e87ed17c2430a769c1c6407f922d7f27c811106e2d70e79104"} Mar 20 12:04:25 crc kubenswrapper[4748]: I0320 12:04:25.702796 4748 scope.go:117] "RemoveContainer" containerID="ff913d3ba97b49e87ed17c2430a769c1c6407f922d7f27c811106e2d70e79104" Mar 20 12:04:26 crc kubenswrapper[4748]: I0320 12:04:26.207018 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-82sdm" Mar 20 12:04:26 crc kubenswrapper[4748]: I0320 12:04:26.215800 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d508e50c-63c7-44dd-b39d-af458fdafd1d-utilities\") pod \"d508e50c-63c7-44dd-b39d-af458fdafd1d\" (UID: \"d508e50c-63c7-44dd-b39d-af458fdafd1d\") " Mar 20 12:04:26 crc kubenswrapper[4748]: I0320 12:04:26.215892 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d508e50c-63c7-44dd-b39d-af458fdafd1d-catalog-content\") pod \"d508e50c-63c7-44dd-b39d-af458fdafd1d\" (UID: \"d508e50c-63c7-44dd-b39d-af458fdafd1d\") " Mar 20 12:04:26 crc kubenswrapper[4748]: I0320 12:04:26.216057 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mrvqv\" (UniqueName: \"kubernetes.io/projected/d508e50c-63c7-44dd-b39d-af458fdafd1d-kube-api-access-mrvqv\") pod \"d508e50c-63c7-44dd-b39d-af458fdafd1d\" (UID: \"d508e50c-63c7-44dd-b39d-af458fdafd1d\") " Mar 20 12:04:26 crc kubenswrapper[4748]: I0320 12:04:26.217055 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d508e50c-63c7-44dd-b39d-af458fdafd1d-utilities" (OuterVolumeSpecName: "utilities") pod "d508e50c-63c7-44dd-b39d-af458fdafd1d" (UID: "d508e50c-63c7-44dd-b39d-af458fdafd1d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 12:04:26 crc kubenswrapper[4748]: I0320 12:04:26.221447 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d508e50c-63c7-44dd-b39d-af458fdafd1d-kube-api-access-mrvqv" (OuterVolumeSpecName: "kube-api-access-mrvqv") pod "d508e50c-63c7-44dd-b39d-af458fdafd1d" (UID: "d508e50c-63c7-44dd-b39d-af458fdafd1d"). InnerVolumeSpecName "kube-api-access-mrvqv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 12:04:26 crc kubenswrapper[4748]: I0320 12:04:26.266970 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d508e50c-63c7-44dd-b39d-af458fdafd1d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d508e50c-63c7-44dd-b39d-af458fdafd1d" (UID: "d508e50c-63c7-44dd-b39d-af458fdafd1d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 12:04:26 crc kubenswrapper[4748]: I0320 12:04:26.318734 4748 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d508e50c-63c7-44dd-b39d-af458fdafd1d-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 12:04:26 crc kubenswrapper[4748]: I0320 12:04:26.318767 4748 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d508e50c-63c7-44dd-b39d-af458fdafd1d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 12:04:26 crc kubenswrapper[4748]: I0320 12:04:26.318783 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mrvqv\" (UniqueName: \"kubernetes.io/projected/d508e50c-63c7-44dd-b39d-af458fdafd1d-kube-api-access-mrvqv\") on node \"crc\" DevicePath \"\"" Mar 20 12:04:26 crc kubenswrapper[4748]: I0320 12:04:26.651721 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-vpnp5_must-gather-plw76_90804811-2638-4a6e-8f6c-259fe4e36763/gather/0.log" Mar 20 12:04:26 crc kubenswrapper[4748]: I0320 12:04:26.711964 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-82sdm" event={"ID":"d508e50c-63c7-44dd-b39d-af458fdafd1d","Type":"ContainerDied","Data":"48fc5b631520df59f477cb2fcf57d4e1dead802572868e65e2478a05f83157fd"} Mar 20 12:04:26 crc kubenswrapper[4748]: I0320 12:04:26.712027 4748 scope.go:117] "RemoveContainer" containerID="95e142b154181e4e400dfe67704d6bb8fa93ada11c9d187f94f092abb6b05be0" Mar 20 12:04:26 crc kubenswrapper[4748]: I0320 12:04:26.712080 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-82sdm" Mar 20 12:04:26 crc kubenswrapper[4748]: I0320 12:04:26.730002 4748 scope.go:117] "RemoveContainer" containerID="55f06249ca061a2c8322436f8178f4e0f69273acbe30a5cfb604417f796d1d21" Mar 20 12:04:26 crc kubenswrapper[4748]: I0320 12:04:26.748824 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-82sdm"] Mar 20 12:04:26 crc kubenswrapper[4748]: I0320 12:04:26.757138 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-82sdm"] Mar 20 12:04:26 crc kubenswrapper[4748]: I0320 12:04:26.770988 4748 scope.go:117] "RemoveContainer" containerID="77ae68ec6c448843e1b81d0a5f683fe3a669fc7245396db34fe86994f52612b6" Mar 20 12:04:27 crc kubenswrapper[4748]: I0320 12:04:27.535077 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d508e50c-63c7-44dd-b39d-af458fdafd1d" path="/var/lib/kubelet/pods/d508e50c-63c7-44dd-b39d-af458fdafd1d/volumes" Mar 20 12:04:40 crc kubenswrapper[4748]: I0320 12:04:40.297122 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-vpnp5/must-gather-plw76"] Mar 20 12:04:40 crc kubenswrapper[4748]: I0320 12:04:40.297915 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-vpnp5/must-gather-plw76" podUID="90804811-2638-4a6e-8f6c-259fe4e36763" containerName="copy" containerID="cri-o://a4a01dbb18ef461ba0c64479a20d20ef746373bff8c15104e3883a9f6d2970bb" gracePeriod=2 Mar 20 12:04:40 crc kubenswrapper[4748]: I0320 12:04:40.308820 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-vpnp5/must-gather-plw76"] Mar 20 12:04:40 crc kubenswrapper[4748]: I0320 12:04:40.787796 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-vpnp5_must-gather-plw76_90804811-2638-4a6e-8f6c-259fe4e36763/copy/0.log" Mar 20 12:04:40 crc kubenswrapper[4748]: I0320 12:04:40.788609 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vpnp5/must-gather-plw76" Mar 20 12:04:40 crc kubenswrapper[4748]: I0320 12:04:40.866259 4748 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-vpnp5_must-gather-plw76_90804811-2638-4a6e-8f6c-259fe4e36763/copy/0.log" Mar 20 12:04:40 crc kubenswrapper[4748]: I0320 12:04:40.866773 4748 generic.go:334] "Generic (PLEG): container finished" podID="90804811-2638-4a6e-8f6c-259fe4e36763" containerID="a4a01dbb18ef461ba0c64479a20d20ef746373bff8c15104e3883a9f6d2970bb" exitCode=143 Mar 20 12:04:40 crc kubenswrapper[4748]: I0320 12:04:40.866826 4748 scope.go:117] "RemoveContainer" containerID="a4a01dbb18ef461ba0c64479a20d20ef746373bff8c15104e3883a9f6d2970bb" Mar 20 12:04:40 crc kubenswrapper[4748]: I0320 12:04:40.866850 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vpnp5/must-gather-plw76" Mar 20 12:04:40 crc kubenswrapper[4748]: I0320 12:04:40.885076 4748 scope.go:117] "RemoveContainer" containerID="ff913d3ba97b49e87ed17c2430a769c1c6407f922d7f27c811106e2d70e79104" Mar 20 12:04:40 crc kubenswrapper[4748]: I0320 12:04:40.919915 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q57hp\" (UniqueName: \"kubernetes.io/projected/90804811-2638-4a6e-8f6c-259fe4e36763-kube-api-access-q57hp\") pod \"90804811-2638-4a6e-8f6c-259fe4e36763\" (UID: \"90804811-2638-4a6e-8f6c-259fe4e36763\") " Mar 20 12:04:40 crc kubenswrapper[4748]: I0320 12:04:40.920096 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/90804811-2638-4a6e-8f6c-259fe4e36763-must-gather-output\") pod \"90804811-2638-4a6e-8f6c-259fe4e36763\" (UID: \"90804811-2638-4a6e-8f6c-259fe4e36763\") " Mar 20 12:04:40 crc kubenswrapper[4748]: I0320 12:04:40.971074 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90804811-2638-4a6e-8f6c-259fe4e36763-kube-api-access-q57hp" (OuterVolumeSpecName: "kube-api-access-q57hp") pod "90804811-2638-4a6e-8f6c-259fe4e36763" (UID: "90804811-2638-4a6e-8f6c-259fe4e36763"). InnerVolumeSpecName "kube-api-access-q57hp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 12:04:41 crc kubenswrapper[4748]: I0320 12:04:41.015630 4748 scope.go:117] "RemoveContainer" containerID="a4a01dbb18ef461ba0c64479a20d20ef746373bff8c15104e3883a9f6d2970bb" Mar 20 12:04:41 crc kubenswrapper[4748]: E0320 12:04:41.016050 4748 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a4a01dbb18ef461ba0c64479a20d20ef746373bff8c15104e3883a9f6d2970bb\": container with ID starting with a4a01dbb18ef461ba0c64479a20d20ef746373bff8c15104e3883a9f6d2970bb not found: ID does not exist" containerID="a4a01dbb18ef461ba0c64479a20d20ef746373bff8c15104e3883a9f6d2970bb" Mar 20 12:04:41 crc kubenswrapper[4748]: I0320 12:04:41.016086 4748 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4a01dbb18ef461ba0c64479a20d20ef746373bff8c15104e3883a9f6d2970bb"} err="failed to get container status \"a4a01dbb18ef461ba0c64479a20d20ef746373bff8c15104e3883a9f6d2970bb\": rpc error: code = NotFound desc = could not find container \"a4a01dbb18ef461ba0c64479a20d20ef746373bff8c15104e3883a9f6d2970bb\": container with ID starting with a4a01dbb18ef461ba0c64479a20d20ef746373bff8c15104e3883a9f6d2970bb not found: ID does not exist" Mar 20 12:04:41 crc kubenswrapper[4748]: I0320 12:04:41.016106 4748 scope.go:117] "RemoveContainer" containerID="ff913d3ba97b49e87ed17c2430a769c1c6407f922d7f27c811106e2d70e79104" Mar 20 12:04:41 crc kubenswrapper[4748]: E0320 12:04:41.016668 4748 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff913d3ba97b49e87ed17c2430a769c1c6407f922d7f27c811106e2d70e79104\": container with ID starting with ff913d3ba97b49e87ed17c2430a769c1c6407f922d7f27c811106e2d70e79104 not found: ID does not exist" containerID="ff913d3ba97b49e87ed17c2430a769c1c6407f922d7f27c811106e2d70e79104" Mar 20 12:04:41 crc kubenswrapper[4748]: I0320 12:04:41.016719 4748 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff913d3ba97b49e87ed17c2430a769c1c6407f922d7f27c811106e2d70e79104"} err="failed to get container status \"ff913d3ba97b49e87ed17c2430a769c1c6407f922d7f27c811106e2d70e79104\": rpc error: code = NotFound desc = could not find container \"ff913d3ba97b49e87ed17c2430a769c1c6407f922d7f27c811106e2d70e79104\": container with ID starting with ff913d3ba97b49e87ed17c2430a769c1c6407f922d7f27c811106e2d70e79104 not found: ID does not exist" Mar 20 12:04:41 crc kubenswrapper[4748]: I0320 12:04:41.022240 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q57hp\" (UniqueName: \"kubernetes.io/projected/90804811-2638-4a6e-8f6c-259fe4e36763-kube-api-access-q57hp\") on node \"crc\" DevicePath \"\"" Mar 20 12:04:41 crc kubenswrapper[4748]: I0320 12:04:41.106618 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/90804811-2638-4a6e-8f6c-259fe4e36763-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "90804811-2638-4a6e-8f6c-259fe4e36763" (UID: "90804811-2638-4a6e-8f6c-259fe4e36763"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 12:04:41 crc kubenswrapper[4748]: I0320 12:04:41.124518 4748 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/90804811-2638-4a6e-8f6c-259fe4e36763-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 20 12:04:41 crc kubenswrapper[4748]: I0320 12:04:41.525693 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90804811-2638-4a6e-8f6c-259fe4e36763" path="/var/lib/kubelet/pods/90804811-2638-4a6e-8f6c-259fe4e36763/volumes" Mar 20 12:04:42 crc kubenswrapper[4748]: I0320 12:04:42.928386 4748 patch_prober.go:28] interesting pod/machine-config-daemon-5lbvz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 12:04:42 crc kubenswrapper[4748]: I0320 12:04:42.929691 4748 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 12:05:12 crc kubenswrapper[4748]: I0320 12:05:12.928401 4748 patch_prober.go:28] interesting pod/machine-config-daemon-5lbvz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 12:05:12 crc kubenswrapper[4748]: I0320 12:05:12.928880 4748 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 12:05:25 crc kubenswrapper[4748]: I0320 12:05:25.568364 4748 scope.go:117] "RemoveContainer" containerID="1729455ba7adc1f65024369f256253ffaeec71d3249a5476cf01c3e56c8959a1" Mar 20 12:05:42 crc kubenswrapper[4748]: I0320 12:05:42.928543 4748 patch_prober.go:28] interesting pod/machine-config-daemon-5lbvz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 12:05:42 crc kubenswrapper[4748]: I0320 12:05:42.929177 4748 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 12:05:42 crc kubenswrapper[4748]: I0320 12:05:42.929249 4748 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" Mar 20 12:05:42 crc kubenswrapper[4748]: I0320 12:05:42.930238 4748 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ab5fbc3d3e1bf68c7ddafde8663631f03f98d3c2bfc6679861a776522b57bd0f"} pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 12:05:42 crc kubenswrapper[4748]: I0320 12:05:42.930338 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" containerName="machine-config-daemon" containerID="cri-o://ab5fbc3d3e1bf68c7ddafde8663631f03f98d3c2bfc6679861a776522b57bd0f" gracePeriod=600 Mar 20 12:05:43 crc kubenswrapper[4748]: E0320 12:05:43.061366 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lbvz_openshift-machine-config-operator(8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" Mar 20 12:05:43 crc kubenswrapper[4748]: I0320 12:05:43.482399 4748 generic.go:334] "Generic (PLEG): container finished" podID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" containerID="ab5fbc3d3e1bf68c7ddafde8663631f03f98d3c2bfc6679861a776522b57bd0f" exitCode=0 Mar 20 12:05:43 crc kubenswrapper[4748]: I0320 12:05:43.482464 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" event={"ID":"8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c","Type":"ContainerDied","Data":"ab5fbc3d3e1bf68c7ddafde8663631f03f98d3c2bfc6679861a776522b57bd0f"} Mar 20 12:05:43 crc kubenswrapper[4748]: I0320 12:05:43.482547 4748 scope.go:117] "RemoveContainer" containerID="5ce726fbb2e9c7a69b53896504587de01360f862a5598de467da49c2048886e6" Mar 20 12:05:43 crc kubenswrapper[4748]: I0320 12:05:43.483486 4748 scope.go:117] "RemoveContainer" containerID="ab5fbc3d3e1bf68c7ddafde8663631f03f98d3c2bfc6679861a776522b57bd0f" Mar 20 12:05:43 crc kubenswrapper[4748]: E0320 12:05:43.483946 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lbvz_openshift-machine-config-operator(8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" Mar 20 12:05:57 crc kubenswrapper[4748]: I0320 12:05:57.515627 4748 scope.go:117] "RemoveContainer" containerID="ab5fbc3d3e1bf68c7ddafde8663631f03f98d3c2bfc6679861a776522b57bd0f" Mar 20 12:05:57 crc kubenswrapper[4748]: E0320 12:05:57.516419 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lbvz_openshift-machine-config-operator(8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" Mar 20 12:06:00 crc kubenswrapper[4748]: I0320 12:06:00.146609 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566806-js5zp"] Mar 20 12:06:00 crc kubenswrapper[4748]: E0320 12:06:00.147339 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90804811-2638-4a6e-8f6c-259fe4e36763" containerName="gather" Mar 20 12:06:00 crc kubenswrapper[4748]: I0320 12:06:00.147355 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="90804811-2638-4a6e-8f6c-259fe4e36763" containerName="gather" Mar 20 12:06:00 crc kubenswrapper[4748]: E0320 12:06:00.147388 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d508e50c-63c7-44dd-b39d-af458fdafd1d" containerName="registry-server" Mar 20 12:06:00 crc kubenswrapper[4748]: I0320 12:06:00.147398 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="d508e50c-63c7-44dd-b39d-af458fdafd1d" containerName="registry-server" Mar 20 12:06:00 crc kubenswrapper[4748]: E0320 12:06:00.147411 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d508e50c-63c7-44dd-b39d-af458fdafd1d" containerName="extract-utilities" Mar 20 12:06:00 crc kubenswrapper[4748]: I0320 12:06:00.147419 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="d508e50c-63c7-44dd-b39d-af458fdafd1d" containerName="extract-utilities" Mar 20 12:06:00 crc kubenswrapper[4748]: E0320 12:06:00.147432 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90804811-2638-4a6e-8f6c-259fe4e36763" containerName="copy" Mar 20 12:06:00 crc kubenswrapper[4748]: I0320 12:06:00.147439 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="90804811-2638-4a6e-8f6c-259fe4e36763" containerName="copy" Mar 20 12:06:00 crc kubenswrapper[4748]: E0320 12:06:00.147466 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d508e50c-63c7-44dd-b39d-af458fdafd1d" containerName="extract-content" Mar 20 12:06:00 crc kubenswrapper[4748]: I0320 12:06:00.147473 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="d508e50c-63c7-44dd-b39d-af458fdafd1d" containerName="extract-content" Mar 20 12:06:00 crc kubenswrapper[4748]: I0320 12:06:00.147665 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="90804811-2638-4a6e-8f6c-259fe4e36763" containerName="gather" Mar 20 12:06:00 crc kubenswrapper[4748]: I0320 12:06:00.147694 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="d508e50c-63c7-44dd-b39d-af458fdafd1d" containerName="registry-server" Mar 20 12:06:00 crc kubenswrapper[4748]: I0320 12:06:00.147715 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="90804811-2638-4a6e-8f6c-259fe4e36763" containerName="copy" Mar 20 12:06:00 crc kubenswrapper[4748]: I0320 12:06:00.148485 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566806-js5zp" Mar 20 12:06:00 crc kubenswrapper[4748]: I0320 12:06:00.156295 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 12:06:00 crc kubenswrapper[4748]: I0320 12:06:00.156431 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566806-js5zp"] Mar 20 12:06:00 crc kubenswrapper[4748]: I0320 12:06:00.156697 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-p6q8z" Mar 20 12:06:00 crc kubenswrapper[4748]: I0320 12:06:00.156765 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 12:06:00 crc kubenswrapper[4748]: I0320 12:06:00.270861 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wh4p4\" (UniqueName: \"kubernetes.io/projected/c65db8f7-20a2-442b-9fc0-b60c195295d9-kube-api-access-wh4p4\") pod \"auto-csr-approver-29566806-js5zp\" (UID: \"c65db8f7-20a2-442b-9fc0-b60c195295d9\") " pod="openshift-infra/auto-csr-approver-29566806-js5zp" Mar 20 12:06:00 crc kubenswrapper[4748]: I0320 12:06:00.373517 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wh4p4\" (UniqueName: \"kubernetes.io/projected/c65db8f7-20a2-442b-9fc0-b60c195295d9-kube-api-access-wh4p4\") pod \"auto-csr-approver-29566806-js5zp\" (UID: \"c65db8f7-20a2-442b-9fc0-b60c195295d9\") " pod="openshift-infra/auto-csr-approver-29566806-js5zp" Mar 20 12:06:00 crc kubenswrapper[4748]: I0320 12:06:00.397357 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wh4p4\" (UniqueName: \"kubernetes.io/projected/c65db8f7-20a2-442b-9fc0-b60c195295d9-kube-api-access-wh4p4\") pod \"auto-csr-approver-29566806-js5zp\" (UID: \"c65db8f7-20a2-442b-9fc0-b60c195295d9\") " pod="openshift-infra/auto-csr-approver-29566806-js5zp" Mar 20 12:06:00 crc kubenswrapper[4748]: I0320 12:06:00.484714 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566806-js5zp" Mar 20 12:06:00 crc kubenswrapper[4748]: I0320 12:06:00.968138 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566806-js5zp"] Mar 20 12:06:01 crc kubenswrapper[4748]: I0320 12:06:01.684036 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566806-js5zp" event={"ID":"c65db8f7-20a2-442b-9fc0-b60c195295d9","Type":"ContainerStarted","Data":"44b95c55489d322475403b999e2ec8e2c284ba9a644a8cd7235708165f33bb4a"} Mar 20 12:06:02 crc kubenswrapper[4748]: I0320 12:06:02.694408 4748 generic.go:334] "Generic (PLEG): container finished" podID="c65db8f7-20a2-442b-9fc0-b60c195295d9" containerID="26d10eb4853c74668fbec83496c6bda10b3d4d379550dd213c251b90b4226beb" exitCode=0 Mar 20 12:06:02 crc kubenswrapper[4748]: I0320 12:06:02.694459 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566806-js5zp" event={"ID":"c65db8f7-20a2-442b-9fc0-b60c195295d9","Type":"ContainerDied","Data":"26d10eb4853c74668fbec83496c6bda10b3d4d379550dd213c251b90b4226beb"} Mar 20 12:06:04 crc kubenswrapper[4748]: I0320 12:06:04.074871 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566806-js5zp" Mar 20 12:06:04 crc kubenswrapper[4748]: I0320 12:06:04.162658 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wh4p4\" (UniqueName: \"kubernetes.io/projected/c65db8f7-20a2-442b-9fc0-b60c195295d9-kube-api-access-wh4p4\") pod \"c65db8f7-20a2-442b-9fc0-b60c195295d9\" (UID: \"c65db8f7-20a2-442b-9fc0-b60c195295d9\") " Mar 20 12:06:04 crc kubenswrapper[4748]: I0320 12:06:04.169215 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c65db8f7-20a2-442b-9fc0-b60c195295d9-kube-api-access-wh4p4" (OuterVolumeSpecName: "kube-api-access-wh4p4") pod "c65db8f7-20a2-442b-9fc0-b60c195295d9" (UID: "c65db8f7-20a2-442b-9fc0-b60c195295d9"). InnerVolumeSpecName "kube-api-access-wh4p4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 12:06:04 crc kubenswrapper[4748]: I0320 12:06:04.266296 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wh4p4\" (UniqueName: \"kubernetes.io/projected/c65db8f7-20a2-442b-9fc0-b60c195295d9-kube-api-access-wh4p4\") on node \"crc\" DevicePath \"\"" Mar 20 12:06:04 crc kubenswrapper[4748]: I0320 12:06:04.714406 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566806-js5zp" event={"ID":"c65db8f7-20a2-442b-9fc0-b60c195295d9","Type":"ContainerDied","Data":"44b95c55489d322475403b999e2ec8e2c284ba9a644a8cd7235708165f33bb4a"} Mar 20 12:06:04 crc kubenswrapper[4748]: I0320 12:06:04.714446 4748 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="44b95c55489d322475403b999e2ec8e2c284ba9a644a8cd7235708165f33bb4a" Mar 20 12:06:04 crc kubenswrapper[4748]: I0320 12:06:04.714520 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566806-js5zp" Mar 20 12:06:05 crc kubenswrapper[4748]: I0320 12:06:05.153803 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566800-pzr77"] Mar 20 12:06:05 crc kubenswrapper[4748]: I0320 12:06:05.166969 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566800-pzr77"] Mar 20 12:06:05 crc kubenswrapper[4748]: I0320 12:06:05.534358 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d5b5af3-b53b-44d3-a690-cbc64f7c61fa" path="/var/lib/kubelet/pods/9d5b5af3-b53b-44d3-a690-cbc64f7c61fa/volumes" Mar 20 12:06:11 crc kubenswrapper[4748]: I0320 12:06:11.515691 4748 scope.go:117] "RemoveContainer" containerID="ab5fbc3d3e1bf68c7ddafde8663631f03f98d3c2bfc6679861a776522b57bd0f" Mar 20 12:06:11 crc kubenswrapper[4748]: E0320 12:06:11.516605 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lbvz_openshift-machine-config-operator(8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" Mar 20 12:06:13 crc kubenswrapper[4748]: I0320 12:06:13.778115 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-rrthr"] Mar 20 12:06:13 crc kubenswrapper[4748]: E0320 12:06:13.779075 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c65db8f7-20a2-442b-9fc0-b60c195295d9" containerName="oc" Mar 20 12:06:13 crc kubenswrapper[4748]: I0320 12:06:13.779096 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="c65db8f7-20a2-442b-9fc0-b60c195295d9" containerName="oc" Mar 20 12:06:13 crc kubenswrapper[4748]: I0320 12:06:13.779440 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="c65db8f7-20a2-442b-9fc0-b60c195295d9" containerName="oc" Mar 20 12:06:13 crc kubenswrapper[4748]: I0320 12:06:13.781652 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rrthr" Mar 20 12:06:13 crc kubenswrapper[4748]: I0320 12:06:13.787493 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rrthr"] Mar 20 12:06:13 crc kubenswrapper[4748]: I0320 12:06:13.965347 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8509f672-fbd9-42ba-985a-715bdc4178af-catalog-content\") pod \"redhat-marketplace-rrthr\" (UID: \"8509f672-fbd9-42ba-985a-715bdc4178af\") " pod="openshift-marketplace/redhat-marketplace-rrthr" Mar 20 12:06:13 crc kubenswrapper[4748]: I0320 12:06:13.965420 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jfbqs\" (UniqueName: \"kubernetes.io/projected/8509f672-fbd9-42ba-985a-715bdc4178af-kube-api-access-jfbqs\") pod \"redhat-marketplace-rrthr\" (UID: \"8509f672-fbd9-42ba-985a-715bdc4178af\") " pod="openshift-marketplace/redhat-marketplace-rrthr" Mar 20 12:06:13 crc kubenswrapper[4748]: I0320 12:06:13.965467 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8509f672-fbd9-42ba-985a-715bdc4178af-utilities\") pod \"redhat-marketplace-rrthr\" (UID: \"8509f672-fbd9-42ba-985a-715bdc4178af\") " pod="openshift-marketplace/redhat-marketplace-rrthr" Mar 20 12:06:14 crc kubenswrapper[4748]: I0320 12:06:14.067093 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8509f672-fbd9-42ba-985a-715bdc4178af-utilities\") pod \"redhat-marketplace-rrthr\" (UID: \"8509f672-fbd9-42ba-985a-715bdc4178af\") " pod="openshift-marketplace/redhat-marketplace-rrthr" Mar 20 12:06:14 crc kubenswrapper[4748]: I0320 12:06:14.067232 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8509f672-fbd9-42ba-985a-715bdc4178af-catalog-content\") pod \"redhat-marketplace-rrthr\" (UID: \"8509f672-fbd9-42ba-985a-715bdc4178af\") " pod="openshift-marketplace/redhat-marketplace-rrthr" Mar 20 12:06:14 crc kubenswrapper[4748]: I0320 12:06:14.067277 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jfbqs\" (UniqueName: \"kubernetes.io/projected/8509f672-fbd9-42ba-985a-715bdc4178af-kube-api-access-jfbqs\") pod \"redhat-marketplace-rrthr\" (UID: \"8509f672-fbd9-42ba-985a-715bdc4178af\") " pod="openshift-marketplace/redhat-marketplace-rrthr" Mar 20 12:06:14 crc kubenswrapper[4748]: I0320 12:06:14.067661 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8509f672-fbd9-42ba-985a-715bdc4178af-utilities\") pod \"redhat-marketplace-rrthr\" (UID: \"8509f672-fbd9-42ba-985a-715bdc4178af\") " pod="openshift-marketplace/redhat-marketplace-rrthr" Mar 20 12:06:14 crc kubenswrapper[4748]: I0320 12:06:14.067944 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8509f672-fbd9-42ba-985a-715bdc4178af-catalog-content\") pod \"redhat-marketplace-rrthr\" (UID: \"8509f672-fbd9-42ba-985a-715bdc4178af\") " pod="openshift-marketplace/redhat-marketplace-rrthr" Mar 20 12:06:14 crc kubenswrapper[4748]: I0320 12:06:14.103618 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jfbqs\" (UniqueName: \"kubernetes.io/projected/8509f672-fbd9-42ba-985a-715bdc4178af-kube-api-access-jfbqs\") pod \"redhat-marketplace-rrthr\" (UID: \"8509f672-fbd9-42ba-985a-715bdc4178af\") " pod="openshift-marketplace/redhat-marketplace-rrthr" Mar 20 12:06:14 crc kubenswrapper[4748]: I0320 12:06:14.117533 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rrthr" Mar 20 12:06:14 crc kubenswrapper[4748]: I0320 12:06:14.635161 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rrthr"] Mar 20 12:06:14 crc kubenswrapper[4748]: I0320 12:06:14.834920 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rrthr" event={"ID":"8509f672-fbd9-42ba-985a-715bdc4178af","Type":"ContainerStarted","Data":"b7e9fe0aeb7d0331c3c27c1d5caa11e8df7fb66c1b88e237c659e0568b7f171a"} Mar 20 12:06:14 crc kubenswrapper[4748]: I0320 12:06:14.835000 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rrthr" event={"ID":"8509f672-fbd9-42ba-985a-715bdc4178af","Type":"ContainerStarted","Data":"a97722be63124e4460becf9b6e26fc8fec8f538e6037c8fd5d7314e586103219"} Mar 20 12:06:15 crc kubenswrapper[4748]: I0320 12:06:15.149136 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-trpfs"] Mar 20 12:06:15 crc kubenswrapper[4748]: I0320 12:06:15.151505 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-trpfs" Mar 20 12:06:15 crc kubenswrapper[4748]: I0320 12:06:15.170121 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-trpfs"] Mar 20 12:06:15 crc kubenswrapper[4748]: I0320 12:06:15.291221 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fcc3f8d7-c68f-417c-925f-6b921abfd582-utilities\") pod \"redhat-operators-trpfs\" (UID: \"fcc3f8d7-c68f-417c-925f-6b921abfd582\") " pod="openshift-marketplace/redhat-operators-trpfs" Mar 20 12:06:15 crc kubenswrapper[4748]: I0320 12:06:15.291288 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpp97\" (UniqueName: \"kubernetes.io/projected/fcc3f8d7-c68f-417c-925f-6b921abfd582-kube-api-access-fpp97\") pod \"redhat-operators-trpfs\" (UID: \"fcc3f8d7-c68f-417c-925f-6b921abfd582\") " pod="openshift-marketplace/redhat-operators-trpfs" Mar 20 12:06:15 crc kubenswrapper[4748]: I0320 12:06:15.291571 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fcc3f8d7-c68f-417c-925f-6b921abfd582-catalog-content\") pod \"redhat-operators-trpfs\" (UID: \"fcc3f8d7-c68f-417c-925f-6b921abfd582\") " pod="openshift-marketplace/redhat-operators-trpfs" Mar 20 12:06:15 crc kubenswrapper[4748]: I0320 12:06:15.393271 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fcc3f8d7-c68f-417c-925f-6b921abfd582-utilities\") pod \"redhat-operators-trpfs\" (UID: \"fcc3f8d7-c68f-417c-925f-6b921abfd582\") " pod="openshift-marketplace/redhat-operators-trpfs" Mar 20 12:06:15 crc kubenswrapper[4748]: I0320 12:06:15.393331 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fpp97\" (UniqueName: \"kubernetes.io/projected/fcc3f8d7-c68f-417c-925f-6b921abfd582-kube-api-access-fpp97\") pod \"redhat-operators-trpfs\" (UID: \"fcc3f8d7-c68f-417c-925f-6b921abfd582\") " pod="openshift-marketplace/redhat-operators-trpfs" Mar 20 12:06:15 crc kubenswrapper[4748]: I0320 12:06:15.393387 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fcc3f8d7-c68f-417c-925f-6b921abfd582-catalog-content\") pod \"redhat-operators-trpfs\" (UID: \"fcc3f8d7-c68f-417c-925f-6b921abfd582\") " pod="openshift-marketplace/redhat-operators-trpfs" Mar 20 12:06:15 crc kubenswrapper[4748]: I0320 12:06:15.393941 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fcc3f8d7-c68f-417c-925f-6b921abfd582-catalog-content\") pod \"redhat-operators-trpfs\" (UID: \"fcc3f8d7-c68f-417c-925f-6b921abfd582\") " pod="openshift-marketplace/redhat-operators-trpfs" Mar 20 12:06:15 crc kubenswrapper[4748]: I0320 12:06:15.394013 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fcc3f8d7-c68f-417c-925f-6b921abfd582-utilities\") pod \"redhat-operators-trpfs\" (UID: \"fcc3f8d7-c68f-417c-925f-6b921abfd582\") " pod="openshift-marketplace/redhat-operators-trpfs" Mar 20 12:06:15 crc kubenswrapper[4748]: I0320 12:06:15.419353 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fpp97\" (UniqueName: \"kubernetes.io/projected/fcc3f8d7-c68f-417c-925f-6b921abfd582-kube-api-access-fpp97\") pod \"redhat-operators-trpfs\" (UID: \"fcc3f8d7-c68f-417c-925f-6b921abfd582\") " pod="openshift-marketplace/redhat-operators-trpfs" Mar 20 12:06:15 crc kubenswrapper[4748]: I0320 12:06:15.485053 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-trpfs" Mar 20 12:06:15 crc kubenswrapper[4748]: I0320 12:06:15.843160 4748 generic.go:334] "Generic (PLEG): container finished" podID="8509f672-fbd9-42ba-985a-715bdc4178af" containerID="b7e9fe0aeb7d0331c3c27c1d5caa11e8df7fb66c1b88e237c659e0568b7f171a" exitCode=0 Mar 20 12:06:15 crc kubenswrapper[4748]: I0320 12:06:15.843208 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rrthr" event={"ID":"8509f672-fbd9-42ba-985a-715bdc4178af","Type":"ContainerDied","Data":"b7e9fe0aeb7d0331c3c27c1d5caa11e8df7fb66c1b88e237c659e0568b7f171a"} Mar 20 12:06:15 crc kubenswrapper[4748]: I0320 12:06:15.991366 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-trpfs"] Mar 20 12:06:16 crc kubenswrapper[4748]: I0320 12:06:16.852797 4748 generic.go:334] "Generic (PLEG): container finished" podID="8509f672-fbd9-42ba-985a-715bdc4178af" containerID="f6dfe4c6985f26c6283236192c75324f7fdb98755e1f4f9b98b29f4dd830f013" exitCode=0 Mar 20 12:06:16 crc kubenswrapper[4748]: I0320 12:06:16.852902 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rrthr" event={"ID":"8509f672-fbd9-42ba-985a-715bdc4178af","Type":"ContainerDied","Data":"f6dfe4c6985f26c6283236192c75324f7fdb98755e1f4f9b98b29f4dd830f013"} Mar 20 12:06:16 crc kubenswrapper[4748]: I0320 12:06:16.854474 4748 generic.go:334] "Generic (PLEG): container finished" podID="fcc3f8d7-c68f-417c-925f-6b921abfd582" containerID="41ba6dc4d7805d4f8e2346d357da9f0d1bf6778841df5c7f903abf1afcabee82" exitCode=0 Mar 20 12:06:16 crc kubenswrapper[4748]: I0320 12:06:16.854513 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-trpfs" event={"ID":"fcc3f8d7-c68f-417c-925f-6b921abfd582","Type":"ContainerDied","Data":"41ba6dc4d7805d4f8e2346d357da9f0d1bf6778841df5c7f903abf1afcabee82"} Mar 20 12:06:16 crc kubenswrapper[4748]: I0320 12:06:16.854551 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-trpfs" event={"ID":"fcc3f8d7-c68f-417c-925f-6b921abfd582","Type":"ContainerStarted","Data":"51335e256149e53b1605be0bda73611d51415780898fb4a44a930feec09ea211"} Mar 20 12:06:17 crc kubenswrapper[4748]: I0320 12:06:17.863320 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rrthr" event={"ID":"8509f672-fbd9-42ba-985a-715bdc4178af","Type":"ContainerStarted","Data":"73f82d9f688d6f964affca9e56bd95aa42399c236888b85104372515921da7ea"} Mar 20 12:06:17 crc kubenswrapper[4748]: I0320 12:06:17.866812 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-trpfs" event={"ID":"fcc3f8d7-c68f-417c-925f-6b921abfd582","Type":"ContainerStarted","Data":"83675f47d7dfad97ca52d942cf12e99d6c59f995c2614edf5465d35b58b80105"} Mar 20 12:06:17 crc kubenswrapper[4748]: I0320 12:06:17.891272 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-rrthr" podStartSLOduration=3.391612922 podStartE2EDuration="4.891225123s" podCreationTimestamp="2026-03-20 12:06:13 +0000 UTC" firstStartedPulling="2026-03-20 12:06:15.844894781 +0000 UTC m=+5410.986440595" lastFinishedPulling="2026-03-20 12:06:17.344506982 +0000 UTC m=+5412.486052796" observedRunningTime="2026-03-20 12:06:17.881701422 +0000 UTC m=+5413.023247236" watchObservedRunningTime="2026-03-20 12:06:17.891225123 +0000 UTC m=+5413.032770947" Mar 20 12:06:18 crc kubenswrapper[4748]: I0320 12:06:18.884578 4748 generic.go:334] "Generic (PLEG): container finished" podID="fcc3f8d7-c68f-417c-925f-6b921abfd582" containerID="83675f47d7dfad97ca52d942cf12e99d6c59f995c2614edf5465d35b58b80105" exitCode=0 Mar 20 12:06:18 crc kubenswrapper[4748]: I0320 12:06:18.885516 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-trpfs" event={"ID":"fcc3f8d7-c68f-417c-925f-6b921abfd582","Type":"ContainerDied","Data":"83675f47d7dfad97ca52d942cf12e99d6c59f995c2614edf5465d35b58b80105"} Mar 20 12:06:19 crc kubenswrapper[4748]: I0320 12:06:19.895215 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-trpfs" event={"ID":"fcc3f8d7-c68f-417c-925f-6b921abfd582","Type":"ContainerStarted","Data":"1a696cafd4698b05fc1c32734fa2c925fc59bb13b1049aabb3cde5cce4ef3c20"} Mar 20 12:06:19 crc kubenswrapper[4748]: I0320 12:06:19.913437 4748 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-trpfs" podStartSLOduration=2.479991554 podStartE2EDuration="4.913413296s" podCreationTimestamp="2026-03-20 12:06:15 +0000 UTC" firstStartedPulling="2026-03-20 12:06:16.855737855 +0000 UTC m=+5411.997283669" lastFinishedPulling="2026-03-20 12:06:19.289159587 +0000 UTC m=+5414.430705411" observedRunningTime="2026-03-20 12:06:19.912295158 +0000 UTC m=+5415.053840982" watchObservedRunningTime="2026-03-20 12:06:19.913413296 +0000 UTC m=+5415.054959130" Mar 20 12:06:24 crc kubenswrapper[4748]: I0320 12:06:24.117666 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-rrthr" Mar 20 12:06:24 crc kubenswrapper[4748]: I0320 12:06:24.118310 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-rrthr" Mar 20 12:06:24 crc kubenswrapper[4748]: I0320 12:06:24.174144 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-rrthr" Mar 20 12:06:24 crc kubenswrapper[4748]: I0320 12:06:24.992139 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-rrthr" Mar 20 12:06:25 crc kubenswrapper[4748]: I0320 12:06:25.491370 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-trpfs" Mar 20 12:06:25 crc kubenswrapper[4748]: I0320 12:06:25.491521 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-trpfs" Mar 20 12:06:25 crc kubenswrapper[4748]: I0320 12:06:25.522623 4748 scope.go:117] "RemoveContainer" containerID="ab5fbc3d3e1bf68c7ddafde8663631f03f98d3c2bfc6679861a776522b57bd0f" Mar 20 12:06:25 crc kubenswrapper[4748]: E0320 12:06:25.522965 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lbvz_openshift-machine-config-operator(8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" Mar 20 12:06:25 crc kubenswrapper[4748]: I0320 12:06:25.672668 4748 scope.go:117] "RemoveContainer" containerID="f0bac34c16833392314be99b4f108588d770550934c80b4b410212d3b0e6458e" Mar 20 12:06:26 crc kubenswrapper[4748]: I0320 12:06:26.540429 4748 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-trpfs" podUID="fcc3f8d7-c68f-417c-925f-6b921abfd582" containerName="registry-server" probeResult="failure" output=< Mar 20 12:06:26 crc kubenswrapper[4748]: timeout: failed to connect service ":50051" within 1s Mar 20 12:06:26 crc kubenswrapper[4748]: > Mar 20 12:06:26 crc kubenswrapper[4748]: I0320 12:06:26.547399 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rrthr"] Mar 20 12:06:26 crc kubenswrapper[4748]: I0320 12:06:26.959993 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-rrthr" podUID="8509f672-fbd9-42ba-985a-715bdc4178af" containerName="registry-server" containerID="cri-o://73f82d9f688d6f964affca9e56bd95aa42399c236888b85104372515921da7ea" gracePeriod=2 Mar 20 12:06:27 crc kubenswrapper[4748]: I0320 12:06:27.981228 4748 generic.go:334] "Generic (PLEG): container finished" podID="8509f672-fbd9-42ba-985a-715bdc4178af" containerID="73f82d9f688d6f964affca9e56bd95aa42399c236888b85104372515921da7ea" exitCode=0 Mar 20 12:06:27 crc kubenswrapper[4748]: I0320 12:06:27.981445 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rrthr" event={"ID":"8509f672-fbd9-42ba-985a-715bdc4178af","Type":"ContainerDied","Data":"73f82d9f688d6f964affca9e56bd95aa42399c236888b85104372515921da7ea"} Mar 20 12:06:28 crc kubenswrapper[4748]: I0320 12:06:28.180240 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rrthr" Mar 20 12:06:28 crc kubenswrapper[4748]: I0320 12:06:28.359478 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8509f672-fbd9-42ba-985a-715bdc4178af-utilities\") pod \"8509f672-fbd9-42ba-985a-715bdc4178af\" (UID: \"8509f672-fbd9-42ba-985a-715bdc4178af\") " Mar 20 12:06:28 crc kubenswrapper[4748]: I0320 12:06:28.359645 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jfbqs\" (UniqueName: \"kubernetes.io/projected/8509f672-fbd9-42ba-985a-715bdc4178af-kube-api-access-jfbqs\") pod \"8509f672-fbd9-42ba-985a-715bdc4178af\" (UID: \"8509f672-fbd9-42ba-985a-715bdc4178af\") " Mar 20 12:06:28 crc kubenswrapper[4748]: I0320 12:06:28.359683 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8509f672-fbd9-42ba-985a-715bdc4178af-catalog-content\") pod \"8509f672-fbd9-42ba-985a-715bdc4178af\" (UID: \"8509f672-fbd9-42ba-985a-715bdc4178af\") " Mar 20 12:06:28 crc kubenswrapper[4748]: I0320 12:06:28.360660 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8509f672-fbd9-42ba-985a-715bdc4178af-utilities" (OuterVolumeSpecName: "utilities") pod "8509f672-fbd9-42ba-985a-715bdc4178af" (UID: "8509f672-fbd9-42ba-985a-715bdc4178af"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 12:06:28 crc kubenswrapper[4748]: I0320 12:06:28.369375 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8509f672-fbd9-42ba-985a-715bdc4178af-kube-api-access-jfbqs" (OuterVolumeSpecName: "kube-api-access-jfbqs") pod "8509f672-fbd9-42ba-985a-715bdc4178af" (UID: "8509f672-fbd9-42ba-985a-715bdc4178af"). InnerVolumeSpecName "kube-api-access-jfbqs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 12:06:28 crc kubenswrapper[4748]: I0320 12:06:28.389506 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8509f672-fbd9-42ba-985a-715bdc4178af-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8509f672-fbd9-42ba-985a-715bdc4178af" (UID: "8509f672-fbd9-42ba-985a-715bdc4178af"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 12:06:28 crc kubenswrapper[4748]: I0320 12:06:28.462123 4748 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8509f672-fbd9-42ba-985a-715bdc4178af-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 12:06:28 crc kubenswrapper[4748]: I0320 12:06:28.462413 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jfbqs\" (UniqueName: \"kubernetes.io/projected/8509f672-fbd9-42ba-985a-715bdc4178af-kube-api-access-jfbqs\") on node \"crc\" DevicePath \"\"" Mar 20 12:06:28 crc kubenswrapper[4748]: I0320 12:06:28.462594 4748 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8509f672-fbd9-42ba-985a-715bdc4178af-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 12:06:28 crc kubenswrapper[4748]: I0320 12:06:28.993885 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rrthr" event={"ID":"8509f672-fbd9-42ba-985a-715bdc4178af","Type":"ContainerDied","Data":"a97722be63124e4460becf9b6e26fc8fec8f538e6037c8fd5d7314e586103219"} Mar 20 12:06:28 crc kubenswrapper[4748]: I0320 12:06:28.993970 4748 scope.go:117] "RemoveContainer" containerID="73f82d9f688d6f964affca9e56bd95aa42399c236888b85104372515921da7ea" Mar 20 12:06:28 crc kubenswrapper[4748]: I0320 12:06:28.994011 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rrthr" Mar 20 12:06:29 crc kubenswrapper[4748]: I0320 12:06:29.032578 4748 scope.go:117] "RemoveContainer" containerID="f6dfe4c6985f26c6283236192c75324f7fdb98755e1f4f9b98b29f4dd830f013" Mar 20 12:06:29 crc kubenswrapper[4748]: I0320 12:06:29.032806 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rrthr"] Mar 20 12:06:29 crc kubenswrapper[4748]: I0320 12:06:29.055812 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-rrthr"] Mar 20 12:06:29 crc kubenswrapper[4748]: I0320 12:06:29.068243 4748 scope.go:117] "RemoveContainer" containerID="b7e9fe0aeb7d0331c3c27c1d5caa11e8df7fb66c1b88e237c659e0568b7f171a" Mar 20 12:06:29 crc kubenswrapper[4748]: I0320 12:06:29.533494 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8509f672-fbd9-42ba-985a-715bdc4178af" path="/var/lib/kubelet/pods/8509f672-fbd9-42ba-985a-715bdc4178af/volumes" Mar 20 12:06:35 crc kubenswrapper[4748]: I0320 12:06:35.579030 4748 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-trpfs" Mar 20 12:06:35 crc kubenswrapper[4748]: I0320 12:06:35.654516 4748 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-trpfs" Mar 20 12:06:36 crc kubenswrapper[4748]: I0320 12:06:36.515424 4748 scope.go:117] "RemoveContainer" containerID="ab5fbc3d3e1bf68c7ddafde8663631f03f98d3c2bfc6679861a776522b57bd0f" Mar 20 12:06:36 crc kubenswrapper[4748]: E0320 12:06:36.515885 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lbvz_openshift-machine-config-operator(8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" Mar 20 12:06:38 crc kubenswrapper[4748]: I0320 12:06:38.217043 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-trpfs"] Mar 20 12:06:38 crc kubenswrapper[4748]: I0320 12:06:38.217320 4748 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-trpfs" podUID="fcc3f8d7-c68f-417c-925f-6b921abfd582" containerName="registry-server" containerID="cri-o://1a696cafd4698b05fc1c32734fa2c925fc59bb13b1049aabb3cde5cce4ef3c20" gracePeriod=2 Mar 20 12:06:38 crc kubenswrapper[4748]: I0320 12:06:38.748086 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-trpfs" Mar 20 12:06:38 crc kubenswrapper[4748]: I0320 12:06:38.886007 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fpp97\" (UniqueName: \"kubernetes.io/projected/fcc3f8d7-c68f-417c-925f-6b921abfd582-kube-api-access-fpp97\") pod \"fcc3f8d7-c68f-417c-925f-6b921abfd582\" (UID: \"fcc3f8d7-c68f-417c-925f-6b921abfd582\") " Mar 20 12:06:38 crc kubenswrapper[4748]: I0320 12:06:38.886381 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fcc3f8d7-c68f-417c-925f-6b921abfd582-catalog-content\") pod \"fcc3f8d7-c68f-417c-925f-6b921abfd582\" (UID: \"fcc3f8d7-c68f-417c-925f-6b921abfd582\") " Mar 20 12:06:38 crc kubenswrapper[4748]: I0320 12:06:38.886476 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fcc3f8d7-c68f-417c-925f-6b921abfd582-utilities\") pod \"fcc3f8d7-c68f-417c-925f-6b921abfd582\" (UID: \"fcc3f8d7-c68f-417c-925f-6b921abfd582\") " Mar 20 12:06:38 crc kubenswrapper[4748]: I0320 12:06:38.887223 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fcc3f8d7-c68f-417c-925f-6b921abfd582-utilities" (OuterVolumeSpecName: "utilities") pod "fcc3f8d7-c68f-417c-925f-6b921abfd582" (UID: "fcc3f8d7-c68f-417c-925f-6b921abfd582"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 12:06:38 crc kubenswrapper[4748]: I0320 12:06:38.892541 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fcc3f8d7-c68f-417c-925f-6b921abfd582-kube-api-access-fpp97" (OuterVolumeSpecName: "kube-api-access-fpp97") pod "fcc3f8d7-c68f-417c-925f-6b921abfd582" (UID: "fcc3f8d7-c68f-417c-925f-6b921abfd582"). InnerVolumeSpecName "kube-api-access-fpp97". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 12:06:38 crc kubenswrapper[4748]: I0320 12:06:38.988461 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fpp97\" (UniqueName: \"kubernetes.io/projected/fcc3f8d7-c68f-417c-925f-6b921abfd582-kube-api-access-fpp97\") on node \"crc\" DevicePath \"\"" Mar 20 12:06:38 crc kubenswrapper[4748]: I0320 12:06:38.988492 4748 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fcc3f8d7-c68f-417c-925f-6b921abfd582-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 12:06:39 crc kubenswrapper[4748]: I0320 12:06:39.013357 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fcc3f8d7-c68f-417c-925f-6b921abfd582-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fcc3f8d7-c68f-417c-925f-6b921abfd582" (UID: "fcc3f8d7-c68f-417c-925f-6b921abfd582"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 12:06:39 crc kubenswrapper[4748]: I0320 12:06:39.090530 4748 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fcc3f8d7-c68f-417c-925f-6b921abfd582-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 12:06:39 crc kubenswrapper[4748]: I0320 12:06:39.102282 4748 generic.go:334] "Generic (PLEG): container finished" podID="fcc3f8d7-c68f-417c-925f-6b921abfd582" containerID="1a696cafd4698b05fc1c32734fa2c925fc59bb13b1049aabb3cde5cce4ef3c20" exitCode=0 Mar 20 12:06:39 crc kubenswrapper[4748]: I0320 12:06:39.102328 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-trpfs" event={"ID":"fcc3f8d7-c68f-417c-925f-6b921abfd582","Type":"ContainerDied","Data":"1a696cafd4698b05fc1c32734fa2c925fc59bb13b1049aabb3cde5cce4ef3c20"} Mar 20 12:06:39 crc kubenswrapper[4748]: I0320 12:06:39.102361 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-trpfs" event={"ID":"fcc3f8d7-c68f-417c-925f-6b921abfd582","Type":"ContainerDied","Data":"51335e256149e53b1605be0bda73611d51415780898fb4a44a930feec09ea211"} Mar 20 12:06:39 crc kubenswrapper[4748]: I0320 12:06:39.102379 4748 scope.go:117] "RemoveContainer" containerID="1a696cafd4698b05fc1c32734fa2c925fc59bb13b1049aabb3cde5cce4ef3c20" Mar 20 12:06:39 crc kubenswrapper[4748]: I0320 12:06:39.102484 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-trpfs" Mar 20 12:06:39 crc kubenswrapper[4748]: I0320 12:06:39.131639 4748 scope.go:117] "RemoveContainer" containerID="83675f47d7dfad97ca52d942cf12e99d6c59f995c2614edf5465d35b58b80105" Mar 20 12:06:39 crc kubenswrapper[4748]: I0320 12:06:39.150363 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-trpfs"] Mar 20 12:06:39 crc kubenswrapper[4748]: I0320 12:06:39.157899 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-trpfs"] Mar 20 12:06:39 crc kubenswrapper[4748]: I0320 12:06:39.180528 4748 scope.go:117] "RemoveContainer" containerID="41ba6dc4d7805d4f8e2346d357da9f0d1bf6778841df5c7f903abf1afcabee82" Mar 20 12:06:39 crc kubenswrapper[4748]: I0320 12:06:39.210408 4748 scope.go:117] "RemoveContainer" containerID="1a696cafd4698b05fc1c32734fa2c925fc59bb13b1049aabb3cde5cce4ef3c20" Mar 20 12:06:39 crc kubenswrapper[4748]: E0320 12:06:39.210887 4748 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a696cafd4698b05fc1c32734fa2c925fc59bb13b1049aabb3cde5cce4ef3c20\": container with ID starting with 1a696cafd4698b05fc1c32734fa2c925fc59bb13b1049aabb3cde5cce4ef3c20 not found: ID does not exist" containerID="1a696cafd4698b05fc1c32734fa2c925fc59bb13b1049aabb3cde5cce4ef3c20" Mar 20 12:06:39 crc kubenswrapper[4748]: I0320 12:06:39.210952 4748 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a696cafd4698b05fc1c32734fa2c925fc59bb13b1049aabb3cde5cce4ef3c20"} err="failed to get container status \"1a696cafd4698b05fc1c32734fa2c925fc59bb13b1049aabb3cde5cce4ef3c20\": rpc error: code = NotFound desc = could not find container \"1a696cafd4698b05fc1c32734fa2c925fc59bb13b1049aabb3cde5cce4ef3c20\": container with ID starting with 1a696cafd4698b05fc1c32734fa2c925fc59bb13b1049aabb3cde5cce4ef3c20 not found: ID does not exist" Mar 20 12:06:39 crc kubenswrapper[4748]: I0320 12:06:39.210985 4748 scope.go:117] "RemoveContainer" containerID="83675f47d7dfad97ca52d942cf12e99d6c59f995c2614edf5465d35b58b80105" Mar 20 12:06:39 crc kubenswrapper[4748]: E0320 12:06:39.211389 4748 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83675f47d7dfad97ca52d942cf12e99d6c59f995c2614edf5465d35b58b80105\": container with ID starting with 83675f47d7dfad97ca52d942cf12e99d6c59f995c2614edf5465d35b58b80105 not found: ID does not exist" containerID="83675f47d7dfad97ca52d942cf12e99d6c59f995c2614edf5465d35b58b80105" Mar 20 12:06:39 crc kubenswrapper[4748]: I0320 12:06:39.211421 4748 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83675f47d7dfad97ca52d942cf12e99d6c59f995c2614edf5465d35b58b80105"} err="failed to get container status \"83675f47d7dfad97ca52d942cf12e99d6c59f995c2614edf5465d35b58b80105\": rpc error: code = NotFound desc = could not find container \"83675f47d7dfad97ca52d942cf12e99d6c59f995c2614edf5465d35b58b80105\": container with ID starting with 83675f47d7dfad97ca52d942cf12e99d6c59f995c2614edf5465d35b58b80105 not found: ID does not exist" Mar 20 12:06:39 crc kubenswrapper[4748]: I0320 12:06:39.211441 4748 scope.go:117] "RemoveContainer" containerID="41ba6dc4d7805d4f8e2346d357da9f0d1bf6778841df5c7f903abf1afcabee82" Mar 20 12:06:39 crc kubenswrapper[4748]: E0320 12:06:39.211810 4748 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41ba6dc4d7805d4f8e2346d357da9f0d1bf6778841df5c7f903abf1afcabee82\": container with ID starting with 41ba6dc4d7805d4f8e2346d357da9f0d1bf6778841df5c7f903abf1afcabee82 not found: ID does not exist" containerID="41ba6dc4d7805d4f8e2346d357da9f0d1bf6778841df5c7f903abf1afcabee82" Mar 20 12:06:39 crc kubenswrapper[4748]: I0320 12:06:39.211854 4748 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41ba6dc4d7805d4f8e2346d357da9f0d1bf6778841df5c7f903abf1afcabee82"} err="failed to get container status \"41ba6dc4d7805d4f8e2346d357da9f0d1bf6778841df5c7f903abf1afcabee82\": rpc error: code = NotFound desc = could not find container \"41ba6dc4d7805d4f8e2346d357da9f0d1bf6778841df5c7f903abf1afcabee82\": container with ID starting with 41ba6dc4d7805d4f8e2346d357da9f0d1bf6778841df5c7f903abf1afcabee82 not found: ID does not exist" Mar 20 12:06:39 crc kubenswrapper[4748]: I0320 12:06:39.537080 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fcc3f8d7-c68f-417c-925f-6b921abfd582" path="/var/lib/kubelet/pods/fcc3f8d7-c68f-417c-925f-6b921abfd582/volumes" Mar 20 12:06:51 crc kubenswrapper[4748]: I0320 12:06:51.515278 4748 scope.go:117] "RemoveContainer" containerID="ab5fbc3d3e1bf68c7ddafde8663631f03f98d3c2bfc6679861a776522b57bd0f" Mar 20 12:06:51 crc kubenswrapper[4748]: E0320 12:06:51.516467 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lbvz_openshift-machine-config-operator(8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" Mar 20 12:07:06 crc kubenswrapper[4748]: I0320 12:07:06.515480 4748 scope.go:117] "RemoveContainer" containerID="ab5fbc3d3e1bf68c7ddafde8663631f03f98d3c2bfc6679861a776522b57bd0f" Mar 20 12:07:06 crc kubenswrapper[4748]: E0320 12:07:06.516504 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lbvz_openshift-machine-config-operator(8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" Mar 20 12:07:21 crc kubenswrapper[4748]: I0320 12:07:21.515186 4748 scope.go:117] "RemoveContainer" containerID="ab5fbc3d3e1bf68c7ddafde8663631f03f98d3c2bfc6679861a776522b57bd0f" Mar 20 12:07:21 crc kubenswrapper[4748]: E0320 12:07:21.517505 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lbvz_openshift-machine-config-operator(8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" Mar 20 12:07:35 crc kubenswrapper[4748]: I0320 12:07:35.524285 4748 scope.go:117] "RemoveContainer" containerID="ab5fbc3d3e1bf68c7ddafde8663631f03f98d3c2bfc6679861a776522b57bd0f" Mar 20 12:07:35 crc kubenswrapper[4748]: E0320 12:07:35.525168 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lbvz_openshift-machine-config-operator(8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" Mar 20 12:07:49 crc kubenswrapper[4748]: I0320 12:07:49.515425 4748 scope.go:117] "RemoveContainer" containerID="ab5fbc3d3e1bf68c7ddafde8663631f03f98d3c2bfc6679861a776522b57bd0f" Mar 20 12:07:49 crc kubenswrapper[4748]: E0320 12:07:49.516677 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lbvz_openshift-machine-config-operator(8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" Mar 20 12:08:00 crc kubenswrapper[4748]: I0320 12:08:00.174170 4748 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566808-9w6f5"] Mar 20 12:08:00 crc kubenswrapper[4748]: E0320 12:08:00.175046 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcc3f8d7-c68f-417c-925f-6b921abfd582" containerName="extract-content" Mar 20 12:08:00 crc kubenswrapper[4748]: I0320 12:08:00.175058 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcc3f8d7-c68f-417c-925f-6b921abfd582" containerName="extract-content" Mar 20 12:08:00 crc kubenswrapper[4748]: E0320 12:08:00.175069 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8509f672-fbd9-42ba-985a-715bdc4178af" containerName="extract-utilities" Mar 20 12:08:00 crc kubenswrapper[4748]: I0320 12:08:00.175076 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="8509f672-fbd9-42ba-985a-715bdc4178af" containerName="extract-utilities" Mar 20 12:08:00 crc kubenswrapper[4748]: E0320 12:08:00.175087 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcc3f8d7-c68f-417c-925f-6b921abfd582" containerName="extract-utilities" Mar 20 12:08:00 crc kubenswrapper[4748]: I0320 12:08:00.175094 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcc3f8d7-c68f-417c-925f-6b921abfd582" containerName="extract-utilities" Mar 20 12:08:00 crc kubenswrapper[4748]: E0320 12:08:00.175118 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8509f672-fbd9-42ba-985a-715bdc4178af" containerName="registry-server" Mar 20 12:08:00 crc kubenswrapper[4748]: I0320 12:08:00.175124 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="8509f672-fbd9-42ba-985a-715bdc4178af" containerName="registry-server" Mar 20 12:08:00 crc kubenswrapper[4748]: E0320 12:08:00.175135 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcc3f8d7-c68f-417c-925f-6b921abfd582" containerName="registry-server" Mar 20 12:08:00 crc kubenswrapper[4748]: I0320 12:08:00.175141 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcc3f8d7-c68f-417c-925f-6b921abfd582" containerName="registry-server" Mar 20 12:08:00 crc kubenswrapper[4748]: E0320 12:08:00.175152 4748 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8509f672-fbd9-42ba-985a-715bdc4178af" containerName="extract-content" Mar 20 12:08:00 crc kubenswrapper[4748]: I0320 12:08:00.175157 4748 state_mem.go:107] "Deleted CPUSet assignment" podUID="8509f672-fbd9-42ba-985a-715bdc4178af" containerName="extract-content" Mar 20 12:08:00 crc kubenswrapper[4748]: I0320 12:08:00.175308 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="fcc3f8d7-c68f-417c-925f-6b921abfd582" containerName="registry-server" Mar 20 12:08:00 crc kubenswrapper[4748]: I0320 12:08:00.175320 4748 memory_manager.go:354] "RemoveStaleState removing state" podUID="8509f672-fbd9-42ba-985a-715bdc4178af" containerName="registry-server" Mar 20 12:08:00 crc kubenswrapper[4748]: I0320 12:08:00.175906 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566808-9w6f5" Mar 20 12:08:00 crc kubenswrapper[4748]: I0320 12:08:00.179656 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 12:08:00 crc kubenswrapper[4748]: I0320 12:08:00.180000 4748 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 12:08:00 crc kubenswrapper[4748]: I0320 12:08:00.180197 4748 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-p6q8z" Mar 20 12:08:00 crc kubenswrapper[4748]: I0320 12:08:00.200087 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566808-9w6f5"] Mar 20 12:08:00 crc kubenswrapper[4748]: I0320 12:08:00.325899 4748 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7fmc\" (UniqueName: \"kubernetes.io/projected/60479296-4487-4287-8862-70a9904800cc-kube-api-access-d7fmc\") pod \"auto-csr-approver-29566808-9w6f5\" (UID: \"60479296-4487-4287-8862-70a9904800cc\") " pod="openshift-infra/auto-csr-approver-29566808-9w6f5" Mar 20 12:08:00 crc kubenswrapper[4748]: I0320 12:08:00.428879 4748 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d7fmc\" (UniqueName: \"kubernetes.io/projected/60479296-4487-4287-8862-70a9904800cc-kube-api-access-d7fmc\") pod \"auto-csr-approver-29566808-9w6f5\" (UID: \"60479296-4487-4287-8862-70a9904800cc\") " pod="openshift-infra/auto-csr-approver-29566808-9w6f5" Mar 20 12:08:00 crc kubenswrapper[4748]: I0320 12:08:00.449515 4748 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7fmc\" (UniqueName: \"kubernetes.io/projected/60479296-4487-4287-8862-70a9904800cc-kube-api-access-d7fmc\") pod \"auto-csr-approver-29566808-9w6f5\" (UID: \"60479296-4487-4287-8862-70a9904800cc\") " pod="openshift-infra/auto-csr-approver-29566808-9w6f5" Mar 20 12:08:00 crc kubenswrapper[4748]: I0320 12:08:00.502316 4748 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566808-9w6f5" Mar 20 12:08:00 crc kubenswrapper[4748]: I0320 12:08:00.966573 4748 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566808-9w6f5"] Mar 20 12:08:00 crc kubenswrapper[4748]: W0320 12:08:00.975018 4748 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod60479296_4487_4287_8862_70a9904800cc.slice/crio-8e9e71fa3bc5b28bbbaa8ea96d18e0e1ef18893826c3b9e83e0fb9dad4b87a85 WatchSource:0}: Error finding container 8e9e71fa3bc5b28bbbaa8ea96d18e0e1ef18893826c3b9e83e0fb9dad4b87a85: Status 404 returned error can't find the container with id 8e9e71fa3bc5b28bbbaa8ea96d18e0e1ef18893826c3b9e83e0fb9dad4b87a85 Mar 20 12:08:01 crc kubenswrapper[4748]: I0320 12:08:01.516827 4748 scope.go:117] "RemoveContainer" containerID="ab5fbc3d3e1bf68c7ddafde8663631f03f98d3c2bfc6679861a776522b57bd0f" Mar 20 12:08:01 crc kubenswrapper[4748]: E0320 12:08:01.517630 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lbvz_openshift-machine-config-operator(8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" Mar 20 12:08:01 crc kubenswrapper[4748]: I0320 12:08:01.794356 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566808-9w6f5" event={"ID":"60479296-4487-4287-8862-70a9904800cc","Type":"ContainerStarted","Data":"8e9e71fa3bc5b28bbbaa8ea96d18e0e1ef18893826c3b9e83e0fb9dad4b87a85"} Mar 20 12:08:03 crc kubenswrapper[4748]: I0320 12:08:03.820172 4748 generic.go:334] "Generic (PLEG): container finished" podID="60479296-4487-4287-8862-70a9904800cc" containerID="484b887f2da87301ffcf743189a75390fcd225603e7d77bac3a58c82f0193eeb" exitCode=0 Mar 20 12:08:03 crc kubenswrapper[4748]: I0320 12:08:03.820267 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566808-9w6f5" event={"ID":"60479296-4487-4287-8862-70a9904800cc","Type":"ContainerDied","Data":"484b887f2da87301ffcf743189a75390fcd225603e7d77bac3a58c82f0193eeb"} Mar 20 12:08:05 crc kubenswrapper[4748]: I0320 12:08:05.187010 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566808-9w6f5" Mar 20 12:08:05 crc kubenswrapper[4748]: I0320 12:08:05.327150 4748 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d7fmc\" (UniqueName: \"kubernetes.io/projected/60479296-4487-4287-8862-70a9904800cc-kube-api-access-d7fmc\") pod \"60479296-4487-4287-8862-70a9904800cc\" (UID: \"60479296-4487-4287-8862-70a9904800cc\") " Mar 20 12:08:05 crc kubenswrapper[4748]: I0320 12:08:05.333532 4748 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60479296-4487-4287-8862-70a9904800cc-kube-api-access-d7fmc" (OuterVolumeSpecName: "kube-api-access-d7fmc") pod "60479296-4487-4287-8862-70a9904800cc" (UID: "60479296-4487-4287-8862-70a9904800cc"). InnerVolumeSpecName "kube-api-access-d7fmc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 12:08:05 crc kubenswrapper[4748]: I0320 12:08:05.429994 4748 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d7fmc\" (UniqueName: \"kubernetes.io/projected/60479296-4487-4287-8862-70a9904800cc-kube-api-access-d7fmc\") on node \"crc\" DevicePath \"\"" Mar 20 12:08:05 crc kubenswrapper[4748]: I0320 12:08:05.844110 4748 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566808-9w6f5" event={"ID":"60479296-4487-4287-8862-70a9904800cc","Type":"ContainerDied","Data":"8e9e71fa3bc5b28bbbaa8ea96d18e0e1ef18893826c3b9e83e0fb9dad4b87a85"} Mar 20 12:08:05 crc kubenswrapper[4748]: I0320 12:08:05.844149 4748 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8e9e71fa3bc5b28bbbaa8ea96d18e0e1ef18893826c3b9e83e0fb9dad4b87a85" Mar 20 12:08:05 crc kubenswrapper[4748]: I0320 12:08:05.844233 4748 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566808-9w6f5" Mar 20 12:08:06 crc kubenswrapper[4748]: I0320 12:08:06.283028 4748 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566802-hvkxv"] Mar 20 12:08:06 crc kubenswrapper[4748]: I0320 12:08:06.293618 4748 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566802-hvkxv"] Mar 20 12:08:07 crc kubenswrapper[4748]: I0320 12:08:07.526476 4748 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8bf6c021-b677-4fc2-8fa4-a4c86022bea9" path="/var/lib/kubelet/pods/8bf6c021-b677-4fc2-8fa4-a4c86022bea9/volumes" Mar 20 12:08:12 crc kubenswrapper[4748]: I0320 12:08:12.516176 4748 scope.go:117] "RemoveContainer" containerID="ab5fbc3d3e1bf68c7ddafde8663631f03f98d3c2bfc6679861a776522b57bd0f" Mar 20 12:08:12 crc kubenswrapper[4748]: E0320 12:08:12.517400 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lbvz_openshift-machine-config-operator(8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" Mar 20 12:08:25 crc kubenswrapper[4748]: I0320 12:08:25.818017 4748 scope.go:117] "RemoveContainer" containerID="7893591d9717acc313f255cd2b4dad137ea10871e51c3328634fc593b9c03d7c" Mar 20 12:08:26 crc kubenswrapper[4748]: I0320 12:08:26.516903 4748 scope.go:117] "RemoveContainer" containerID="ab5fbc3d3e1bf68c7ddafde8663631f03f98d3c2bfc6679861a776522b57bd0f" Mar 20 12:08:26 crc kubenswrapper[4748]: E0320 12:08:26.517738 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lbvz_openshift-machine-config-operator(8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" Mar 20 12:08:39 crc kubenswrapper[4748]: I0320 12:08:39.515590 4748 scope.go:117] "RemoveContainer" containerID="ab5fbc3d3e1bf68c7ddafde8663631f03f98d3c2bfc6679861a776522b57bd0f" Mar 20 12:08:39 crc kubenswrapper[4748]: E0320 12:08:39.516380 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lbvz_openshift-machine-config-operator(8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" Mar 20 12:08:53 crc kubenswrapper[4748]: I0320 12:08:53.515986 4748 scope.go:117] "RemoveContainer" containerID="ab5fbc3d3e1bf68c7ddafde8663631f03f98d3c2bfc6679861a776522b57bd0f" Mar 20 12:08:53 crc kubenswrapper[4748]: E0320 12:08:53.516887 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lbvz_openshift-machine-config-operator(8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" Mar 20 12:09:07 crc kubenswrapper[4748]: I0320 12:09:07.520656 4748 scope.go:117] "RemoveContainer" containerID="ab5fbc3d3e1bf68c7ddafde8663631f03f98d3c2bfc6679861a776522b57bd0f" Mar 20 12:09:07 crc kubenswrapper[4748]: E0320 12:09:07.527478 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lbvz_openshift-machine-config-operator(8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" Mar 20 12:09:18 crc kubenswrapper[4748]: I0320 12:09:18.515748 4748 scope.go:117] "RemoveContainer" containerID="ab5fbc3d3e1bf68c7ddafde8663631f03f98d3c2bfc6679861a776522b57bd0f" Mar 20 12:09:18 crc kubenswrapper[4748]: E0320 12:09:18.518161 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lbvz_openshift-machine-config-operator(8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c" Mar 20 12:09:29 crc kubenswrapper[4748]: I0320 12:09:29.520092 4748 scope.go:117] "RemoveContainer" containerID="ab5fbc3d3e1bf68c7ddafde8663631f03f98d3c2bfc6679861a776522b57bd0f" Mar 20 12:09:29 crc kubenswrapper[4748]: E0320 12:09:29.525541 4748 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5lbvz_openshift-machine-config-operator(8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c)\"" pod="openshift-machine-config-operator/machine-config-daemon-5lbvz" podUID="8e81ab84-9a9e-4ec4-ae87-ec51a8bc658c"